Quantifying Gender Bias in Consumer Culture

Authors: Reihane Boghrati, Jonah Berger

License: CC BY-NC-ND 4.0

Abstract: Cultural items like songs have an important impact in creating and reinforcing stereotypes, biases, and discrimination. But the actual nature of such items is often less transparent. Take songs, for example. Are lyrics biased against women? And how have any such biases changed over time? Natural language processing of a quarter of a million songs over 50 years quantifies misogyny. Women are less likely to be associated with desirable traits (i.e., competence), and while this bias has decreased, it persists. Ancillary analyses further suggest that song lyrics may help drive shifts in societal stereotypes towards women, and that lyrical shifts are driven by male artists (as female artists were less biased to begin with). Overall, these results shed light on cultural evolution, subtle measures of bias and discrimination, and how natural language processing and machine learning can provide deeper insight into stereotypes and cultural change.

Submitted to arXiv on 10 Jan. 2022

Explore the paper tree

Click on the tree nodes to be redirected to a given paper and access their summaries and virtual assistant

Also access our AI generated Summaries, or ask questions about this paper to our AI assistant.

Look for similar papers (in beta version)

By clicking on the button above, our algorithm will scan all papers in our database to find the closest based on the contents of the full papers and not just on metadata. Please note that it only works for papers that we have generated summaries for and you can rerun it from time to time to get a more accurate result while our database grows.