Quantifying Gender Bias in Consumer Culture

Reihane Boghrati, Jonah Berger

Cultural items like songs have an important impact in creating and reinforcing stereotypes, biases, and discrimination. But the actual nature of such items is often less transparent. Take songs, for example. Are lyrics biased against women? And how have any such biases changed over time? Natural language processing of a quarter of a million songs over 50 years quantifies misogyny. Women are less likely to be associated with desirable traits (i.e., competence), and while this bias has decreased, it persists. Ancillary analyses further suggest that song lyrics may help drive shifts in societal stereotypes towards women, and that lyrical shifts are driven by male artists (as female artists were less biased to begin with). Overall, these results shed light on cultural evolution, subtle measures of bias and discrimination, and how natural language processing and machine learning can provide deeper insight into stereotypes and cultural change.

Knowledge Graph



Sign up or login to leave a comment