Not OK, computer: music streaming’s diversity problem

Sexism can be a subtle problem. In the music industry, for example, we have not just had #MeToo scandals, exposing the abuses of male singers, musicians and producers, but have also seen less obvious ways where women seem to be disadvantaged.

Take people’s listening patterns on streaming services. If you look at Spotify’s top 10 most streamed artists of 2020, for example, only two are women — and Billie Eilish is the highest in seventh place. This might not seem a case of discrimination, but the way we got here raises some important questions.

Now a team of European computer scientists have explored this tendency by looking at streaming services’ algorithms. More specifically, Christine Bauer of Utrecht University in the Netherlands and Xavier Serra and Andres Ferraro of Universitat Pompeu Fabra in Spain analysed the publicly available listening records of 330,000 users of one service. This showed that female artists only represented 25 per cent of the music listened to by users. The authors wrote on The Conversation platform that “on average, the first recommended track was by a man, along with the next six. Users had to wait until song seven or eight to hear one by a woman.”

People come to their musical tastes in all kinds of ways, but how most of us listen to music now offers specific problems of embedded bias. When a streaming service offers music recommendations, it does so by studying what music has been listened to before. That creates a vicious feedback loop, if it already offers more music by men, that has startling consequences — which most of us listeners are unaware of.

Is there any solution? The researchers offered one: they did a simulation of the algorithm and tweaked it a few times to raise the rankings of female artists (ie they get more exposure by being recommended earlier) and lower the male ones. When they let this system run, a new feedback loop emerged: the AI indeed recommended female artists earlier, making listeners more aware of that option; and when the AI platform learnt that the music was being chosen, it was recommended more often.

Bauer tells me it was “a positive surprise” to change the streaming service’s apparent bias so much with a few tweaks to the algorithm. “Of course, it’s always easier to fix something in theory rather than in practice,” she says, “but if this effect was similar in the real world, that would be great.” She adds that the group is now exploring how to use the same approach to address ethnic and other forms of discrimination in media platforms.

The team stress that this work is still at an early stage, but the study is thought-provoking for two reasons. First, and most obviously, it shows why it pays to have a wider debate on how now-pervasive AI programs work and, above all, whether we want them to extrapolate from our collective pasts into our futures. “We are at a critical juncture, one that requires us to ask hard questions about the way AI is produced and adopted,” writes Kate Crawford, who co-founded an AI centre at New York University, in a powerful new book, Atlas of AI.

Second, music streaming should also make us ponder the thorny issue of positive discrimination. Personally, I have often felt wary of this concept, since I have built my career trying to avoid defining myself by gender. But today, after years working in the media, I also realise the power of the “demonstration effect”: if a society only ever sees white men in positions of power (or on the pages of newspapers), it creates a cultural feedback loop, not unlike those streaming services.

This affects many corners of business. Consider venture capital: research from a multitude of groups shows that diverse teams outperform homogeneous ones. Yet according to Deloitte, 77 per cent of venture capitalists are male and 72 per cent white, while black and Latino investors received just 2.4 per cent of funding between 2015 and 2020, according to Crunchbase.

This pattern has not arisen primarily because powerful people are overtly sexist or racist; the subtler issue is that financiers prefer to work with colleagues who are a good “cultural fit” (ie are like them) and to back entrepreneurs with a proven track record — except most of those entrepreneurs happen to look like them.

“Mainstream investors generally consider funds led by people of colour and women as higher risk, despite widely available evidence that diversity actually mitigates risk,” point out financiers Tracy Gray and Emilie Cortes in the Stanford Social Innovation Review. You could address this by using something akin to a music algorithm rejig: foundations could deliberately elevate diverse employees and overinvest in funds run by diverse groups to change the feedback loop.

Would this work? Nobody knows yet since it has never been done at scale, or at least not yet in finance. The reality is that it is probably even harder to shift human bias than it is to tweak an algorithm.

But if you want a reason to feel hopeful, consider this: while computer programs might entrench existing bias, the amazing levels of transparency that Big Data can provide are able to illuminate the problem with clarity. That, in turn, can galvanise action, if we choose it — in music and elsewhere.

Follow Gillian on Twitter @gilliantett and email her at [email protected]

Follow @FTMag on Twitter to find out about our latest stories first

Link to Original Story

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.