Question

As more systems and sectors are driven by predictive analytics, there is increasing awareness of the possibility and pitfalls of algorithmic discrimination. In what ways do you think Recommender Systems reinforce human bias? Reflecting on the techniques we have covered, do you think recommender systems reinforce or help to prevent unethical targeting or customer segmentation? Please provide one or more examples to support your arguments.

A few resources:

Evan Estola (2016): When Recommendations Systems Go Bad; MLconf SEA 2016

Rishabh Jain (2016): When Recommendation Systems Go Bad

Moritz Hardt, Eric Price, Nathan Srebro (2016): Equality of Opportunity in Supervised Learning

Response

In what ways do you think Recommender Systems reinforce human bias?

A recommender system may reinforce human bias when it makes recommendations to a user based on biased ratings.Exposure of items is thus affected,creating a popularity divide of items during the feedback loop that occurs with users. This may lead the recommendation system to make increasingly biased recommendations over time and may lead to a situation in which users only see a narrow subset of the entire range of available recommendations, a phenomenon known as the ‘filter bubble’.

Do you think recommender systems reinforce or help to prevent unethical targeting or customer segmentation?

This is a great debate. However, in most of my research readings, there are numerous examples that show how recommender systems reinforce rather than prevent, unethical targeting or customer segmentation. For example, in her book Algorithms of Oppression: How Search Engines Reinforce Racism, Ms. Safyiya Nobke shows how online search results are far from neutral, but instead replicate and reinforce racist and sexist beliefs that reverberate in the societies in which search engines operate. In her book, Ms. Noble gives an example where a friend suggested that she google ‘black girls’. She did, and was horrified to discover that all the top results led to porn sites. By 2011 she thought her own engagement with Black feminist texts, videos and books online would have changed the kinds of results she would get from Google – but it had not. The top-ranked information provided by Google about ‘black girls’ was that they were commodities for consumption en route to sexual gratification. When problems such as those that Noble experienced are pointed out to Google representatives, they usually say either that it’s the computer’s fault or that it’s an anomaly they can’t control. This reinforces the misconception that algorithms are neutral. In fact, algorithms are created by people, and we all carry biases and prejudices which we write into the algorithms we create (Book Review: Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble, 2019).

Reference

Book Review: Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble. (2019, June 7). LSE Review of Books. https://blogs.lse.ac.uk/lsereviewofbooks/2019/06/07/book-review-algorithms-of-oppression-how-search-engines-reinforce-racism-by-safiya-umoja-noble/