As more systems and sectors are driven by predictive analytics, there is increasing awareness of the possibility and pitfalls of algorithmic discrimination. In what ways do you think Recommender Systems reinforce human bias? Reflecting on the techniques we have covered, do you think recommender systems reinforce or help to prevent unethical targeting or customer segmentation? Please provide one or more examples to support your arguments.
Evan Estola brought up a number of examples in which recommender systems were biased. For example:
- Men were shown ads for higher paying jobs - Individuals with African American sounding names were shown more ads relating to people with criminal records
When women are not given access to the same ads as men, it is perpetuating a paradigm in our society in which women and men do not make comprable amounts of money for the same education. It also makes it harder for women to climb the professional ladder.
When black individuals are shown more negative ads, it causes psychological damage and the presence of these ads implies that there are positive ads that could be beneficial that people are not being presented with. It is limiting black people’s access to products, jobs and opportunities.
In these cases, the recommender systems reinforce unethical targeting. Just as there is a danger to limiting access through ads, so too is there a danger of only presenting individuals with the types of news’ stories they have previously read. We are becoming segmented as a society. Although we have so much access to information, the information we are receiving seems to be narrowing as it trying to best target us as individuals.
However, I believe that it is possible for recommender systems to open up access. It would take careful programming, but if race, gender, zip code etc.. were removed from recommender systems, search results could open up opportunities for people. In addition, I would like to connect this to Georgia’s (I think) earlier presentation about serendipity. We should be shown news’ stories that are different from what we have previously read and we should be given access to ads that may be a little different from what the nearest neighbor would predict. This comes at a price as people may not feel like the recommendation is as accurate, but there could be a large societal benefit as we access more information.