Research Assignment 4 - How to counter the radicalizing effects of recommender systems or ways to prevent algorithmic discrimination.

The article by Zeynep Tufekci from the New York Times suggest that youtube is a immense radicalization effects on society today. This is notable when going through the videos that the popular video streaming website recommends. The goal of the their algorithm is to keep users as engages for as long as possible to maximize traffic revenue. This sometimes leads users to watching debatably biased or radicalized videos. The example from Zeynep discusses how they created an account and started watching left wing political videos on Hillary Clinton and Bernie Sanders to allow the algorithms to begin calibrating on this users preference. Eventually YouTube began redirecting the user to conspiracy based videos and governmental radicalized subjects. The same pattern emerged even with less extreme such as vegetarian-based videos leading to veganism and the like.

Some suggestions for improving algorithms like this is to allow teach these algorithms to discard certain biases or recommendation types that promote extreme content. These can be built in via weights that reduce content promotability or filters that eliminate this type of content. Other ways to account for this is to not allow the recommender systems to learn based on extremely bias datasets. an example of this is if mostly men watch car vehicles, a recommender system should not decide that a woman would not like to watch a similar video because of the population driven from within the dataset. Companies need to prioritize ethics and safety over profitability and reduce harmful biases as best they can.