Consider how to counter the radicalizing effects of recommender systems or ways to prevent algorithmic discrimination.
I read the New York Times opinion article entitled YouTube, the Great Radicalizer by Zeynep Tufekci, which can be found at the following link: https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html
I assumed that YouTube recommends videos based on those one has already watched, thereby inhibiting the diversity of videos and opinions that a user is exposed to. However it had not occurred to me that YouTube is inentionally suggesting more extreme videos of related topics to keep us hooked and watching more. It makes sense that people would be drawn to more intense videos.
I decided to test this out by choosing to watch cute dog videos on YouTube. The videos that played automoatically proceding the original video I clicked on, were more cute dog videos that were about the same as the one I first watched and that matched my request. However when I did another search that was different from the first, I had new recommendations on the side bar - Animals Can Be Jerks Compilation and Ninja Cats vs. Dogs, Who Wins. So perhaps YouTube is trying to keep me engaged by upping the ante.
If it is in fact the case that YouTube is intentionally suggesting more extreme videos, then the simple solution would be to change the recommender. The greater issue is that YouTube wants you to stay on their site for longer and a user may have little incentive to do so if the next video that is recommended is too similar to the one that was just watched. If YouTube is recommending and continues to recommend more extereme videos to users, then I think the best way to manage this is by educating the public so that people are aware and are able to make more responsible decisions with what they watch and take seriously.