Commercial Recommender

Choose one commercial recommender.

  1. Describe how you think it works (content-based, collaborative filtering, etc).
  2. Does the technique deliver a good experience or are the recommendations off-target?

Youtube

  1. I definitely think Youtube’s system is hybrid (collaborated filtering and content-based). It takes what the user has already watched and uses collaborative filtering to pick videos with similar themes. And then, based on what you’ve selected out of those recommendations (and how long you’ve watched each), it looks at your content and either narrows down your interests or expands what you might be into.

  2. The recommendations aren’t really off-target, but that’s probably because they never surprise. Rarely have I gone through Youtube and been recommended something I wasn’t already looking for. If you go in for cat videos, you’re gonna get cat videos. And you’re gonna keep getting cat videos until people get tired of cats, which is never. Though, that’s only in terms of entertainment. It’s when I’m searching for more academic videos that this perception is really challenged. So, if I had to rate the overall experience as “good” or “bad,” I’d lean on the side of good. The suggestions are safe and I can respect not taking a risk when it comes to recommendations.

Non-personalized Recommender

Also choose one of the three non-personalized recommenders (below) and describe the technique and which of the three you prefer to use:

  1. Metacritic: http://www.metacritic.com/about-metascores
  2. Rotten Tomatoes:https://www.rottentomatoes.com/about/
  3. IMDB Rating: http://imdb.to/ 2ljPH90

Rotten Tomatoes

I like Rotten Tomatoes. I don’t mind that it can be vague or that it draws in the crowd with a green splat more than numbers. I like how black and white it is. Rotten or Fresh. Or Certified Fresh (so maybe black and grey and white). That being said, I can acknowledge that this probably does not make it the most reliable scoring method. Its grades rely on the percentage of critics who judge the film to be more positive than negative and the critics themselves have to meet certain criteria to be called “critics”. There’s also a subset of “top critics” (more active users) and those scores are separately calculated. The more reviews, the more RT measures the percentage of more-positive-than-negative to assign the overall “freshness”. I think the thing to keep in mind is that RT doesn’t measure the movie’s score based on popular opinion, but rather the general consensus on whether the movie is “fresh”. It’s like a raise of hands. Who thinks this movie is worth watching?

Attacks on Recommender System

Read the article https://www.washingtonpost.com/news/morning-mix/wp/2017/04/19/wisdom-of-the-crowd-imdb-users-gang-up-on-the-promise-before-it-even-opens/?utm_term=.329a75ece088

  1. Consider how to handle attacks on recommender systems.
  2. Can you think of a similar example where a collective effort to alter the workings of content recommendations have been successful?
  3. How would you design a system to prevent this kind of abuse?

  1. I don’t think you can prevent this abuse, not fully, but I think that you can get people curious enough to wonder why a movie tanked before it even came out. One of the ways to handle this on a more general level is to know the people behind the data, so maybe focus on recommender system access and security.

  2. I don’t think I can recall one specific example, but this kind of sabotage can easily take place on the local level. A business might seek to increase their traffic by increasing reviews or ratings. Here is an example of a method to sway individuals using Yelp’s system https://www.theverge.com/2017/8/31/16232180/ai-fake-reviews-yelp-amazon.

  3. Have a section for trusted critics and their scores. Maybe even have a small section with the top 10 words from a sentiment analysis. This, together with the critics’ ratings, may give a better idea as to where the dichotomy of views stems from. Or maybe even have a small section showing ratings from other popular websites. I’m not sure what the levels of determination are for a large group of individuals to sabotage not one but all of the recommender systems, but that’s a whole lot of effort to put into taking down one movie. Maybe a weighted recommender system? If you’ve been with the site long enough to be trusted, your input holds more weight than someone who just logged on.