Posts

Showing posts from November, 2018

Nov 14 - 28 Journal

Image
Recommender Systems: After my last journal, I looked into prior to Thanksgiving break and am looking into now the topic of dimensionality reduction in the larger concept map of recommender systems. I found several videos with one linked below that were really helpful. I learned how dimensionality reduction is used to simplify and streamline the processing of the data. Using principle component analysis, dimensionality reduction is used to create a plane of lesser dimensions that contains all the data points at a higher dimension. I look forward to sharing what I have learned with the group, and I feel that dimensionality reduction will play an important part in ensuring the matrix processes run smoothly. Thanksgiving Break: time to relax With the start of break, I spent time finishing all schoolwork during the first weekend. My family and I went to pick up my sister at the airport and got Japanese soba in Torrance on the way back. I tried out a brunch cafe near my home that I...

Nov 1 - 14 Journal

Image
On our visit last Thursday to Caltech, the group and I meet with graduate student Ethan and discussed recommender systems. Recommender systems are used all around us from Youtube to Spotify but we focused on the Netflix recommender system as Netflix actually has datasets published. Ethan told us that this dataset is out there because Netflix is challenging people to develop an even better recommender system than their current one using that data. He added that there was a prize of $1 million dollars so that also added some extra incentive. The dataset that is used to make the recommender system is similar to that pictured above with a matrix created by users and their ratings of movies that they have watched. The empty spaces are the movies that users have not watched yet and we are trying to solve which blank spaced movies should be recommended to the user. After discussing, we explored three ways to break down the matrix and make recommendations. The first is similarity c...