Posts

Showing posts from October, 2018

Oct 17 - Oct 31 Journal

Image
Concept Maps:  Finishing off my Concept Map on k-means  algorithm, I explained more about k-means to the rest of my group while we all shared information about what we learned. I created my Concept Map video on the k-means algorithm (pasted below) while the group and I discussed how centrality worked in data graphing (pictured below). While we were originally planning on focussing in on centrality for our next concept map, our visit to Caltech changed the course of our plans as centrality was not as focussed upon. Will had figured out parts of the centrality concepts though, so he taught us how betweenness centrality, closeness centrality, and degree centrality would be used to analyze nodes in the data graphs. The centrality established that the higher the centrality value, the more "central" that point is. If you imagine the data graph to be a social network, the node with the highest centrality is the most popular person. After visiting Caltech on Thursday...

Oct 5 - Oct 17 Journal

Image
Clustering : These two weeks, I researched the k-means clustering algorithm after the group split amongst ourselves different clustering algorithms. I already knew the basics of k-means, that data points would be clustered depending on the centroids that would be shifted until all the data stayed in the same clusters. One of the key issues with the k-means algorithm, however, is finding exactly how many clusters the algorithm should create, because the user must input the "k" value, the number of cluster. I found that the "Elbow Method" (pictured below) was the most common way to determine the "k" value. The Elbow Method uses the Least Squares Method to calculate how many clusters minimizes the distance between centroids and the cluster data points. I also learned of how the k-means algorithm can be applied. This clustering can be used to cluster customer purchases, personality test respondents, or even typical Youtube recommendations. During t...

Sep 28 - Oct 5 Journal

Image
Decision Trees: This week, I continued looking into decision trees with my concept map being conditionals (if/elif/else statements) and also working a bit on the Titanic survivability tutorial. Before the new deadline for the concept map was set, I decided to complete my concept map entry due to other tests later in the week and compared conditionals to ordering at In-n-Out. I explained how conditionals are used by the decision tree to create the branches and categorize the data. I also learned about the entropy (randomness) of the decision tree from Will today with the group. I thought the discussion with Will and the group was very informative but had reservations about the extent to which each person should research individually. My concern was how much individual research is too much so that our skillsets do not become so different that it impedes discussion. A possible area of further research is the random forest trees area within decision trees. Learning Decision Trees Ent...