background preloader

The best Hans Rosling talks you’ve ever seen

The best Hans Rosling talks you’ve ever seen

algovis/README.md at master · enjalot/algovis A Visual Introduction to Machine Learning Finding better boundaries Let's revisit the 73-m elevation boundary proposed previously to see how we can improve upon our intuition. Clearly, this requires a different perspective. By transforming our visualization into a histogram, we can better see how frequently homes appear at each elevation. While the highest home in New York is 73m, the majority of them seem to have far lower elevations. Your first fork A decision tree uses if-then statements to define patterns in data. For example, if a home's elevation is above some number, then the home is probably in San Francisco. In machine learning, these statements are called forks, and they split the data into two branches based on some value. That value between the branches is called a split point. Tradeoffs Picking a split point has tradeoffs. Look at that large slice of green in the left pie chart, those are all the San Francisco homes that are misclassified. The best split Recursion

Datavisualization.ch DataViz DataViz Mediaeater MMX Archive / RSS June 21 (Source: thedailywhat) May 26 April 30 December 5 (Source: mrharristweed) November 12 November 9 (Source: toukubo, via handa) November 3 September 3 August 15 (Source: thedailyfeed) Next »

Related: