Gradient Descent

I started taking an online machine learning class recently because I have been working on a data driven project at Ziba Design and I thought it would be interesting to view our data set through the eyes of the machine. One of the fundamental tools in machine learning is linear regression. A topic I never really thought much about but have used a lot in the past. I have been fitting lines and polynomials to data using Excel for as long as I can remember. Now as I learn about ML I am starting to understand how that actually works. Spreadsheets use a gradient descent algorithm. As I learned more about it I felt it was still a little mysterious. All the tools used in the class (Matlab, Octave) gave me the answer but didn't show me the process over time. I wanted to see what it looks like for a machine to "learn" in real time and at a human time scale. I also wanted to compare how different initial assumptions effected the time and path taken for gradient descent to converge on an answer. I created this tool. ( Repo ) The purpose of the tool is to take the calculations a computer needs a fraction of a second to compute and drag it out over the course of minutes or even hours. I wanted to expose the gradient descent dance. When I started the project my approach was to create something more artistic then useful but I found the final product really helped my understanding of gradient descent.

http://cloud.engineering-bear.com/apps/ml/index.html

To make the tool more useful I added the ability to feed it query parameters to change its starting state. Out of the box it assumes a learning rate alpha of 0.01, theta0 = 0 & theta1 = 0. Here are some other options of how to run the tool:

Alpha = 0.2, Theta0 = 10, Theta1 = -5

Run all of the plots at once using the all=true flag

Have the tool calculate 100 steps for ever one visual step

Make a 2 x 2 grid

Scale the data y axis by -20