Skip to content

Latest commit

 

History

History
13 lines (8 loc) · 1.49 KB

README.md

File metadata and controls

13 lines (8 loc) · 1.49 KB

Synopsis

Personally, I think the best way to learn is to build myself a small prototype based on what I have learn fromthe books/theory. The tinkering and debugging parts really gave me an intuition that is missing had I just do a proof of concept in paper.

Here is the list of jupyter notebooks I created, with theory explanation and actual code side-by-side. Hope these will be useful for someone with similar itention.

Table of Content

  1. Linear Regression - the basics of the basics. This notebook illustrates the solution in closed form using matrix in python.
  2. Logistic Regression - another bread and butter algorithm. Mostly useful for probability-type predictions (Churn rate, yes/no, certain classifications). Here I used gradient descent as an optimization method.
  3. Neural Network: Perceptron - the most simple type of NN.
  4. Neural Network: Multi-layer - a more elaborate type with a hidden layer in the middle. Illustrates backpropagation method with MNIST digit image dataset.