MLFS 3: Building a Linear Regression Model from Scratch
Thus far, we have been working exclusively with classification problems. Linear regression models the relationship between a single, continuous-valued target and some number of features. While we can find a closed-form solution, some familiar optimization methods are often useful.
MLFS 0: Building a Perceptron Model from Scratch
The first step on the path to modern deep neural networks is the perceptron model. The perceptron is a supervised method for learning a binary classifier that is based on the biology of neurons in the brain.
MLFS 2: Building a Logistic Regression Model from Scratch
Logistic regression is a probabilistic model for binary classification, and it can also be generalized to multiclass settings via multinomial logistic regression. While very similar to the ADALINE model, we now implement a specific activation function: the logistic function.
MLFS 1: Building an ADALINE Model from Scratch
The ADALINE model (ADAptive LInear NEuron) which we implement below was developed by Bernard Widrow and his doctoral student Ted Hoff at Stanford University as a modification of the perceptron. For the first time, we optimize weights and biases using gradient descent.