Supervised Learning and Regression

Regression

Regression is one of the simplest supervised learning approaches to learn relationships between input variables (features) and output variables (predictions).

Linear regression

Linear regression assums a predictor of the form : matrix of features (data) : unknowns (which features are relevant) : vector of outputs (labels)

One observation:

Q: Solve for theta
A: - doesn’t work because is not necessarily invertible (not square)
A:

theta, residuals, rank, s = numpy.linalg.lstsq(X, y)

We can perform arbitrary combinations of the features and the model will still be linear in in the parameters (theta):

The linear models we’ve seen so far do not support transformations (they need to be linear in their parameters) There are alternative models that support non-linear transformations of parameters, e.g. neural networks