Ridge regression In this post I want to write about the connection of Ridge regression and robust regression. Ridge regression (also know as Thikonov regularization) is a form of regularization or shrinkage, where the parameters of linear regression are shrunk towards 0. There are several reason why one might want to use methods like this. … Continue reading Regularization as robust regression
Assume we have a portfolio with Sharpe ratio of \displaystyle S_r = 1 . What is the probability of the portfolio losing value over a 4 year time horizon?
Consider the setting of least squares regression and suppose we are given a data matrix together with a vector of labels . The solution to the problem is given by that solves the equation . Using matrix inversion the solution can be easily calculated as the well known least squares estimator . However, for large … Continue reading Stepsize for gradient descent in linear regression