A linear regression is a solved problem: it finds B which minimizes the problem |y - (A + XB)|. The solution is known: B = (X'X)^(-1) X' y (see Ordinary least squares). Can we compute the solution without any matrix inverse, only with matrix products and additions? My solution is somewhere on my github account...
For curious people, the implementation is in function linear_regression and some proof written in French in a chapter related to piecewise linear regression: Régression linéaire par morceaux. It could be much faster with a C++ version which I did not have time to do. It is much slower than numpy.
<-- --> |