View Single Post
  #6   Report Post  
Jerry W. Lewis
 
Posts: n/a
Default

The normal equations for least squares are
=MMULT(MINVERSE(MMULT(TRANSPOSE(xMat),xMat))),MMUL T(TRANSPOSE(xMat),yVec))
Which will work for up to 52 predictors if your X matrix is orthogonal.

This "method of calculation ... breaks down" for non-orthogonal X
matrices in the sense that the the useful information often ends up well
beyond the accuracy of IEEE double precision. A numerically better
approach is to do a singular value decomposition of the X matrix, so
that much of the numerical junk in (X'X)^-1 X'y can be canceled
analytically instead of needing far more precision than is available.

My vague recollection is that MATLAB also forms the normal equations,
and hence will have the same numerical instability as the excel formula
above. R (which Harlan suggested) is free
http://www.r-project.org/
and uses a more stable algorithm akin to the approach that I outlined above.

Jerry

Ken wrote:

Thanks for the suggestions. I see from an old thread that I missed (LINEST
maximum number of predictor variables 2/15/2005 3:35 PM PST ) that the 16
limit in XL is more than a hard-code limit; the method of calculation
apparently breaks down much before the limit of 16 is reached. Since I have
MATLAB, I will just export my data and compute the regression there. Thanks
to all for the help.

"Bernard Liengme" wrote:


You may need to buy SPSS!

--
Bernard V Liengme
www.stfx.ca/people/bliengme
remove caps from email

"Ken" wrote in message
...

The regresssion tool says the limit is 16 x's. I have 30.