Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Posted to microsoft.public.excel.misc
newaglish
 
Posts: n/a
Default Prove multicollinearity by regressing one X on ALL the others?

I have a multiple regression with 4 independent variables.It has a high
predictive value (R2 = 0.82 and the P-value for F = 0.0004). However, two of
the variables (X1 and X2) hav high P-values for t (0.86 and 0.3,
respectively).

I suspect multicollinearity. However, individual correlation analysis
between each of the X variables is inconclusive. The highest correlation is r
= 0.64. Is that high enough to prove multicollinearity?

Regardless, I was wondering if it is OK to do a multiple correlation
analysis of X1 on the other X variables to prove intercorrelation and
multicollinearity? When regressing X1 on the other variables, r = 0.86; R2 =
0.74. Does that prove multicollinearity?
  #2   Report Post  
Posted to microsoft.public.excel.misc
Jerry W. Lewis
 
Posts: n/a
Default Prove multicollinearity by regressing one X on ALL the others?

Multiconllinearity is used with at least two very different meanings in the
literature.

1. Predictor variables that very nearly lie in a reduced dimensional
subspace, so that it is difficult or impossible to numerically solve for
unique least squares estimates. In this case the
MDETERM(MMULT(TRANSPOSE(xmatrix),xmatrix)) is nearly zero. I see no evidence
of this in the information that you provided.

2. Predictor variables that are sufficiently "correlated" that it is
difficult to separate their unique contributions. This may in fact be
happening to you. For more information, you might find some of the following
articles to be instructive:
Sharpe & Roberts (1997) American Statistician 51:46-48
Schey (1993) American Statistican 47:26-30
Hamilton (1988) American Statistician 42:89-90
Lewis & Escobar (1986) The Statistician 35:17-26
Lewis, Escobar, & Geaghan (1985) J. Statistical Computation & Simulation
22:51-66

Jerry

"newaglish" wrote:

I have a multiple regression with 4 independent variables.It has a high
predictive value (R2 = 0.82 and the P-value for F = 0.0004). However, two of
the variables (X1 and X2) hav high P-values for t (0.86 and 0.3,
respectively).

I suspect multicollinearity. However, individual correlation analysis
between each of the X variables is inconclusive. The highest correlation is r
= 0.64. Is that high enough to prove multicollinearity?

Regardless, I was wondering if it is OK to do a multiple correlation
analysis of X1 on the other X variables to prove intercorrelation and
multicollinearity? When regressing X1 on the other variables, r = 0.86; R2 =
0.74. Does that prove multicollinearity?

Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
trend for polynomial curve fitting by regressing vijaya Excel Worksheet Functions 7 November 11th 05 03:51 PM


All times are GMT +1. The time now is 09:26 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 ExcelBanter.
The comments are property of their posters.
 

About Us

"It's about Microsoft Excel"