MULTICILLINEARITY IN A REGRESSION MODEL

1. When there is multicollinearity, we should not drop variables
Professor Zubair Khan commented as such > If the variable is very important that has a
special emphasis in the theoretical framework and in the model, then it is suggested in
the literature to use the collinear variable even by using Ridge regression with out
dropping the main variable.

2. Can we delete  relevant variables to handle multicollinearity problem?
Professor Sayed Hossain commented as such> We simply can not delete variables
which are theoretically and practically related to the model. If we do so, we will commit
model specification error. So keep those variables intact and  look for other ways to
solve multicollinearity issue such as transform the variables into log and so on.

3. How to remove multicollinearity?
Professor ÇĦåñd ĻŏÖñy  commented as such > Just look at VIF and Tolerance values;
if you find multicollinearity, exclude the correlated variables and proceed the results.

4. How to handle multicillinearity problem?
Professor Moulana oulana Naykrasyvishyy Cholovik commented as such> Do not
remove variables, you'll loose information. Apply PCA (principle component Analysis).
this technique transform the correlated variable into uncorrelated variabl to perform
regression safely.

What is multicollinearity?
Olasehinde Timilehin commented>Multicollinearity : take for an instance, both husband
and wife jointly donate resources to build a house. It will be hard for the man to drive out
his wife out of the house they jointly built as this will cause problems. This is term sine
qua non in Latin clause. Also, multicollinearity means what God has joined together, let
no man put asunder.

5. What is multicollinearity?
Aadersh Joshi commented> Multicollinearity generally occurs when there are high
correlations between two or more predictor variables. In other words, one predictor
variable can be used to predict the other. This creates redundant information, skewing
the results in a regression model. Examples of correlated predictor variables are: a
person’s height and weight, age and sales price of a car, or years of education and
annual income.

An easy way to detect multicollinearity is to calculate correlation coefficients for all pairs
of predictor variables. If the correlation coefficient, r, is exactly +1 or -1, this is called
perfect multicollinearity. If r is close to or exactly -1 or +1, one of the variables should be
removed from the model if at all possible. Data-based multicollinearity: caused by poorly
designed experiments, data that is 100% observational, or data collection methods that
cannot be manipulated. In some cases, variables may be highly correlated (usually due
to collecting data from purely observational studies) and there is no error on the
researcher’s part. For this reason, you should conduct experiments whenever possible,
setting the level of the predictor variables in advance. Structural multicollinearity:
caused by you, the researcher, creating new predictor variables.
POPOLAR BLOGS
Dave
Meo School of Research
Shishir Shakya
Noman Arshed
Multicollinearity
Hossain Academy Note
Univariate Models
Multivariate Models
Panel Data Model