Econometrics
5BUS1059
Tuhin Ahmed
14089754
Word count-1645
Introduction
Economic theory suggests that consumption is positively related to income given the function C=c(y). The aim of this report is to see whether the predictions of this theory hold. The predictions that have been raised in terms of this theory are that when disposable income increases consumption will also increase.
Hyphothessis
The regression was run and obtained the following results to test the significance of the model and analyse whether disposable income has any significance towards the dependant variable.
T test
Ho: DPI = 0
1 tail test
T value= 2.465599372
Critical t value = 1.645
2.466>1.645
The null is rejected
F test- to test the validity of all the coefficients
Ho: R2 = 0
F= 3804.450781
Critical f value = 2.71
3804.5>2.71
So the null is rejected (Cameron 2005) (Gujarati 2011)
Introduction of multicolinearity
One of the assumptions of the classical linear regression (CLRM) is that there is no exact linear relationship among the regressors. If there are one or more such relationships among the regressors, it is called multicollinearity.
Multicolinearity may arise due to several causes such as a poorly constructed sampling design which causes correlation among X’s. Furthermore too many variables in the model would also lead to multicolinearity
If multicolinearity arises in a model, this can lead to certain consequences such as confusing and misleading results within the model. Moreover the problem that is multicolinearity does not result in biased coefficients estimates. However standard errors regression coefficients can increase causing insignificant T- statistics (Gujarati 2011)
How to detect multicolinearity
One of the indications of multicolinearity is that the F- stat is high and the T- stat is low. From figure 1 one can see that both of them are high which shows that there is no multicolinearity (Gujarati 2011).
Another indication is to calculate pairwise correlations between the independent variables. By putting in COR X1 X2 within the regression however since there was only one independent variable it was not possible to calculate pairwise correlations as the regression only has one independent variable, however if the regression had other regressors such as VAT and Inflation rates than one can calculate the correlation among them. If the pairwise correlations are above 0.80 than this an indication of multicolinearity (Cameron 2005).
Lastly another way to indicate multicolinearity is to use the variance inflation factor; this tells us how much of the variance of the coefficient is inflated because of linear dependence on other predictors. It is not possible for us to use this method in our regression to test for multicolinearity due to just having one independent variable. If the results are above 10 than this indicates multicolinearity (Gujarati 2011)
How to solve multicolinearity
The usual remedy is to drop one or more variables from the model, this breaks the linear relationship between the variables, since we have only had one independent variable we are not able to so. However If there were more regressors such as VAT and Inflation rates one may drop the variable to see if it breaks the linear relationship (Gujarati 2011)
Another measure would be to increase the number of data so that one can distinguish between the effects from each of the variables. So we can see how each independent variable affects the dependant variable separately (Gujarati 2011).
Introduction of Heteroscedasticity
Another violation of the classical linear regression model is Heteroscedasticty. Homoscedasticity is when the variances of all regressors are considered to be homogenous, however when the Heteroscedasticty is present the variances are not persistent. This problem is caused by outliers, mixing observations of different scale, incorrect functional form & data transformation. This issue causes the ordinary least