WSEAS Transactions on Mathematics, vol.4, no.3, pp.163-169, 2005 (Scopus)
The least squares estimators of the regression coefficients are the best linear unbiased estimators. That is, of all possible estimators that are both linear functions of the data and unbiased for the parameters being estimated, the least squares estimators have the smallest variance. One of the assumptions of linear regression model is the independency of explanatory variables. If those variables are correlated with each other multicollinearity problem occurs. In the presence of multicollinearity minimum variance may be unacceptably large. Ridge regression, a biased regression method used in multicollinearity conditions builds on the fact that a singular square matrix can be made nonsingular by adding a constant to the diagonal of the matrix. In this study ridge regression and artificial neural network algorithm, that does not require any assumption, are applied on two different economic data sets with multicollinearity. For both of the models, the results are interpreted and compared.