Estimating Linear Regression Parameters in the Presence of the Autocorrelation Problem and High Leverage Values
DOI:
https://doi.org/10.55562/jrucs.v56i1.25Keywords:
Autocorrelation, robust methods, high leverage values.Abstract
Linear regression analysis is one of the most important statistical techniques in many fields such as economics, survival studies, business administration, medicine, engineering, etc. To estimate the coefficients of a linear regression model, the least squares method is often used because of its good advantages and ease of calculation. However, the presence of single or multiple outliers in a data set can destroy OLS estimates. Many researchers have mentioned that a real data set typically contains 1% to 10% of anomalous observations. Outliers in explanatory variables, called high leverage points (HLPs), have more serious effects on OLS estimates than outliers in the variable y. High leverage values are responsible for the problem of masking and swapping in linear regression. HLPs also cause the problem of multi-co-linearity and have a significant impact on the accuracy of the estimates. Therefore, it is necessary to detect these unusual observations. Also, the estimates of the least squares method can suffer from a significant deterioration in the presence of the auto-correlation problem. To address this problem, a group of methods have been proposed, including the Cochrane-Orcutt-Price-Winston iterative method, but unfortunately most of these traditional methods fail individually to address the problem of autocorrelation between errors in the presence of outliers. In this study, a robust method was proposed to address the complex problem of autocorrelation in a multiple linear regression model in the presence of outliers and high leverage values. The performance of the proposed method was evaluated using real data.