Ordinary Least Squares (OLS) estimator produces the Best Linear Unbiased Estimate (BLUE) of the parameter of linear regression model if the assumptions of normality and constant variance of the error terms are satisfied. However, the assumption of the constant error terms across the entire observations is frequently violated by real life data. Due to the failure of OLS estimator for contaminated data, robust alternatives such as Least Absolute Deviation (LAD) method and M estimators are proposed. M-estimators are robust to outliers in the y-direction but fail for x outliers. To obtain M-estimator that is robust to outliers in both directions, weights were applied to two loss functions Lx and Ly to remove the effect of outliers in y and x directions. The method handles both the simple and multiple linear regression models and yields set of solutions that are unbiased and efficient. Comparative analysis of the performance of the proposed method with the existing methods indicates that the method competes favourably and are particularly more robustand efficient than other estimators considered when outliers lie on the X-direction and on both X and Y directions. The finite sample performance of the proposed method is studied using MonteCarlo simulation.
Afrah, Y. and Rezami, A.L. (2020). Effect of outliers on the coefficient of determination in multiple regression analysis with the application on the GPA for student, International Journal of Advanced and Applied Sciences, l 7(10), 30 – 37.
Donoho, D.L. and Huber, P.J. (1983). The notion of breakdown point, In: A Festschrift for EricLehmann, Bickel P. J., Doksum K. A. and Hodges J. L. (Eds.), 157 – 184, Wadsworth, Belmont.CA.
Hawkins, D.M. and Olive, D. (1999). Applications and algorithms for least trimmed sum of absolute deviations regression, Computation Statistics & Data Analysis, 32, 119 – 134.
Huber, P.J. and Ronchetti, E.M. (2009). Robust Statistics, Second Edition, John Wiley & Sons Inc., New York.
Manimannan, G.M., Salomi, Priya R.L. and Saranraj, R (2020). Detecting outliers using R Package in fitting data with linear and nonlinear regression models, International Journal of Scientific and Innovative Mathematical Research, 8(4), 1 – 13.
Maronna, R.A., Martins, R.D. and Yohai, V.J. (2006). Robust Statistics: Theory and Methods, John Wiley & Sons Ltd, West Sussex.
Nevitt, J. and Tam, H.P. (1998). A comparison of robust and nonparametric estimators under the simple linear regression model, Multiple Linear Regression Viewpoints, 25, 54 – 69.
Portnoy, S. and Koenker, R. (1997). The Gaussian Hare and the Laplacian Tortoise: compatibility of squared-error versus absolute-error estimators, Statistical Science, 12(4), 279 – 300.
Rousseeuw, P.J. and Leroy, A.M. (1987). Robust regression and outlier detection, John Wiley & Sons Inc., New York.
Stephen, R.S. and Senthamarai, K.K. (2017). Detection of outliers in regression model for medical data, International Journal of Medical Research & Health Sciences, 6(7), 50 – 56.
Wolberg J. (2006). Data analysis using the method of least squares, Springer-Verlag, Berlin Heidelberg.
Yohai, V.J. (1987). High breakdown-point and high efficiency estimates for regression, Annals of Statistics, 15(2), 642-656.
SHARE WITH OTHERS