A Comparative Study of Ridge and Robust Regression Estimators for Handling Multicollinearity and Outliers
DOI:
https://doi.org/10.64321/jcr.v3i1.10Keywords:
Ridge Regression, Robust Regression, Multicollinearity, Outliers, Shrinkage Estimation, M-estimationAbstract
Multicollinearity and outliers often distort classical regression estimates, leading to inefficient and biased parameter estimates. This study examines two alternative regression techniques Ridge regression and robust regression as remedies for multicollinearity and outlier influence. Using synthetic data simulations, we compare their performance with Ordinary Least Squares (OLS) regression. Results indicate that Ridge Regression effectively reduces coefficient variance under multicollinearity, while Robust Regression provides substantial resistance to outlier contamination. The findings suggest that Ridge and Robust estimators offer superior predictive performance and reliability compared to OLS under data irregularities. The study underscores the importance of selecting regression estimators based on data characteristics and suggests future hybridization of both techniques for improved performance.
References
Beaton, A. E., & Tukey, J. W. (1974). The fitting of power series, meaning polynomials, illustrated on band-spectroscopic data. Technometrics, 16(2), 147–185.
Farrar, D. E., & Glauber, R. R. (1967). Multicollinearity in regression analysis: The problem revisited. Review of Economics and Statistics, 49(1), 92–107.
Ghosh, S., & Vogt, A. (2012). Outliers and multicollinearity: Redesigning regression for modern data. Statistical Science, 27(1), 45–67.
Gujarati, D. N., & Porter, D. C. (2009). Basic Econometrics (5th ed.). McGraw-Hill.
Hampel, F. R. (1971). A general qualitative definition of robustness. Annals of Mathematical Statistics, 42(6), 1887–1896.
Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.
Huber, P. J. (1964). Robust estimation of a location parameter. Annals of Mathematical Statistics, 35(1), 73–101.
Kutner, M. H., Nachtsheim, C. J., & Neter, J. (2005). Applied Linear Regression Models (4th ed.). McGraw-Hill.
Li, Y., & Martin, A. (2020). Robust penalized regression for high-dimensional contaminated data. Journal of Computational and Graphical Statistics, 29(3), 633–647.
Maronna, R. A., Martin, R. D., Yohai, V. J., & Salibián-Barrera, M. (2019). Robust Statistics: Theory and Methods (with R). Wiley.
Montgomery, D. C., Peck, E. A., & Vining, G. G. (2012). Introduction to Linear Regression Analysis (5th ed.). Wiley.
Rousseeuw, P. J. (1984). Least median of squares regression. Journal of the American Statistical Association, 79(388), 871–880.
Yohai, V. J. (1987). High breakdown-point and high efficiency robust estimates for regression. Annals of Statistics, 15(2), 642–656.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Paul Moses Medugu

This work is licensed under a Creative Commons Attribution 4.0 International License.