A Comparative Study of Ridge and Robust Regression Estimators for Handling Multicollinearity and Outliers

Authors

  • Paul Moses Medugu
  • Yaska Mutah
  • Noel Eckson

DOI:

https://doi.org/10.64321/jcr.v3i1.10

Keywords:

Ridge Regression, Robust Regression, Multicollinearity, Outliers, Shrinkage Estimation, M-estimation

Abstract

Multicollinearity and outliers often distort classical regression estimates, leading to inefficient and biased parameter estimates. This study examines two alternative regression techniques Ridge regression and robust regression as remedies for multicollinearity and outlier influence. Using synthetic data simulations, we compare their performance with Ordinary Least Squares (OLS) regression. Results indicate that Ridge Regression effectively reduces coefficient variance under multicollinearity, while Robust Regression provides substantial resistance to outlier contamination. The findings suggest that Ridge and Robust estimators offer superior predictive performance and reliability compared to OLS under data irregularities. The study underscores the importance of selecting regression estimators based on data characteristics and suggests future hybridization of both techniques for improved performance.

Author Biographies

Paul Moses Medugu

Department of Mathematics and Statistics, Federal Polytechnic Mubi, Mubi, Adamawa State, Nigeria

Yaska Mutah

Department of Mathematics and Statistics, Federal Polytechnic Mubi, Mubi, Adamawa State, Nigeria

Noel Eckson

Department of Mathematics and Statistics, Federal Polytechnic Mubi, Mubi, Adamawa State, Nigeria

References

Beaton, A. E., & Tukey, J. W. (1974). The fitting of power series, meaning polynomials, illustrated on band-spectroscopic data. Technometrics, 16(2), 147–185.

Farrar, D. E., & Glauber, R. R. (1967). Multicollinearity in regression analysis: The problem revisited. Review of Economics and Statistics, 49(1), 92–107.

Ghosh, S., & Vogt, A. (2012). Outliers and multicollinearity: Redesigning regression for modern data. Statistical Science, 27(1), 45–67.

Gujarati, D. N., & Porter, D. C. (2009). Basic Econometrics (5th ed.). McGraw-Hill.

Hampel, F. R. (1971). A general qualitative definition of robustness. Annals of Mathematical Statistics, 42(6), 1887–1896.

Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.

Huber, P. J. (1964). Robust estimation of a location parameter. Annals of Mathematical Statistics, 35(1), 73–101.

Kutner, M. H., Nachtsheim, C. J., & Neter, J. (2005). Applied Linear Regression Models (4th ed.). McGraw-Hill.

Li, Y., & Martin, A. (2020). Robust penalized regression for high-dimensional contaminated data. Journal of Computational and Graphical Statistics, 29(3), 633–647.

Maronna, R. A., Martin, R. D., Yohai, V. J., & Salibián-Barrera, M. (2019). Robust Statistics: Theory and Methods (with R). Wiley.

Montgomery, D. C., Peck, E. A., & Vining, G. G. (2012). Introduction to Linear Regression Analysis (5th ed.). Wiley.

Rousseeuw, P. J. (1984). Least median of squares regression. Journal of the American Statistical Association, 79(388), 871–880.

Yohai, V. J. (1987). High breakdown-point and high efficiency robust estimates for regression. Annals of Statistics, 15(2), 642–656.

Downloads

Published

2026-02-28

How to Cite

Paul Moses Medugu, Yaska Mutah, & Noel Eckson. (2026). A Comparative Study of Ridge and Robust Regression Estimators for Handling Multicollinearity and Outliers. Journal of Current Research and Studies, 3(1), 93–99. https://doi.org/10.64321/jcr.v3i1.10