Robust modification of PSO algorithm based on M-means for solving linear regression

Cover Page

Cite item

Full Text

Abstract

Linear regression is a fundamental data analysis tool, but it can be greatly affected by outliers and noise, which can lead to distorted coefficients and a reduction in forecast quality. Classical heuristic PSO (Particle Swarm Optimization) and Jaya algorithms demonstrate good performance on smooth loss functions, but they are sensitive to outliers when applying the standard mean squared error (MSE). This creates a need for simple but effective modifications that would maintain global search capabilities without complicating the basic algorithm'sstructure.

Aim. The study is to develop and subject to experimental evaluation modification of PSO algorithm (PSO-Robust) that ensures the robustness of linear regression to outliers without complicating the core algorithm or introducing additional hyperparameters.

Methods. Algorithmic idea: to keep the standard PSO equations of motion; interfering only with the fitness function. Instead of the root mean squire loss, an M-means function based on the Huber function is used, with adaptive weights that reduce the contribution of outliers. Experiments have been conducted on synthetic data with 15 % and 25 % outliers, with the same hyperparameters for all compared algorithms and 30 independent runs. Evaluation by mean and median test errors, as well as by dispersion estimation (variance, interquartile range). Visual analysis – boxplot of error distributions and regression lines. Evaluation by mean and median test errors, as well as dispersion (variance, interquartile range). Visual analysis – boxplot of error distributions and regression lines.

Results. PSO-Robust consistently outperforms classic PSO and Jaya in terms of mean and median test errors. The results show a smaller spread (variance). The visual analysis confirms reduced sensitivity to outliers (more compact box plots, more consistent regression lines).

Conclusion. The PSO-Robust modification demonstrates consistent superiority over the original algorithms in accuracy and robustness, producing more compact boxplots andless distorted regression lines. The proposed approach combines simplicity of implementation and robustness, increasing the reliability of regression in the case with heterogeneous data. Future developments include extending the method to multivariate and nonlinear models, as well as exploring alternative robust loss functions.

About the authors

Elena M. Kazakova

Institute of Applied Mathematics and Automation - branch of the Kabardino-Balkarian Scientific Center of the Russian Academy of Sciences

Author for correspondence.
Email: shogenovae@inbox.ru
ORCID iD: 0000-0002-5819-9396
SPIN-code: 4135-3315

Junior Researcher, Department of Neuroinformatics and Machine Learning

Russian Federation, 89 A, Shortanov street, Nalchik, 360000, Russia

References

  1. Kennedy J., Eberhart R. Particle swarm optimization. Proc. IEEE Int. Conf. on Neural Networks (ICNN). 1995. Pp. 1942–1948. doi: 10.1109/ICNN.1995.488968
  2. Rao R. Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Indus. Eng. Comput. 2016. Vol. 1. No. 7. Pp. 19–34. doi: 10.5267/j.ijiec.2015.8.004
  3. Shibzukhov Z.M. On a robust gradient boosting scheme based on aggregation functions insensitive to outliers. Automation and Remote Control. 2022. Vol. 83. No. 10. Pp. 1619–1629. doi: 10.1134/S00051179220100149
  4. Shibzukhov Z.M., Dimitrichenko D.P., Kazakov M.A. The principle of minimizing empirical risk based on aggregating average loss functions for solving regression problems. Programmnye produkty i sistemy [Software Products & Systems]. 2017. Vol. 30. No. 2. Pp. 180–186. doi: 10.15827/0236-235X.030.2.180-186. (In Russian)
  5. Zeng J., Yu X., Yang G., Gui H. Dynamic robust particle swarm optimization algorithm based on hybrid strategy. International Journal of Swarm Intelligence Research (IJSIR). 2023. Vol. 14. No. 1. Pp. 1–14. doi: 10.4018/IJSIR.325006
  6. Garg H. A hybrid PSO–GA algorithm for constrained optimization problems. Applied Mathematics and Computation. 2016. Vol. 274. Pp. 292–305. doi: 10.1016/j.amc.2015.11.001
  7. Şenel F.A., Gökçe F., Yüksel A.S., Yigit T. A novel hybrid PSO–GWO algorithm for optimization problems. Engineering with Computers. 2019. Vol. 35. Pp. 1359–1373. doi: 10.1007/s00366-018-0668-5
  8. Kang H., Li X., Shen Yo. et al. Particle swarm optimization with historical return decay enhances cooperation in public goods games with investment risks. Chaos, Solitons & Fractals. 2024. Vol. 189. P. 115665. doi: 10.1016/j.chaos.2024.115665
  9. Kazakova E.M. A hybrid PSO–Jaya algorithm for solving various optimization problems. Programmnaya inzheneriya [Software Engineering]. 2024. Vol. 15. No. 2. Pp. 87–96. doi: 10.17587/prin.15.87-96. (In Russian)
  10. Huber P.J. Robust Statistics: monograph. New York: Wiley, 1981. 308 p. ISBN: 0-471-41805-6

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2026 Kazakova E.M.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Согласие на обработку персональных данных

 

Используя сайт https://journals.rcsi.science, я (далее – «Пользователь» или «Субъект персональных данных») даю согласие на обработку персональных данных на этом сайте (текст Согласия) и на обработку персональных данных с помощью сервиса «Яндекс.Метрика» (текст Согласия).