A working set algorithm for support vector regression problems
DOI:
https://doi.org/10.33993/jnaat542-1604Keywords:
Support Vector Regression; Dual Quadratic Programming; Primal Simplex Algorithm; Working Set Methods; Convex Optimization; Iterative Algorithms.Abstract
Due to its dual quadratic structure and the existence of complex structural constraints, support vector regression (SVR) is still a computationally intensive operation. This paper presents a new working set technique called WSA-SVR, which extends the primal simplex method, which was just suggested for SVM classification, to the regression scenario.
The suggested approach successfully overcomes the inherent difficulties of SVR, such as the coupling of dual variables, the ϵ- insensitive loss function, and the equality constraint on coefficient differences. WSA-SVR maintains primal feasibility throughout the iterations and uses effective rank-one updates, eliminating the need for null space and reduced Hessian projections. These characteristics guarantee numerical stability as well as theoretical scalability.
Extensive experiments on widely used benchmark datasets show that WSA–SVR not only converges rapidly, but also provides competitive predictive accuracy. In many cases, it achieves comparable or better results, while requiring substantially fewer iterations than traditional solvers. By addressing the structural constraints of SVR in a principled way and demonstrating strong empirical
performance, WSA–SVR emerges as an efficient and adaptable solution for kernel-based regression problems.
Downloads
References
[1] R. Khanna M.Awad. Support Vector Regression, pages 67–80. Apress, 2015. DOI: https://doi.org/10.1007/978-1-4302-5990-9_4
[2] L. J. O’Donnell F.Zhang. Support vector regression. In Machine Learning, pages 123–140, Elsevier, 2020. DOI: https://doi.org/10.1016/B978-0-12-815739-8.00007-9
[3] C. C. Aggarwal. Linear Classification and Regression for Text. Springer International Publishing, 2018. DOI: https://doi.org/10.1007/978-3-319-73531-3_6
[4] G. W.Flake. Efficient SVM Regression Training with SMO. Machine Learning, 46, 2002. DOI: https://doi.org/10.1023/A:1012474916001
[5] A. J.Smola B.Sch¨olkopf. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT press, 2002. DOI: https://doi.org/10.7551/mitpress/4175.001.0001
[6] R. W. McClendon A. Paz Joel: R. F.Chevalier, G. Hoogenboom. Support vector regression with reduced training sets for air temperature prediction: A comparison with artificial neural networks. Neural Computing and Applications, 20:151–159, 2011. DOI: https://doi.org/10.1007/s00521-010-0363-y
[7] S. Niazmardi M.Najafzadeh. A Novel Multiple-Kernel Support Vector Regression Algorithm for Estimation of Water Quality Parameters. Natural Resources Research, 30:3761–3775, 2021. DOI: https://doi.org/10.1007/s11053-021-09895-5
[8] C.-M. Huang Y.-J.Lee, W.-F. Hsieh. Epsilon-SSVR: A smooth support vector machine for epsilon-insensitive regression. IEEE Transactions on Knowledge & Data Engineering, 17:678–685, 2005. DOI: https://doi.org/10.1109/TKDE.2005.77
[9] G.Steidl. Supervised Learning by Support Vector Machines. In O.(Eds) In:Scherzer, editor, Handbook of Mathematical Methods in Imaging, pages pp.1393–1453. Springer New York, 2015. DOI: https://doi.org/10.1007/978-1-4939-0790-8_22
[10] S. Zaidi P. Bhargava M. S. Ahmad, S. M.Adnan. A novel support vector regression (SVR) model for the prediction of splice strength of the unconfined beam specimens. Construction and building materials, 248:118475, 2020. DOI: https://doi.org/10.1016/j.conbuildmat.2020.118475
[11] D.Xu X.Peng. Projection support vector regression algorithms for data regression. Knowledge-Based Systems, 112:54–66, 2016. DOI: https://doi.org/10.1016/j.knosys.2016.08.030
[12] J.Cota-Ruiz P.Rivas-Perea. An algorithm for training a large scale support vector machine for regression based on linear programming and decomposition methods. Pattern Recognition Letters, 34:439–451, 2013. DOI: https://doi.org/10.1016/j.patrec.2012.10.026
[13] J.Sun Y.Zhao. A fast method to approximately train hard support vector regression. Neural Networks, 23:1276–1285, 2010. DOI: https://doi.org/10.1016/j.neunet.2010.08.001
[14] B.Brahmi. An efficient primal simplex method for solving large-scale support vector machines. Neurocomputing, 599:128109, 2024. DOI: https://doi.org/10.1016/j.neucom.2024.128109
[15] C.Graff D.Dua. UCI machine learning repository,. URL http://archive. ics. uci. edu/ml., 7:62, 2017.
[16] C.-J. Lin C.-C. Chang. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2:1–27, 2011. DOI: https://doi.org/10.1145/1961189.1961199
Published
Issue
Section
License
Copyright (c) 2025 Saida Azib, Belkacem Brahmi

This work is licensed under a Creative Commons Attribution 4.0 International License.
Open Access. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.







