Skip to content

Hossein Moosaei, Sparse least squares K-SVCR multi-class classification

Full Text: PDF
DOI: 10.23952/jnva.8.2024.6.07

Volume 8, Issue 6, 1 December 2024, Pages 953-971

 

Abstract. This paper introduces a novel model, the sparse least squares K-class support vector classification-regression with adaptive \ell_p-norm (PLSTKSVC), to tackle challenges in multi-class classification. Leveraging a “1-versus-1-versus-rest” structure, PLSTKSVC dynamically adjusts the parameter p based on the data, enabling an adaptive learning framework. By incorporating cardinality-constrained optimization, the model seamlessly integrates feature selection and classification. Although the \ell_p-norm is non-convex for 0 \textless p \textless 1, PLSTKSVC efficiently addresses the associated optimization via linear systems of equations. PLSTKSVC offers several advantages, including simultaneous feature selection and classification, robust theoretical foundations, algorithmic efficiency, and strong empirical validation. The model’s theoretical contributions include lower bounds on non-zero solution entries and upper bounds on the optimal solution norm. Experimental results on multi-class classification datasets highlight the superior performance of PLSTKSVC, establishing it as a significant advancement in machine learning.

 

How to Cite this Article:
H. Moosaei, Sparse least squares K-SVCR multi-class classification, J. Nonlinear Var. Anal. 8 (2024), 953-971.