Shuyao Li, Stephen J. Wright, Jelena Diakonikolas, Outlier-robust nonsmooth stochastic optimization
Full Text: PDF
DOI: 10.23952/jnva.10.2026.2.2
Volume 10, Issue 2, 1 April 2026, Pages 201-230
Abstract. We study nonsmooth stochastic optimization under adversarial data contamination, which models outliers that are often unavoidable in modern machine learning tasks. While robust methods for such settings with smooth objectives have been developed, nonsmooth models remain largely unexplored despite their central role in machine learning, including regression with losses, support vector machines, and distributionally robust optimization. We introduce a general framework for outlier-robust nonsmooth optimization, combining robust mean estimation with projected subgradient methods. Our analysis establishes the first polynomial-time algorithms with provable guarantees for nonsmooth (weakly) convex objectives under adversarial corruptions. As a key application, we resolve an open problem in outlier-robust distributionally robust optimization, obtaining polynomial-time algorithms with bounded errors for Conditional Value-at-Risk and
-divergence–based formulations. These results advance the theory of robust nonsmooth optimization and highlight new directions for robust learning with corrupted data.
How to Cite this Article:
S. Li, S.J. Wright, J. Diakonikolas, Outlier-robust nonsmooth stochastic optimization, J. Nonlinear Var. Anal. 10 (2026), 201-230.
