Hongxiang Zhang, Wenguo Yang, Ding-Zhu Du, Gradient approximation algorithms for the $\sigma$-OSS + concave function maximization
Full Text: PDF
DOI: 10.23952/jnva.9.2025.1.02
Volume 9, Issue 1, 1 February 2025, Pages 15-30
Abstract. This paper studies the one-sided -smooth + concave function maximization problems under general convex sets. Provided that the objective function is Lipschitz smooth, we use the jump-start initial point selection technique and the Frank-Wolfe method to propose a
-approximation algorithm with
time complexity, where
,
, and
is a finite constant. The algorithm is discrete-time and receives
losses. To overcome the disadvantage that the gradient of function is Lipschitz continuous, we propose the continuous-time Jump-Start Greedy Frank-Wolfe algorithm with the same approximation guarantee. In addition, through the analysis of our algorithms, we investigate some important theoretical properties regarding the analysis of approximation algorithms for continuous function maximization problems.
How to Cite this Article:
H. Zhang, W. Yang, D.Z. Du, Gradient approximation algorithms for the -OSS + concave function maximization, J. Nonlinear Var. Anal. 9 (2025), 15-30.