Li Tan, Ke Guo, Bregman ADMM: A new algorithm for nonconvex optimization with linear constraints
Full Text: PDF
DOI: 10.23952/jnva.9.2025.2.02
Volume 9, Issue 2, 1 April 2025, Pages 179-196
Abstract. Alternating direction method of multipliers (ADMM) is a widely used algorithm for solving two-block separable problems with linear constraints. However, its applicability in various fields is limited by the need to assume the global Lipschitz continuity of the gradient of differentiable functions, which is often infeasible in nonconvex optimization problems. To address this limitation, we propose a new version of the Bregman ADMM that can return to the ADMM while avoiding the need for global Lipschitz continuity of the gradient. The Bregman ADMM relaxes the classical ADMM’s requirement for global Lipschitz continuous gradient, enriching its applications. We prove that when the associated function satisfies the Kurdyka-Łojasiewicz inequality and certain assumptions, the iterative sequence generated by our algorithm converges to a critical point of the problem. Additionally, we analyze the rate of convergence of the algorithm.
How to Cite this Article:
L. Tan, K. Guo, Bregman ADMM: A new algorithm for nonconvex optimization with linear constraints, J. Nonlinear Var. Anal. 9 (2025), 179-196.