Abstract
In this paper, we study a class of constrained group sparse ℓ0 regularized optimization problems, where the loss function is convex but nonsmooth and the feasible set is defined by box constraints. First, we propose a smoothing proximal gradient block-coordinate (SPGBC) algorithm, which is a novel combination of the proximal gradient block-coordinate algorithm and the smoothing method. We prove that any accumulation point of the iterates generated by it is a local minimizer of the considered problem and its zero entries can be identified in finite iterations. Moreover, we show that the proposed SPGBC algorithm achieves a local convergence rate of O(k-(1-ν)) on the objective function value, where ν∈(12,1) comes from the decay exponent of the smoothing parameter. Second, we consider a randomized variant of the SPGBC algorithm, the R-SPGBC algorithm, and obtain that the iterates generated by it converge to a subset of local minimizers of the original problem with probability 1. In addition, we establish that the R-SPGBC algorithm attains a sublinear convergence rate in expectation. Finally, some numerical examples are performed to show the efficiency of the proposed algorithms.
| Original language | English |
|---|---|
| Article number | 14 |
| Journal | Journal of Optimization Theory and Applications |
| Volume | 209 |
| Issue number | 1 |
| DOIs | |
| State | Published - Apr 2026 |
| Externally published | Yes |
Keywords
- Block-coordinate method
- Convergence rate
- Group sparse ℓ regularization
- Smoothing method
Fingerprint
Dive into the research topics of 'Smoothing Proximal Gradient Block-coordinate Algorithms for Group Sparse ℓ0 Regularized Nonsmooth Convex Regression Problem'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver