Diagonal Bundle Method for Nonsmooth Sparse Optimization

Abstract

We propose an efficient diagonal bundle method for sparse nonsmooth, possibly nonconvex optimization. The convergence of the proposed method is proved for locally Lipschitz continuous functions, which are not necessary differentiable or convex. The numerical experiments have been made using problems with up to million variables. The results to be presented confirm the usability of the diagonal bundle method especially for extremely large-scale problems.