Convergence of the Exponentiated Gradient Method with Armijo Line Search
Journal
Journal of Optimization Theory and Applications
Journal Volume
181
Journal Issue
2
Pages
588-607
Date Issued
2019
Author(s)
Cevher, Volkan
Abstract
Consider the problem of minimizing a convex differentiable function on the probability simplex, spectrahedron, or set of quantum density matrices. We prove that the exponentiated gradient method with Armijo line search always converges to the optimum, if the sequence of the iterates possesses a strictly positive limit point (element-wise for the vector case, and with respect to the Löwner partial ordering for the matrix case). To the best of our knowledge, this is the first convergence result for a mirror descent-type method that only requires differentiability. The proof exploits self-concordant likeness of the log-partition function, which is of independent interest. © 2018, Springer Science+Business Media, LLC, part of Springer Nature.
Subjects
Armijo line search; Exponentiated gradient method; Peierls–Bogoliubov inequality; Self-concordant likeness
SDGs
Type
journal article
