z-logo
open-access-imgOpen Access
On fast convergence rates for generalized conditional gradient methods with backtracking stepsize
Author(s) -
Karl Kunisch,
Daniel Walter
Publication year - 2022
Publication title -
numerical algebra control and optimization
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.303
H-Index - 20
eISSN - 2155-3289
pISSN - 2155-3297
DOI - 10.3934/naco.2022026
Subject(s) - sublinear function , iterated function , mathematics , convexity , rate of convergence , backtracking , differentiable function , convex function , gradient descent , convergence (economics) , mathematical optimization , regular polygon , computer science , combinatorics , mathematical analysis , artificial neural network , computer network , channel (broadcasting) , geometry , machine learning , financial economics , economics , economic growth

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom