Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: We develop a novel optimistic gradient-type algorithmic framework, combining
both Nesterov's acceleration and variance-reduction techniques, to solve a
class of generalized equations involving possibly nonmonotone operators in
data-driven applications. Our framework covers a wide class of stochastic
variance-reduced schemes, including mini-batching, and control variate unbiased
and biased estimators. We establish that our method achieves
$\mathcal{O}(1/k^2)$ convergence rates in expectation on the squared norm of
residual under the Lipschitz continuity and a ``co-hypomonotonicity-type''
assumptions, improving upon non-accelerated counterparts by a factor of $1/k$.
We also prove faster $o(1/k^2)$ convergence rates, both in expectation and
almost surely. In addition, we show that the sequence of iterates of our method
almost surely converges to a solution of the underlying problem. We demonstrate
the applicability of our method using general error bound criteria, covering
mini-batch stochastic estimators as well as three well-known control variate
estimators: loopless SVRG, SAGA, and loopless SARAH, for which the last three
variants attain significantly better oracle complexity compared to existing
methods. We validate our framework and theoretical results through two
numerical examples. The preliminary results illustrate promising performance of
our accelerated method over its non-accelerated counterparts.