Gradient-Free Two-Point Methods for Solving Stochastic Nonsmooth Convex Optimization Problems with Small Non-Random Noises


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

We study nonsmooth convex stochastic optimization problems with a two-point zero-order oracle, i.e., at each iteration one can observe the values of the function’s realization at two selected points. These problems are first smoothed out with the well-known technique of double smoothing (B.T. Polyak) and then solved with the stochastic mirror descent method. We obtain conditions for the permissible noise level of a nonrandom nature exhibited in the computation of the function’s realization for which the estimate on the method’s rate of convergence is preserved.

About the authors

A. S. Bayandina

Moscow Institute of Physics and Technology (National Research University); Skolkovo University of Science and Technology

Author for correspondence.
Email: anast.bayandina@gmail.com
Russian Federation, Moscow; Moscow

A. V. Gasnikov

Moscow Institute of Physics and Technology (National Research University); Kharkevich Institute for Information Transmission Problems

Email: anast.bayandina@gmail.com
Russian Federation, Moscow; Moscow

A. A. Lagunovskaya

Moscow Institute of Physics and Technology (National Research University)

Email: anast.bayandina@gmail.com
Russian Federation, Moscow

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2018 Pleiades Publishing, Ltd.