Busca avançada
Ano de início
Entree
(Referência obtida automaticamente do Web of Science, por meio da informação sobre o financiamento pela FAPESP e o número do processo correspondente, incluída na publicação pelos autores.)

VARIANCE-BASED EXTRAGRADIENT METHODS WITH LINE SEARCH FOR STOCHASTIC VARIATIONAL INEQUALITIES

Texto completo
Autor(es):
Iusem, Alfredo N. [1] ; Jofre, Alejandro [2, 3] ; Oliveira, I, Roberto ; Thompson, Philip [4]
Número total de Autores: 4
Afiliação do(s) autor(es):
[1] IMPA, BR-22460320 Rio De Janeiro, RJ - Brazil
[2] CMM, Santiago 8370448 - Chile
[3] DIM, Santiago 8370448 - Chile
[4] Oliveira, Roberto, I, IMPA, BR-22460320 Rio De Janeiro, RJ - Brazil
Número total de Afiliações: 4
Tipo de documento: Artigo Científico
Fonte: SIAM JOURNAL ON OPTIMIZATION; v. 29, n. 1, p. 175-206, 2019.
Citações Web of Science: 1
Resumo

In this paper, we propose dynamic sampled stochastic approximated (DS-SA) extragradient methods for stochastic variational inequalities (SVIs) that are robust with respect to an unknown Lipschitz constant L. We propose, to the best of our knowledge, the first provably convergent robust SA method with variance reduction, either for SVIs or stochastic optimization, assuming just an unbiased stochastic oracle within a large sample regime. This widens the applicability and improves, up to constants, the desired efficient acceleration of previous variance reduction methods, all of which still assume knowledge of L (and, hence, are not robust against its estimate). Precisely, compared to the iteration and oracle complexities of O(epsilon(-2)) of previous robust methods with a small stepsize policy, our robust method uses a DS-SA line search scheme obtaining the faster iteration complexity of O(epsilon(-1)) with oracle complexity of (ln L) O(d epsilon(-2)) (up to log factors on epsilon(-1)) for a d-dimensional space. This matches, up to constants, the sample complexity of the sample average approximation estimator which does not assume additional problem information (such as L). Differently from previous robust methods for ill-conditioned problems, we allow an unbounded feasible set and an oracle with multiplicative noise (MN) whose variance is not necessarily uniformly bounded. These properties are appreciated in our complexity estimates which depend only on L and local variances or fourth moments at solutions. The robustness and variance reduction properties of our DS-SA line search scheme come at the expense of nonmartingale-like dependencies (NMDs) due to the needed inner statistical estimation of a lower bound for L. In order to handle an NMD and an MN, our proofs rely on a novel iterative localization argument based on empirical process theory. (AU)

Processo FAPESP: 13/07699-0 - Centro de Pesquisa, Inovação e Difusão em Neuromatemática - NeuroMat
Beneficiário:Jefferson Antonio Galves
Linha de fomento: Auxílio à Pesquisa - Centros de Pesquisa, Inovação e Difusão - CEPIDs