Advanced search
Start date
Betweenand


Técnicas amostrais para otimização não suave

Full text
Author(s):
Lucas Eduardo Azevedo Simões
Total Authors: 1
Document type: Doctoral Thesis
Press: Campinas, SP.
Institution: Universidade Estadual de Campinas (UNICAMP). Instituto de Matemática, Estatística e Computação Científica
Defense date:
Examining board members:
Sandra Augusta Santos; José Mario Martínez Pérez; Lucio Tunes dos Santos; Claudia Alejandra Sagastizabal; Ademir Alves Ribeiro
Advisor: Sandra Augusta Santos; Elias Salomão Helou Neto
Abstract

The Gradient Sampling (GS) method is a recently developed tool for solving unconstrained nonsmooth optimization problems. Using just first order information of the objective function, it generalizes the steepest descent method, one of the most classical methods for minimizing a smooth function. This study aims at developing and exploring different sampling algorithms for the numerical optimization of nonsmooth functions. First, we prove that it is possible to have a global convergence result for the GS method in the abscence of the differentiability check procedure. Second, we prove in which circumstances one can expect the GS method to have a linear convergence rate. Lastly, a new sampling algorithm with superlinear convergence is presented, which rests not only upon the gradient but also on the objective function value at the sampled points (AU)

FAPESP's process: 13/14615-7 - On the nonmonotone line search in gradient sampling methods for nonconvex and nonsmooth optimization
Grantee:Lucas Eduardo Azevedo Simões
Support Opportunities: Scholarships in Brazil - Doctorate