Detalhes bibliográficos
Ano de defesa: |
2023 |
Autor(a) principal: |
Chagas, Marcus Vinícius de Morais
 |
Orientador(a): |
Melo, Jefferson Divino Gonçalves de
 |
Banca de defesa: |
Melo, Jefferson Divino Gonçalves de,
Gonçalves, Max Leandro Nobre,
Bento, Glaydston de Carvalho,
Alves, Maicon Marques |
Tipo de documento: |
Dissertação
|
Tipo de acesso: |
Acesso aberto |
Idioma: |
por |
Instituição de defesa: |
Universidade Federal de Goiás
|
Programa de Pós-Graduação: |
Programa de Pós-graduação em Matemática (IME)
|
Departamento: |
Instituto de Matemática e Estatística - IME (RMG)
|
País: |
Brasil
|
Palavras-chave em Português: |
|
Palavras-chave em Inglês: |
|
Área do conhecimento CNPq: |
|
Link de acesso: |
http://repositorio.bc.ufg.br/tede/handle/tede/12854
|
Resumo: |
In this work, we analyze the Hybrid Proximal Extragradiente (HPE) method to find zeroes of maximal monotone operators and its accelerated version Accelerated Hybrid Proximal Extragradient (A-HPE) to solve convex optimization problems whose objective function is given by the sum of two other convex functions, one differentiable with Lipschitz gradient and another one not necessarily differentiable. The HPE method was introduced by Solodov and Svaiter, it consists of an inexact version of the proximal point method having its proximal subproblems inexactly solved using a relative error criterion followed by an extragradient step. The HPE can also be seen as a framework, in the sense that many other methods for minimizing convex functions and more generally to find zeroes of maximal monotone operators can be seen as instances of the HPE method, such as the extragradient method, regularized Newton type method, ADMM, etc. In this work, we will analyze both the asymptotic convergence of the HPE method and its iteration-complexity.We will also analyze the iteration-complexity of the A-HPE method proposed by Monteiro and Svaiter. The A-HPE is a first-order accelerated method, i.e., it uses only information of the functional values and the first derivative or subgradients of the objective function and has optimal iteration-complexity. |