A new SMBO-Based parameter tuning framework to optimization algorithms

Detalhes bibliográficos
Ano de defesa: 2019
Autor(a) principal: Áthila Rocha Trindade
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Tese
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Universidade Federal de Minas Gerais
Brasil
ENG - DEPARTAMENTO DE ENGENHARIA ELÉTRICA
Programa de Pós-Graduação em Engenharia Elétrica
UFMG
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: http://hdl.handle.net/1843/32403
Resumo: A variety of algorithms have been proposed by optimization researchers, for solving several different problems. Heuristics and Metaheuristics are two class of algorithms which have been widely used for practical optimization problems, due to their capability to achieve good solutions in a feasible runtime when solving a problem, even in problems with high computational complexity (in terms e.g., of modality, non-differentiability, or NP-hardness). These algorithms can have their level of balance of global search and local improvement in the search space tuned by choosing suitable values of a number of user-defined parameters, which can be numerical or categorical. To take full advantage of the potential of Heuristics and Metaheuristics when solving hard optimization problems, it is necessary to think about appropriate strategies for choose adequate parameter values, that is, parameters values which provide a balance between global search and local improvement in the search space. Former strategies based on choosing parameters values by trial and error did not achieve satisfactory results. In contrast, methods based on statistical modeling and inference regarding algorithm performance, i.e., which recommend parameter values based on generalizations of given statistics observed in samples of problem instances to whole problem classes, have arisen in the past two decades. Algorithm configurations yielded by these methods not only result in better performance for the algorithms, but can also help researchers and practitioners to experimentally investigate certain aspects of algorithmic behavior. Based on this observation, this work investigates a parameter-tuning framework based on iteratively improving statistical modeling about the algorithm peformance, which represent the algorithm behavior prediction model. The experimental results showed that the current framework is capable to return competitive results of best parameter values when compared to those of some classical parameter tuning methods of the literature, with the advantage of providing prediction models which can reveal the algorithm parameters relevance. This framework is available for using by researchers in the field of general optimization and metaheuristics. It was implemented using the R Language, and is available for using by the scientific community in a form of an open source R package.