Uso de informações estruturais da matriz de projeção para regularização de Extreme Learning Machines
Ano de defesa: | 2019 |
---|---|
Autor(a) principal: | |
Orientador(a): | |
Banca de defesa: | |
Tipo de documento: | Dissertação |
Tipo de acesso: | Acesso aberto |
Idioma: | por |
Instituição de defesa: |
Universidade Federal de Minas Gerais
Brasil ENG - DEPARTAMENTO DE ENGENHARIA ELÉTRICA Programa de Pós-Graduação em Engenharia Elétrica UFMG |
Programa de Pós-Graduação: |
Não Informado pela instituição
|
Departamento: |
Não Informado pela instituição
|
País: |
Não Informado pela instituição
|
Palavras-chave em Português: | |
Link de acesso: | http://hdl.handle.net/1843/39328 |
Resumo: | This work aims at evaluating the usage of some linear separability measure taken from the structure of a hidden layer projected matrix of an Extreme Learning Machine as prior information for automatic obtention of a regularization parameter for a Tikhonov Regularization. Extreme Learning Machines (ELM) are networks that can be trained very quickly and present universal approximation property. Some regularization is usually necessary in order to stop ELMs from overfitting and Tikhonov Regularization is a straightforward option. Such technique, however, demands the selection of a regularization parameter that weights the training error minimization and the network weights norm minimization. This selection is usually carried out by cross validation, which increases training times and in fact goes against ELM philosophy. Proposed methodologies are capable of generating regularized models with similar performance to those obtained through cross validation and in much shorter times. The distance matrix calculated from the hidden layer of an ELM is also briefly explored and a proposal of parameterless regularization of Spiking Neurons ELMs is introduced. |