Reamostragem local baseada em informação estrutural dos dados com regularização de redes neurais artificiais
Ano de defesa: | 2022 |
---|---|
Autor(a) principal: | |
Orientador(a): | |
Banca de defesa: | |
Tipo de documento: | Tese |
Tipo de acesso: | Acesso aberto |
Idioma: | por |
Instituição de defesa: |
Universidade Federal de Minas Gerais
Brasil ENG - DEPARTAMENTO DE ENGENHARIA ELÉTRICA Programa de Pós-Graduação em Engenharia Elétrica UFMG |
Programa de Pós-Graduação: |
Não Informado pela instituição
|
Departamento: |
Não Informado pela instituição
|
País: |
Não Informado pela instituição
|
Palavras-chave em Português: | |
Link de acesso: | http://hdl.handle.net/1843/39660 https://orcid.org/0000-0002-1293-5642 |
Resumo: | The learning capability of the artificial neural networks (ANN) depends the imposes constraints on solutions space which can be defined by the number of parameters of the model or by another constraints on it search space. To control the complexity of the neural network is used the decomposition the expectation of mean squared error into bias and variance terms of the model family. A technical used to control the tradeoff between bias and variance is the regularization that control the variance by a modification into error function by including a penalization term. The propose of this work is to define a classifier of large margin based on local resampling into training set in feature space. The thesis approach is based in the addition of noise during neural network training and on structural information of the data. Experiments were carried out to compare the proposed model called Regularization with Noise of Extreme Learning Machine (RN-ELM) against the standard Extreme Learning Machine (ELM) and Extreme Learning Machine with regularization (ELM-REG). The results showed that the methods RN-ELM and ELM-REG yield smoother solutions and decrease the weight norm. The Statistical test was applied on the mean accuracy of the models was observed that there are significative difference between the models. A mathematical formulation of the proposed method shows that the addition of synthetic samples has the same effect as the Tikhonov regularization. The RN-ELM approach leads the separation function to the margin region, without the need to exhaustively cover the whole input space and the parameters are fitted using the structural information of the samples. |