Detalhes bibliográficos
Ano de defesa: |
2020 |
Autor(a) principal: |
Ponte, Jackson Uchoa |
Orientador(a): |
Não Informado pela instituição |
Banca de defesa: |
Não Informado pela instituição |
Tipo de documento: |
Dissertação
|
Tipo de acesso: |
Acesso aberto |
Idioma: |
por |
Instituição de defesa: |
Não Informado pela instituição
|
Programa de Pós-Graduação: |
Não Informado pela instituição
|
Departamento: |
Não Informado pela instituição
|
País: |
Não Informado pela instituição
|
Palavras-chave em Português: |
|
Link de acesso: |
http://www.repositorio.ufc.br/handle/riufc/52214
|
Resumo: |
The multilayer perceptron (MLP) neural network is an important classical architecture of artificial neural networks that finds application in many complex pattern classification and function approximation problems. Despite its wide use, it is known that the performance of the MLP network is strongly dependent on the number of hidden neurons, and the estimation of this hyperparameter is responsible for much of the time spent such topology. In this work we introduced a new technique for quickly estimating the number of hidden neurons in the MLP network using KPCA (Kernel Principal Components Analysis). This technique is applied to three sets of state variables, (i) hidden neuron outputs, (ii) back-propagated errors, and (iii) local gradients back-propagated errors, with the aim of reducing the information redundancy on these variables. A comprehensive comparative evaluation of the proposed method using four real datasets and one synthetic dataset is carried out targeting pattern classification and function approximation problems. The results achieved clearly indicate a superior performance of the proposed technique compared to a previously proposed version that uses linear techniques. |