Detalhes bibliográficos
Ano de defesa: |
2007 |
Autor(a) principal: |
Galvão, Sebastian David Carvalho de Oliveira |
Orientador(a): |
Hruschka Júnior, Estevam Rafael
 |
Banca de defesa: |
Não Informado pela instituição |
Tipo de documento: |
Dissertação
|
Tipo de acesso: |
Acesso aberto |
Idioma: |
por |
Instituição de defesa: |
Universidade Federal de São Carlos
|
Programa de Pós-Graduação: |
Programa de Pós-Graduação em Ciência da Computação - PPGCC
|
Departamento: |
Não Informado pela instituição
|
País: |
BR
|
Palavras-chave em Português: |
|
Área do conhecimento CNPq: |
|
Link de acesso: |
https://repositorio.ufscar.br/handle/20.500.14289/366
|
Resumo: |
The Knowledge Discovery in Databases (KDD) techniques have grown from the need for obtain more information about the data stored by organizations, such as, enterprise companies and research institutes. Bayesian Networks (BNs) can be considered as a probabilistic reasoning based model to represent knowledge and are very adequate to KDD tasks. In the last years, Bayesian Networks (BNs) have been applied in many supervised and unsupervised learning successful applications. The process to induce BNs and Bayesian Classifiers (BCs) from data tries do identify a BN (or a BC) able to represent the relationship among the variables of a certain data set. However, this is a NP-complete problem and, thus, its search space may become very large in most applications. That is the reason why many algorithms explore some way to reduce the search space in order to make the learning process computationally viable. In this master s thesis a new Conditional Independence based approach to induce BCs from data is proposed and implemented. Such approach is based on the Markov Blanket concept in order to impose some constraints and optimize the traditional PC learning algorithm. Experiments performed with ten data sets revealed that the proposed approach tends to execute fewer comparisons than the traditional PC. The experiments also show that the implemented algorithm produce competitive classification rates when compared with both, PC and NaiveBayes |