Visual novelty detection for autonomous inspection robots

Detalhes bibliográficos
Ano de defesa: 2006
Autor(a) principal: Vieira Neto, Hugo
Orientador(a): Nehmzow, Ulrich
Banca de defesa: Não Informado pela instituição
Tipo de documento: Tese
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: University of Essex
Curitiba
Programa de Pós-Graduação: Department of Computer Science
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: http://repositorio.utfpr.edu.br/jspui/handle/1/644
Resumo: Mobile robot applications that involve automated exploration and inspection of environments are often dependant on novelty detection, the ability to differentiate between common and uncommon perceptions. Because novelty can be anything that deviates from the normal context, we argue that in order to implement a novelty filter it is necessary to exploit the robot's sensory data from the ground up, building models of normality rather than abnormality. In this work we use unrestricted colour visual data as perceptual input to on-line incremental learning algorithms. Unlike other sensor modalities, vision can provide a variety of useful information about the environment through massive amounts of data, which often need to be reduced for realtime operation. Here we use mechanisms of visual attention to select candidate image regions to be encoded and fed to higher levels of processing, enabling the localisation of novel features within the input image frame. An extensive series of experiments using visual input, obtained by a real mobile robot interacting with laboratory and medium-scale real world environments, are used to discuss different visual novelty filter configurations. We compare performance and functionality of novelty detection mechanisms based on the Grow-When-Required neural network and incremental Principal Component Analysis. Results are assessed using both qualitative and quantitative methods, demonstrating advantages and disadvantages of each investigated approach.