Observação dinâmica cooperativa
Ano de defesa: | 2006 |
---|---|
Autor(a) principal: | |
Orientador(a): | |
Banca de defesa: | |
Tipo de documento: | Tese |
Tipo de acesso: | Acesso aberto |
Idioma: | por |
Instituição de defesa: |
Universidade Federal de Minas Gerais
UFMG |
Programa de Pós-Graduação: |
Não Informado pela instituição
|
Departamento: |
Não Informado pela instituição
|
País: |
Não Informado pela instituição
|
Palavras-chave em Português: | |
Link de acesso: | http://hdl.handle.net/1843/RVMR-6QGJSQ |
Resumo: | Nowadays, surveillance and security systems based on visual sensors are a very common approach. International terrorism and the growth of urban violence, evoke the new applications of computer vision systems. These systems can be found in international ports, airports, train and subway stations of all great urban centers. A common approach is based on analog closed-circuit television systems with image scene analysis, control and decision centered on human operator. However, modern and sophisticated surveillance based computer vision systems enable the integration of sights from many cameras into a single, consistent scene representation. This thesis addresses the problem of multi-camera target observation. Here, we name this problem of Cooperative Dynamic Observation. We want to find the cameras pose that makes possible the observation of moving targets of interest and their trajectories based on visual information shared between cameras. In this problem, we consider that observation means target identification and tracking; dynamic is the cameras moving ability provided by mobile robots with navigation and positioning performance; and cooperation refers to the practice of self-organized camera positioning based on shared visual information, provided by a communication network. To address this problem, we developed a framework that finds the cameras poses based on visual information acquired and shared by a camera communication network. The framework is modelled in three principal modules: the tracker, the camera position planner, and the target/trajectory association module. The tracker is based on distributed particle filter that fuses, in real time, the targets motion and visual information given by colors. The position planning is done by a observation function with optical and environmental constraints. The targets trajectories between cameras with disjoint field of views are matched by an adaptation of classical EM (Expectation-Maximization)Algorithm. The robustness of framework is analyzed and tested in real experiments with a developed systematic experimental protocol. |