Detalhes bibliográficos
Ano de defesa: |
2006 |
Autor(a) principal: |
Lopes, Altieres Ribeiro |
Orientador(a): |
Araujo, Regina Borges de
 |
Banca de defesa: |
Não Informado pela instituição |
Tipo de documento: |
Dissertação
|
Tipo de acesso: |
Acesso aberto |
Idioma: |
por |
Instituição de defesa: |
Universidade Federal de São Carlos
|
Programa de Pós-Graduação: |
Programa de Pós-Graduação em Ciência da Computação - PPGCC
|
Departamento: |
Não Informado pela instituição
|
País: |
BR
|
Palavras-chave em Português: |
|
Área do conhecimento CNPq: |
|
Link de acesso: |
https://repositorio.ufscar.br/handle/20.500.14289/337
|
Resumo: |
Systems for emergency preparedness support, especially those for accurate monitoring of physical environments subjected to emergency situations, are valuable resources for companies and civil defense public institutions, since these systems can help avoiding and/or reducing lives and patrimony losses. Most of the existing monitoring systems described in the literature have limitations, such as: no posterior visualization of emergency situations that have occurred; limited to specific types of applications; inaccurate identification of risk situations; etc. In this work, a system was proposed and evaluated that aims to overcome these limitations through the integrated use of wireless actor and sensor networks, context aware computing and virtual reality. The work consisted on the creation, implementation and evaluation of a recording and playing 3D media in which physical environments subjected to emergency situations are deployed sensors with processing and communication resources. These sensors capture and interpret contexts, which are mapped, through a visual language, on a 3D virtual environment that mimics the physical environment. The use of virtual reality for visualization and access in real-time2 or afterwards of situations that are occurring in the physical environment, through a 3D Virtual Environment, can overcome the limitations of hypermedia interfaces or continuous media, like video, when the experiences of the real world are very complex. This work describes the project of a recording and playing system, which allows users to play live experiences gathered from the real world for analysis, evaluation, monitoring and training. The novelty of the system resides in two aspects: it uses an optimized recording technique that saves processing time and storage space; it records scene updating commands independent from 3D Players, allowing the visualization of the collaborative virtual environment (CVE) through any existing 3D web players. In collaboration with the Arts and Communication Department (DAC) of UFSCar, a visual language to prompt identification of emergency situations was created as well as an interface to complex systems. Examples of use include the monitoring of industrial plants, flight rehearsals, petrol exploration platforms, etc. This work is part of a collaborative Project between the Networked Virtual Reality Lab (LRVNet) of the Computer Science Department at UFSCar and PARADISE Lab of SITE at University of Ottawa. |