A portable mobile terrestrial system with omnidirectional camera for close range applications

Detalhes bibliográficos
Ano de defesa: 2019
Autor(a) principal: Campos, Mariana Batista
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Tese
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Universidade Estadual Paulista (Unesp)
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: http://hdl.handle.net/11449/182022
Resumo: This research proposes a new technique for close-range mobile data acquisition and processing, consisting of a backpacked light-weight low-cost system. This system integrates an omnidirectional camera and a GPS/IMU system (Global Positioning Systems/Inertial Measurement Unit System) with a tailored photogrammetric processing chain to obtain sensor location and 3D points coordinates using the fisheye images. Omnidirectional systems, based on multiple cameras covering a full-spherical field of view, have been used in close range photogrammetry applications. The use of omnidirectional systems is especially motivated by their 360° coverage around the sensor, which allows more features to be tracked in a single image shot, and by the light weight and low cost of some off-the-shelf omnidirectional cameras. This kind of systems have been named as Personal Mobile Terrestrial System (PMTS). There are only few studies focusing on PMTS using omnidirectional systems. This research assessed the performance of an omnidirectional PMTS based exclusively on low-cost technologies to indirectly estimate forest and outdoor urban features. An accuracy evaluation of GPS and IMU sensors and the development of rigorous photogrammetric processing considering fisheye geometry were performed. PMTS data, i.e fisheye images and navigation data, are input information for the photogrammetric process. The proposed photogrammetric process focused on omnidirectional camera modelling, feature-based matching and bundle adjustment considering fisheye geometry. Experimental assessments showed that the integrated sensor orientation approach using navigation data as the initial information and a rigorous photogrammetric process can increase the trajectory accuracy, especially in obstructed areas, such as dense tropical forests and some urban environments. Overall, using a postprocessing approach with PMTS data and ground control points, the EOPs (Exterior Orientation Parameters) were estimated with standard deviations of 0.1° for sensor attitude and centimetric accuracy for sensor position. The point cloud generated had an accuracy consistent with the range of the pixel size in the object space units in the central part of omnidirectional images (3.5–8 cm). A PMTS real-time application was simulated in laboratory. The PMTS trajectory was estimated with a planialtimetric accuracy of 0.7 m, while, the 3D map was simultaneously computed with an average accuracy ranging between 0.5 m and 2 m.