DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors
Main Author: | |
---|---|
Publication Date: | 2021 |
Other Authors: | , , , |
Format: | Article |
Language: | eng |
Source: | Repositório Institucional da Udesc |
dARK ID: | ark:/33523/00130000067vw |
Download full: | https://repositorio.udesc.br/handle/UDESC/3621 |
Summary: | © 2001-2012 IEEE.This paper presents a novel approach for creating virtual LiDAR scanners through the active segmentation of point clouds. The method employs top-view point cloud segmentation in virtual LiDAR sensors that can be applied to the intelligent behavior of autonomous agents. Segmentation is correlated with the visual tracking of the agent for localization in the environment and point cloud. Virtual LiDAR sensors with different characteristics and positions can then be generated. This method is referred to as the DepthLiDAR approach, and is rigorously evaluated to quantify its performance and determine its advantages and limitations. An extensive set of experiments is conducted using real and virtual LiDAR sensors to compare both approaches. The objective is to propose a novel method to incorporate spatial perception in warehouses, aiming to achieve Industry 4.0. Thus, it is tested in a low-scale warehouse to incorporate realistic features. The analysis of the experiments shows a measurement improvement of 52.24% compared to the conventional LiDAR. |
id |
UDESC-2_650c28e22e558f216226370f410d8946 |
---|---|
oai_identifier_str |
oai:repositorio.udesc.br:UDESC/3621 |
network_acronym_str |
UDESC-2 |
network_name_str |
Repositório Institucional da Udesc |
repository_id_str |
6391 |
spelling |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors© 2001-2012 IEEE.This paper presents a novel approach for creating virtual LiDAR scanners through the active segmentation of point clouds. The method employs top-view point cloud segmentation in virtual LiDAR sensors that can be applied to the intelligent behavior of autonomous agents. Segmentation is correlated with the visual tracking of the agent for localization in the environment and point cloud. Virtual LiDAR sensors with different characteristics and positions can then be generated. This method is referred to as the DepthLiDAR approach, and is rigorously evaluated to quantify its performance and determine its advantages and limitations. An extensive set of experiments is conducted using real and virtual LiDAR sensors to compare both approaches. The objective is to propose a novel method to incorporate spatial perception in warehouses, aiming to achieve Industry 4.0. Thus, it is tested in a low-scale warehouse to incorporate realistic features. The analysis of the experiments shows a measurement improvement of 52.24% compared to the conventional LiDAR.2024-12-06T11:29:41Z2021info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articlep. 19047 - 190571558-174810.1109/JSEN.2021.3088007https://repositorio.udesc.br/handle/UDESC/3621ark:/33523/00130000067vwIEEE Sensors Journal2117Limeira M.Piardi L.Kalempa V.C.*Leitao P.De Oliveira A.S.engreponame:Repositório Institucional da Udescinstname:Universidade do Estado de Santa Catarina (UDESC)instacron:UDESCinfo:eu-repo/semantics/openAccess2024-12-07T20:42:14Zoai:repositorio.udesc.br:UDESC/3621Biblioteca Digital de Teses e Dissertaçõeshttps://pergamumweb.udesc.br/biblioteca/index.phpPRIhttps://repositorio-api.udesc.br/server/oai/requestri@udesc.bropendoar:63912024-12-07T20:42:14Repositório Institucional da Udesc - Universidade do Estado de Santa Catarina (UDESC)false |
dc.title.none.fl_str_mv |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
title |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
spellingShingle |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors Limeira M. |
title_short |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
title_full |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
title_fullStr |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
title_full_unstemmed |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
title_sort |
DepthLiDAR: Active Segmentation of Environment Depth Map into Mobile Sensors |
author |
Limeira M. |
author_facet |
Limeira M. Piardi L. Kalempa V.C.* Leitao P. De Oliveira A.S. |
author_role |
author |
author2 |
Piardi L. Kalempa V.C.* Leitao P. De Oliveira A.S. |
author2_role |
author author author author |
dc.contributor.author.fl_str_mv |
Limeira M. Piardi L. Kalempa V.C.* Leitao P. De Oliveira A.S. |
description |
© 2001-2012 IEEE.This paper presents a novel approach for creating virtual LiDAR scanners through the active segmentation of point clouds. The method employs top-view point cloud segmentation in virtual LiDAR sensors that can be applied to the intelligent behavior of autonomous agents. Segmentation is correlated with the visual tracking of the agent for localization in the environment and point cloud. Virtual LiDAR sensors with different characteristics and positions can then be generated. This method is referred to as the DepthLiDAR approach, and is rigorously evaluated to quantify its performance and determine its advantages and limitations. An extensive set of experiments is conducted using real and virtual LiDAR sensors to compare both approaches. The objective is to propose a novel method to incorporate spatial perception in warehouses, aiming to achieve Industry 4.0. Thus, it is tested in a low-scale warehouse to incorporate realistic features. The analysis of the experiments shows a measurement improvement of 52.24% compared to the conventional LiDAR. |
publishDate |
2021 |
dc.date.none.fl_str_mv |
2021 2024-12-06T11:29:41Z |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
1558-1748 10.1109/JSEN.2021.3088007 https://repositorio.udesc.br/handle/UDESC/3621 |
dc.identifier.dark.fl_str_mv |
ark:/33523/00130000067vw |
identifier_str_mv |
1558-1748 10.1109/JSEN.2021.3088007 ark:/33523/00130000067vw |
url |
https://repositorio.udesc.br/handle/UDESC/3621 |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
IEEE Sensors Journal 21 17 |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
p. 19047 - 19057 |
dc.source.none.fl_str_mv |
reponame:Repositório Institucional da Udesc instname:Universidade do Estado de Santa Catarina (UDESC) instacron:UDESC |
instname_str |
Universidade do Estado de Santa Catarina (UDESC) |
instacron_str |
UDESC |
institution |
UDESC |
reponame_str |
Repositório Institucional da Udesc |
collection |
Repositório Institucional da Udesc |
repository.name.fl_str_mv |
Repositório Institucional da Udesc - Universidade do Estado de Santa Catarina (UDESC) |
repository.mail.fl_str_mv |
ri@udesc.br |
_version_ |
1842258092416303104 |