A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies
Main Author: | |
---|---|
Publication Date: | 2018 |
Other Authors: | , , , |
Format: | Article |
Language: | eng |
Source: | Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
Download full: | http://hdl.handle.net/10400.8/3645 |
Summary: | The wireless capsule endoscopy has revolutionized early diagnosis of small bowel diseases. However, a single examination has up to 10 h of video and requires between 30–120 min to read. Computational methods are needed to increase both efficiency and accuracy of the diagnosis. In this paper, an evaluation of deep learning U-Net architecture is presented, to detect and segment red lesions in the small bowel. Its results were compared with those obtained from the literature review. To make the evaluation closer to those used in clinical environments, the U-Net was also evaluated in an annotated sequence by using the Suspected Blood Indicator tool (SBI). Results found that detection and segmentation using U-Net outperformed both the algorithms used in the literature review and the SBI tool. |
id |
RCAP_d09da6b07255f05f99e731870a5dce9a |
---|---|
oai_identifier_str |
oai:iconline.ipleiria.pt:10400.8/3645 |
network_acronym_str |
RCAP |
network_name_str |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
repository_id_str |
https://opendoar.ac.uk/repository/7160 |
spelling |
A Deep Learning Approach for Red Lesions Detection in Video Capsule EndoscopiesLesion detectionGastrointestinal bleedingMachine learningCapsule endoscopyDeep learningU-NetComputer ScienceImage AnalysisThe wireless capsule endoscopy has revolutionized early diagnosis of small bowel diseases. However, a single examination has up to 10 h of video and requires between 30–120 min to read. Computational methods are needed to increase both efficiency and accuracy of the diagnosis. In this paper, an evaluation of deep learning U-Net architecture is presented, to detect and segment red lesions in the small bowel. Its results were compared with those obtained from the literature review. To make the evaluation closer to those used in clinical environments, the U-Net was also evaluated in an annotated sequence by using the Suspected Blood Indicator tool (SBI). Results found that detection and segmentation using U-Net outperformed both the algorithms used in the literature review and the SBI tool.Springer, ChamRepositório IC-OnlineCoelho, PauloPereira, AnaLeite, ArgentinaSalgado, MartaCunha, António2018-11-13T17:12:12Z20182018-01-01T00:00:00Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articleapplication/pdfhttp://hdl.handle.net/10400.8/3645eng978-3-319-92999-6https://doi.org/10.1007/978-3-319-93000-8_63info:eu-repo/semantics/openAccessreponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiainstacron:RCAAP2025-02-25T15:12:04Zoai:iconline.ipleiria.pt:10400.8/3645Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireinfo@rcaap.ptopendoar:https://opendoar.ac.uk/repository/71602025-05-28T20:51:03.136778Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiafalse |
dc.title.none.fl_str_mv |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
title |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
spellingShingle |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies Coelho, Paulo Lesion detection Gastrointestinal bleeding Machine learning Capsule endoscopy Deep learning U-Net Computer Science Image Analysis |
title_short |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
title_full |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
title_fullStr |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
title_full_unstemmed |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
title_sort |
A Deep Learning Approach for Red Lesions Detection in Video Capsule Endoscopies |
author |
Coelho, Paulo |
author_facet |
Coelho, Paulo Pereira, Ana Leite, Argentina Salgado, Marta Cunha, António |
author_role |
author |
author2 |
Pereira, Ana Leite, Argentina Salgado, Marta Cunha, António |
author2_role |
author author author author |
dc.contributor.none.fl_str_mv |
Repositório IC-Online |
dc.contributor.author.fl_str_mv |
Coelho, Paulo Pereira, Ana Leite, Argentina Salgado, Marta Cunha, António |
dc.subject.por.fl_str_mv |
Lesion detection Gastrointestinal bleeding Machine learning Capsule endoscopy Deep learning U-Net Computer Science Image Analysis |
topic |
Lesion detection Gastrointestinal bleeding Machine learning Capsule endoscopy Deep learning U-Net Computer Science Image Analysis |
description |
The wireless capsule endoscopy has revolutionized early diagnosis of small bowel diseases. However, a single examination has up to 10 h of video and requires between 30–120 min to read. Computational methods are needed to increase both efficiency and accuracy of the diagnosis. In this paper, an evaluation of deep learning U-Net architecture is presented, to detect and segment red lesions in the small bowel. Its results were compared with those obtained from the literature review. To make the evaluation closer to those used in clinical environments, the U-Net was also evaluated in an annotated sequence by using the Suspected Blood Indicator tool (SBI). Results found that detection and segmentation using U-Net outperformed both the algorithms used in the literature review and the SBI tool. |
publishDate |
2018 |
dc.date.none.fl_str_mv |
2018-11-13T17:12:12Z 2018 2018-01-01T00:00:00Z |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
http://hdl.handle.net/10400.8/3645 |
url |
http://hdl.handle.net/10400.8/3645 |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
978-3-319-92999-6 https://doi.org/10.1007/978-3-319-93000-8_63 |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
application/pdf |
dc.publisher.none.fl_str_mv |
Springer, Cham |
publisher.none.fl_str_mv |
Springer, Cham |
dc.source.none.fl_str_mv |
reponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia instacron:RCAAP |
instname_str |
FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia |
instacron_str |
RCAAP |
institution |
RCAAP |
reponame_str |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
collection |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
repository.name.fl_str_mv |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia |
repository.mail.fl_str_mv |
info@rcaap.pt |
_version_ |
1833598910232264704 |