Segmentação automática de canais ósseos em imagens histológicas utilizando redes neurais completamente convolucionais
Ano de defesa: | 2024 |
---|---|
Autor(a) principal: | |
Orientador(a): | |
Banca de defesa: | |
Tipo de documento: | Dissertação |
Tipo de acesso: | Acesso aberto |
Idioma: | por |
Instituição de defesa: |
Universidade Federal de Uberlândia
Brasil Programa de Pós-graduação em Ciência da Computação |
Programa de Pós-Graduação: |
Não Informado pela instituição
|
Departamento: |
Não Informado pela instituição
|
País: |
Não Informado pela instituição
|
Palavras-chave em Português: | |
Link de acesso: | https://repositorio.ufu.br/handle/123456789/43317 http://doi.org/10.14393/ufu.di.2024.5120 |
Resumo: | In this work, a method for segmenting bone canals in whole-slide histological images is proposed. The method utilizes a neural network originally developed to segment tumors derived from the oral cavity in Hematoxylin and Eosin-stained histological images, and it has been adapted for the context of bone canals. The dataset consists of 65 whole-slide images stained with Hematoxylin and Eosin, extracted from the femur of healthy Wistar rats. With the assistance of a histology expert, the images were analyzed, and their bone canals were manually marked, generating a set of binary masks. Images from both sets (original and binary images) were then divided into 640x640 pixel-sized sub-images. The network was trained and validated with 2037 sub-images. Training also included a data augmentation strategy with seven possible image variations. The method was evaluated by comparing the regions segmented by the network with the specialist's annotations. Accuracy, specificity, sensitivity, precision, Intersection over Union, and f1-score of the resulting segmentations were calculated. Additionally, a comparison was made with another automatic bone canal segmentation method of the literature. The method validated by this work proved to be efficient and superior to the compared method, with an f1-score of 84.9% and Intersection over Union of 73.7%, along with good qualitative results. |