Avaliação da cobertura de gordura de novilhas e vacas usando visão computacional

Detalhes bibliográficos
Ano de defesa: 2021
Autor(a) principal: Santos, Elton Fernandes dos
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Dissertação
Tipo de acesso: Acesso aberto
Idioma: por
Instituição de defesa: Universidade Federal de Mato Grosso
Brasil
Instituto de Ciências Agrárias e Ambientais (ICAA) – Sinop
UFMT CUS - Sinop
Programa de Pós-Graduação em Zootecnia
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: http://ri.ufmt.br/handle/1/4860
Resumo: Although there is a demand for lean meat, this does not outweigh the importance of fat coverage in its function of preserving the organoleptic characteristics of the meat during the cooling process. Structural changes and biochemical processes that take place in the first 24 hours after death directly impact meat quality, and subcutaneous fat coverage minimizes this impact, adding value to the final product. However, the evaluation methods classify the carcasses according to the quality of finish, not the percentage of the carcass protected by the fat coating. The aim of this study was to evaluate computer vision methods to estimate fat coverage in bovine carcasses, in addition to verifying the relationship of fat coverage with the SEUROP classification system. A real-time video processing routine was proposed to calculate the percentage of subcutaneous fat coverage. The method was able to map the regions of the carcass covered with fat with 98% accuracy. The ratings showed low correlation with the percentage of fat coverage ( 2 = 0.3 for the refrigerator system and 2 = 0.6 SEUROP system), there was also low agreement between the assessments of the four experts (Kappa < 0.3). Finally, a deep learning model was proposed to carry out the classification of the slaughterhouse, a model showing 82% accuracy.