Exploring transformers for aspect based sentiment analysis

Bibliographic Details
Main Author: Poudel, Roshan
Publication Date: 2023
Format: Master thesis
Language: eng
Source: Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
Download full: http://hdl.handle.net/10773/41732
Summary: In recent years, sentiment analysis has gained significant attention for its wide range of applications in various domains. Aspect-based sentiment analysis (ABSA) is a challenging task within sentiment analysis that aims to identify the sentiment polarity towards specific aspects or attributes of a target entity in a given text. Transformers, a type of deep neural network architecture, have shown promising results in many natural language processing (NLP) tasks, including sentiment analysis. This dissertation explores the effectiveness of a BERT+BiLSTM+CRF model for ABSA and investigates the impact of model sizes and layer freezing. Several experiments are performed using different ABSA datasets, comparing the results with existing state-of-the-art models. The findings indicate that increasing model size is not necessarily the best approach to improve performance, and freezing a subset of layers can lead to comparable results with reduced computational requirements. The study also highlights the impact of pretraining methods and datasets in downstream tasks. The developed end-to-end system is modular, allowing for the replacement of BERT with any other transformer-based model based on the use case. The research contributes to the understanding of transformer-based models for ABSA and provides insights for future studies in this field.
id RCAP_9a16d9736bda698e47ece52bf2ba05c6
oai_identifier_str oai:ria.ua.pt:10773/41732
network_acronym_str RCAP
network_name_str Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
repository_id_str https://opendoar.ac.uk/repository/7160
spelling Exploring transformers for aspect based sentiment analysisSentiment analysisAspect-based sentiment analysis (ABSA)TransformersDeep learningNatural language processing (NLP)In recent years, sentiment analysis has gained significant attention for its wide range of applications in various domains. Aspect-based sentiment analysis (ABSA) is a challenging task within sentiment analysis that aims to identify the sentiment polarity towards specific aspects or attributes of a target entity in a given text. Transformers, a type of deep neural network architecture, have shown promising results in many natural language processing (NLP) tasks, including sentiment analysis. This dissertation explores the effectiveness of a BERT+BiLSTM+CRF model for ABSA and investigates the impact of model sizes and layer freezing. Several experiments are performed using different ABSA datasets, comparing the results with existing state-of-the-art models. The findings indicate that increasing model size is not necessarily the best approach to improve performance, and freezing a subset of layers can lead to comparable results with reduced computational requirements. The study also highlights the impact of pretraining methods and datasets in downstream tasks. The developed end-to-end system is modular, allowing for the replacement of BERT with any other transformer-based model based on the use case. The research contributes to the understanding of transformer-based models for ABSA and provides insights for future studies in this field.Nos últimos anos, a análise de sentimentos ganhou atenção significativa por sua ampla gama de aplicações em vários domínios. A análise de sentimentos baseada em aspetos (ABSA) é uma tarefa desafiadora dentro da análise de sentimentos que visa identificar a polaridade do sentimento em relação a aspetos ou atributos específicos de uma entidade alvo em um determinado texto. Os transformadores, um tipo de arquitetura de rede neural profunda, mostraram resultados promissores em muitas tarefas de processamento de linguagem natural (NLP), incluindo análise de sentimento. Esta dissertação explora a eficácia de um modelo BERT+BiLSTM+CRF para ABSA e investiga o impacto dos tamanhos dos modelos e congelamento de camadas. Foram realizadas várias experiências usando diferentes conjuntos de dados ABSA, comparando os resultados com modelos de última geração existentes. Os resultados indicam que aumentar o tamanho do modelo não é necessariamente a melhor abordagem para melhorar o desempenho, e congelar um subconjunto de camadas pode levar a resultados comparáveis com requisitos computacionais reduzidos. O estudo também destaca o impacto dos métodos de pré-treino e conjuntos de dados em tarefas posteriores. O sistema end-to-end desenvolvido é modular, permitindo a substituição do modelo BERT por qualquer outro modelo baseado em transformador baseado no caso de uso. A pesquisa contribui para a compreensão de modelos baseados em transformadores para ABSA e fornece indicadores para estudos futuros neste campo.2024-04-29T09:57:46Z2023-07-14T00:00:00Z2023-07-14info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesisapplication/pdfhttp://hdl.handle.net/10773/41732engPoudel, Roshaninfo:eu-repo/semantics/openAccessreponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiainstacron:RCAAP2024-05-06T04:56:50Zoai:ria.ua.pt:10773/41732Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireinfo@rcaap.ptopendoar:https://opendoar.ac.uk/repository/71602025-05-28T14:24:23.604404Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiafalse
dc.title.none.fl_str_mv Exploring transformers for aspect based sentiment analysis
title Exploring transformers for aspect based sentiment analysis
spellingShingle Exploring transformers for aspect based sentiment analysis
Poudel, Roshan
Sentiment analysis
Aspect-based sentiment analysis (ABSA)
Transformers
Deep learning
Natural language processing (NLP)
title_short Exploring transformers for aspect based sentiment analysis
title_full Exploring transformers for aspect based sentiment analysis
title_fullStr Exploring transformers for aspect based sentiment analysis
title_full_unstemmed Exploring transformers for aspect based sentiment analysis
title_sort Exploring transformers for aspect based sentiment analysis
author Poudel, Roshan
author_facet Poudel, Roshan
author_role author
dc.contributor.author.fl_str_mv Poudel, Roshan
dc.subject.por.fl_str_mv Sentiment analysis
Aspect-based sentiment analysis (ABSA)
Transformers
Deep learning
Natural language processing (NLP)
topic Sentiment analysis
Aspect-based sentiment analysis (ABSA)
Transformers
Deep learning
Natural language processing (NLP)
description In recent years, sentiment analysis has gained significant attention for its wide range of applications in various domains. Aspect-based sentiment analysis (ABSA) is a challenging task within sentiment analysis that aims to identify the sentiment polarity towards specific aspects or attributes of a target entity in a given text. Transformers, a type of deep neural network architecture, have shown promising results in many natural language processing (NLP) tasks, including sentiment analysis. This dissertation explores the effectiveness of a BERT+BiLSTM+CRF model for ABSA and investigates the impact of model sizes and layer freezing. Several experiments are performed using different ABSA datasets, comparing the results with existing state-of-the-art models. The findings indicate that increasing model size is not necessarily the best approach to improve performance, and freezing a subset of layers can lead to comparable results with reduced computational requirements. The study also highlights the impact of pretraining methods and datasets in downstream tasks. The developed end-to-end system is modular, allowing for the replacement of BERT with any other transformer-based model based on the use case. The research contributes to the understanding of transformer-based models for ABSA and provides insights for future studies in this field.
publishDate 2023
dc.date.none.fl_str_mv 2023-07-14T00:00:00Z
2023-07-14
2024-04-29T09:57:46Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/masterThesis
format masterThesis
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/10773/41732
url http://hdl.handle.net/10773/41732
dc.language.iso.fl_str_mv eng
language eng
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.source.none.fl_str_mv reponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia
instacron:RCAAP
instname_str FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia
instacron_str RCAAP
institution RCAAP
reponame_str Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
collection Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
repository.name.fl_str_mv Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia
repository.mail.fl_str_mv info@rcaap.pt
_version_ 1833594569920348160