TY - JOUR KW - Artificial Intelligence KW - Machine Learning KW - Natural Language Processing KW - Sentiment Analysis KW - Transformer AU - Sergio Arroni AU - Yerai Galán AU - Xiomarah Guzmán-Guzmán AU - Edward Rolando Nuñez-Valdez AU - Alberto Gómez AB - Sentiment analysis is of great importance to parties who are interested is analyzing the public opinion in social networks. In recent years, deep learning, and particularly, the attention-based architecture, has taken over the field, to the point where most research in Natural Language Processing (NLP) has been shifted towards the development of bigger and bigger attention-based transformer models. However, those models are developed to be all-purpose NLP models, so for a concrete smaller problem, a reduced and specifically studied model can perform better. We propose a simpler attention-based model that makes use of the transformer architecture to predict the sentiment expressed in tweets about hotels in Las Vegas. With their relative predicted performance, we compare the similarity of our ranking to the actual ranking in TripAdvisor to those obtained by more rudimentary sentiment analysis approaches, outperforming them with a 0.64121 Spearman correlation coefficient. We also compare our performance to DistilBERT, obtaining faster and more accurate results and proving that a model designed for a particular problem can perform better than models with several millions of trainable parameters. IS - Special Issue on AI-driven Algorithms and Applications in the Dynamic and Evolving Environments M1 - 1 N2 - Sentiment analysis is of great importance to parties who are interested is analyzing the public opinion in social networks. In recent years, deep learning, and particularly, the attention-based architecture, has taken over the field, to the point where most research in Natural Language Processing (NLP) has been shifted towards the development of bigger and bigger attention-based transformer models. However, those models are developed to be all-purpose NLP models, so for a concrete smaller problem, a reduced and specifically studied model can perform better. We propose a simpler attention-based model that makes use of the transformer architecture to predict the sentiment expressed in tweets about hotels in Las Vegas. With their relative predicted performance, we compare the similarity of our ranking to the actual ranking in TripAdvisor to those obtained by more rudimentary sentiment analysis approaches, outperforming them with a 0.64121 Spearman correlation coefficient. We also compare our performance to DistilBERT, obtaining faster and more accurate results and proving that a model designed for a particular problem can perform better than models with several millions of trainable parameters. PY - 2023 SP - 53 EP - 63 T2 - International Journal of Interactive Multimedia and Artificial Intelligence TI - Sentiment Analysis and Classification of Hotel Opinions in Twitter With the Transformer Architecture UR - https://www.ijimai.org/journal/sites/default/files/2023-02/ijimai8_1_5.pdf VL - 8 SN - 1989-1660 ER -