01986nas a2200265 4500000000100000000000100001008004100002260001200043653002800055653002100083653003200104653002300136653001600159100001800175700001700193700002900210700003300239700001900272245010500291856007900396300001000475490000600485520121500491022001401706 2023 d c03/202310aArtificial Intelligence10aMachine Learning10aNatural Language Processing10aSentiment Analysis10aTransformer1 aSergio Arroni1 aYerai Galán1 aXiomarah Guzmán-Guzmán1 aEdward Rolando Nuñez-Valdez1 aAlberto Gómez00aSentiment Analysis and Classification of Hotel Opinions in Twitter With the Transformer Architecture uhttps://www.ijimai.org/journal/sites/default/files/2023-02/ijimai8_1_5.pdf a53-630 v83 aSentiment analysis is of great importance to parties who are interested is analyzing the public opinion in social networks. In recent years, deep learning, and particularly, the attention-based architecture, has taken over the field, to the point where most research in Natural Language Processing (NLP) has been shifted towards the development of bigger and bigger attention-based transformer models. However, those models are developed to be all-purpose NLP models, so for a concrete smaller problem, a reduced and specifically studied model can perform better. We propose a simpler attention-based model that makes use of the transformer architecture to predict the sentiment expressed in tweets about hotels in Las Vegas. With their relative predicted performance, we compare the similarity of our ranking to the actual ranking in TripAdvisor to those obtained by more rudimentary sentiment analysis approaches, outperforming them with a 0.64121 Spearman correlation coefficient. We also compare our performance to DistilBERT, obtaining faster and more accurate results and proving that a model designed for a particular problem can perform better than models with several millions of trainable parameters. a1989-1660