The Possibilities of Classification of Emotional States Based on User Behavioral Characteristics.
Keywords:
Emotion, Behavioral Characteristics, Valence, Arousal, ClassificationAbstract
The classification of user's emotions based on their behavioral characteristic, namely their keyboard typing and mouse usage pattern is an effective and non-invasive way of gathering user's data without imposing any limitations on their ability to perform tasks. To gather data for the classifier we used an application, the Emotnizer, which we had developed for this purpose. The output of the classification is categorized into 4 emotional categories from Russel's complex circular model - happiness, anger, sadness and the state of relaxation. The sample of the reference database consisted of 50 students. Multiple regression analyses gave us a model, that allowed us to predict the valence and arousal of the subject based on the input from the keyboard and mouse. Upon re-testing with another test group of 50 students and processing the data we found out our Emotnizer program can classify emotional states with an average success rate of 82.31%.
Downloads
References
[1] A. Dzedzickis, A. Kaklauskas, and V. Bucinskas, “Human Emotion Recognition: Review of Sensors and Methods,” Sensors (Switzerland), vol. 20, no. 3, p. 592, Jan. 2020, doi: 10.3390/s20030592.
[2] S. Stöckli, M. Schulte-Mecklenbeck, S. Borer, and A. C. Samson, “Facial expression analysis with AFFDEX and FACET: A validation study,” Behavior Research Methods, vol. 50, no. 4, pp. 1446–1460, Aug. 2018, doi: 10.3758/s13428-017-0996-1.
[3] M. Magdin, T. Sulka, J. Tomanová, and M. Vozár, “Voice Analysis Using PRAAT Software and Classification of User Emotional State,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 5, no. 6, p. 33, 2019, doi: 10.9781/ijimai.2019.03.004.
[4] L. Abramson, I. Marom, R. Petranker, and H. Aviezer, “Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body,” Emotion, vol. 17, no. 3, pp. 557–565, Apr. 2017, doi: 10.1037/emo0000252.
[5] M. Magdin, L. Benko, and Š. Koprda, “A case study of facial emotion classification using affdex,” Sensors (Switzerland), vol. 19, no. 9, May 2019, doi: 10.3390/s19092140.
[6] D. H. Hockenbury and S. E. Hockenbury, Discovering psychology. Worth Publishers, 2007.
[7] P. Ekman and W. V. Friesen, “Constants across cultures in the face and emotion,” Journal of Personality and Social Psychology, vol. 17, no. 2, pp. 124–129, Feb. 1971, doi: 10.1037/h0030377.
[8] P. Ekman and W. V. Friesen, “Measuring facial movement,” Environ. Psychol. Nonverbal Behav., vol. 1, no. 1, pp. 56–75, Sep. 1976, doi: 10.1007/BF01115465.
[9] K. Bahreini, R. Nadolski, and W. Westera, “FILTWAM - A framework for online affective computing in serious games,” in Procedia Computer Science, 2012, vol. 15, pp. 45–52, doi: 10.1016/j.procs.2012.10.057.
[10] G. M. PS Sreeja, “Emotion models: a review,” International Journal of Control Theory and Application, vol. 10, no. 8, pp. 651–657, 2017.
[11] J. A. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, vol. 39, no. 6, pp. 1161–1178, Dec. 1980, doi: 10.1037/ h0077714.
[12] J. Posner, J. A. Russell, and B. S. Peterson, “The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology,” Development and Psychopathology, vol. 17, no. 3, pp. 715–734, Jul. 2005, doi: 10.1017/S0954579405050340.
[13] M. Khari, A. K. Garg, R. Gonzalez-Crespo, and E. Verdú, “Gesture Recognition of RGB and RGB-D Static Images Using Convolutional Neural Networks,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 5, no. 7, p. 22, 2019, doi: 10.9781/ijimai.2019.09.002.
[14] T. Kanade, “Picture processing system by computer complex and recognition of human faces,” 1974.
[15] D. Ghimire and J. Lee, “A robust face detection method based on skin color and edges,” Journal of Information Processing Systems, vol. 9, no. 1, pp. 141–156, 2013, doi: 10.3745/JIPS.2013.9.1.141.
[16] S. H. Yoo, S. K. Oh, and W. Pedrycz, “Optimized face recognition algorithm using radial basis function neural networks and its practical applications,” Neural Networks, vol. 69, pp. 111–125, Sep. 2015, doi: 10.1016/j.neunet.2015.05.001.
[17] L. Surace, M. Patacchiola, E. B. Sönmez, W. Spataro, and A. Cangelosi, “Emotion recognition in the wild using deep neural networks and Bayesian classifiers,” in ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction, 2017, vol. 2017-January, pp. 593– 597, doi: 10.1145/3136755.3143015.
[18] K. Eroğlu, T. Kayıkçıoğlu, and O. Osman, “Effect of brightness of visual stimuli on EEG signals,” Behavioural Brain Research, vol. 382, p. 112486, Mar. 2020, doi: 10.1016/j.bbr.2020.112486.
[19] A. Meikleham and R. Hugo, “Understanding informal feedback to improve online course design,” European Journal of Engineering Education, vol. 45, no. 1, pp. 4–21, Jan. 2020, doi: 10.1080/03043797.2018.1563051.
[20] L. Li, X. Zhu, Y. Hao, S. Wang, X. Gao, and Q. Huang, “A hierarchical CNN-RNN approach for visual emotion classification,” ACM Transactions on Multimedia Computing, Communications and Applications, vol. 15, no. 3s, pp. 1–17, Dec. 2019, doi: 10.1145/3359753.
[21] N. Y. Oktavia, A. D. Wibawa, E. S. Pane, and M. H. Purnomo, “Human Emotion Classification Based on EEG Signals Using Naïve Bayes Method,” in Proceedings - 2019 International Seminar on Application for Technology of Information and Communication: Industry 4.0: Retrospect, Prospect, and Challenges, iSemantic 2019, 2019, pp. 319–324, doi: 10.1109/ISEMANTIC.2019.8884224.
[22] D. Ayata, Y. Yaslan, M. K.-I. U.-J. Of, and U. 2017, “Emotion recognition via galvanic skin response: Comparison of machine learning algorithms and feature extraction methods,” Istanbul University – Journal of Electrical and Electronics Engineering, vol. 17, no. 1, p. 3129, 2017.
[23] M. Dedeoglu, J. Zhang, and R. Liang, “Emotion classification based on audiovisual information fusion using deep learning,” in IEEE International Conference on Data Mining Workshops, ICDMW, 2019, vol. 2019-November, pp. 131–134, doi: 10.1109/ICDMW.2019.00029.
[24] A. A. Akinrinmade, E. Adetiba, J. A. Badejo, and A. A. Atayero, “Creation of a Nigerian Voice Corpus for Indigenous Speaker Recognition,” Journal of Physics: Conference Series, vol. 1378, no. 3, 2019, doi: 10.1088/1742- 6596/1378/3/032011.
[25] S. D. Morgan, “Categorical and dimensional ratings of emotional speech: Behavioral findings from the Morgan emotional speech set,” Journal of Speech, Language and Hearing Research, vol. 62, no. 11, pp. 4015–4029, Nov. 2019, doi: 10.1044/2019_JSLHR-S-19-0144.
[26] S. Gong, H. Mao, Y. Wang, and A. Xu, “Machine learning in humancomputer nonverbal communication,” in NeuroManagement and Intelligent Computing Method on Multimodal Interaction, AICMI 2019, 2019, pp. 1–7, doi: 10.1145/3357160.3357670.
[27] C. Epp, M. Lippold, and R. L. Mandryk, “Identifying emotional states using keystroke dynamics,” in Conference on Human Factors in Computing Systems - Proceedings, 2011, pp. 715–724, doi: 10.1145/1978942.1979046.
[28] P.-M. Lee, W.-H. Tsui, and T.-C. Hsiao, “The Influence of Emotion on Keyboard Typing: An Experimental Study Using Auditory Stimuli,” PLoS One, vol. 10, no. 6, p. e0129056, Jun. 2015, doi: 10.1371/journal. pone.0129056.
[29] A. Pentel, “Employing think-aloud protocol to connect user emotions and mouse movements,” in IISA 2015 - 6th International Conference on Information, Intelligence, Systems and Applications, 2016, doi: 10.1109/IISA.2015.7387970.
[30] E. L. Rosenberg and P. Ekman, “Emotion: Methods of study.,” in Encyclopedia of Psychology, Vol. 3., American Psychological Association, 2000, pp. 171–175.
Downloads
Published
-
Abstract197
-
PDF255






