Innovative Measures of Verhulst Diagram for Emotion Recognition using Eye-Blinking Variability
International Clinical Neuroscience Journal,
Vol. 10 No. 1 (2023),
15 Dey 2023
,
Page e6
Abstract
Background: The human body continuously reveals the status of several organs through biomedical signals. Over time, biomedical signal acquisition, monitoring, and analysis have captured the attention of many scientists for further prediction, diagnosis, decision-making, and recognition. Recently, building an intelligent emotion recognition system has become a challenging issue using the application of signal processing. Frequently, human emotion classification was proposed utilizing the internal body status in dealing with affective provocations. However, external states, such as eye movements, have been claimed to convey practical information about the participant’s emotions. In this study, we proposed an automatic emotion recognition scheme through the analysis of a single-modal eye-blinking variability.
Methods: Initially, the signal was transformed into a 2D space using the Verhulst diagram, a simple analysis based on the signal’s dynamics. Next, some innovative features were introduced to characterize the maps. Then, the extracted measures were inputted to the support vector machine (SVM) and k-nearest neighbor (kNN). The former classifier was evaluated with three kernel functions, including RBF, linear, and polynomial. The latter performances were examined with different values for k. Moreover, the classification results were assessed in two feature-set partitioning modes: a 5-fold and 10-fold cross-validation.
Results: The results showed a statistically significant difference between neutral/fear and neutral/sadness for all Verhulst indices. Also, the average values of these characteristics were higher for fear and sadness than those of other emotions. Our results indicated a maximum rate of 100% for the fear/neutral classification. Therefore, the suggested Verhulst-based approach was supremely talented in emotion classification and analysis using eye-blinking signals.
Conclusion: The novel biomarkers set the scene for designing a simple accurate emotion recognition system. Additionally, this experiment could fortify the territory of ocular affective computing, and open a new horizon for diagnosing or treating various emotion deficiency disorders.
- Verhulst Diagram; Human emotion recognition; Eye-blinking; Dynamics.
How to Cite
References
Khare SK, Bajaj V. A hybrid decision support system for automatic detection of schizophrenia using EEG signals. Comput Biol Med. 2022;141:105028. doi: 10.1016/j. compbiomed.2021.105028.
Li Z, Wu X, Xu X, Wang H, Guo Z, Zhan Z, et al. The Recognition of multiple anxiety levels based on electroencephalograph. IEEE Trans Affect Comput. 2022;13(1):519-29. doi: 10.1109/ taffc.2019.2936198.
Saikia MJ, Besio WG, Mankodiya K. The validation of a portable functional nirs system for assessing mental workload. Sensors (Basel). 2021;21(11):3810. doi: 10.3390/s21113810.
Goshvarpour A, Goshvarpour A. Human identification using information theory-based indices of ECG characteristic points. Expert Syst Appl. 2019;127:25-34. doi: 10.1016/j. eswa.2019.02.038.
Spezialetti M, Placidi G, Rossi S. Emotion recognition for human-robot interaction: recent advances and future perspectives. Front Robot AI. 2020;7:532279. doi: 10.3389/ frobt.2020.532279.
Al-Nafjan A, Hosny M, Al-Ohali Y, Al-Wabil A. Review and classification of emotion recognition based on EEG brain-computer interface system research: a systematic review. Appl Sci. 2017;7(12):1239. doi: 10.3390/app7121239.
Goshvarpour A, Goshvarpour A. Poincaré’s section analysis for PPG-based automatic emotion recognition. Chaos Solitons Fractals. 2018;114:400-7. doi: 10.1016/j.chaos.2018.07.035.
Gruebler A, Suzuki K. Design of a wearable device for reading positive expressions from facial EMG signals. IEEE Trans Affect Comput. 2014;5(3):227-37. doi: 10.1109/ taffc.2014.2313557.
Wang CA, Baird T, Huang J, Coutinho JD, Brien DC, Munoz DP. Arousal effects on pupil size, heart rate, and skin conductance in an emotional face task. Front Neurol. 2018;9:1029. doi: 10.3389/fneur.2018.01029.
Gatti E, Calzolari E, Maggioni E, Obrist M. Emotional ratings and skin conductance response to visual, auditory and haptic stimuli. Sci Data. 2018;5:180120. doi: 10.1038/ sdata.2018.120.
Lu Y, Zheng WL, Li B, Lu BL. Combining eye movements and EEG to enhance emotion recognition. In: IJCAI’15: Proceedings of the 24th International Conference on Artificial Intelligence. Buenos Aires, Argentina: AAAI Press; 2015.
Zheng WL, Liu W, Lu Y, Lu BL, Cichocki A. EmotionMeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern. 2019;49(3):1110-22. doi: 10.1109/ tcyb.2018.2797176.
Akay M. Nonlinear Biomedical Signal Processing: Dynamic Analysis and Modeling. New York: IEEE Press; 2001.
Yang S. Nonlinear signal classification using geometric statistical features in state space. Electron Lett. 2004;40(12):780-1. doi: 10.1049/el:20040498.
Yang S. Nonlinear signal classification in the framework of high-dimensional shape analysis in reconstructed state space. IEEE Trans Circuits Syst II Express Briefs. 2005;52(8):512-6. doi: 10.1109/tcsii.2005.849038.
Tuncer T, Dogan S, Subasi A. A new fractal pattern feature generation function based emotion recognition method using EEG. Chaos Solitons Fractals. 2021;144:110671. doi: 10.1016/j.chaos.2021.110671.
Salankar N, Mishra P, Garg L. Emotion recognition from EEG signals using empirical mode decomposition and second-order difference plot. Biomed Signal Process Control. 2021;65:102389. doi: 10.1016/j.bspc.2020.102389.
Hou HR, Zhang XN, Meng QH. Odor-induced emotion recognition based on average frequency band division of EEG signals. J Neurosci Methods. 2020;334:108599. doi: 10.1016/j.jneumeth.2020.108599.
Pane ES, Wibawa AD, Purnomo MH. Improving the accuracy of EEG emotion recognition by combining valence lateralization and ensemble learning with tuning parameters. Cogn Process. 2019;20(4):405-17. doi: 10.1007/s10339-019- 00924-z.
Özerdem MS, Polat H. Emotion recognition based on EEG features in movie clips with channel selection. Brain Inform. 2017;4(4):241-52. doi: 10.1007/s40708-017-0069-3.
Xing B, Zhang H, Zhang K, Zhang L, Wu X, Shi X, et al. Exploiting EEG signals and audiovisual feature fusion for video emotion recognition. IEEE Access. 2019;7:59844-61. doi: 10.1109/access.2019.2914872.
Zheng WL, Zhu JY, Lu BL. Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput. 2019;10(3):417-29. doi: 10.1109/taffc.2017.2712143.
Cohn JF, Xiao J, Moriyama T, Ambadar Z, Kanade T. Automatic recognition of eye blinking in spontaneously occurring behavior. Behav Res Methods Instrum Comput. 2003;35(3):420-8. doi: 10.3758/bf03195519.
Alghowinem S, Alshehri M, Goecke R, Wagner M. Exploring eye activity as an indication of emotional states using an eye-tracking sensor. In: Chen L, Kapoor S, Bhatia R, eds. Intelligent Systems for Science and Information. Vol 542. Cham: Springer; 2014. p. 261-76. doi: 10.1007/978-3-319- 04702-7_15.
Goshvarpour A, Goshvarpour A. Novel high-dimensional phase space features for EEG emotion recognition. Signal Image Video Process. 2023;17(2):417-25. doi: 10.1007/ s11760-022-02248-6.
Goshvarpour A, Goshvarpour A. Human emotion recognition using polar-based lagged Poincare plot indices of eye-blinking data. Int J Comput Intell Appl. 2021;20(4):2150023. doi: 10.1142/s1469026821500231.
Orlando G, Pisarchik AN, Stoop R. Nonlinearities in Economics: An Interdisciplinary Approach to Economic Dynamics, Growth and Cycles. Cham, Switzerland: Springer; 2021. doi: 10.1007/978-3-030-70982-2.
Neath RC, Johnson MS. Discrimination and classification. In: Peterson P, Baker E, B. McGaw B, eds. International Encyclopedia of Education. 3rd ed. Elsevier; 2010. p. 135-41. doi: 10.1016/b978-0-08-044894-7.01312-9.
Carelli L, Solca F, Tagini S, Torre S, Verde F, Ticozzi N, et al. Gaze-contingent eye-tracking training in brain disorders: a systematic review. Brain Sci. 2022;12(7):931. doi: 10.3390/ brainsci12070931.
Bours C, Bakker-Huvenaars MJ, Tramper J, Bielczyk N, Scheepers F, Nijhof KS, et al. Emotional face recognition in male adolescents with autism spectrum disorder or disruptive behavior disorder: an eye-tracking study. Eur Child Adolesc Psychiatry. 2018;27(9):1143-57. doi: 10.1007/s00787-018- 1174-4.
Bek J, Poliakoff E, Lander K. Measuring emotion recognition by people with Parkinson’s disease using eye-tracking with dynamic facial expressions. J Neurosci Methods. 2020;331:108524. doi: 10.1016/j.jneumeth.2019.108524.
Martin-Key NA, Graf EW, Adams WJ, Fairchild G. Investigating emotional body posture recognition in adolescents with conduct disorder using eye-tracking methods. Res Child Adolesc Psychopathol. 2021;49(7):849-60. doi: 10.1007/ s10802-021-00784-2.
Nemeth VL, Csete G, Drotos G, Greminger N, Janka Z, Vecsei L, et al. The effect of emotion and reward contingencies on relational memory in major depression: an eye-movement study with follow-up. Front Psychol. 2016;7:1849. doi: 10.3389/fpsyg.2016.01849.
Naudin M, Carl T, Surguladze S, Guillen C, Gaillard P, Belzung C, et al. Perceptive biases in major depressive episode. PLoS One. 2014;9(2):e86832. doi: 10.1371/journal. pone.0086832.
Metternich B, Gehrer NA, Wagner K, Geiger MJ, Schütz E, Schulze-Bonhage A, et al. Eye-movement patterns during emotion recognition in focal epilepsy: an exploratory investigation. Seizure. 2022;100:95-102. doi: 10.1016/j. seizure.2022.06.018.
Francés L, Quintero J, Fernández A, Ruiz A, Caules J, Fillon G, et al. Current state of knowledge on the prevalence of neurodevelopmental disorders in childhood according to the DSM-5: a systematic review in accordance with the PRISMA criteria. Child Adolesc Psychiatry Ment Health. 2022;16(1):27. doi: 10.1186/s13034-022-00462-1.
- Abstract Viewed: 70 times
- PDF Downloaded: 70 times