A Review on EEG Signals Based Emotion Recognition

Emotion recognition has become a very controversial issue in brain-computer interfaces (BCIs). Moreover, numerous studies have been conducted in order to recognize emotions. Also, there are several important definitions and theories about human emotions. In this paper we try to cover important topics related to the field of emotion recognition. We review several studies which are based on analyzing electroencephalogram (EEG) signals as a biological marker in emotion changes. Considering low cost, good time and spatial resolution, EEG has become very common and is widely used in most BCI applications and studies. First, we state some theories and basic definitions related to emotions. Then some important steps of an emotion recognition system like different kinds of biologic measurements (EEG, electrocardiogram [EEG], respiration rate, etc), offline vs online recognition methods, emotion stimulation types and common emotion models are described. Finally, the recent and most important studies are reviewed.


Introduction
Everyone knows original and basic emotions such as happiness, fear, anger, disgust, sadness and surprise.But neuroscientists and researchers have no consensus about the nature of emotions.There are 2 opinions about emotions: one approach considers emotions as general states of individuals and the other one knows emotions as physiological interactions. 1Imagine a person driving a car while another car approaches and causes him to deviate from the road.At first that individual probably experiences fear and anger.According to the first view, fear comes from the inference that one might be in anger and that anger is because of the driver who has just put him in danger.Thagard, 1 Oately 2 and Nussbaum 3 believe in the first approach.Oately demonstrated how original emotions have a strong relation with executing goals.In other words, people become happy while approaching their goals and sad when they fail.People become frightened when they experience trouble or feel threatened.Therefore, we can consider emotions a general representation of our problems. 1In contrast to the first view, the second approach emphasis on physical and physiological interactions.When someone causes an individual driving a car to deviate off the road, their heart rate, blood pressure and respiration rate increase.Feelings (like fear or anger, etc) originate from the brain's responses to these physiological changes and not from the interpretation of the situation.James introduced this approach for the first time in 1884.Psychologically speaking, in terms of emotion classification there are 2 basic theories: Plutchik's theory and Ekman's theory.The first theory classifies emotions into 2 different categories: basic emotions and secondary ones.These emotions are as follows: anticipation, joy, trust, sadness, fear, surprise, anger, disgust.Secondary emotions come from a combination of these elementary feelings.These emotions are as follows: love, optimism, aggressiveness, submission, contempt, awe, remorse and disapproval.Ekman's theory is known as a discrete model.He introduced six basic emotions: fear, sadness, happiness, surprise, disgust, anger. 4After that, the number of these emotions increased to 15. James and Lange in the 19th century introduced another theory.In this theory environmental variations cause physiological changes in our autonomous nervous system and consequently cause different emotions.On the other hand, physiological changes cause emotions.Therefore, researchers analyze signals and images related to these physiological changes in order to recognize feelings and classify emotions.However, physiological signals introduce some problems like noise, artifacts, etc.][7][8][9][10][11] Also, there are other factors which affect emotions, such as sex, age and race.Usually, researchers consider these parameters while studying emotions.Besides the discrete model of emotions, there is another model which Lang proposed and called valence-arousal model.In this model, valence and arousal values are assigned to each emotion.In other words, in this model emotions are a continuous spectrum of valence and arousal values and generally emotions are plotted in a 2D coordination called valence-arousal plane.
There are 4 important steps in emotion recognition systems: physiologic records, emotion stimulation, online or offline recognition and stimulated emotions and emotion models.

Physiologic Records
Emotion status is reflected by physiological changes, which is why biological signals and images are recorded in order to recognize emotions.Some biological systems in the human body and their indexes are described as follows: 1-Cardiovascular system: electrocardiogram (ECG), heart rate variability (HRV), cardiac output, blood pressure, etc. 2-Respiratory system: respiration rate, etc. Figure 1 shows some types of these signals.
EEG signals due to their simplicity to analyze and good time and spatial resolution have become common and useful in most BCI applications such as emotion recognition.Also, EEG recording systems are cheap and accessible.Previous studies show that by recording and processing EEG signals we can achieve very good results in terms of emotion classification.So a decision was made to explain and review some previous studies related to emotion classification through EEG signals.

Offline or Online Recognition
In some studies, emotion recognition on the spot is really important such as monitoring patients while taking medicine, so online methods are of importance in those applications.For example, Iacoviello et al 81 an effective, general and complete classification method for EEG signals was introduced.In this study, selfinduction was used as emotion elicitation.Wavelet transform (WT), principle component analysis (PCA) and support vector machine (SVM) were used to process and classify EEG signals.Also, Sourina et al 83 introduced an online emotion recognition study which used spatial time fractal to characterize brain states.A vital issue in online recognition systems is that processing methods must be fast and precise.Fractal transform is one of these methods that was used in several related studies.Sourina et al 86 determined brain responses using fractal transform following stimulation by music.Also they calculated Renyi entropy as well.Liu et al 55  The other type of emotion recognition systems is offline.For example, Zhang and Li 85 recognized positive and negative emotions using neuro fuzzy method offline.In this study, an unsupervised clustering method and adaptive neuro fuzzy inference system (ANFIS) were used.Clustering was used in early steps for creating primary information related to emotions.Emotions were elicited by International Affective Picture System (IAPS)

Emotion Models
Another problem in emotion recognition studies is the number of elicited emotions and the emotion model.Some studies, according to discrete model of emotions, consider a specific number of emotions and others according to valence-arousal model suppose more emotions.For example, Murugappan et al 45,48,52 studied anger, fear, surprise, happiness emotions according to discrete emotion model, while Koelstra et al, 51 Koelstra and Patras, 53 and Hidalgo-Munoz et al 24 studied emotions according to the valence-arousal model.

Public Databases
There are some public emotion databases which can be used by researchers for free.The advantage of public databases is that researchers do not need any laboratory and specific recording systems, appropriate condition, shield environment, etc.Also, they do not need participants and they will have reliable and free databases.In this section, some available databases are described.3 shows a participant while watching clips.Emotions were considered as positive, negative and natural.Participants filled a questionnaire after watching videos.EEGs were recorded in three sessions to evaluate stability of patterns and neural signatures among participants and sessions, the interval between 2 sessions was one or more weeks.EEG signals were recorded according to the 10-20 international standard system.Raw and preprocessed signals and also face videos are available.For more details refer to http:// bcmi.sjtu.edu.cn/~seed/index.html.

MAHNOB-HCI database
Soleymani et al 96  Their comments about their feelings were evaluated.Signals were recorded according to the 10-20 international standard system.For more details about this database refer to https://mahnob-db.eu/hci-tagging/.These databases have been used in several studies, Table 2 shows a brief description and references which have used biological signals in these databases so far.

Previous Emotion Studies Emotion and Normal Cases
In this section we review previous studies which evaluate emotions in normal individuals.Weinreich et al 26 measured variations of alpha frequency band in frontal lobe from an oddball paradigm.Participants were asked to describe each image regardless of the emotion of the image.16-channel EEG signals were recorded from 20 female and 8 male participants.Hidalgo-Munoz et al 24 studied EEG signals of 26 females while watching emotional images from IAPS.This study considered emotions according to the valence-arousal model.In the processing step, they used spectral turbulence (ST), a method which was inspired by ECG studies.Results show that the left temporal lobe has considerable activity during emotion elicitation.Koelstra and Patras 53 recorded EEG signals from several participants according to the valence-arousal model.They showed video clips in order to evoke emotions.The details are described in section 1.5 and Table 2.In this study, power spectral density of EEG sub-bands was calculated and active units (AU) were detected from face videos of participants.Then a combination of features was applied.Hidden Markov Model (HMM) and GentleBoost were used as the classifiers.Results showed that the combination of face videos and EEG signals improved the accuracy.
Lee et al 54 proposed an emotion recognition system based on fuzzy logic.They used video clips to elicit emotions and recorded EEG signals from 12 participants.They extracted dynamic features from emotional states and 3D fuzzy GIST and 3D fuzzy tensor to extract brain features in a semantic level.Independent component analysis (ICA) was used to remove artifacts.ANFIS was used to classify emotions, results showed the performance of the proposed method.
Haung et al 60 presented a multimodal approach to recognize emotions.In this study EEG signals from MAHNOB-HCI database were used.Discriminant power spectrum and difference power spectrum were extracted from EEG signals of 27 participants.Local binary patterns (LBP) were extracted from videos of participants' faces.Then fusion in features and decisions were applied.Finally, SVM and KNN were used as classifiers.Results showed that using multimodal data, gives better recognition results.
Bozhkov et al 31  Mavratzakis et al 36 evaluated event related potentials (ERPs) of 27 individuals during watching pictures.In this study, three picture databases were used as stimuli: KDEF (Karolinska Directed Emotional Faces Database), RAFD (Radboud Faces Database) and IAPS.After statistical analysis of ERP components, results showed that emotions did not influence on P1 component.Also, N170 increased during watching emotional pictures but N100 was not sensitive to emotion changes.Moreover, early posterior negativity (EPN) increased during watching fearful images.

Emotions and Neural Disorders
An interesting part of emotion studies is studies which evaluate psychological diseases and disorders through emotion recognition.In this section, studies about some disorders such as Parkinson's disorder (PD), autism spectrum disorder (ASD), schizophrenia, depression, etc. were reviewed.Yuvaraja et al 88   stimuli and then analyzed using theta coherence index (cortical connectivity index).This study showed that autistic children have deficiency in emotion recognition.Also, there was no theta coherence modulation while normal children had theta coherence modulation in the right frontal lobe in response to emotional faces.Theta coherence modulation in response to emotions is related to social deficiency of autistic children.
Schizophrenia can be detected by emotion stimulation.Brennan et al 89 examined this hypothesis by processing ERP signals.This study used international BRAINnet database, including 108 schizophrenic patients and 108 normal cases.All individuals watched emotional pictures including sadness, fear, anger, disgust and happiness and simultaneously ERPs were recorded in conscious and non-conscious conditions.Then significant differences among 2 groups were achieved through analysis of variance (ANOVA).Results showed that schizophrenic patients had shorter brain activity, about 70 ms.Also, schizophrenic patients in response to disgust had positive shifts after 70 ms and normal people had negative shifts in response to fear and anger in comparison with happiness in temporal-occipital regions.
Croft et al 95 detected emotion deficiency in Huntington's patients via ERPs.In this study, EEG signals from 11 Huntington's patients and 11 normal individuals were recorded while participants expressed emotions such as scramble, neutral, happiness, anger and disgust.Results showed lower accuracies for negative emotions such as disgust, neutral and anger due to decreased functionality.
Psychogenic non-epileptic seizures (PNES) are unknown among epileptic seizures.Recent studies showed that PNES patients have impairments in control of their emotions.Urbanek et al 94 evaluated this hypothesis.In this study, EEG signals from 56 patients and 68 normal individuals during emotion stimulation were recorded.Results demonstrated that these patients have weaker emotions, more negative feelings and stronger control on their emotions than normal people.
Tseng et al evaluated phase synchrony and EEG activation oscillation in Asperger syndrome (AS) patients while they were recognizing emotions from face images. 40AS group included 10 individuals and the normal group consisted of 10 individuals.Emotions were stimulated by pictures.Results demonstrated that AS group had no determined N400 in response to pictures, also, they showed lower synchrony in temporal and parietal-occipital lobes at delta/theta and weaker phase synchronization in separate regions of brain.
Akar et al examined brain dynamics of major depressive disorder (MDD) patients during stimulation using positive and negative emotions. 72They used music as stimulation.Three different situations including noisy environment, relaxation and listening to music were considered.EEG signals from 15 MDD patients and 15 normal people were recorded and analyzed using nonlinear methods.Some kinds of complexity measures such as Lempel-Ziv, Kolmogorov were calculated and then significant differences were evaluated by ANOVA measure.This study demonstrated that MDD patients have more complex EEG signals in parietal and frontal lobes comparing to normal people.Also EEG signals of these individuals had lower complexity in frontal and parietal lobes while listening to music compared to other situations.
Li et al evaluated large scale functional brain networks of depressed people and normal ones using graph theory. 34Participants' emotions were elicited by Ekman pictures including positive, negative and neutral emotions.Simultaneously, EEG signals were recorded from 16 depressed and 14 normal participants.In this study, EEG signals were processed by extracting coherence in frequency bands such as delta, theta, alpha, beta and gamma.Results showed that for depressed participants total coherence values in gamma band were higher than normal people.Also, total coherence among normal participants for negative emotions was higher in gamma band.Moreover, there were abnormal networks in prefrontal and occipital lobes for depressed participants.Table 3 describes recent studies related to emotion recognition.

Conclusion
In this paper, we reviewed several emotion recognition studies from EEG signals.First, we stated some emotion
calculated Higuchi's fractal dimension.They processed EEG signals while participants were listening to music.

Figure 2 .
Figure 2. Valence and Arousal Values of Video Clips in DEAP Database.Included video clips are shown in green.51 considered valence-arousal model for emotions and recorded EEG signals from 26 females viewing IAPS pictures.They used Echo state networks (ESN) to cluster and classify positive and negative emotions.They obtained the desired results and demonstrated the performance of their proposed method.

Table 1 .
Different Kinds of Emotion Stimulation This multimodal database was recorded by Koelstra et al, 51 in 2 laboratories (Geneva and Twente) in 2012.In this database, 40 video clips were used to elicit emotions according to valence arousal model.Thirty-two individuals participated and 32-channel EEG signals, 4-channel EMGs, 4 EOG signals, 2-channel GSR signal, 2 ERG signals, temperature in a single channel, single channel respiration rate and 1-channel blood volume pressure were recorded.Five indexes including arousal, valence, like/ dislike, dominance and familiarity were reported by each participant.Raw and preprocessed signals from all participants and also face videos from 22 of them are available in this dataset.More detailed descriptions can be found in https://www.eecs.qmul.ac.uk/mmv/datasets/deap/index.html.Figure 2 shows selected emotional videos in valence arousal plane.
SJTU Emotion EEG Dataset (SEED) Zheng and Lu 98 recorded SJTU emotion EEG Dataset.This dataset contains EEG signals from 15 individuals while watching Chinese video clips.Figure extracted higher order spectral features from EEG signals and evaluated emotion changes between PD patients and normal individuals.EEG signals were recorded from 20 PD patients and 20 normal participants while watching video clips.Samples were classified into six basic emotions (sadness, happiness, fear, anger, surprise and disgust)

Table 3 .
Recent Emotion Recognition and Evaluation Studies From EEGs

Table 1 .
Continued Then we described different components of emotion recognition systems: different kinds of biologic measurements (EEG, ECG, etc) offline vs online recognition systems, different types of emotional stimulation, and the specific emotion models which have been used in studies (valence-arousal model and discrete model).Since EEG has become more and more common in emotion recognition applications in recent years, our main focus was on the subject of emotion recognition through EEG signals.So different papers and studies were reviewed in order to cover this issue.Attempts were also made to support recent, valid and reliable studies for young researchers who are interested in this field.