**2.2. Event‐related potentials and emotion**

Studies of event‐related potentials (ERPs) deal with signals that can be tackled at different levels of analysis: signals from single‐trials, ensemble averaged signals where the ensemble encompasses several single‐trials and signals resulting from a grand‐average over different trials as well as subjects. The segments of the time series containing the single‐trial response signals are time‐locked with the stimulus: *t* i (negative value) before and *t*<sup>f</sup> (positive value) after stimulus onset. The ensemble average, over trials of one subject, eliminates the spontaneous activity of brain and the spurious noisy contributions, maintaining only the activity that is phase‐locked with the stimulus onset. The grand‐average is the average, over participants, of ensemble averages and it is used mostly for visualization purposes to illustrate the outcomes of the study. Usually, a large number of epochs linked to the same stimulus type need to be averaged in order to enhance the signal‐to‐noise ratio (SNR) and to keep the mentioned phase‐locked contribution of the ERP. Experimental psychology studies on emotions show that the ERPs have characteristics (amplitude and latency) of the early waves which change according to the nature of the stimuli [20, 21]. In Ref. [16], the characteristics of ensemble‐ average are the features of the classifier. However, this model can only roughly approximate reality, since it cannot deal with robust dynamical changes that occur in the human brain [22].

8 years old, is presented. The visual stimulation with the IAPS data set was considered insuf‐ ficient, hence they proposed a sophisticated scenario to elicit emotions and only peripheral biological signals were recorded and the measured features were the input of a classification scheme based on an SVM. The results showed accuracies of 78.4% and 61% for three and four

In Ref. [14], also by means of the IAPS repository, three emotional states were induced in five male participants: *pleasant*, *neutral* and *unpleasant*. They obtained, using SVMs, an accuracy of 66.7% for these three classes of emotion, solely based on features extracted from EEG signals. A similar strategy was followed by Macas [15], where the EEG data were collected from 23 subjects during an affective picture stimulus presentation to induce four emotional states in arousal/ valence space. The automatic recognition of the individual emotional states was performed with

In Ref. [16], four emotional categories of the arousal/valence space were considered and the EEG was recorded from 28 participants. The ensemble average signals were computed for each stimulus category and person. Several characteristics (peaks and latencies) as well as fre‐ quency‐related features (event‐related synchronization) were measured on a signal ensemble encompassing three channels located along the anterior‐posterior line. Then, a classifier (a decision tree, *C*4.5 algorithm) was applied to the set of features to identify the affective state.

In Ref. [17], through a series of projections of facial expression images, emotions were elicited. EEG signals were collected from 16 healthy subjects using only three frontal EEG channels. In Ref. [18], four different classifiers (quadratic discriminant analysis (QDA), k‐nearest neigh‐ bor (KNN), Mahalanobis distance and SVMs) were implemented in order to accomplish the emotion recognition. For the single channel case, the best results were obtained by the QDA (62.3% mean classification rate), whereas for the combined channel case, the best results were obtained using SVM (83.33% mean classification rate), for the hardest case of differentiating

In Ref. [19], *IF‐THEN* rules of a neurofuzzy system detecting positive and negative emotions are discussed. The study presents the individual performance (ranging from 60 to 82%) of the system for the recognition of emotions (two or four categories) of 11 participants. The deci‐ sion process is organized into levels where fuzzy membership functions are calculated and combined to achieve decisions about emotional states. The inputs of the system are not only EEG‐based features, but also visual features computed on the presented stimulus image.

Studies of event‐related potentials (ERPs) deal with signals that can be tackled at different levels of analysis: signals from single‐trials, ensemble averaged signals where the ensemble encompasses several single‐trials and signals resulting from a grand‐average over different trials as well as subjects. The segments of the time series containing the single‐trial response

stimulus onset. The ensemble average, over trials of one subject, eliminates the spontaneous

(negative value) before and *t*<sup>f</sup>

(positive value) after

i

a Bayes classifier. The mean accuracy of the individual classification was about 75%.

different categories of emotions, respectively.

26 Emotion and Attention Recognition Based on Biological Signals and Images

An average accuracy of 77.7% was reported.

**2.2. Event‐related potentials and emotion**

signals are time‐locked with the stimulus: *t*

six basic discrete emotions.

Due to the mentioned limitation, frequency analysis is more appropriate, as long as it is assumed that certain events affect specific bands of the ongoing EEG activity. Therefore, several investigations have studied the effect of stimuli on characteristic frequency bands. Hence, these measures reflect changes in gamma (γ), beta (β), alpha (α), theta (θ) or delta (δ) bands and can be used as input to a classification system. It is known that beta waves are connected to an alert state of mind, whereas alpha waves are more dominant in a relaxing context [23]. Alpha waves are also typically linked to expectancy phenomena and it has been suggested that the main sources of them are located in parietal areas, while beta activity is most prominent in the frontal cortex over other areas during intense‐focused mental activity [22]. Furthermore, regarding the emotional valence processing, psychophysiological research works have shown different patterns in the electrical activity recorded from the two hemi‐ spheres [24]. By comparing the power of spectral bands between the left and the right hemi‐ sphere of the brain of one participant, they reveal that the left frontal area is related to positive valence, whereas the right one is more related to negative valence [25].

In brain‐related studies, one of the most popular, a simple and reliable measure from the spectral domain is the event‐related desychronization/synchronization (ERD/ERS). It repre‐ sents a relative decrease (ERD) or increase (ERS) in the power content in time intervals after the stimulus onset when compared to a reference interval defined before the stimulus onset [26]. ERD/ERS estimated for the relevant frequency bands during the perception of emotional stimulus have been analyzed [27, 28]. It is suggested that ERS in the theta band is related to emotional processes, together with an interaction between valence and hemisphere for the anterior‐temporal regions [27]. Later on, experiments showed that the degree of emotional impact of the stimulus is significantly associated with increase in evoked synchronization in the δ‐, α‐, β‐, γ‐ bands [28]. In the same study, it was also suggested that the anterior areas of the cortex of both hemispheres are associated predominantly with the valence dimension of emotion. Moreover, in Ref. [29], it has been suggested that delta and theta bands are involved in distinguishing between emotional and neutral states, either with explicit or implicit emo‐ tions. Furthermore, in Ref. [30], the results showed that centrofrontal areas showed signifi‐ cant differences of ERD‐delta associated with the valence dimension. They also reported that desynchronization of the medium alpha range is associated with attentional resources. More recently, in Ref. [31], the relationships of the late positive potential (LPP) and alpha‐ERD dur‐ ing the viewing of emotional pictures have been investigated. The statistical results obtained by these studies show that it is worth considering ERD/ERS measures as inputs to classifiers meant to automatically recognize emotions. Interestingly, a recent review about affective computing systems [7] emphasizes the advantages of using frequency‐based features instead of the ERP components.
