3.1. Action units

The Facial Affect Sorting Technique (FAST) was developed to measure facial movement relative to emotion. They describe the six basic emotions through facial behaviour: happiness, surprise and disgust have three intensities and anger is reported as controlled and uncontrolled [57]. Darwin [58], Duchenne [59] and Hjortsjo [60], Ekman and Friesen [61] developed the Facial Action Coding System (FACS), a comprehensive system, which catalogues all possible visually distinguishable facial movements.

FACS describes facial expressions in terms of 44 anatomically based Action Units (AU). They are meant for facial punctuators in conversation, facial deficits indicative of brain lesions, emotion detection, etc. FACS only deals with visible changes, which are often induced by a combination of muscle contractions. Because of that, they are called action units [61]. A small

sample of such expressions can be seen in Figure 7. A selection of databases based on AUs

units

210 Posed Videos 44 Varying ethnic

Videos, images

viewpoint videos

Videos 12

Additional information

backgrounds, FACS

coding

Review on Emotion Recognition Databases http://dx.doi.org/10.5772/intechopen.72748 45

79 Continuously updated, contains different parts

55 Created using the MPI VideoLab

specialists

In 2002, the FACS system was revised and the number of facial contraction AUs was reduced to 33 and 25 head pose AUs were added [68–70]. In addition, there is a separate FACS version

Emotion recognition databases may come in many different forms, depending on how the data was collected. We review existing databases for different types of emotion recognition. In order to better compare similar types of databases, we decided to split them into three broad categories based on format. The first two categories separated still images from video sequences, while the

Most early facial expression databases, like the CK [27], only consist of frontal portrait images taken with simple RGB cameras. Newer databases try to design collection methods that incorporate data, which is closer to real life scenarios by using different angles and occlusion

last category is comprised of databases with more unique capturing methods.

instead of regular facial expressions is listed in Table 1.

DISFA [62] 2013 27 audiovisual

Figure 7. Induced facial action units from the DISFA database [62].

Database Participants Elicitation Format Action

19 Posed and

audiovisual media

1 Posed Six

D3DFACS [67] 2011 10 posed 3D videos 19–97 Supervised by FACS

media

intended for children [71].

Table 1. Action unit databases.

CMU-Pittsburgh AU-Coded Face Expression Database [27] 2000

MMI Facial Expression Database

Face Video Database of the MPI [65,

[63, 64] 2002

66] 2003

4. Database types

4.1. Static databases

Figure 6. Combinations of emotions from the iCV-MEFED [52].

Figure 7. Induced facial action units from the DISFA database [62].


Table 1. Action unit databases.

Relatively newer databases have begun recording more subtle emotions hidden behind other forced or dominant emotions. Among these are the MAHNOB [51] database, which focuses on emotional laughter and different types of laughter, and others that try to record emotions hidden behind a neutral or straight face like SMIC [34], RML [54], Polikovsky's [55] databases. One of the more recent databases, the iCV-MEFED [52, 56] database, takes on a different approach by posing varying combinations of emotions simultaneously, where one emotion takes the dominant role and the other is complimentary. Sample images can be seen in Figure 6.

The Facial Affect Sorting Technique (FAST) was developed to measure facial movement relative to emotion. They describe the six basic emotions through facial behaviour: happiness, surprise and disgust have three intensities and anger is reported as controlled and uncontrolled [57]. Darwin [58], Duchenne [59] and Hjortsjo [60], Ekman and Friesen [61] developed the Facial Action Coding System (FACS), a comprehensive system, which catalogues all possible visually

FACS describes facial expressions in terms of 44 anatomically based Action Units (AU). They are meant for facial punctuators in conversation, facial deficits indicative of brain lesions, emotion detection, etc. FACS only deals with visible changes, which are often induced by a combination of muscle contractions. Because of that, they are called action units [61]. A small

3.1. Action units

distinguishable facial movements.

44 Human-Robot Interaction - Theory and Application

Figure 6. Combinations of emotions from the iCV-MEFED [52].

sample of such expressions can be seen in Figure 7. A selection of databases based on AUs instead of regular facial expressions is listed in Table 1.

In 2002, the FACS system was revised and the number of facial contraction AUs was reduced to 33 and 25 head pose AUs were added [68–70]. In addition, there is a separate FACS version intended for children [71].
