**4.1 Contextual approaches to language**

Context-dependence effects are pervasive in everyday cognition (Barutta et al., 2011; Cosmelli and Ibañez, 2008; Ibanez and Cosmelli, 2008; Ibanez et al., 2010a), especially in the case of language (Ledoux, Camblin, Swaab and Gordon, 2006; Rodriguez-Fornells, Cunillera, Mestres-Misse and de Diego-Balaguer, 2009). We listen and say words within other streams of words. We perceive the emotion of a face altogether with the emotional body language, the semantics, the prosody and other cues from the situation. Language use can be tracked by assessing the influence of context parameters (such as intonation, lexical choice, prosody, and paralinguistic clues) in a current communicative situation. ERPs studies of early (N170 and ELAN) and late components (N400, LPC, LPP) have provided important insights about the temporal brain dynamics of contextual effects in language. For instance, important issues, such as automaticity of contextual effects, multimodal blending of meanings, action-sentence coupling, language-like gesture processing, language and social information coupling, and early emotional word processing have been demonstrated within ERP research (Aravena et al., 2010; Cornejo et al., 2009; Hagoort, 2008; Ibanez et al., 2006, 2009, 2010b, 2010c, 2011b; 2011c, 2011d, 2011e, Van Petten and Luka, 2006). Contextual effects in language assessed with ERPs is a relevant topic in diverse areas of neuropsychiatric research, such as schizophrenia (Guerra et al., 2009; Ibanez et al., 2011c), Alzheimer's disease and mild cognitive impairment (Schwartz et al., 2003; Taylor and Olichney, 2007), focal basal ganglia lesions (Paulmann, Pell and Kotz, 2008) alcoholism (Roopesh et al., 2009) and aphasia (Wassenaar and Hagoort, 2005), among other conditions.

Event-Related Potential Studies of Cognitive and Social Neuroscience 405

2009, 2010b, 2011b, see Figure 3.B). Emotional body language (EBL) is another emergent area in neuroscience research (de Gelder, 2006). Neuroimaging studies have shown that the EBL activates similar areas of emotional face processing, such as amygdala and fusiform gyrus. EBL signals are automatically perceived and influence emotional communication and decision making. ERP research has demonstrated that EMB (a) is automatic and processed early in the brain; (b) influences the emotional recognition of face processing; and (c) is processed in an integrated way with face processing (de Gelder et al., 2006; Meeren, van

A large number of studies using functional MRI, and more recently electrophysiology, have used the presentation of stimuli depicting people in pain (i.e., people suffering from physical injuries or expressing facial expressions of pain) to characterize the neural underpinnings of empathic processing (Botvinick et al., 2005; Jackson et al., 2006; Cheng et al., 2008a; Fan et al., 2008; Han et al., 2008; Akitsuki and Decety, 2009, Decety et al., 2010c). The results from these studies suggest that empathy for pain involves a somatosensory resonance mechanism between other and self that draws on the affective and sensory dimensions of pain processing (Jackson et al., 2006). This mechanism provides crucial and rapid information to help us understand the affective states of others and respond to them

ERP studies of empathy for pain showed an N1 differentiation (neutral pictures eliciting greater negative amplitudes) over the frontal area, as well as a late P3 over the centroparietal region (pain pictures producing greater positive amplitudes; Fan et al., 2008; Han et al., 2008; Decety et al., 2010). These ERPs studies have shown early modulation by contextual reality of stimuli and late modulation based on cognitive regulatory and task demands (Fan and Han, 2008; Han et al., 2008; Decety et al., 2010c; Li and Han, 2010), as well as 'other-related' information, such as priming for treat signaling (Ibanez et al. 2011e). ERP studies have provided important insights regarding the context-dependent processing and differences in automatic-controlled processing on empathy for pain

The current neuroscience of decision making has assessed multiple processes engaged in this complex cognitive ability. Evidence from animals, healthy human volunteers and neuropsychiatric patients (e.g., Bechara and van Der Linden, 2005; Brand et al., 2006; Camerer et al. 2008; Gleichgerrcht et al. 2010; Kable and Glimcher, 2009; Glimcher and Rustichini, 2004; Rangel 2008; Rangel, Rushworth et al., 2007) highlights the role of frontostriatal and limbic loops in decision making. Despite some discrepancies between different models, three main systems are thought to be involved in frontostriatal and limbic loop: a stimulus-encoding system (orbitofrontal cortex), a reward-based action-selection and monitoring system (cingulate cortex) and an expected-reward system (basal ganglia and amygdala). We have shown (Gleichgerrcht et al., 2010) that these systems are crucial in the decision-making process in normal voluntaries, as well as in neuropsychiatric disorders, such as neurodegenerative diseases (Figure 4). The action-selection and monitoring system can be tracked directly with the P2, the ERN and the fERN, opening a new branch of research (Nieuwenhuis et al., 2004, 2005). Gambling and decision-making tasks can be assessed with ERPs (e.g., San martin et al., 2010). Behavioral measures of affective and risky

Heijnsbergen and de Gelder, 2005).

(Decety and Lamm, 2006).

**4.4 Decision making and reward** 

**4.3 Empathy** 

research.

#### **4.2 Emotion and emotional body language**

Today, it is well known that complex social skills depend on basic emotional processing and inference (Grossmann, 2010). Moreover, facial emotional expressions can provide an automatic and rapid shortcut to alarm signals, mentalizing and inter-subjective communication. Important issues in emotion research, such as face emotional processing (Eimer and Holmes, 2007), emotion regulation (Hajcak, MacNamara and Olvet, 2010) and the intertwining of attention and emotion (Schupp, Flaisch, Stockburger & Junghofer, 2006) have a long tradition in ERP research.

Early, automatic and unaware processing of emotion in faces, words and pictures have been demonstrated within ERP research (Guex et al., 2011; Ibanez et al., 2010c, 2011d, In press, Submitted b, see Figure 3.A). Theoretical models of emotion perception (Vuilleumier and Pourtois, 2007) propose a parallel and interactive system indexing object recognition (e.g., triggered by the fusiform gyrus) and emotional discrimination (e.g., triggered by the amygdala). Emotional signs that can denote confidence or danger may occur before and parallel to the process of object codification. In other words, emotional significance can be processed before a stimulus is completely identified. At the same time, processing of complex social stimuli intermixed with emotional processing has been reported at late stages, indexed with the LPP and LPC (Dufey et al., 2010; Hurtado et al., 2009; Ibanez et al.,

Fig. 3. Early and late emotional-cognitive processing. A) Implicit association test (IAT) schematic representation. Both ingroup and outgroup faces, along with words of positive and negative valence, are presented. The subject is required to classify each stimulus to the left or to the right according to labels displayed on top of the screen. B) Early (N170) and late (LPP) effects of IAT. C) N170 contextual modulation based on valence and membership stimuli. D) Late processing (LPP) of semantic stimuli compatibility. Modified from Hurtado et al. 2009 and Ibanez et al 2010c.

2009, 2010b, 2011b, see Figure 3.B). Emotional body language (EBL) is another emergent area in neuroscience research (de Gelder, 2006). Neuroimaging studies have shown that the EBL activates similar areas of emotional face processing, such as amygdala and fusiform gyrus. EBL signals are automatically perceived and influence emotional communication and decision making. ERP research has demonstrated that EMB (a) is automatic and processed early in the brain; (b) influences the emotional recognition of face processing; and (c) is processed in an integrated way with face processing (de Gelder et al., 2006; Meeren, van Heijnsbergen and de Gelder, 2005).

#### **4.3 Empathy**

404 Neuroimaging – Cognitive and Clinical Neuroscience

Today, it is well known that complex social skills depend on basic emotional processing and inference (Grossmann, 2010). Moreover, facial emotional expressions can provide an automatic and rapid shortcut to alarm signals, mentalizing and inter-subjective communication. Important issues in emotion research, such as face emotional processing (Eimer and Holmes, 2007), emotion regulation (Hajcak, MacNamara and Olvet, 2010) and the intertwining of attention and emotion (Schupp, Flaisch, Stockburger & Junghofer, 2006)

Early, automatic and unaware processing of emotion in faces, words and pictures have been demonstrated within ERP research (Guex et al., 2011; Ibanez et al., 2010c, 2011d, In press, Submitted b, see Figure 3.A). Theoretical models of emotion perception (Vuilleumier and Pourtois, 2007) propose a parallel and interactive system indexing object recognition (e.g., triggered by the fusiform gyrus) and emotional discrimination (e.g., triggered by the amygdala). Emotional signs that can denote confidence or danger may occur before and parallel to the process of object codification. In other words, emotional significance can be processed before a stimulus is completely identified. At the same time, processing of complex social stimuli intermixed with emotional processing has been reported at late stages, indexed with the LPP and LPC (Dufey et al., 2010; Hurtado et al., 2009; Ibanez et al.,

Fig. 3. Early and late emotional-cognitive processing. A) Implicit association test (IAT) schematic representation. Both ingroup and outgroup faces, along with words of positive and negative valence, are presented. The subject is required to classify each stimulus to the left or to the right according to labels displayed on top of the screen. B) Early (N170) and late (LPP) effects of IAT. C) N170 contextual modulation based on valence and membership stimuli. D) Late processing (LPP) of semantic stimuli compatibility. Modified from Hurtado

**4.2 Emotion and emotional body language** 

have a long tradition in ERP research.

et al. 2009 and Ibanez et al 2010c.

A large number of studies using functional MRI, and more recently electrophysiology, have used the presentation of stimuli depicting people in pain (i.e., people suffering from physical injuries or expressing facial expressions of pain) to characterize the neural underpinnings of empathic processing (Botvinick et al., 2005; Jackson et al., 2006; Cheng et al., 2008a; Fan et al., 2008; Han et al., 2008; Akitsuki and Decety, 2009, Decety et al., 2010c). The results from these studies suggest that empathy for pain involves a somatosensory resonance mechanism between other and self that draws on the affective and sensory dimensions of pain processing (Jackson et al., 2006). This mechanism provides crucial and rapid information to help us understand the affective states of others and respond to them (Decety and Lamm, 2006).

ERP studies of empathy for pain showed an N1 differentiation (neutral pictures eliciting greater negative amplitudes) over the frontal area, as well as a late P3 over the centroparietal region (pain pictures producing greater positive amplitudes; Fan et al., 2008; Han et al., 2008; Decety et al., 2010). These ERPs studies have shown early modulation by contextual reality of stimuli and late modulation based on cognitive regulatory and task demands (Fan and Han, 2008; Han et al., 2008; Decety et al., 2010c; Li and Han, 2010), as well as 'other-related' information, such as priming for treat signaling (Ibanez et al. 2011e). ERP studies have provided important insights regarding the context-dependent processing and differences in automatic-controlled processing on empathy for pain research.

### **4.4 Decision making and reward**

The current neuroscience of decision making has assessed multiple processes engaged in this complex cognitive ability. Evidence from animals, healthy human volunteers and neuropsychiatric patients (e.g., Bechara and van Der Linden, 2005; Brand et al., 2006; Camerer et al. 2008; Gleichgerrcht et al. 2010; Kable and Glimcher, 2009; Glimcher and Rustichini, 2004; Rangel 2008; Rangel, Rushworth et al., 2007) highlights the role of frontostriatal and limbic loops in decision making. Despite some discrepancies between different models, three main systems are thought to be involved in frontostriatal and limbic loop: a stimulus-encoding system (orbitofrontal cortex), a reward-based action-selection and monitoring system (cingulate cortex) and an expected-reward system (basal ganglia and amygdala). We have shown (Gleichgerrcht et al., 2010) that these systems are crucial in the decision-making process in normal voluntaries, as well as in neuropsychiatric disorders, such as neurodegenerative diseases (Figure 4). The action-selection and monitoring system can be tracked directly with the P2, the ERN and the fERN, opening a new branch of research (Nieuwenhuis et al., 2004, 2005). Gambling and decision-making tasks can be assessed with ERPs (e.g., San martin et al., 2010). Behavioral measures of affective and risky

Event-Related Potential Studies of Cognitive and Social Neuroscience 407

The study of cognitive processing during sleep is a topic of great interest because ERPs allow the study of stimulation with passive paradigms (without conscious or behavioral response), opening multiple research possibilities during different sleep phases (Ibanez et al., 2008a). Different cognitive discriminations during sleep related to the learning, frequency, intensity, duration, saliency, novelty, proportion of appearance, meaning and even sentential integration of stimuli are topics of intense research (e.g., Ibanez et al., 2006). Methodological control of ERP sleep research, such the use of qualitative and quantitative measures of sleep stages (see Figure 5), the control of the so-called first night effect and the assessment of sleep disturbances are important factors for improving this research area (Ibanez et al., 2008b). Better control of experimental paradigms is relevant for the growth of

Fig. 5. Quantitative assessment of sleep stages during ERP recordings. A) The voltage maps and N400 waveform modulation of contextual semantic discrimination during REM sleep. B).Comparison of frequency bands between sleep stages (time-frequency charts for stage II sleep, REM sleep and stage II-minus-REM subtraction). Microvolt differences (mean and standard deviations) of delta band activity during stage II and REM (right bottom).

The use of local field potentials (LFP) and electrocorticography (ECoG) in patients with surgically implanted electrodes (Figure 6) have provided a recent, new pathway to study the spatiotemporal brain dynamics of cognition. Intracraneal recordings help to diagnose and treat neurological conditions, such as epilepsy, Parkinson's disease and tumors. LFP and ECoG are measures of direct brain activity that have better (combined) temporo-spatial resolution than any other human neuroscience method. The ERP assessment, together with evoked oscillatory activity, has provided important insights on working memory, episodic memory, language, face processing, consciousness and spatial cognition (Jacobs and

**5.1 Sleep research** 

the neuroscience of sleep.

Modified from Ibanez et al., 2006, 2008b.

**5.2 Intracraneal recordings** 

Kahana, 2010; Lachaux et al., 2003).

decision-making tasks would be not so sensitive as to assess subtle deficits in decision making in disorders such as adult attention deficit hyperactivity disorder and bipolar disorders. Conversely, ERP abnormal neural processing of valence and magnitude of rewards in a gambling task in those disorders may help to integrate reward, action-selection and monitoring systems, providing an excellent shortcut to goal-directed action (Ibanez et al., Submitted a). ERP research on gambling tasks provides both a clinical and a theoretical branch of research linking decision making, soft frontal diseases, monitoring-reward systems and psychiatry.

Fig. 4. A neuroanatomical model of decision making. Three main systems are thought to be involved in decision making: a stimulus-encoding system (orbitofrontal cortex shown in red), an action-selection system (anterior cingulate cortex shown in green) and an expected-reward system (basal ganglia and amygdala shown in blue). The anterior, medial and posterior cingulate cortex, together with basal ganglia (ellipse), seem to modulate the ERN and fERN in gambling and error-monitoring tasks. Modified from Gleichgerrcht et al., 2010.
