**4. Neural correlates of reading emotion-laden literary**

Reading literaries (such as poems and novels) bring various affective feelings, such as sadness, feelings of suspense, and beauty. The literary reading is a constructive process, linking to perspective taking and relational inferences associated with the extended language network [36], the ToM network [37], and regions associated with the mood empathy [38] and esthetic positive feelings [39]. The emotional connotation of single words recruits attention and can induce engagement for readers of texts and seems to be the supported by the activities of prefrontal affective networks. Ferstl et al. [40] revealed the auditory presentation of the emotion-laden text passages activated ventral mPFC (vmPFC), the left amygdale, and the pons. Wallentin et al. further showed the neural correlates of intensity ratings of each line of the text in the bilateral temporal, IFG, and premotor regions, and the right amygdala [41]. An fMRI study by [42] presented 120 short passages from the Harry Potter book series. Three levels of emotional ratings were used as the regressor for the parametric analysis: the rating of single words, the rating of the relation between words (e.g., the contrast in the emotional valence between words), and the rating of a whole passage. The contrast between the literary reading and fixation engendered activations in the dorsal lateral PFC, TPJ, anterior temporal lobe (aTL), precuneus, and amygdala which are associated with the ToM or affective empathy processing [43] and aTL and vmPFC associated with the multimodal integration and emotional conceptualization [44]. They also demonstrated that the arousal ratings on the lexical items and inter-lexical items were correlated with the activity in areas associated with emotional salience, emotional conceptualization, situation model building, multimodal semantic integration and theory of mind. For example, the more positive valence was found in the left dlPFC, left premotor, bilateral aTL, left TPJ, left PCC, precuneus. Lexical valance span varied in the left amygdala which was demonstrated to involve salience detection and the effects of arousal span were significant in the anterior insula, extended from IFG, which was demonstrated as integration of autonomic processes with emotional and motivational functions. However, no effects of ratings on passages were demonstrated in emotion-associated regions, but in ToM or affective empathy processing and multimodal semantic integration (IFG, dlPFC, aTL, TPJ, precuneus, dorsal ACC, vmPFC). This finding was different from the observation in Altmann et al. [45]. Stories with negative valence were found to activate stronger connectivity between the mPFC and left amygdala and bilateral insula, regions involving affective empathy and ToM processing. Moreover, the mPFC was more activated when the reader showed more positive judgments toward the negative stories. Whether the emotion potential of short texts can be uniquely predicted by lexical and inter-lexical affective variables or also by passage-wise rating is worth further investigation.

A related question is how one's language experience (e.g., familiarity, age of acquiring the language) affects the brain responses (especially the prefrontal involvement) underlying the reading of emotion-laden literature. Reading fictions involves language-related processes including constructive content imagination and simulation [46], perspective taking and relational inferences [47]. These processes were represented by the extended language network associated with discourse comprehension, the neural mechanisms underlying high-level/ multimodal semantic integration [44, 48], and theory of mind network, generally including vmPFC, dmPFC, IFG, aTL, TPJ, PCC, precuneus, and left amygdala. The effects of emotionality were demonstrated to predict the left amygdala, vmPFC, and the pons when listening to emotion-laden text passages [40]. Ref. [49] showed that the happy passage activated left precentral gyrus (the head/face area on the somatotopy) and bilateral amygdala when the literature was presented in reader's first language (L1 reading, German) only; while regardless of the language status, the emotion-laden literature activated emotion-related amygdala and lateral prefrontal, anterior temporal, and temporo-parietal regions associated with the discourse comprehension, high-level semantic integration, and ToM processing. Moreover, the multivariate pattern analysis approach revealed better accuracy of differential patterns of brain activity in predicting different emotional contents in L1 than second language (L2), with the sensitivity attenuated in the L2 relative to the L1. These patterns showed the neural activations that support provide stronger and more differentiated emotional experience in reading our native than the second language texts.

## **5. Transmitting and learning language in social contexts**

memory, planning, and cognitive flexibility) and ToM tasks (including the first- and secondorder theory of mind, or mentalizing the other person's mind or the another's knowledge toward others), the latter of which predict individual's performance in the communicative-

Reading literaries (such as poems and novels) bring various affective feelings, such as sadness, feelings of suspense, and beauty. The literary reading is a constructive process, linking to perspective taking and relational inferences associated with the extended language network [36], the ToM network [37], and regions associated with the mood empathy [38] and esthetic positive feelings [39]. The emotional connotation of single words recruits attention and can induce engagement for readers of texts and seems to be the supported by the activities of prefrontal affective networks. Ferstl et al. [40] revealed the auditory presentation of the emotion-laden text passages activated ventral mPFC (vmPFC), the left amygdale, and the pons. Wallentin et al. further showed the neural correlates of intensity ratings of each line of the text in the bilateral temporal, IFG, and premotor regions, and the right amygdala [41]. An fMRI study by [42] presented 120 short passages from the Harry Potter book series. Three levels of emotional ratings were used as the regressor for the parametric analysis: the rating of single words, the rating of the relation between words (e.g., the contrast in the emotional valence between words), and the rating of a whole passage. The contrast between the literary reading and fixation engendered activations in the dorsal lateral PFC, TPJ, anterior temporal lobe (aTL), precuneus, and amygdala which are associated with the ToM or affective empathy processing [43] and aTL and vmPFC associated with the multimodal integration and emotional conceptualization [44]. They also demonstrated that the arousal ratings on the lexical items and inter-lexical items were correlated with the activity in areas associated with emotional salience, emotional conceptualization, situation model building, multimodal semantic integration and theory of mind. For example, the more positive valence was found in the left dlPFC, left premotor, bilateral aTL, left TPJ, left PCC, precuneus. Lexical valance span varied in the left amygdala which was demonstrated to involve salience detection and the effects of arousal span were significant in the anterior insula, extended from IFG, which was demonstrated as integration of autonomic processes with emotional and motivational functions. However, no effects of ratings on passages were demonstrated in emotion-associated regions, but in ToM or affective empathy processing and multimodal semantic integration (IFG, dlPFC, aTL, TPJ, precuneus, dorsal ACC, vmPFC). This finding was different from the observation in Altmann et al. [45]. Stories with negative valence were found to activate stronger connectivity between the mPFC and left amygdala and bilateral insula, regions involving affective empathy and ToM processing. Moreover, the mPFC was more activated when the reader showed more positive judgments toward the negative stories. Whether the emotion potential of short texts can be uniquely predicted by lexical and inter-lexical affective variables or also by passage-wise rating is worth further investigation.

A related question is how one's language experience (e.g., familiarity, age of acquiring the language) affects the brain responses (especially the prefrontal involvement) underlying the

**4. Neural correlates of reading emotion-laden literary**

pragmatic tasks.

70 Prefrontal Cortex

How are messages propagated? What are the underlying neural mechanisms? One key aspect regarding how our language is grounded into the social interaction is the synchronized linguistic behavior between communicative partners. The tendency to become more similar in the use of nonverbal cues (e.g., [50]), linguistic structures (e.g., [51]), and neural activity associated with producing and decoding narratives (e.g., [52, 53]). Are the mechanisms of verbal synchrony or linguistic style matching between communicators applied to the relationship between the use of social language by one communicator and that by the listener who subsequently retransmit the message? One fMRI study addressed the neural mechanisms underlying the processing and retransmission of social language in the context of word-ofmouth sharing [54]. The brain systems that are engaged in considering the mental states of others are particularly engaged by social features of language (e.g., words associated with social interaction, which referred to individuals who maybe participate in a social interaction, such as "friend", or those used to describe these interactive processes, such as "exchange"), and the activity within the brain's mentalizing system during exposure to ideas predicts the subsequent extent to which social language is employed in describing the ideas to others. In particular, the brain mentalizing system included the bilateral TPJ, dorsomedial prefrontal cortex (dmPFC) as well as the precuneus and PCC [55]. Previous literature has examined the mechanisms of successful communication in pairs [52, 53] and how simulation of other's mental states can facilitate effective idea retransmission [56, 57]. They showed that the use of more social words to introduce ideas was associated with increased neural activation in the dmPFC, bilateral TPJ, and temporal pole in the participants, the networks that were typically responsible for mentalizing. Moreover, the higher levels of activity in left TPJ and dmPFC during idea exposure were positively associated with greater usage of words from social categories after the experiment. Here, the dmPFC is considered as functions associated with considering other's attributes and motivations [58, 59], and reflected the pursuance of specific motivational goal for the speaker (e.g., to look good by communicating good ideas in a compelling way to others). These findings consolidated the idea that the social cues in language (here the lexical item related with social interaction) can activate the medial prefrontal and other systems implicated in understanding the mental states of others (who introduced the ideas toward a new object) and successfully retransmission of ideas.

mentalizing network and bilateral PMC, bilateral aIPS fo the mirroring system. The mPFC activity increased as the increasing of the individual's trait empathy. Self-directed (0° away from the camera) versus other-directed orientation (30° away from the camera) activated the visual cortex, and the enhanced activation was found in mPFC for the self-directed orientation in the communicative as compared with the private intention. Moreover, the communicative intent further strengthened the connectivity between the mPFC and the bilateral pSTS for the mentalizing system and the left PMC and bilateral aIPS for the mirroring system. These findings suggest a collaboration between the medial prefrontal cortex and other social inference

Prefrontal Cortex: Role in Language Communication during Social Interaction

http://dx.doi.org/10.5772/intechopen.79255

73

Studies also demonstrated that the prefrontal cortex is more involved during language acquisition that occurs in a social interaction scenario. Infants must be immersed in a language in a socially interactive situation to develop speech perception [72]. Similar to the spoken language, the sign language provides rich grammatical rules and both activate the left IFG during syntactic comprehension [73]. The live communication, as compared with the prerecorded videos, is more rewarding, more arousal, provides richer sources of information (such as responsive eye gaze), and more attention-grabbing (e.g., [72]). In a study on the neural correlates of language acquisition [74], naïve Japanese participants learned Japanese sign language through an interaction with a deaf signer or via watching videos for a comparable amount of time. The group who received a live exposure showed the modulation of BOLD signals in the left IFG between two sessions of the testing when making grammatical judgments toward a sequence of signs which was absent in the group received the video exposure. The left IFG was considered as crucial for processing syntactic and other linguistic rules in the native adult which demonstrated a clear role of exposure toward the communicative environment on acquiring new linguistic knowledge (e.g., foreign language). The group receiving DVD exposure revealed the activation in the right IFG and right supramarginal gyrus, suggesting that they developed the knowledge of sign language through incorporating

networks during intention recognition in nonverbal communication.

multimodal information from different senses and from imitation learning [75].

Many forms of social interaction are not carrying explicit mentalizing demands. These implicit mentalizing processes include tracking mental state content [76, 77] and monitoring other's communicative intent [70] which are more engaged when processing communicative cues from a real-time social partner than those from a pre-recorded video. A similar experiment invited the participants to listen to short vignettes and were persuaded to believe half to be pre-recorded and the other half to be presented over a real-time audio-feed by a live social partner [78]. Mentalizing regions (which were defined from an independent localizer paradigm with a typical false belief task, in particular dorsal/middle/ventral mPFC and bilateral TPJ [79]) and activations associated with social engagement [66] were observed when participants believed that the speech was live than they listened to recorded matched human speech. The right dmPFC was further correlated with the subjective rating of the liveness for the live speech versus the matched speech and with the individual's autistic traits (measured by autistic quotient [80]). The increased activity in the dmPFC was, the higher rating of liveness for the live speech versus the matched speech, and the lower score in the autistic-like traits. As the mentalizing regions were observed in fMRI studies of speech comprehension [43, 81, 82], the increased activity in the prefrontal mentalizing networks maybe attributed

Social communication is fundamental to human daily activity [60–62] and is contributed considerably by nonlinguistic social cues [63]. A growing number of literatures have suggested that the social inference networks, including understanding other's mental states and monitor other's feelings, maybe implicated during communicative tasks. One may recognize other's intention by two ways. One may recruit the motor simulation process which involves the premotor cortex (PMC) and the anterior intraparietal sulcus (aIPS), especially in tasks which require the understanding of the intention conveyed by body motion [64]. The other is related with inferential processes based on "theory of mind" [65] or mentalizing, which has been typically represented by regions non-overlapping with the motor system, including the mPFC, the TPJ as well as the posterior superior temporal sulcus (pSTS) [66]. These regions are mainly involved when intentions were embedded in stories or cartoons in which the goals or beliefs of the characters are not explicitly encoded by communicative cues [67], or when individuals were instructed to identify the intentions of actors they observe [68].

Nonverbal communicative cues (e.g., gaze, voice) are essential social signals in language communication. To understand how the mentalizing and mirroring system contribute to the recognition of intention via nonverbal communicative cues, an fMRI study focused on communicative (e.g., looking at a person) and private intentions (e.g., looking at an object) as well as other-directed and self-directed intentions (whether facing the camera and therefore the participants [69]). Previous studies on cartoons have demonstrated that mentalizing areas was involved when cartoons contained more social interactions, the characters showed less private intentions and more social prospective intentions (e.g., preparing future social interactions, which is considered as more "communicative intentions") [67, 69]. The right TPJ was activated regardless of the intention type, the mPFC was selectively activated in the social prospective and communicative intentions. The dmPFC was considered uniquely involved in the decoding of intention during movement observation. The dmPFC has been associated with the social gaze shift, with increased activity when participant's gaze shift is directed at another person [70] or when they follow the gaze of another person to engage in joint attention [71], suggesting a role of medial prefrontal cortex in the engagement of social communication in a second-person perspective. The participants watched videos in which the actor either faced and looked toward the camera, faced toward but looked away from the camera (at an object at his/her hand), or faced 30° away from the camera and looked toward another person outside of the camera, faced away and looked away from the camera. Observing actions performed with a communicative intent (looking toward the person) versus private intent (looking away from the camera) activated in mPFC, bilateral pSTS, and left TPJ for the mentalizing network and bilateral PMC, bilateral aIPS fo the mirroring system. The mPFC activity increased as the increasing of the individual's trait empathy. Self-directed (0° away from the camera) versus other-directed orientation (30° away from the camera) activated the visual cortex, and the enhanced activation was found in mPFC for the self-directed orientation in the communicative as compared with the private intention. Moreover, the communicative intent further strengthened the connectivity between the mPFC and the bilateral pSTS for the mentalizing system and the left PMC and bilateral aIPS for the mirroring system. These findings suggest a collaboration between the medial prefrontal cortex and other social inference networks during intention recognition in nonverbal communication.

responsible for mentalizing. Moreover, the higher levels of activity in left TPJ and dmPFC during idea exposure were positively associated with greater usage of words from social categories after the experiment. Here, the dmPFC is considered as functions associated with considering other's attributes and motivations [58, 59], and reflected the pursuance of specific motivational goal for the speaker (e.g., to look good by communicating good ideas in a compelling way to others). These findings consolidated the idea that the social cues in language (here the lexical item related with social interaction) can activate the medial prefrontal and other systems implicated in understanding the mental states of others (who introduced the

Social communication is fundamental to human daily activity [60–62] and is contributed considerably by nonlinguistic social cues [63]. A growing number of literatures have suggested that the social inference networks, including understanding other's mental states and monitor other's feelings, maybe implicated during communicative tasks. One may recognize other's intention by two ways. One may recruit the motor simulation process which involves the premotor cortex (PMC) and the anterior intraparietal sulcus (aIPS), especially in tasks which require the understanding of the intention conveyed by body motion [64]. The other is related with inferential processes based on "theory of mind" [65] or mentalizing, which has been typically represented by regions non-overlapping with the motor system, including the mPFC, the TPJ as well as the posterior superior temporal sulcus (pSTS) [66]. These regions are mainly involved when intentions were embedded in stories or cartoons in which the goals or beliefs of the characters are not explicitly encoded by communicative cues [67], or when individuals

Nonverbal communicative cues (e.g., gaze, voice) are essential social signals in language communication. To understand how the mentalizing and mirroring system contribute to the recognition of intention via nonverbal communicative cues, an fMRI study focused on communicative (e.g., looking at a person) and private intentions (e.g., looking at an object) as well as other-directed and self-directed intentions (whether facing the camera and therefore the participants [69]). Previous studies on cartoons have demonstrated that mentalizing areas was involved when cartoons contained more social interactions, the characters showed less private intentions and more social prospective intentions (e.g., preparing future social interactions, which is considered as more "communicative intentions") [67, 69]. The right TPJ was activated regardless of the intention type, the mPFC was selectively activated in the social prospective and communicative intentions. The dmPFC was considered uniquely involved in the decoding of intention during movement observation. The dmPFC has been associated with the social gaze shift, with increased activity when participant's gaze shift is directed at another person [70] or when they follow the gaze of another person to engage in joint attention [71], suggesting a role of medial prefrontal cortex in the engagement of social communication in a second-person perspective. The participants watched videos in which the actor either faced and looked toward the camera, faced toward but looked away from the camera (at an object at his/her hand), or faced 30° away from the camera and looked toward another person outside of the camera, faced away and looked away from the camera. Observing actions performed with a communicative intent (looking toward the person) versus private intent (looking away from the camera) activated in mPFC, bilateral pSTS, and left TPJ for the

ideas toward a new object) and successfully retransmission of ideas.

72 Prefrontal Cortex

were instructed to identify the intentions of actors they observe [68].

Studies also demonstrated that the prefrontal cortex is more involved during language acquisition that occurs in a social interaction scenario. Infants must be immersed in a language in a socially interactive situation to develop speech perception [72]. Similar to the spoken language, the sign language provides rich grammatical rules and both activate the left IFG during syntactic comprehension [73]. The live communication, as compared with the prerecorded videos, is more rewarding, more arousal, provides richer sources of information (such as responsive eye gaze), and more attention-grabbing (e.g., [72]). In a study on the neural correlates of language acquisition [74], naïve Japanese participants learned Japanese sign language through an interaction with a deaf signer or via watching videos for a comparable amount of time. The group who received a live exposure showed the modulation of BOLD signals in the left IFG between two sessions of the testing when making grammatical judgments toward a sequence of signs which was absent in the group received the video exposure. The left IFG was considered as crucial for processing syntactic and other linguistic rules in the native adult which demonstrated a clear role of exposure toward the communicative environment on acquiring new linguistic knowledge (e.g., foreign language). The group receiving DVD exposure revealed the activation in the right IFG and right supramarginal gyrus, suggesting that they developed the knowledge of sign language through incorporating multimodal information from different senses and from imitation learning [75].

Many forms of social interaction are not carrying explicit mentalizing demands. These implicit mentalizing processes include tracking mental state content [76, 77] and monitoring other's communicative intent [70] which are more engaged when processing communicative cues from a real-time social partner than those from a pre-recorded video. A similar experiment invited the participants to listen to short vignettes and were persuaded to believe half to be pre-recorded and the other half to be presented over a real-time audio-feed by a live social partner [78]. Mentalizing regions (which were defined from an independent localizer paradigm with a typical false belief task, in particular dorsal/middle/ventral mPFC and bilateral TPJ [79]) and activations associated with social engagement [66] were observed when participants believed that the speech was live than they listened to recorded matched human speech. The right dmPFC was further correlated with the subjective rating of the liveness for the live speech versus the matched speech and with the individual's autistic traits (measured by autistic quotient [80]). The increased activity in the dmPFC was, the higher rating of liveness for the live speech versus the matched speech, and the lower score in the autistic-like traits. As the mentalizing regions were observed in fMRI studies of speech comprehension [43, 81, 82], the increased activity in the prefrontal mentalizing networks maybe attributed to the increased belief state reasoning during live interaction, or due to an ongoing representation of a social partner that underlies phenomena such as a social resonance, synchrony, and coordination. These findings suggest that the medial prefrontal cortex as a key region to indicate the ongoing mentalizing about social partners, were shaped by social context, and may be crucial for understanding the implication of social context for typical and atypical social processing, especially for neurodevelopmental disorders like autism who suffer more from social difficulties in live interaction.

indirect requests may activate the regions associated with cognitive empathy (the ability to simulate others in a fictional or real world interactive setting [24, 85, 86]), in particular, in the mPFC and TPJ. Moreover, sentences with meanings incongruent with one's real-world knowledge or other types of contextual information (such as speaker identity, counterfactual context, etc.) activated the left IFG [87–91], and some general executive control networks including right IFG, IPL, medial superior frontal gyrus (mSFG) when pragmatic incongruence between linguistic representations or meanings has to be resolved (e.g., [6, 92]). Among the prefrontal executive control networks that are involved in resolving the pragmatic incongruence, the right IFG may subserve a process that inhibits the irrelevant information to ensure a representation that is congruent with the contextual information whereas the mSFG is more generally involved regardless of contextual type. For example, Nieuwland [91] reported the right IFG only responsible for the world knowledge violation in the counterfactual context (e.g., *If NASA had not developed its Apollo Project, the first country to land on moon would be \*America*) but the mSFG in both counterfactual and real-world context (e.g., *Because NASA* 

Prefrontal Cortex: Role in Language Communication during Social Interaction

http://dx.doi.org/10.5772/intechopen.79255

75

In an fMRI study, Li et al. [4] demonstrated that the cognitive empathy of readers (as measured by the interpersonal reactivity index, IRI [93]) predict the neural activations when they read sentences in which the language use failed the pragmatic constraint. In a sentence with "even" which constrained an event of low expectedness, the neutral or highly likely events were embedded, creating the underspecified (e.g., *Even such a sound can be heard by Zhang, he has a sharp hearing*) and incongruent sentences (e.g., *Even such a \*loud sound can be heard by Zhang, he has a sharp hearing*). They demonstrated that when the underspecified sentences were read, the activity in the ventral mPFC was associated with the reader's fantasizing ability (an individual's trait to transpose him or herself to the character of a fictional situations, e.g., novel). The observation of mPFC and its individual differences may indicate that participants may engage an action-related fantasizing or imaging process when making inferences for the underspecified scalar implicature. When the incongruent sentences were encountered, the mSFG extending to ACC was activated and bilateral IFG was correlated with their perspective taking ability (an individual's tendency to adopt the perspectives of others and see things from their point of view). The bilateral IFG was further connected with a number of prefrontal regions such as bilateral mSFG, SMA and ACC (for the left IFG) and right dlPFC and left IPL (for the right IFG) during the processing of incongruent versus congruent sentences. These conflict control networks were involved to unify information from different sources and select the appropriate representation (inhibit the inappropriate representation) for the incongruent sentences. Most importantly, these findings suggest that the cognitive empathy (including those that involve the shift of one's perspective to the fictional character and to another's perspective) supports the neurocognitive mechanisms in making pragmatic inference and in

The involvement of prefrontal cortex in pragmatic processing is also supported by evidence in individual differences in autistic-like traits during language comprehension. Individuals with autism spectrum disorders demonstrated reduced neural activities in the mPFC when they inferred pragmatic meanings from metaphors or ironic remarks [94]. Using structural neuroimaging, Banissy et al. [95] demonstrated that an individual cognitive empathy was

*developed its Apollo Project, the first country to land on moon has been \*Russia*).

resolving pragmatic failure.

The studies on language communication should be better fit into the large picture of the emergent contributions to the social brain, especially in the study of brain-to-brain coupling for learning, (re)constructing and using language through multi-participant experiments [2]. An fMRI study scanned the speaker's brain when they produced a 15-min-long real-life narrative and the listener's brain when they listened to the same narrative [83]. The brain regions specific to production and comprehension, and those that are overlapped between the two processes were examined. The left hemisphere and the bilateral temporal networks under the production of the narrative were shared with those under the comprehension system. Moreover, areas in which the neural activity was coupled between the speaker's and the listener's brains in both linguistic and extralinguistic areas during production and comprehension of the same narrative were shown. The narrative production engendered activations in social aspects of the story processing (e.g., mPFC, precuneus, dlPFC, PCC), as well as in motor speech areas (e.g., bilateral premotor cortex, bilateral insula, and basal ganglia), in the bilateral IFG associated with the construction of grammatical structures, and in bilateral STG/ MTG previously linked to speech comprehension. The coupling between the speaker's and the listener's brain responses was found in precuneus and mPFC, bilateral temporal–parietal areas associated with the comprehension, and left IFG, bilateral insula, left premotor cortex associated with the production. The involvement of the medial prefrontal cortex and precuneus in a range of social functions suggests that the ability of a listener to relate to a speaker and to understand the content of a real-world narrative seems to rely on the higher-level social processing, including the reward-based learning and memory, empathy, and ToM (for mPFC), and first-person perspective taking and experience of agency (for precuneus). In particular, the inference of another's intention through verbal cues plays an essential role during exchange of information between the speaker and the listener and is integral to the success of real-world communication [84].
