Mapping and Timing the (Healthy) Emotional Brain: A Review

*Pablo Revuelta Sanz, María José Lucía Mulas, Tomás Ortiz, José M. Sánchez Pena and Belén Ruiz-Mezcua*

## **Abstract**

The study of the emotional processing in the brain began from a psychological point of view in the last decades of the 19th century. However, since the discovery of the electrical background of mental activity around 1930, a new scientific way of observing and measuring the functioning of the living brain has opened up. In addition, Functional Magnetic Resonance Imaging (fMRI) has given neuroscientists a (literally) deeper instrument to perform such measurements. With all this technological background, the last decades have produced an important amount of information about how the brain works. In this chapter, we review the latest results on the emotional response of the brain, a growing field in neuroscience.

**Keywords:** brain, EEG, fMRI, emotions, stimuli, neuroscience

## **1. Introduction**

The study of emotions deals with the physiological and psychological correlates of subjective experiences that are evident to conscious human beings. Emotions are present and influence our lives and even our perception of reality, making the scientific approach to their study, which has only begun in relatively recent decades, very difficult.

According to [1], the pseudoscience of phrenology brought the critical idea of the physical distribution of psychological functions in the brain, opening the door to modern neuroscience that has largely corroborated this assumption.

It is widely assumed that emotions are the subjective representations of naturally evolved primarily neural circuits and functions that helped surviving since the very first complex animals [2, 3]. This has two main consequences: on the one hand, the physical localization of emotional circuits is hidden in the ancient brain (the limbic system, the amygdalae, and other inner regions). On the other hand, these regions are largely connected to more developed areas, such as the cortex or the cerebellum. Therefore, not only should external stimuli trigger automatic motor responses, but cognitive information can be critical as well as a "brake" on these autonomous reactions (implemented in the cerebellum) and can produce more flexible and adaptive responses.

Although much research in this field focuses on damaged brains, this review covers the healthy brain that responds to emotional stimuli under laboratory conditions.

## **1.1 History**

The connection between the physical processes of the brain and its biomarkers has been assumed since the late 19th century [4].

In 1929, German psychiatrist Hans Berger developed the novel method of Electro-Encephalography (EEG), opening a disruptive and scientific way of studying the processes of the living brain. Although a vast and unexplored field was opened, the first results using EEG to measure emotions did not occur until the 1960s [5]. However, interest in emotional studies still had to wait some years, till the mid-70s' when some researches began to appear [6, 7].

Since then, the same basic experimental setup has been replicated in research until today: a subject connected to the EEG, or to new tomography technologies (as in [8]), is exposed to different stimuli while his brain activity is recorded.

**Figure 1** shows the historic timeline developing this research field.

Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI), two new techniques to access information inside the brain, and not only at scalp level, were developed in 1975 and 1979 respectively, and began to yield significant results in the 1980s (see, for example, [10]).

## **1.2 Brain atlases and areas' references**

The brain began to be mapped according to its anatomical differences in 1909 by Brodmann [11], who defined 52 regions that modern neuroscience considers extraordinarily accurate for those years. In fact, today most neuroscientific works still provide Brodmann's nomenclature to specify the areas of activation.

However, it has been shown that it is not precise enough to evaluate some functional characteristics of the brain, so the Montreal Neurological Institute proposed in the 1990s a more modern division of the human brain [12], with 1 mm3 templates organized in a system of coordinates (X, Y, Z). Today, this brain atlas is considered to be a standard.

The main regions in the brain are depicted in the **Figure 2**.

#### **1.3 Emotional maps**

Emotions are subjective feelings, but they must be quantified in some way to allow a methodical study. Since the 1980s, there have been two main approaches in the field of emotion research: the categorical approach and the dimensional approach.

#### **Figure 1.**

*A chronology of major events associated with the development of human brain imaging, from [9], adapted with permission.*

*Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

#### **Figure 2.**

*Basic brain anatomy. 1: Brain stem. 2: Limbic system. 3: Cerebellum. 4: Cerebrum. 5: Occipital lobe. 6: Temporal lobe. 7: Parietal lobe. 8: Frontal lobe.*

The dimensional approach considers that emotions are organized along a few psychological dimensions. Step by step, a consensus was established on the representation of emotions around a two-dimensional plane, shown in **Figure 3**.

The debate about the separation of the emotional features of valence and arousal flows over the correlation of these two variables. Barret, for example, found weak correlation between them [14], and Lang supports this idea, inferring that some neural circuits are similarly engaged by motivationally relevant cues independently of the valence, while there may be some other hedonic circuits to discriminate valence [3, 15]. However, other researchers have found contradictory results [16], specifically with respect to valence and arousal of negative stimuli.

This paradigm presents another issue since Miller's studies [17]. It seems that there is a distortion in the linearity of this space: a negativity bias (for equal amount of positive or negative stimulus, the negative one produces higher responses) and a positive offset (in neutral scenarios, there is a predisposition to appetitive responses).

Combinations of different scales have been used to provide representational spaces with more dimensions and, allegedly, higher accuracy [16]. Examples of these rating scales are the Bivariate Evaluation and Ambivalent Measures (BEAM), described in [18] and the three-dimensional space proposed in [19], which distinguishes between tension arousal and energy arousal.

**Figure 3.** *Emotional map from [13], adapted with permission.*

As an example of categorical approach, we can find the Self-Assessment Manikin (SAM): SAM is a non-verbal pictorial assessment technique that directly measures the valence, arousal, and dominance associated with a person's affective reaction to a wide variety of stimuli [20].

There is an important body of evidence supporting cross-cultural stability in the perception of emotions [21, 22], and the pleasant-unpleasant dimension seems to exists in all cultures [23]. Ekman considers a few categories of innate and universal emotions (happiness, sadness, anger, fear and disgust) from which all other emotions can be derived [24]. The categorical approach that considers some discrete emotional maps related to basic adaptive problems, has been shown to be cross-cultural or even cross-species (for a review, please refer to [25]).

## **1.4 Emotional stimuli**

As stated in the History section, the setup of most of neuroscientific experiments involves stimuli to elicit emotions (or any other response) in the subject under study.

In addition, different stimuli trigger different and specific areas of the brain, so the choice of stimuli is crucial for the information expected to be retrieved from the experiment.

We will present the most used ones and some points about their effectiveness.

## *1.4.1 Emotion elicitation techniques*

Out of a total of 248 articles, al-Nafjan gathers the type of stimulus used in **Table 1**.

*Why using images?*

Psychologists have already shown that images have strong effects on the emotions of human beings [16]. An important result found in the literature states that simple images generate better emotional responses than complex scenes [3, 27].

Please refer to [28] for a deeper review.

*Why using music?*

The role of music in producing emotional responses is widely accepted and is one of its defining features [29]. Moreover, this has been proven to be cross-cultural [30, 31], making it a very stable and reliable way to provoke emotions in subjects.

When using audio-visual stimuli, it is important to take into account the predominance of image over sound [32], in case of ambiguity or emotional conflict.


**Table 1.**

*Emotional stimuli used according to their nature, from [26].*

## *Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

## *Why using words?*

It is well known that the brain dedicates exclusive resources, organized hierarchically, to word and language processing, such as the areas of Broca and Wernicke, which demonstrates their importance in evolution and survival. It has subsequently been found that words and language stimuli function as emotional triggers [33, 34]. *Others*

Among the "other" stimuli in **Table 1**, researchers have used olfactory [35, 36] or food [37] stimulation, for example.

## *1.4.2 Databases*

To ease the replication, comparison and contrast of theories and results, many research institutions and authors have developed databases with normalized emotional stimuli, which are publicly available. They have labeled stimuli according to different paradigms, and they have been tested. Some of the most used ones are the following:


As we have seen, the databases cover language, images, sounds and combinations thereof.

## **2. Methods**

Nowadays, almost all neuroscientific studies and findings are based on two non-invasive, biomarkers-free technologies: Electroencephalography (EEG) and functional Magnetic Resonance Imaging (fMRI).

## **2.1 EEG**

The EEG is based on the evidence that massive clusters of neurons fire at the same time when they work synchronized, producing tiny voltage changes around them (in the range of millivolts to microvolts). The EEG can be measured directly on the surface of the brain surface and scalp. In both cases, the system has the following elements:


Important advantages of this technique are its setup (some equipment is portable) and its price (actually, the cheapest of the techniques exposed here). In the scalp version, it is non-invasive and very safe for the subject.

The main advantage of this technique is the time resolution, around the millisecond, which measures very rapid changes in scalp potentials. Because of this, only the EEG technique allows phase measurements, synchronization computations, spectral analysis, or other time-related processing.

In the EEG, the first representational information found was the so-called Event-Related Potentials (ERPs), signals produced as a reaction to a stimulus, typically within a few hundred milliseconds to several seconds. These signals have proven to be stable between users and experiments, and some of them, as the P300 (positive peak 300 ms after stimulus onset), are universally known and used.

The EEG allows frequency analysis, and its signals can be transformed into bands. The spectral power (also called Power Spectral Density -PSD-), i.e., the amount of energy in each band-, can be used to obtain important information from the raw EEG data.

Since 1977 [45], researchers have proposed a new approach, combining the ERPs and the bands, called Event-Related Synchronization (ERS) and Desynchronization (ERD), to measure instantaneous responses to stimuli in specific bands, as shown in **Figure 4**.

The main counterparts of this technique are the volume conductance effect that makes it difficult to locate internal potential sources [47], and the limitation of recording only signals on the surface of the brain (if no electrodes are placed inside the brain), which also limits the measurement of internal sources. To partially address this limitation, some novel techniques recreate inner sources from their fingerprint on the scalp voltage through complex algorithms, such as the Low Resolution Electromagnetic Tomography (LORETA) first proposed in [48], and other "reverse problem methods" summarized in [49, 50].

In addition, EEG registers suffer from artifacts (from electrical power networks, lighting, muscle movements …) that must be removed or filtered out before the recordings can be interpreted. Although some automatic approaches have been proposed, this is a craft task that many researchers still perform manually. Another source of noise is the impedance of the electrodes (as they are transductors between the scalp and the wires), which must be kept low enough to accurately measure extremely low scalp voltages, typically below 50 Ω. This requires the application of a conductive gel, cleaning with electrodes with alcohol, washing the hair before the experiment, etc.

### *Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

**Figure 4.** *ERD/ERS detection, from [46], adapted with permission.*

Finally, the EEG system, as a voltage recorder, needs a reference. This does not change the relative voltage distribution on the scalp, but depending on the choice, it can lead to different absolute measures.

Another important process of standardization of EEG measurements has been the definition of electrode positions, which must be constant between studies to allow replication and falsifiability. Depending on the number of electrodes, different standard configurations are defined, the most used being, in the case of 32, the 10–20 International configuration of **Figure 5**.

**Figure 5.** *The 10–20 international EEG configuration.*

The names and positions of the electrodes are defined in this standard and applied elsewhere. For a larger number of electrodes some other standards can be found (see, for example, [51]).

The correlation of EEG signals with emotions is well established, as stated in [26]: " We found that the majority of the 130 articles used event-related potentials, whereas 48 articles used Frontal EEG asymmetry in their analysis, six articles used event-related desynchronization/synchronization, and four articles used steadystate visually evoked potentials".

### **2.2 fRMI**

fMRI began in the 1980s, and soon produced extremely novel results. fMRI measures differential activations of brain regions [52] according to de-oxy-hemoglobin distribution.

fMRI requires a massive magnet (typically around a few tesla), which makes the setup extremely space demanding and expensive. Besides, it cannot be used with metallic components (nor implanted in the subject's body) so the presentation of the stimuli must be deviated with reflective screens, remote speakers, etc.

The temporal resolution of fMRI is poor, in the range of seconds, which makes it useless to record rapid changes or reactions to the stimuli. However, the main positive aspect is the spatial resolution and the real three-dimensionality of the recording, which generates a map of voxels (volumetric units of information) of very few mm3 if a high temporal resolution is not needed (in fact there is a tradeoff between these two parameters; for example, for a voxel size of 3x3x5mm3 , the sampling rate falls to about 2 s [53]). Unlike the EEG registering technique, fMRI has the difficulty of mapping different brains (of different participants) in a canonical brain in which the activations and regions can be represented. This forces a spatial transformation to standard geometries that implies loses in spatial resolution [53].

The functionality of the MRI is given, among others, by the Blood Oxygen Level Dependent (BOLD) imaging, which measures differences in oxygenated blood flowing through the brain (since oxy-hemoglobin and de-oxy-hemoglobin have different magnetic susceptibility), correlated with neural activation.

BOLD techniques have the temporal limitations of the physiological processes on which they are based (see [53] for more details). In most studies, fMRI data are statistically processed to generate a meaningful representation of changes, in so-called Statistical Parametric Maps (SPM), yielding to images as that shown in **Figure 6**.

#### **2.3 Simultaneous measuring and comparison**

Since both EEG and fRMI are based on related physiological processes, it is easy to find correlations between them.

These two non-invasive techniques for exploring the interior of the living brain are not mutually exclusive, and both have advantages and disadvantages. Therefore, both are used in neuroscience research today.

**Table 2** summarizes the main characteristics of each one.

In 1996, Gerloff and others [54] combined fMRI and EEG for the first time to evaluate the co-registration of both techniques applied to the primary motor cortex and the sensory cortex.

Over the past decade, several studies using both techniques have been proposed to, among other, find new large-scale brain networks [55], examine some specific networks [56–58] or even provide neurofeedback to assist in the regulation of some circuits [59].

#### **Figure 6.**

*SPM in which the color of pixels is representative of its p-value and, thus, the statistical significance of its activation or deactivation when two or more tasks are compared. From G.Konstantina, CC BY-SA 4.0, via Wikimedia commons.*


#### **Table 2.**

*Features comparison between EEG and fMRI.*

Babayan et al. [60] have recently published a large database with combined EEG and fMRI data from 227 healthy participants.

Unfortunately, the joint use of both techniques has its drawbacks: the signal-tonoise ratio (SNR) can be degraded [61] and interferential artifacts can be generated, as shown in [62].

For more in-depth in brain data imaging, please refer to the handbook [63].

## **3. Measuring emotions**

The neuronal correlates of emotions present features and effects in various dimensions that interact in the living brain.

To help understand the results collected in the scientific literature on emotions, we will divide the findings into different categories, although they are mixed and sometimes inseparable.

#### **3.1 Timing**

Studies dealing with the temporal signals created by the emotional processing are, most of the time, based on EEG recordings. The reason is, as explained, the temporal resolution of this technique.

Typically, the measurement of an EEG signal follows the scheme of **Figure 7**.

When recording the brain's reaction to stimuli, it is important to define a control group or baseline with which to compare activations or deactivations. Koelstra [65] proposes 5 seconds prior to stimuli as such a baseline.

**Figure 7.** *Typical EEG signal during a stimulus-based experiment, from [64], adapted with permission.*

What happens in the reference period is not uninteresting when emotions are studied from a neuroscientific approach. It has been proven that if in this period the index of asymmetry (the difference in the global activation in each hemisphere, calculated as the left over the right) is high, the subject will present a bias towards positive stimuli, and vice versa as far as fear is concerned [66].

After the stimulus onset, Wei defines a time range of [0.5–4] s as the temporal space in which emotional signals appear [13]. This period has been divided by some authors into three sections: Early [400–1100] ms, Middle [1000-3000] ms, Late [3000–5000] ms [32].

For example, it is widely established that a high positive ERP, in the range of 200–300 ms, widely known as P300, is elicited by emotional stimuli (such as emotional words) compared to neutral ones [67–71]. This ERP appears in the occipitaltemporal regions with an arousal-related amplitude (independent of the valence) compared to neutral stimuli [72].

As already mentioned, the emotional response is mediated or modulated by different neural systems. The P300 has proven to be a modulator of emotional processing regardless of valence when presenting emotional versus neutral pictures [5, 67, 73–75]. These effects were seen in both pleasant and unpleasant pictures [5, 75, 76].

One of the most stable and reliable neural signatures of emotional processing is the so-called Late Positive Potential (LPP), which appears after 1 second of the stimulus presentation, and can be traced for up to 6 seconds in the central-parietal region [77].

It has been shown that LPP appears with both emotional pictures or words, with an amplitude that depends on the arousal intensity [67, 70, 71], being higher in emotional (both positive and negatives) images compared to neutral ones [78] and not habit-forming [67, 79, 80], although it may decrease somewhat with repetition [15]. Another interesting feature of this signal is its ability to appear with very short exposures to visual stimuli (down to 25 ms) [3].

The LPP is independent of the characteristics of the stimuli as realism/symbolism, complexity/simplicity, etc. [27, 67] and therefore very reliable: "The late positive potential evoked by picture stimuli is a reliable, replicable index of their motivational relevance" [3], correlated with the self-reported arousal [3].

For all these reasons, the LPP has been labeled as the "motivational significance" of a stimulus [81].

The LPP has been localized in the central-parietal region, but also in the secondary visual processing sites in the lateral occipital cortex [82] with visual stimuli.

Summarizing the findings of time-analysis of EEG signals correlated with emotional processing, we can say that the P300 and LPP track emotional processes [78]. There is a golden rule that says that valence is processed before arousal [83], since it

has been shown that the early ERP components are correlated with valence [28, 84]. In contrast, the long-term ERP components are correlated with the arousal [85–87].

Hajcak et al. [67] illustrate the temporal and spatial evolution of the different signals.

## **3.2 Mapping**

The fMRI has shown the processing cores in the inner regions of the brain, such as the limbic system. In terms of arousal, it was found that the area of greatest response in the brain is the amygdalae, a couple of little clusters of nuclei belonging to the limbic system in the temporal lobes, on the internal part of the brain. Regarding the role of the amygdalae in emotional processing, survival instincts, memory, etc. Lang et al. found that this area responds to the intensity of emotional stimuli, and has a central role in enhancing sympathetic reactivity to such stimuli [15]. But the amygdalae do not react independently of the valence of the stimuli: the preferred stimuli selectively activated the right amygdala, in relation to aversive ones in some experiments [88, 89].

Valence has also been correlated with specific limbic neural circuits closely connected to the amygdalae: the mesolimbic reward system, in which the nucleus accumbens (NAc) is particularly relevant in the processing of reward and pleasure evoking stimuli (the reward, motivation and addiction circuits) [90, 91]. Another study extends this list of central processing centers to the Ventral Tegmental Area (VTA) and the hypothalamus, working together as a tripartite network that manages the responses to the emotional aspects of music [92]. Another reward network component is the ventral striatum, which, along with the cingulate cortex, has also shown correlations with the arousal of positive emotions when listening to music [93, 94].

The relation between the mesolimbic networks and some frontal regions (as the Orbito-Frontal Cortex (OFC) and the Interior Frontal Cortex (IFC)), more in charge of cognitive processing, has led some researchers to establish a close relationship between "affective" and "cognitive" processing involved in music listening [92]. Another hypothesis is that the interactions between the OFC and the NAc may be related to the control of emotions [95].

Overall, although it belongs to the cognitive cortex, the role of the OFC in the emotional processing is beyond doubt, and is supported by many studies dealing with music [32, 92, 96, 97], images [98, 99] or decisions [95]. Close to the OFC, the IFC has also been considered relevant, producing a bilateral activation when listening to music, according to [92].

Other important cortex cores for emotional processing are the parietal and temporal areas. The centro-parietal area has shown an activation proportional to the arousal of emotional pictures in the first moments (in a range of 300 to 700 ms) [3, 92]. Positive centro-parietal signals [300–6000 ms] have shown valence independence [3]. Furthermore, the link between the frontal and right parieto-temporal areas with the arousal of a stimulus has been also established [100].

It is worth mentioning the anterior insula, belonging to the temporal lobe, which has been thoroughly studied and defined as a relay between the limbic (specifically the human mirror neuron system) and the motor system (in the cortex) [31, 101], and may be the physiological support for subjective states, like pain, hunger, heart rate perception or emotional awareness [102–104].

Early lateralization has been found to be correlated with the valence of sounds [32], and this effect does not exist with neutral sounds. One of the first findings in this field is due to Schwartz [7], who found a lateralization in the brain activity.

Back to the surface of the cortex, there are areas specially engaged in emotional processing, measured with both EEG and fMRI.

In the first case, there is a discussion about which electrodes are the most representative of the undergoing emotional processes. For example, we can find the work of Wei et al., who proposes moving from F1, F2, T3 and T4 to F1, F2, F7 and F8 respectively (shifting the registering area from bilateral front-temporal to pre-frontal medial areas), obtaining much better predictions [13]. This change is also proposed independently (and partially) by Lin et al., stating that fronto-central electrodes are specially relevant when measuring theta asymmetry (F7-F8 and FC3-FC4) as a correlate of arousal [105].

By focusing on synchronizations between different areas, it has been found that there is a phase synchronization between frontal and right temporo-parietal areas depending on the valence and the energetic arousal [106]. In another study [107], a beta-band synchronization was found between the pre-frontal and posterior areas when observing high-arousal images. Finally, unpleasant images caused a phase synchronization in the gamma band according to [108]. Please refer to [109] for further details about synchronization.

We have shown the interactions of the limbic system with the cognitive areas of the brain, in relation to images, sounds or decisions. But we have also found some interactions with other unrelated areas, mainly the motor areas, when empathy in involved. It seems that in the processing of emotions, many different and specialized areas need to interact to account for such a subjective experience. This cross modality has been studied in depth. For example, [32] shows that emotional sounds modulate visual primary cortex (P1). In the same region, relationships have been found between emotional processing cores and visuo-spatial and visuo-motor regions [110], or even premotor regions including the intra-parietal sulcus and the ventral premotor cortex [111]. The ventral premotor and the posterior parietal cortex were elicited during the observation of Classical and Renaissance sculptures suggesting, as Di Dio state, "motor resonance congruent with the implied movements portrayed in the sculptures." [88].

For a final summary, please refer to **Table 1** in [87], which provides a detailed review of the EEG spatial correlates of emotions.

## **4. Final considerations and conclusions**

It has been shown how the way we react to emotional situations depends on very different and scattered areas in the brain. Both initial reactions and the later dependencies lay on different systems [112]. Reward calculation and empathy have been identified as being involved in fear or appetite reactions, and are also the most primitive, but they interact with more evolved areas of the cortex that deal with visual and auditory processing, decision making and even motor activation.

Many aspects of the emotional brain remain open. For instance, global or synchronized processing networks beyond the cortical surface have yet to be described, as the limitations of fMRI and non-invasive EEG do not allow this unknown field to be addressed.

Furthermore, it is not clear whether and how gender affects emotional processing. Although responses to musical stimuli, recorded with EEG, have shown no significant gender differences in brain frontal regions [100], gender differences have been observed during verbal learning tasks [113], in emotional networks in adolescents [114] and with esthetic stimuli producing bilateral parietal activation in women, but lateralized in men [115].

*Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

Other limitations are due to the stimuli used. Real life involves interaction with various external and internal sources of information that modulate our feelings, and laboratory conditions barely address such complex situations.

This branch of neuroscience is not new (compared to neuroscience itself), but still presents many open doors to be explored.

The way emotions mobilize resources in the brain seems to be large and deep, and many other functions (such as memory) depend on it, showing their preeminent position to survive and behave in human (and many other animals') life.

## **Author details**

Pablo Revuelta Sanz1 \*, María José Lucía Mulas2 , Tomás Ortiz<sup>3</sup> , José M. Sánchez Pena<sup>2</sup> and Belén Ruiz-Mezcua<sup>2</sup>


\*Address all correspondence to: revueltapablo@uniovi.es

© 2021 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/ by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

## **References**

[1] A. Damasio, "Descartes Error And The Future Of Human Life," *Scientific American,* vol. 271, no. 4, pp. 144-144, Oct 1994, 1994.

[2] N. Frijda, "The Current Status Of Emotion Theory," *Bulletin of the British Psychological Society,* vol. 39, pp. A75- A75, May 1986, 1986.

[3] P. Lang, and M. Bradley, "Emotion and the motivational brain," *Biological Psychology,* vol. 84, no. 3, pp. 437-450, JUL 2010, 2010.

[4] C. S. Roy, and C. S. Sherrington, "On the Regulation of the Blood-supply of the Brain," *J. Physiol.,* vol. Jan. 11, no. 1-2, pp. 85-158.17., 1890.

[5] K. Lifshitz, "The Averaged Evoked Cortical Response To Complex Visual Stimuli," *Psychophysiology,* vol. 3, no. 1, pp. 55-68, 1966.

[6] Z. Drohocki, "Effect Of Emotion On Amplitude Spectrogram Of EEG," *Electroencephalography and Clinical Neurophysiology,* vol. 36, no. 4, pp. 426- 426, 1974, 1974.

[7] G. Schwartz, R. Davidson, And F. Maer, "Right Hemisphere Lateralization For Emotion In Human Brain - Interactions With Cognition," *Science,* vol. 190, no. 4211, pp. 286-288, 1975, 1975.

[8] S. Petersen, P. Fox, M. Posner, M. Mintun, And M. Raichle, "Positron Emission Tomographic Studies Of The Cortical Anatomy Of Single-Word Processing," *Nature,* vol. 331, no. 6157, pp. 585-589, Feb 18 1988, 1988.

[9] M. Raichle, "A brief history of human brain mapping," *Trends in Neurosciences,* vol. 32, no. 2, pp. 118-126, FEB 2009, 2009.

[10] G. Bradac, W. Schorner, A. Bender, And *R. Felix*, "MRI (NMR) IN THE

Diagnosis Of Brain-Stem Tumors," *Neuroradiology,* vol. 27, no. 3, pp. 208- 213, 1985, 1985.

[11] K. Brodmann, *Vergleichende Lokalisationslehre der Großhirnrinde : in ihren Prinzipien dargestellt auf Grund des Zellenbaues*, Leipzig: Barth, 1909.

[12] A. Evans, M. Kamber, D. Collins, D. Macdonald, S. Shorvon, F. Andermann, G. Bydder, H. Stefan, And D. Fish, "An MRI-Based Probabilistic Atlas Of Neuroanatomy," *Magnetic Resonance Scanning and Epilepsy,* vol. 264, pp. 263- 274, 1994, 1994.

[13] Y. Wei, Y. Wu, and J. Tudor, "A real-time wearable emotion detection headband based on EEG measurement," *Sensors and Actuators a-Physical,* vol. 263, pp. 614-621, AUG 15 2017, 2017.

[14] L. Barrett, and J. Russell, "Independence and bipolarity in the structure of current affect," *Journal of Personality and Social Psychology,* vol. 74, no. 4, pp. 967-984, Apr 1998, 1998.

[15] P. Lang, M. Bradley, B. Cuthbert, R. Simons, and M. Balaban, "Motivated attention: Affect, activation, and action," *Attention and Orienting: Sensory and Motivational Processes*, pp. 97-135, 1997, 1997.

[16] T. Ito, J. Cacioppo, and P. Lang, "Eliciting affect using the international affective picture system: Trajectories through evaluative space," *Personality and Social Psychology Bulletin,* vol. 24, no. 8, pp. 855-879, AUG 1998, 1998.

[17] N. E. Miller, "Liberalization of the basic S-R concepts: Extensions to conflict behavior, motivation and social learning," *Psychology: a study of a science*, S. Kock, ed., pp. 198-292, New York: McGraw-Hill, 1959.

[18] J. T. Cacioppo, W. L. Gardner, and G. G. Bernston, "Attitudes and *Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

evaluative space: Beyond bipolar conceptualizations and measures," *Personality and Social Psychology Review,* vol. 1, pp. 3-25, 1997.

[19] U. Schimmack, and A. Grob, "Dimensional models of core affect: a quantitative comparison by means of structural equation modeling," *European Journal of Personality,* vol. 14, no. 4, pp. 325-345, 2000.

[20] M. Bradley, And P. Lang, "Measuring Emotion - The Self-Assessment Mannequin And The Semantic Differential," *Journal of Behavior Therapy and Experimental Psychiatry,* vol. 25, no. 1, pp. 49-59, MAR 1994, 1994.

[21] P. Ekman, W. Friesen, M. Osullivan, A. Chan, I. Diacoyannitarlatzis, K. Heider, R. Krause, W. Lecompte, T. Pitcairn, P. Riccibitti, K. Scherer, M. Tomita, And A. Tzavaras, "Universals And Cultural-Differences In The Judgments Of Facial Expressions Of Emotion," *Journal Of Personality And Social Psychology,* Vol. 53, No. 4, Pp. 712- 717, Oct 1987, 1987.

[22] P. Ekman, and W. V. Friesen, "Constants Across Cultures in the Face and Emotion," *Journal of Personality and Social Psychology,* vol. 17, no. 2, pp. 124-129., 1971.

[23] J. Russell, "Culture And The Categorization Of Emotions," *Psychological Bulletin,* Vol. 110, No. 3, Pp. 426-450, Nov 1991, 1991.

[24] P. Ekman, "Are There Basic Emotions," *Psychological Review,* vol. 99, no. 3, pp. 550-553, Jul 1992, 1992.

[25] R. Levenson, "Basic Emotion Questions," *Emotion Review,* vol. 3, no. 4, pp. 379-386, Oct 2011, 2011.

[26] A. Al-Nafjan, M. Hosny, Y. Al-Ohali, and A. Al-Wabil, "Review and Classification of Emotion Recognition

Based on EEG Brain-Computer Interface System Research: A Systematic Review," *Applied Sciences-Basel,* vol. 7, no. 12, Dec 2017, 2017.

[27] M. Bradley, S. Hamby, A. Low, and P. Lang, "Brain potentials in perception: Picture complexity and emotional arousal," *Psychophysiology,* vol. 44, no. 3, pp. 364-373, May 2007, 2007.

[28] J. Olofsson, S. Nordin, H. Sequeira, and J. Polich, "Affective picture processing: An integrative review of ERP findings," *Biological Psychology,* vol. 77, no. 3, pp. 247-265, Mar 2008, 2008.

[29] A. Gabrielsson, "Emotions in strong experiences with music.," *Music and emotion: Theory and research*, P. Juslin, Sloboda, J.A., ed., pp. 431-449, Oxford, UK: Oxford University Press, 2001.

[30] L. Balkwill, and W. Thompson, "A cross-cultural investigation of the perception of emotion in music: Psychophysical and cultural cues," *Music Perception,* vol. 17, no. 1, pp. 43-64, Fal 1999, 1999.

[31] I. Molnar-Szakacs, and K. Overy, "Music and mirror neurons: from motion to 'e'motion," *Social Cognitive and Affective Neuroscience,* vol. 1, no. 3, pp. 235-241, Dec 2006, 2006.

[32] D. Brown, and J. Cavanagh, "The sound and the fury: Late positive potential is sensitive to sound affect," *Psychophysiology,* vol. 54, no. 12, pp. 1812-1825, Dec 2017, 2017.

[33] L. Thomas, and K. LaBar, "Emotional arousal enhances word repetition priming," *Cognition & Emotion,* vol. 19, no. 7, pp. 1027-1047, Nov 2005, 2005.

[34] M. Zhang, Y. Ge, C. Kang, T. Guo, and D. Peng, "ERP evidence for the contribution of meaning complexity underlying emotional word processing," *Journal of Neurolinguistics,* vol. 45, pp. 110-118, Feb 2018, 2018.

[35] J. Kline, G. Blackhart, K. Woodward, S. Williams, and G. Schwartz, "Anterior electroencephalographic asymmetry changes in elderly women in response to a pleasant and an unpleasant odor," *Biological Psychology,* vol. 52, no. 3, pp. 241-250, Apr 2000, 2000.

[36] E. Kroupi, J. Vesin, and T. Ebrahimi, "Subject-Independent Odor Pleasantness Classification Using Brain and Peripheral Signals," *IEEE Transactions on Affective Computing,* vol. 7, no. 4, pp. 422-434, Oct-Dec 2016, 2016.

[37] A. Novosel, N. Lackner, H. Unterrainer, M. Dunitz-Scheer, P. Scheer, S. Wallner-Liebmann, and C. Neuper, "Motivational processing of food cues in anorexia nervosa: a pilot study," *Eating and Weight Disorders-Studies on Anorexia Bulimia and Obesity,* vol. 19, no. 2, pp. 169-175, Jun 2014, 2014.

[38] U. o. Surrey. "Surrey Audio-Visual Expressed Emotion (SAVEE) database," http://kahlan.eps.surrey.ac.uk/savee/.

[39] P. J. Lang, M. M. Bradley, and B. N. Cuthbert, *International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-8.*, University ofFlorida, Gainesville, FL., 2008.

[40] M. M. Bradley, and P. J. Lang, *The International Affective Digitized Sounds (2nd Edition; IADS-2): Affective ratings of sounds and instruction manual. Technical report B-3.*, University of Florida, Gainesville, Fl., 2007.

[41] M. M. Bradley, and P. J. Lang, *Affective Norms for English Words (ANEW): Instruction manual and affective ratings. Technical Report C-3.*, UF Center for the Study of Emotion and Attention. Gainesville, FL., 2017.

[42] M. M. Bradley, and P. J. Lang, *Affective Norms for English Text (ANET): Affective ratings of text and instruction manual. (Tech. Rep. No. D-1).* University of Florida, Gainesville, FL., 2007.

[43] S. Livingstone, and F. Russo, "The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English," *Plos One,* vol. 13, no. 5, May 16 2018, 2018.

[44] S. Paquette, I. Peretz, and B. Pascal, "Can We Dissociate The Musical Emotional Pathway From The Vocal One?," *Journal of Cognitive Neuroscience*, pp. 214-214, 2013, 2013.

[45] G. Pfurtscheller, And A. Aranibar, "Event-Related Cortical Desynchronization Detected By Power Measurements Of Scalp EEG," *Electroencephalography and Clinical Neurophysiology,* vol. 42, no. 6, pp. 817- 826, 1977, 1977.

[46] G. Pfurtscheller, and F. da Silva, "Event-related EEG/MEG synchronization and desynchronization: basic principles," *Clinical Neurophysiology,* vol. 110, no. 11, pp. 1842-1857, Nov 1999, 1999.

[47] S. van den Broek, F. Reinders, M. Donderwinkel, and M. Peters, "Volume conduction effects in EEG and MEG," *Electroencephalography and Clinical Neurophysiology,* vol. 106, no. 6, pp. 522- 534, JUN 1998, 1998.

[48] R. Pascualmarqui, C. Michel, And D. Lehmann, "Low-Resolution Electromagnetic Tomography - A New Method For Localizing Electrical-Activity In The Brain," *International Journal of Psychophysiology,* vol. 18, no. 1, pp. 49-65, Oct 1994, 1994.

[49] R. Khemakhem, W. Zouch, A. Ben Hamida, A. Taleb-Ahmed, and I. Feki, "EEG Source Localization Using the

*Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

Inverse Problem Methods," *International Journal of Computer Science and Network Security,* vol. 9, no. 4, pp. 408-415, Apr 30 2009, 2009.

[50] C. Michel, M. Murray, G. Lantz, S. Gonzalez, L. Spinelli, and R. de Peralta, "EEG source imaging," *Clinical Neurophysiology,* vol. 115, no. 10, pp. 2195-2222, Oct 2004, 2004.

[51] A. Neuro. "Electrodes Layout," 06/06/2019; https://www.antneuro.com/products/waveguard/ electrode-layouts.

[52] A. Cohen, D. Fair, N. Dosenbach, F. Miezin, D. Dierker, D. Van Essen, B. Schlaggar, and S. Petersen, "Defining functional areas in individual human brains using resting functional connectivity MRI," *Neuroimage,* vol. 41, no. 1, pp. 45-57, May 15 2008, 2008.

[53] M. Lindquist, "The Statistical Analysis of fMRI Data," *Statistical Science,* vol. 23, no. 4, pp. 439-464, Nov 2008, 2008.

[54] C. Gerloff, W. Grodd, E. Altenmuller, R. Kolb, T. Naegele, U. Klose, K. Voigt, and J. Dichgans, "Coregistration of EEG and fMRI in a simple motor task," *Human Brain Mapping,* vol. 4, no. 3, pp. 199-209, 1996, 1996.

[55] R. Labounek, D. Bridwell, R. Marecek, M. Lamos, M. Mikl, P. Bednarik, J. Bastinec, T. Slavicek, P. Hlustik, M. Brazdil, and J. Jan, "EEG spatiospectral patterns and their link to fMRI BOLD signal via variable hemodynamic response functions," *Journal of Neuroscience Methods,* vol. 318, pp. 34-46, Apr 15 2019, 2019.

[56] V. Salmela, E. Salo, J. Salmi, and K. Alho, "Spatiotemporal Dynamics of Attention Networks Revealed by Representational Similarity Analysis of EEG and fMRI," *Cerebral Cortex,* vol. 28, no. 2, pp. 549-560, FEB 2018, 2018.

[57] R. Ahmad, A. Malik, N. Kamel, F. Reza, H. Amin, and M. Hussain, "Visual brain activity patterns classification with simultaneous EEG-fMRI: A multimodal approach," *Technology and Health Care,* vol. 25, no. 3, pp. 471-485, 2017, 2017.

[58] Q. Guo, T. Zhou, W. Li, L. Dong, S. Wang, and L. Zou, "Single-trial EEGinformed fMRI analysis of emotional decision problems in hot executive function," *Brain and Behavior,* vol. 7, no. 7, Jul 2017, 2017.

[59] J. Bodurka, "Amygdala Emotional Regulation Training With Real-Time fMRI Neurofeedback and Concurrent EEG Recordings," *Biological Psychiatry,* vol. 83, no. 9, pp. S58-S58, May 1 2018, 2018.

[60] A. Babayan, M. Erbey, D. Kumral, J. Reinelt, A. Reiter, J. Roebbig, H. Schaare, M. Uhlig, A. Anwander, P. Bazin, A. Horstmann, L. Lampe, V. Nikulin, H. Okon-Singer, S. Preusser, A. Pampel, C. Rohr, J. Sacher, A. Thoene-Otto, S. Trapp, T. Nierhaus, D. Altmann, K. Arelin, M. Bloechl, E. Bongartz, P. Breig, E. Cesnaite, S. Chen, R. Cozatl, S. Czerwonatis, G. Dambrauskaite, M. Dreyer, J. Enders, M. Engelhardt, M. Fischer, N. Forschack, J. Golchert, L. Golz, C. Guran, S. Hedrich, N. Hentschel, D. Hoffmann, J. Huntenburg, R. Jost, A. Kosatschek, S. Kunzendorf, H. Lammers, M. Lauckner, K. Mahjoory, A. Kanaan, N. Mendes, R. Menger, E. Morino, K. Naethe, J. Neubauer, H. Noyan, S. Oligschlaeger, P. Panczyszyn-Trzewik, D. Poehlchen, N. Putzke, S. Roski, M. Schaller, A. Schieferbein, B. Schlaak, R. Schmidt, K. Gorgolewski, H. Schmidt, A. Schrimpf, S. Stasch, M. Voss, A. Wiedemann, D. Margulies, M. Gaebler, and A. Villringer, "A mind-brain-body dataset of MRI, EEG, cognition, emotion, and peripheral physiology in young and old adults," *Scientific Data,* vol. 6, Feb 12 2019, 2019.

[61] M. Schrooten, R. Vandenberghe, R. Peeters, and P. Dupont, "Quantitative

Analyses Help in Choosing Between Simultaneous vs. Separate EEG and fMRI," *Frontiers in Neuroscience,* vol. 12, Jan 10 2019, 2019.

[62] R. Abreu, A. Leal, and P. Figueiredo, "EEG-Informed fMRI: A Review of Data Analysis Methods," *Frontiers in Human Neuroscience,* vol. 12, Feb 6 2018, 2018.

[63] K. Friston, J. Ashburner, S. Kiebel, T. Nichols, and W. Penny, "Statistical Parametric Mapping: The Analysis of Functional Brain Images," *Statistical Parametric Mapping: the Analysis of Functional Brain Images*, pp. 1-680, 2007, 2007.

[64] iMotions, *EEG (Eectroencephalography): The Complete Pocket Guide*, 2017.

[65] S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, "DEAP: A Database for Emotion Analysis Using Physiological Signals," *IEEE Transactions on Affective Computing,* vol. 3, no. 1, pp. 18-31, Jan-Mar 2012, 2012.

[66] J. Coan, and J. Allen, "Frontal EEG asymmetry as a moderator and mediator of emotion," *Biological Psychology,* vol. 67, no. 1-2, pp. 7-49, OCT 2004, 2004.

[67] G. Hajcak, A. MacNamara, and D. Olvet, "Event-Related Potentials, Emotion, and Emotion Regulation: An Integrative Review," *Developmental Neuropsychology,* vol. 35, no. 2, pp. 129- 155, 2010, 2010.

[68] L. Carretie, J. Hinojosa, M. Martin-Loeches, F. Mercado, and M. Tapia, "Automatic attention to emotional stimuli: Neural correlates," *Human Brain Mapping,* vol. 22, no. 4, pp. 290-299, AUG 2004, 2004.

[69] H. Schupp, M. Junghofer, A. Weike, and A. Hamm, "Emotional facilitation of sensory processing in the visual

cortex," *Psychological Science,* vol. 14, no. 1, pp. 7-13, Jan 2003, 2003.

[70] H. Schupp, B. Cuthbert, M. Bradley, C. Hillman, A. Hamm, and P. Lang, "Brain processes in emotional perception: Motivated attention," *Cognition & Emotion,* vol. 18, no. 5, pp. 593-611, AUG 2004, 2004.

[71] H. Schupp, A. Ohman, M. Junghofer, A. Weike, J. Stockburger, and A. Hamm, "The facilitated processing of threatening faces: An ERP analysis," *Emotion,* vol. 4, no. 2, pp. 189-200, Jun 2004, 2004.

[72] I. Franken, L. Gootjes, and J. van Strien, "Automatic processing of emotional words during an emotional Stroop task," *Neuroreport,* vol. 20, no. 8, pp. 776-781, May 27 2009, 2009.

[73] R. Johnson, "For Distinguished Early Career Contribution To Psychophysiology - Award Address, 1985 - A Triarchic Model Of P300 Amplitude," *Psychophysiology,* vol. 23, no. 4, pp. 367-384, Jul 1986, 1986.

[74] A. Mini, D. Palomba, A. Angrilli, and S. Bravi, "Emotional information processing and visual evoked brain potentials," *Perceptual and Motor Skills,* vol. 83, no. 1, pp. 143-152, Aug 1996, 1996.

[75] J. Radilova, "The Late Positive Component Of Visual Evoked-Response Sensitive To Emotional Factors," *Activitas Nervosa Superior*, pp. 334-337, 1982, 1982.

[76] D. Palomba, A. Angrilli, and A. Mini, "Visual evoked potentials, heart rate responses and memory to emotional pictorial stimuli," *International Journal of Psychophysiology,* vol. 27, no. 1, pp. 55-67, Jul 1997, 1997.

[77] B. Cuthbert, M. Bradley, and P. Lang, "Probing picture *Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

perception: Activation and emotion," *Psychophysiology,* vol. 33, no. 2, pp. 103- 111, Mar 1996, 1996.

[78] T. Suo, L. Liu, C. Chen, and E. Zhang, "The Functional Role of Individual-Alpha Based Frontal Asymmetry in the Evaluation of Emotional Pictures: Evidence from Event-Related Potentials," *Frontiers in Psychiatry,* vol. 8, Sep 27 2017, 2017.

[79] M. Codispoti, V. Ferrari, and M. Bradley, "Repetition and event-related potentials: Distinguishing early and late processes in affective picture perception," *Journal of Cognitive Neuroscience,* vol. 19, no. 4, pp. 577-586, Apr 2007, 2007.

[80] J. Olofsson, and J. Polich, "Affective visual event-related potentials: Arousal, repetition, and time-on-task," *Biological Psychology,* vol. 75, no. 1, pp. 101-108, Apr 2007, 2007.

[81] M. Bradley, "Natural selective attention: Orienting and emotion," *Psychophysiology,* vol. 46, no. 1, pp. 1-11, Jan 2009, 2009.

[82] P. Lang, M. Bradley, J. Fitzsimmons, B. Cuthbert, J. Scott, B. Moulder, and V. Nangia, "Emotional arousal and activation of the visual cortex: An fMRI analysis," *Psychophysiology,* vol. 35, no. 2, pp. 199-210, Mar 1998, 1998.

[83] S. Moon, and J. Lee, "Implicit Analysis of Perceptual Multimedia Experience Based on Physiological Response: A Review," *IEEE Transactions on Multimedia,* vol. 19, no. 2, pp. 340- 353, Feb 2017, 2017.

[84] L. Gianotti, *P. Faber*, M. Schuler, R. Pascual-Marqui, K. Kochi, and D. Lehmann, "First valence, then arousal: The temporal dynamics of brain electric activity evoked by emotional stimuli," *Brain Topography,* vol. 20, no. 3, pp. 143- 156, Mar 2008, 2008.

[85] E. Bernat, S. Bunce, and H. Shevrin, "Event-related brain potentials differentiate positive and negative mood adjectives during both supraliminal and subliminal visual processing," *International Journal of Psychophysiology,* vol. 42, no. 1, pp. 11-34, Aug 2001, 2001.

[86] R. Roschmann, And W. Wittling, "Topographic Brain Mapping Of Emotion-Related Hemisphere Asymmetries," *International Journal of Neuroscience,* vol. 63, no. 1-2, pp. 5-16, 1992, 1992.

[87] M. Kim, M. Kim, E. Oh, and S. Kim, "A Review on the Computational Methods for Emotional State Estimation from the Human EEG," *Computational and Mathematical Methods in Medicine*, 2013, 2013.

[88] C. Di Dio, E. Macaluso, and G. Rizzolatti, "The Golden Beauty: Brain Response to Classical and Renaissance Sculptures," *Plos One,* vol. 2, no. 11, Nov 21 2007, 2007.

[89] E. Phelps, and J. LeDoux, "Contributions of the amygdala to emotion processing: From animal models to human behavior," *Neuron,* vol. 48, no. 2, pp. 175-187, OCT 20 2005, 2005.

[90] B. Knutson, C. Adams, G. Fong, and D. Hommer, "Anticipation of increasing monetary reward selectively recruits nucleus accumbens," *Journal of Neuroscience,* vol. 21, no. 16, pp. art. no.-RC159, Aug 15 2001, 2001.

[91] H. Breiter, I. Aharon, D. Kahneman, *A. dale*, and P. Shizgal, "Functional imaging of neural responses to expectancy and experience of monetary gains and losses," *Neuron,* vol. 30, no. 2, pp. 619-639, May 2001, 2001.

[92] V. Menon, and D. Levitin, "The rewards of music listening: Response and physiological connectivity of the mesolimbic system," *Neuroimage,* vol. 28, no. 1, pp. 175-184, Oct 15 2005, 2005.

[93] A. Blood, and R. Zatorre, "Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion," *Proceedings of the National Academy of Sciences of the United States of America,* vol. 98, no. 20, pp. 11818-11823, Sep 25 2001, 2001.

[94] G. Bush, P. Luu, and M. Posner, "Cognitive and emotional influences in anterior cingulate cortex," *Trends in Cognitive Sciences,* vol. 4, no. 6, pp. 215- 222, Jun 2000, 2000.

[95] A. Bechara, H. Damasio, and A. Damasio, "Emotion, decision making and the orbitofrontal cortex," *Cerebral Cortex,* vol. 10, no. 3, pp. 295-307, Mar 2000, 2000.

[96] S. Koelsch, T. Fritz, K. Schulze, D. Alsop, and G. Schlaug, "Adults and children processing music: An fMR1 study," *Neuroimage,* vol. 25, no. 4, pp. 1068-1076, May 1 2005, 2005.

[97] S. Koelsch, T. Fritz, D. Von Cramon, K. Muller, and A. Friederici, "Investigating emotion with music: An fMRI study," *Human Brain Mapping,* vol. 27, no. 3, pp. 239-250, Mar 2006, 2006.

[98] H. Kawabata, and S. Zeki, "Neural correlates of beauty," *Journal of Neurophysiology,* vol. 91, no. 4, pp. 1699- 1705, Apr 2004, 2004.

[99] D. Cinzia, and G. Vittorio, "Neuroaesthetics: a review," *Current Opinion in Neurobiology,* vol. 19, no. 6, pp. 682-687, Dec 2009, 2009.

[100] L. Schmidt, and L. Trainor, "Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions," *Cognition & Emotion,* vol. 15, no. 4, pp. 487-500, Jul 2001, 2001.

[101] L. Carr, M. Iacoboni, M. Dubeau, J. Mazziotta, and G. Lenzi, "Neural mechanisms of empathy in humans: A relay from neural systems for imitation to limbic areas," *Proceedings of the National Academy of Sciences of the United States of America,* vol. 100, no. 9, pp. 5497-5502, Apr 29 2003, 2003.

[102] A. Craig, "How do you feel? Interoception: the sense of the physiological condition of the body," *Nature Reviews Neuroscience,* vol. 3, no. 8, pp. 655-666, Aug 2002, 2002.

[103] A. Craig, "Human feelings: why are some more aware than others?," *Trends in Cognitive Sciences,* vol. 8, no. 6, pp. 239-241, Jun 2004, 2004.

[104] H. Critchley, S. Wiens, P. Rotshtein, A. Ohman, and R. Dolan, "Neural systems supporting interoceptive awareness," *Nature Neuroscience,* vol. 7, no. 2, pp. 189-195, Feb 2004, 2004.

[105] Y. Lin, Y. Yang, and T. Jung, "Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening," *Frontiers in Neuroscience,* vol. 8, May 1 2014, 2014.

[106] M. Wyczesany, S. Grzybowski, R. Barry, J. Kaiser, A. Coenen, and A. Potoczek, "Covariation of EEG Synchronization and Emotional State as Modified by Anxiolytics," *Journal of Clinical Neurophysiology,* vol. 28, no. 3, pp. 289-296, Jun 2011, 2011.

[107] V. Miskovic, and L. Schmidt, "Cross-regional cortical synchronization during affective image viewing," *Brain Research,* vol. 1362, pp. 102-111, Nov 29 2010, 2010.

[108] *N. martini*, D. Menicucci, L. Sebastiani, R. Bedini, A. Pingitore, N. Vanello, M. Milanesi, L. Landini, and A. Gemignani, "The dynamics of EEG gamma responses to unpleasant visual

*Mapping and Timing the (Healthy) Emotional Brain: A Review DOI: http://dx.doi.org/10.5772/intechopen.95574*

stimuli: From local activity to functional connectivity," *Neuroimage,* vol. 60, no. 2, pp. 922-932, Apr 2 2012, 2012.

[109] C. Han, J. Lee, J. Lim, Y. Kim, and C. Im, "Global Electroencephalography Synchronization as a New Indicator for Tracking Emotional Changes of a Group of Individuals during Video Watching," *Frontiers in Human Neuroscience,* vol. 11, Dec 1 2017, 2017.

[110] L. Fogassi, and G. Luppino, "Motor functions of the parietal lobe," *Current Opinion in Neurobiology,* vol. 15, no. 6, pp. 626-631, DEC 2005, 2005.

[111] T. Jacobsen, R. Schubotz, L. Hofel, and D. Von Cramon, "Brain correlates of aesthetic judgment of beauty," *Neuroimage,* vol. 29, no. 1, pp. 276-285, Jan 1 2006, 2006.

[112] D. Jackson, C. Mueller, I. Dolski, K. Dalton, J. Nitschke, H. Urry, M. Rosenkranz, C. Ryff, B. Singer, and R. Davidson, "Now you feel it, now you don't: Frontal brain electrical asymmetry and individual differences in emotion regulation," *Psychological Science,* vol. 14, no. 6, pp. 612-617, Nov 2003, 2003.

[113] S. Edwards, D. Everhart, H. Demaree, and D. Harrison, "Sexrelated electroencephalographic differences observed during positive and negative affective verbal learning," *Psychophysiology,* vol. 43, pp. S36-S36, 2006, 2006.

[114] J. Hardee, L. Cope, E. Munier, R. Welsh, R. Zucker, and M. Heitzeg, "Sex differences in the development of emotion circuitry in adolescents at risk for substance abuse: a longitudinal fMRI study," *Social Cognitive and Affective Neuroscience,* vol. 12, no. 6, pp. 965-975, Jun 2017, 2017.

[115] C. Cela-Conde, F. Ayala, E. Munar, F. Maestu, M. Nadal, M. Capo, D. del

Rio, J. Lopez-Ibor, T. Ortiz, C. Mirasso, and G. Marty, "Sex-related similarities and differences in the neural correlates of beauty," *Proceedings of the National Academy of Sciences of the United States of America,* vol. 106, no. 10, pp. 3847- 3852, Mar 10 2009, 2009.

## **Chapter 2**

## Modeling of the Flexible Needle Insertion into the Human Liver

*Veturia Chiroiu, Ligia Munteanu, Cristian Rugină and Nicoleta Nedelcu*

## **Abstract**

The insertion of the needle is difficult because the deformation and displacement of the organs are the key elements in the surgical act. Liver and tumor modeling are essential in the development of the needle insertion model. The role of the needle is to deliver into the tumor an active chemotherapeutic agent. We describe in this chapter the deformation of the needle during its insertion into the human liver in the context of surgery simulation of the high- robotic-assisted intraoperative treatment of liver tumors based on the integrated imaging-molecular diagnosis. The needle is a bee barbed type modeled as a flexible thread within the framework of the Cosserat (micropolar) elasticity theory.

**Keywords:** bee needle, human liver, Cosserat elasticity theory

## **1. Introduction**

The flexible bee needles are useful tools to transport drugs into the liver tumors [1, 2]. The insertion trajectory of the needle must avoid the ribs, blood vessels, and other organs to protect the liver [3–6] (**Figure 1a**). The bee needle assures reduced insertion forces and small tissue deformations because of the tip deflections. The furthermost current publications on the surgical needle navigation into the liver can be demonstrated in [7–9]. The bee needle is shown in **Figure 1b**. The front angle has 157 deg., the back angle, 110 deg., the height is 0.5 mm, and the tip thickness 0.15 mm.

A number of scientific researches have been carried out on the collision free trajectory of the needle to the target. The surgical event requires experience in imaging the tumor location based on the liver structure and the microstructural interaction between the needle and the liver. Several studies have revealed that the needle flexibility is essential to achieve a good precision in the handling.

The strain and stress fields and the topological changes of the liver are not to be neglected during the needle navigation towards the tumor [10–13]. Details of the forces during needle insertion into the liver are find in [14], the real time collision detection for virtual surgery in [15] and the minimal hierarchical collision detection in [16]. Optimization is required to modify the needle trajectory in order to protect the liver [17, 18], to manage the tumor risk [19], and to change the robot architecture [20–22]. The inverse sonification problem for capturing hardly detectable details in a medical image is treated in [23], and the control in [24–27]. Microscopic investigation of the human liver offers details on its microanatomy with emphases

#### **Figure 1.**

*a) Trajectory towards the liver tumor; b) honeybee barbed needle [1, 2].*

#### **Figure 2.** *Hepatic lobule - basic unit of the liver.*

to the granular, fibrillar components and irregular solid–fluid interfaces [28–30]. The basic unit of the liver is the hepatic lobule which is a hexagonal element with comprised the portal triad -portal vein, hepatic artery and the bile duct [31, 32]. Lobuli form two layers membranes with internal space of 100A and the cellular elements with twisted, spiraling fibers braided into the helical and screw-shaped gaps (pores) of 40–100 μm in size (**Figure 2**) [33–36].

In this chapter we try to answer a few questions such as how is the deformation of the needle and how the free-collision trajectories are determined.

## **2. Deformation of the needle**

The needle is a bee barbed needle and it is modeled as a flexible thread within the framework of the Cosserat (micropolar) elasticity theory [37–42]. The Cosserat elasticity is applied to describe the interaction between the needle and the human liver. Let us consider a serial surgical robot composed of a revolute joint and a flexible needle. A Lagrange frame ð Þ *X*, *Y*, *Z* of base vectors ð Þ *e*1,*e*2,*e*<sup>3</sup> and origin *O* in the entry point of the skin is attached to the robot (**Figure 3**). The Euler frame *K x*ð Þ , *y*, *z* with origin in the joint and the base vectors ð Þ *d*1, *d*2, *d*<sup>3</sup> is attached to the

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

**Figure 3.**

*Lagrange coordinate system OXYZ and the Euler coordinate system oxyz attached to the needle.*

needle. The angle between the flexible arm and axis *x* is θ. Bending and torsion of the needle are described by the strain functions ð Þ *u*1, *u*2, *u*<sup>3</sup> . The robot has *f* degrees of freedom *f* ¼ *fr* þ *f <sup>e</sup>*, where *fr* ¼ 1 is the generalized coordinate of the rigid system and *f <sup>e</sup>* ¼ 3 are the degrees of freedom of the flexible needle [43, 44].

The Euler axes are oriented with respect to the Lagrange axes by Euler angles υ,ψ and φ [43].

$$\begin{aligned} d\_1 &= (-\sin\psi\sin\eta + \cos\psi\cos\eta\cos\upsilon)e\_1 + \\ &+ (\cos\psi\sin\eta + \sin\psi\cos\eta\cos\upsilon)e\_2 - \sin\upsilon\cos\eta e\_3, \\ d\_2 &= (-\sin\psi\cos\eta - \cos\psi\sin\eta\cos\upsilon)e\_1 + \\ &+ (\cos\psi\cos\eta - \sin\psi\sin\eta\cos\upsilon)e\_2 + \sin\upsilon\sin\eta e\_3, \\ d\_3 &= \sin\upsilon\cos\psi e\_1 + \sin\upsilon\sin\eta e\_2 + \cos\upsilon\ e\_3. \end{aligned} \tag{1}$$

The functions ð Þ *u*1, *u*2, *u*<sup>3</sup> measure the bending and torsion of the needle as

$$\begin{aligned} \mu\_1 &= \mathfrak{v}' \sin \mathfrak{q} - \mathfrak{v}' \sin \mathfrak{v} \cos \mathfrak{q}, \\ \mu\_2 &= \mathfrak{v}' \cos \mathfrak{q} + \mathfrak{v}' \sin \mathfrak{v} \sin \mathfrak{q}, \end{aligned} \tag{2}$$
  $\mu\_3 = \mathfrak{q}' + \mathfrak{v}' \cos \mathfrak{v},$ 

where <sup>0</sup> ð Þ means the partial differentiation with respect to *s* which is the coordinate along the central line of the needle. The functions *u*<sup>1</sup> and *u*<sup>2</sup> describe the bending of the needle, and the function *u*<sup>3</sup> the torsion of the needle. In addition, *u*<sup>1</sup> and *u*<sup>2</sup> are components of the curvature κ of the central line corresponding to the planes ð Þ *yz* and ð Þ *xz*

$$\mathbf{x}^2 = \boldsymbol{\mu}\_1^2 + \boldsymbol{\mu}\_2^2 = \mathbf{v}'^2 + \boldsymbol{\psi}'^2 \sin^2 \mathbf{v}. \tag{3}$$

The function *u*<sup>3</sup> measures the torsion τ of the needle

$$
\mu\_3 = \pi = \mathfrak{q}' + \mathfrak{y}' \cos \mathfrak{v}.\tag{4}
$$

So, the needle is rigid along the tangential direction and the total length of the needle *l* is invariant, the ends being fixed by the force *F* ¼ �*f* with *f* ¼ *f* <sup>1</sup>, *f* <sup>2</sup>, *f* <sup>3</sup> . This force describes the contact between the needle and the tissue *f* ¼ *pcn* ¼ *pcg*<sup>0</sup> , where *g* is a gap function with respect to *s*.

The link between the position vector *r* ¼ ð Þ *x*, *y*, *z* and unit tangential vector *d*<sup>3</sup> is *<sup>r</sup>* <sup>¼</sup> <sup>Ð</sup>*<sup>s</sup>* 0 *d*<sup>3</sup> d*s*, or

$$\mathbf{x}(\mathbf{s}) = \int\_0^\varepsilon \cos\psi \sin\upsilon \,d\mathbf{s}, \mathbf{y}(\mathbf{s}) = \int\_0^\varepsilon \sin\psi \sin\upsilon \,d\mathbf{s}, \mathbf{z}(\mathbf{s}) = \int\_0^\varepsilon \cos\upsilon \,d\mathbf{s}.\tag{5}$$

We introduce the inertia of the needle characterized by

$$(\rho\_0 A\_0)(\mathfrak{s}), (\rho\_0 I\_\mathbf{1})(\mathfrak{s}), (\rho\_0 I\_\mathbf{2}(\mathfrak{s})), \tag{6}$$

where ρ<sup>0</sup> is the mass density per unit volume, *A*<sup>0</sup> the area of the cross section, *I*1,*I*<sup>2</sup> are geometrical moments of inertia around the axis, which is perpendicular to the central axis and respectively around the central axis.

The equations which describe the deformation are

$$-\rho \ddot{r} - \lambda' = 0,\tag{7}$$

$$\begin{split} k\_1 \left( \dot{\psi}^2 \sin \upsilon \cos \upsilon - \ddot{\upsilon} \right) - k\_2 \left( \dot{\varphi} + \dot{\psi} \cos \upsilon \right) \dot{\psi} \sin \upsilon - \\ -A \left( \psi'^2 \sin \upsilon \cos \upsilon - \upsilon'' \right) + C (\psi' + \dot{\psi}' \cos \upsilon) \psi' \sin \upsilon - \\ -\lambda\_1 \cos \upsilon \cos \psi - \lambda\_2 \cos \upsilon \sin \psi + \lambda\_3 \sin \upsilon = 0,\end{split} \tag{8}$$

$$\begin{split} -\frac{\partial}{\partial t} \left\{ k\_1 \dot{\psi} \sin^2 \upsilon + k\_2 (\dot{\varphi} + \dot{\psi} \cos \upsilon) \cos \upsilon \right\} + \\ +\frac{\partial}{\partial s} \left\{ A \psi'^2 \sin^2 \upsilon + C (\psi' + \psi' \cos \upsilon) \cos \upsilon \right\} + \\ +\lambda\_1 \sin \upsilon \sin \upsilon \sin \psi - \lambda\_2 \sin \upsilon \cos \psi = 0,\end{split} \tag{9}$$

$$\begin{split} -k\_2 \frac{\partial}{\partial t} \left( \dot{\varphi} + \dot{\Psi} \cos \upsilon \right) + C \frac{\partial}{\partial s} \left( \dot{\varphi}' + \dot{\psi}' \cos \upsilon \right) = 0. \end{split} \tag{10}$$

where *A* and *C* are the bending stiffness and respectively the torsional stiffness of the needle, related to the Lame constants <sup>λ</sup>, <sup>μ</sup> by *<sup>A</sup>* <sup>¼</sup> <sup>1</sup> <sup>4</sup> <sup>π</sup>*a*<sup>4</sup>*E*,*<sup>C</sup>* <sup>¼</sup> <sup>1</sup> <sup>2</sup> <sup>π</sup>*a*<sup>4</sup>μ, *<sup>A</sup>* <sup>¼</sup> 1 <sup>4</sup> <sup>π</sup>*a*<sup>4</sup>*E*,*<sup>C</sup>* <sup>¼</sup> <sup>1</sup> <sup>2</sup> <sup>π</sup>*a*<sup>4</sup>μ, where *<sup>E</sup>* <sup>¼</sup> <sup>μ</sup>ð Þ <sup>3</sup>λþ2<sup>μ</sup> <sup>λ</sup>þ<sup>μ</sup> is the Young's elastic modulus, and *<sup>a</sup>* is the radius of the cross section of the needle, and

$$
\rho = A\_0 \rho\_0 = \pi a^2 \rho\_0,\\
k\_1 = I\_1 \rho\_0 = \frac{\pi a^4}{4} \rho\_0,\\
k\_2 = I\_2 \rho\_0 = \frac{\pi a^4}{2} \rho\_0. \tag{11}
$$

The Eqs. (7–11) are solved by using the cnoidal method [43].

In short, this method is reducible to a generalization of the Fourier series with the cnoidal functions as the fundamental basis function. This is because the cnoidal functions are much richer than the trigonometric or hyperbolic functions, that is, the modulus *m* of the cnoidal function, 0≤ *m* ≤1, can be varied to obtain a sine or cosine function ð Þ *m* ffi 0 , a Stokes function ð Þ *m* ffi 0*:*5 or a solitonic function, sech or tanh.

To understand the cnoidal method, consider now a nonlinear system of equations that govern the motion of a dynamical system

$$\frac{\mathbf{d}\theta\_i}{\mathbf{d}t} = F\_i(\theta\_1, \theta\_2, \dots, \theta\_n), i = 1, \dots, n, n \ge 3,\tag{12}$$

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

with *<sup>x</sup>*<sup>∈</sup> <sup>R</sup>*n*, *<sup>t</sup>* <sup>∈</sup>½ � 0, *<sup>T</sup>* , *<sup>T</sup>* <sup>∈</sup>*R*, where *<sup>F</sup>* may be of the form

$$\begin{aligned} F\_i &= \sum\_{p=1}^n a\_{ip} \theta\_p + \sum\_{p,q=1}^n b\_{ipq} \theta\_p \theta\_q + \sum\_{p,q,r=1}^n c\_{ipqr} \theta\_p \theta\_q \theta\_r + \\ &+ \sum\_{p,q,r,l=1}^n d\_{ipqrl} \theta\_p \theta\_q \theta\_r \theta\_l + \sum\_{p,q,r,l,m=1}^n e\_{ipqlm} \theta\_p \theta\_q \theta\_r \theta\_l \theta\_m + \dots \, \end{aligned} \tag{13}$$

where *i* ¼ 1, 2, … , *n*, and *a*, *b*,*c* … constants.

This system of equations can be reduced to Weierstrass equations of the type

$$
\dot{\Theta}\_2 = P\_n(\theta),
\tag{14}
$$

We introduce the function transformation

$$\Theta = 2\frac{\text{d}^2}{\text{d}t}\log\Theta\_n(t),\tag{15}$$

where the theta function Θ*n*ð Þ*t* are defined as

$$\Theta\_1 = \mathbf{1} + \exp\left(\mathrm{i}\bullet\_1 t + B\_{11}\right),$$

$$\Theta\_2 = \mathbf{1} + \exp\left(\mathrm{i}\bullet\_1 t + B\_{11}\right) + \exp\left(\mathrm{i}\bullet\_2 t + B\_{22}\right) + \exp\left(\mathrm{i}\bullet\_1 + \mathrm{o}\bullet\_2 + B\_{12}\right),$$

$$\Theta\_3 = \mathbf{1} + \exp\left(\mathrm{i}\bullet\_1 t + B\_{11}\right) + \exp\left(\mathrm{i}\bullet\_2 t + B\_{22}\right) +$$

$$\begin{aligned} + \exp\left(\mathrm{i}\bullet\_3 t + B\_{33}\right) &+ \exp\left(\mathrm{o}\bullet\_1 + \mathrm{o}\bullet\_2 + B\_{12}\right) + \\ + \exp\left(\mathrm{o}\bullet\_1 + \mathrm{o}\bullet\_3 + B\_{13}\right) &+ \exp\left(\mathrm{o}\bullet\_2 + \mathrm{o}\bullet\_3 + B\_{23}\right) + \\ + \exp\left(\mathrm{o}\bullet\_1 + \mathrm{o}\bullet\_2 + \mathrm{o}\bullet\_3 + B\_{12} + B\_{13} + B\_{23}\right), \end{aligned} \tag{16}$$

and

$$\Theta\_n = \sum\_{M \in (-\infty,\infty)} \exp\left(\mathrm{i}\sum\_{i=1}^n M\_i \mathrm{o}\_i t + \frac{1}{2} \sum\_{i$$

$$\exp B\_{i\bar{j}} = \left(\frac{\alpha\_i - \alpha\_j}{\alpha\_i + \alpha\_j}\right)^2, \ \exp B\_{i\bar{i}} = \alpha\_i^2. \tag{18}$$

Further, we write the solution (15) under the form

$$\Theta(t) = 2\frac{\partial^2}{\partial t^2}\log \Theta\_n(\eta) = \Theta\_{lin}(\eta) + \Theta\_{int}(\eta),\tag{19}$$

for η ¼ �ω*t* þ ϕ. The first term θ*lin* represents a linear superposition of cnoidal waves. Indeed, after a little manipulation and algebraic calculus, obtain

$$\Theta\_{lin} = \sum\_{l=1}^{n} \text{cq} \left[ \frac{2\pi}{K\_l \sqrt{m\_l}} \sum\_{k=0}^{\infty} \left[ \frac{q\_l^{k+1/2}}{1 + q\_l^{2k+1}} \cos \left(2k+1\right) \frac{\pi \text{cq} t}{2K\_l} \right]^2 \right]. \tag{20}$$

In (20) we recognize the expression [43].

$$
\Theta\_{lin} = \sum\_{l=1}^{n} \alpha\_l \text{ cm}^2[\alpha\_l t; m\_l], \tag{21}
$$

with

$$q = \exp\left(-\pi \frac{K'}{K}\right),$$

$$K = K(m) + \int\_0^{\pi/2} \frac{\mathbf{d}u}{\sqrt{1 - m \sin^2 u}},$$

$$K'(m\_1) = K(m), m + m\_1 = 1.$$

The second term θint represents a nonlinear superposition or interaction among cnoidal waves. We write this term as

$$2\frac{\text{d}^2}{\text{d}t}\log\left[\mathbf{1} + \frac{F(t)}{G(t)}\right] \approx \frac{\beta\_k \text{cn}^2(\text{ot}, m\_k)}{\mathbf{1} + \gamma\_k \text{cn}^2(\text{ot}, m\_k)}.\tag{22}$$

If *mk* take the values 0 or 1, the relation (22) is directly verified. For 0≤ *mk* ≤1, the relation is numerically verified with an error of j j*<sup>e</sup>* <sup>≤</sup><sup>5</sup> � <sup>10</sup>�<sup>7</sup> . Consequently, we have

$$\Theta\_{nonlinear} = \frac{\sum\_{k=0}^{n} \beta\_k \text{ cm}^2[\eta; m\_k]}{1 + \sum\_{k=0}^{n} \lambda\_k \text{ cm}^2[\eta; m\_k]} \,. \tag{23}$$

As a result, the cnoidal method yields to solutions consisting of a linear superposition and a nonlinear superposition of cnoidal waves.

Therefore, by applying the cnoidal method, the closed form solutions of the Euler angles θ,ψ and φ are obtained [43].

$$\begin{split} \cos \mathfrak{v} &= \mathfrak{\zeta} = \mathfrak{\zeta}\_2 - (\mathfrak{\zeta}\_2 - \mathfrak{\zeta}\_3) \text{cn}^2 \left( \sqrt{\frac{|\lambda\_3|}{2\mathcal{A}} (\mathfrak{\zeta}\_1 - \mathfrak{\zeta}\_3)} (\mathfrak{\xi} - \mathfrak{\xi}\_3), m \right) = \\ &= \mathfrak{\zeta}\_2 - (\mathfrak{\zeta}\_2 - \mathfrak{\zeta}\_3) \text{cn}^2 [w(\mathfrak{\xi} - \mathfrak{\xi}\_3), m], \end{split} \tag{24}$$

where *<sup>m</sup>* <sup>¼</sup> <sup>ζ</sup>2�ζ<sup>3</sup> ζ1�ζ<sup>3</sup> and *w* ¼ ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi j j λ<sup>3</sup> <sup>2</sup>*<sup>A</sup>* <sup>ζ</sup><sup>1</sup> � <sup>ζ</sup><sup>3</sup> ð Þ <sup>q</sup> , <sup>ψ</sup> <sup>¼</sup> <sup>1</sup> <sup>4</sup> *<sup>A</sup>* � *<sup>k</sup>*1*v*<sup>2</sup> ð Þ<sup>2</sup> *<sup>w</sup>*<sup>2</sup> � *<sup>β</sup>* <sup>þ</sup> *<sup>C</sup>* � *<sup>k</sup>*2*v*<sup>2</sup> ð Þ*<sup>τ</sup>* 1 � *ζ*<sup>3</sup> <sup>Y</sup> *<sup>w</sup>* <sup>ξ</sup> � <sup>ξ</sup><sup>3</sup> ð Þ, ζ<sup>2</sup> � ζ<sup>3</sup> 1 � *u*<sup>3</sup> , *m* � �� � � <sup>β</sup> � *<sup>C</sup>* � *<sup>k</sup>*2*v*<sup>2</sup> ð Þ<sup>τ</sup> 1 þ ζ*u*<sup>3</sup> <sup>Y</sup> *<sup>w</sup>* <sup>ξ</sup> � <sup>ξ</sup><sup>3</sup> ð Þ, ζ<sup>2</sup> � ζ<sup>3</sup> 1 þ *u*<sup>3</sup> , *m* � ��, (25) <sup>φ</sup> ¼ � <sup>τ</sup> *<sup>C</sup>* � *<sup>A</sup>* � ð Þ *<sup>k</sup>*<sup>2</sup> <sup>þ</sup> *<sup>k</sup>*<sup>1</sup> *<sup>v</sup>*<sup>2</sup> ½ � *<sup>A</sup>* � *<sup>k</sup>*1*v*<sup>2</sup> <sup>ξ</sup> <sup>þ</sup> 1 <sup>4</sup> *<sup>A</sup>* � *<sup>k</sup>*1*v*<sup>2</sup> ð Þ<sup>2</sup> *w*<sup>2</sup> <sup>β</sup> <sup>þ</sup> *<sup>C</sup>* � *<sup>k</sup>*2*v*<sup>2</sup> ð Þ<sup>τ</sup> 1 � ζ<sup>3</sup> � � � <sup>Y</sup> *<sup>w</sup>* <sup>ξ</sup> � <sup>ξ</sup><sup>3</sup> ð Þ, ζ<sup>2</sup> � ζ<sup>3</sup> 1 � ζ<sup>3</sup> , *m* � � � <sup>β</sup> � *<sup>C</sup>* � *<sup>k</sup>*2*v*<sup>2</sup> ð Þ<sup>τ</sup> 1 þ ζ<sup>3</sup> <sup>Y</sup> *<sup>w</sup>* <sup>ξ</sup> � <sup>ξ</sup><sup>3</sup> ð Þ, ζ<sup>2</sup> � ζ<sup>3</sup> 1 þ ζ<sup>3</sup> , *m* � ��, (26)

with <sup>Q</sup>ð Þ¼ *<sup>x</sup>*, *<sup>z</sup>*, *<sup>m</sup>* <sup>Ð</sup> *x* 0 d*y* <sup>1</sup>�*<sup>z</sup>* sn2ð Þ *<sup>y</sup>*, *<sup>m</sup>* the normal elliptic integral of the third kind. Functions ζ1, ζ2, ζ<sup>3</sup> are solutions of the equation

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

$$\frac{1}{2}\zeta'^2 = a\zeta^3 + b\zeta^2 - a\zeta + c,\tag{27}$$

$$a = -\frac{\lambda\_3}{A} \neq 0, b = \frac{1}{2A} \left( \gamma - \frac{C^2 \pi^2}{A} \right), c = -\frac{1}{2A} \left( \gamma - \frac{\mathfrak{b}^2}{A} \right). \tag{28}$$

Our objective is to determine the functions which measure the bending of the needle (*u*<sup>1</sup> and *u*2), and the torsion ð Þ *u*<sup>3</sup> . To visualize the strain profile of the needle, we chose two routes (**Figure 4**). For the first route the tumor is red and the entry point is A. The second route is restricted by the presence of blood vessels that should not be touched and has the tumor (blue) with entry at point B.

**Figures 5** and **6** show that the deformation of the needle for both routes. We see that the deformation is small with no tendency to chaos. The strains are described

#### **Figure 4.**

*Two needle trajectories: For first route the tumor is red and the entry point A, and for the second route tumor is blue and the entry point B.*

**Figure 5.** *Functions u*<sup>1</sup> *and u*<sup>2</sup> *for a) first route and b) second route.*

**Figure 6.** *Function u*<sup>3</sup> *for a) first route and b) the second route.*

by localized solitons which propagate for a long time without changes. The soliton is a localized wave with an infinite number of degrees of freedom. This wave conserves its properties even after interaction with another wave. In short, this wave acts somewhat like particles [43]. The system of Eqs. (7–11) has unique properties. These properties are locally preserved such as an infinite number of exact solutions expressed in terms of the Jacobi elliptic functions or the hyperbolic functions, and the simple formulae for nonlinear superposition of explicit solutions.

## **3. Determination of the free-collision trajectories**

Let us present a model for determining the collision-free trajectories by using the Fibonacci sequence. The trajectories are determined from the restrictions of avoiding the collisions with blood vessels, ribs and surrounding tissues, and also the interference of needles with each other. We consider that more needles are planned to be inserted into the liver [45].

The trajectory of each needle *j* ¼ 1, … , *n*, is defined as a set of segments connecting the insertion point with the tumor. Two binary control parameters are introduced on each needle. The first parameter is the length of the *k*th segment of the *j* th needle's trajectory, *lkj* <sup>¼</sup> *<sup>f</sup> <sup>k</sup>=rkj*, where *<sup>f</sup> <sup>k</sup>* is the *<sup>k</sup>*th Fibonacci number and *rkj* <sup>a</sup> scaling number. The second control parameter is the angle ω between the current needle and the previous one, ω∈ ð Þ 0, π [46, 47].

The sequence terms in the Fibonacci problem are 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144. It is clear that each term is a sum of two preceding sequence term. In fact, the sequence can be recursively defined in the form *<sup>f</sup> <sup>n</sup>* <sup>¼</sup> *<sup>f</sup> <sup>n</sup>*�<sup>1</sup> <sup>þ</sup> *<sup>f</sup> <sup>n</sup>*�<sup>2</sup>, *<sup>f</sup>* <sup>0</sup> <sup>¼</sup> 0, *<sup>f</sup>* <sup>1</sup> <sup>¼</sup> 1. The limit of the ratio of two consecutive terms in the Fibonacci string tends to the gold ratio <sup>φ</sup> <sup>¼</sup> <sup>1</sup> <sup>þ</sup> ffiffi <sup>5</sup> � � <sup>p</sup> *<sup>=</sup>*2 [48, 49].

The Fibonacci sequence was highlighted in nature for example in the arrangement of the flower petals, in seeds and in the spiral arrangement of pine cones and pineapple. The keys on a piano are divided into Fibonacci numbers, and numerous classical compositions implement the golden section. Such an example is found in the Alleluia Choir in Handel's Messiah and in many of Chopin's preludes [50].

Development of a target drug delivery technique usually consisted of three steps. In step 1, a 2D ultrasound image of the tumor is obtained. The size and position of the tumor are analyzed by the surgeon who decides the number of required needles. In step 2, the insertion points on the skin of the needles and the positions of the targets in the tumor are established. In step 3, once the needles are placed at the target points on the skin, they are guided according to a precise surgical planning

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

**Figure 7.** *The base coordinate frame.*

based on the optimization of the collision-free trajectories to avoid the ribs, blood vessels and tissues in the abdominal area.

The base coordinate frame is established on the first needle with the vertical axis *Oz* and the origin in the insertion point (**Figure 7**). The workspace boundary for the needle is a 1D curve in each *xy*- plane means the *zh*-plane for *z* ¼ *h*, *h*∈½ � 0, *h*max with a clinical value *h* ¼ 200 mm. The *S* is the plan of the needle insertion trajectory, θ*k*, φ*<sup>k</sup>* ð Þ are the rotation angles with respect to *y* and *z* axes θmin ≤θ*<sup>k</sup>* ≤θmax, ϕmin ≤ φ*<sup>k</sup>* ≤φmax [45].

The first control parameter is the length of the *k*th segment of the *j* th needle, *lkj* <sup>¼</sup> *<sup>f</sup> <sup>k</sup>=rkj*, *<sup>k</sup>* <sup>¼</sup> 1, … , *mlk*, *<sup>j</sup>* <sup>¼</sup> 1, … , *<sup>n</sup>* where *<sup>f</sup> <sup>k</sup>* is the *<sup>k</sup>*th Fibonacci number

$$f\_0 = f\_1 = \mathbf{1},\\ f\_{k+2} = f\_{k+1} + f\_k, \newline k \ge \mathbf{0}. \tag{29}$$

and *rkj* a scaling number. The second control parameter is the angle φ between the current needle and the previous one, φ ∈ð Þ 0, π . The kinematic constraint of the *j* th needle, *<sup>j</sup>* <sup>¼</sup> 1, … , *<sup>n</sup>*, is given by

$$\Phi = \begin{bmatrix} \varkappa\_j - \varkappa\_{0j} - \delta\_{\mathbbm{x}j} - \left( h - \mathbbm{z}\_{0j} - \delta\_{\mathbbm{z}j} \right) \tan \theta\_j \cos \varphi\_j \\\varkappa\_j - \varkappa\_{0j} - \varkappa\_{j\mathbbm{y}} - \left( h - \mathbbm{z}\_{0j} - \delta\_{\mathbbm{z}j} \right) \tan \theta\_j \cos \varphi\_j \end{bmatrix} = \mathbf{0},\tag{30}$$

where *<sup>x</sup>*0*<sup>j</sup>* <sup>þ</sup> <sup>δ</sup>*xj*, *<sup>y</sup>*0*<sup>j</sup>* <sup>þ</sup> <sup>δ</sup>*xj*, *<sup>y</sup>*0*<sup>j</sup>* <sup>þ</sup> <sup>δ</sup>*xj* h i*<sup>T</sup>* is the actual target of the tip of needle ð Þ *j* � 1 , θ *<sup>j</sup>*, φ *<sup>j</sup>* � � are the rotation angles with respect to *<sup>y</sup>* and *<sup>z</sup>* axes <sup>θ</sup>min <sup>≤</sup><sup>θ</sup> *<sup>j</sup>* <sup>≤</sup>θmax, <sup>ϕ</sup>min <sup>≤</sup> <sup>φ</sup>*<sup>j</sup>* <sup>≤</sup> <sup>φ</sup>max and <sup>δ</sup>*xj*, <sup>δ</sup>*yj*, <sup>δ</sup>*zj* � � denote the deformation of the liver.

The choice of the scaling number *rkj* is done by a binary control

$$d\_{kj} = ||\varkappa\_k - \varkappa\_{k-1}|| = \frac{u\_{kj}f\_k}{r^{kj}}.\tag{31}$$

The possible collision point between the needle and the tissue is analyzed by an identifier to check the minimum distance between needle and the surrounding tissue [45]. The minimum distance is expressed as

$$\min \left( \frac{1}{2} (r\_1 - r\_2)^T (r\_1 - r\_2) \right),\tag{32}$$

with g1ð Þ *r*<sup>1</sup> ≤0, g2ð Þ *r*<sup>2</sup> ≤0, *r*1, *r*2, the position vectors of two points belonging to the needle and the tissue, respectively, and *g*1, *g*2, the surfaces to the needle and the tissue, respectively. The interference distance or penetration is defined as

$$\min \left( -d \right), \quad \mathbf{g}\_1(r\_1) \le -\frac{d}{2}, \quad \mathbf{g}\_2(r\_2) \le -\frac{d}{2}, \tag{33}$$

where *d* is the penetration. The configuration of the collision-free trajectories of each needle is defined as a sequence of trajectories corresponding to a particular choice for the control ð Þ *u*, *v* . The set of all collision-free trajectories are computed based on the kinematic constraint (30) as

$$R = \left\{ \sum\_{k=0}^{n} \frac{u\_k f\_k}{r^k} \exp\left( - \text{io} \sum\_{j=0}^{k} v\_j | (v\_j) \in \{0, 1\} \right) \right\}. \tag{34}$$

**Figure 8.** *a) Location of the tumor; b) tumor image seen on the microscope.*

**Figure 9**. *Simulation of the collision-free trajectories for three-needles.*

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

As application, the case of a tumor with a difficult location in the vicinity of the portal tree of the vascular territory in the liver, is considered (**Figure 8a**). The tumor image seen on the microscope is shown in **Figure 8b**. White and gray denote forbidden areas while the shade of purple are safe regions. The tumor is drawn in red. The Fibonacci algorithm is applied to three-needles with restrictions to avoid the collision with the tissues, blood vessels, ribs and previously inserted needles.

The task of our simulation is to determine the boundaries of each needle as a collision-free surface which represents the feasible insertion area based on given constraints. Then, the optimal trajectory of each needle can be chosen in this surface automatically.

Once all needles are placed at the predetermined epidermis and the ordering of entry is chosen to be 1, 2 and 3, following the first needle's insertion, the operation is repeated for the needles 2 and 3.

Simulation of the collision-free trajectories for three needles is presented in **Figure 9**. The insertion scheme is determined by the Fibonacci spirals. A set of freecollision trajectories (red) in the immediate vicinity of the epidermis, is suggested.

**Figure 10.** *Optimal solution with 3 collision-free trajectories to the target.*

**Figure 11.** *Three optimal collision-free trajectories for the needle robot.*

From these possible collision-free trajectories (red) the green paths corresponding to the Fibonacci spirals (black) are chosen. These trajectories avoid the blood vessels (purple) and the coasts (brown) in all directions until the tumor. Optimal solution with 3 collision-free trajectories to the target is displayed in **Figure 10**. The Fibonacci spirals with the centers in ribs (brown) are displayed for needles 1 and 3. For the needle 2, the Fibonacci.

Three locally optimal collision-free trajectories for the surgical needle corresponding to three different entry points into the skin A, B and C are displayed in **Figure 11**.

## **4. Conclusions**

The study investigates the navigation of a flexible needle into the human liver. The role of the needle is to deliver into the tumor an active chemotherapeutic agent. The deformation of the needle during its insertion into the human liver is describe in this chapter in the context of intraoperative treatment of liver tumors based on the integrated imaging-molecular diagnosis. The needle is a bee barbed type modeled as a flexible thread within the framework of the Cosserat (micropolar) elasticity theory. The Cosserat elasticity describes the interaction between the needle and the human liver by incorporating the local rotation of points and the couple stress as well as the force stress representing the chiral properties of the human liver.

## **Acknowledgements**

This work was supported by a grant of the Romanian ministry of Research and Innovation, CCCDI – UEFISCDI, project number PN-III-P1-1.2-PCCDI-2017-0221/ 59PCCDI/2018 (IMPROVE), within PNCDI III. We mention that *all authors have an equal contribution* to the work.

## **Conflict of interest**

The authors of this paper certify that they have no affiliations with or involvement in any organization or entity with any financial or nonfinancial interest in the subject matter or materials discussed in this manuscript.

## **Author details**

Veturia Chiroiu\*, Ligia Munteanu, Cristian Rugină and Nicoleta Nedelcu Institute of Solid Mechanics, Romanian Academy, Bucharest, Romania

\*Address all correspondence to: veturiachiroiu@yahoo.com

© 2021 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/ by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

## **References**

[1] Chiroiu, V., Nedelcu, N., Munteanu, L., Rugina, C., Ionescu, M., Dragne, C., *Modeling the flexible needle insertion into the human liver,* Proceedings of the Romanian Academy, series A: Mathematics, Physics, Technical Sciences, Information Science (2020).

[2] Sahlabadi, M., Hutapea, P.,*Tissue defomation and insertion force of beestinger inspired surgical needles*, Journal of medical device, vol.12, 034501, 1–3 (2018).

[3] Dionigi, R., *Medical Intelligence Unit*: *Recent Advances in Liver Surgery*, Landes Bioscience Austin, Texas (2009).

[4] Dremin, V., Potapova, E., Zherebtsov, E., Kandurova, K., Shupletsov, V., Alekseyev, A., Mamoshin, A., Dunaev A*., Optical percutaneous needle biopsy of the liver: a pilot animal and clinical study*, Scientific Reports, volume 10, Article number: 14200 (2020).

[5] Abolhassani, N., Patel, R., Moallem, M., *Control of soft tissue deformation during robotic needle insertion*, Minimally Invasive Therapy, 15(3), 165–176 (2006).

[6] Abolhassani, N., Patel, R., Moallem, M., *Needle insertion into soft tissue: A survey*, Medical Engineerings & Physics, 29, 413–431 (2007).

[7] Torzilli, G., Minagawa, M., Takayama, T., *Accurate preoperative evaluation of liver mass lesions without fine-needle biopsy,* Hepatology, 30(4), 889–893 (1999).

[8] Baker, N.E., *Emerging mechanisms of cell competition*, Nature Reviews Genetic s (2020).

[9] Petrowsky, H., Fritsch, R., Guckenberger, M., De Oliveira, M. L., Dutkowski, P., Clavien, P.A., *Modern*

*therapeutic approaches for the treatment of malignant liver tumours*, Nature Reviews Gastroenterology & Hepatology (2020).

[10] Gino van den Bergen, *Collision detection in interactive 3D environments*, Elsevier (2004).

[11] Okamura, A.M., Simone, C., O'Leary, M.D., *Force modeling for needle insertion into soft tissue*, IEEE Trans Biomed Eng., 51,1707–16 (2004).

[12] DiMaio, S.P., Salcudean, S.E., *Needle insertion modeling and simulation*, IEEE Trans Robot Automat., 19, 864–75 (2003).

[13] DiMaio, S.P., Salcudean, S.E., *Simulated interactive needle insertion*, Proceedings of the 10th IEEE Symposium on Haptic Interfaces for Virtual Environment & Teleoperator Systems, 344–51 (2002).

[14] Maurin, B., Barbe, L., Bayle, B., Zanne, P., *In vivo study of forces during needle insertions*, Scientific Workshop on Medical Robotics, Navigation and Visualization (MRNV04), Germany, Remagen, 415–422 (2004).

[15] Lombardo, J-C, Cani, M-P, Neyret, F., *Real time collision detection for virtual surgery*, Computer Animation (CA'99), May 1999, Geneva, Switzerland, pp.82– 90 (1999).

[16] Zachmann, G., *Minimal hierarchical collision detection,* Proc. ACM Symposium on Virtual Reality Software and Technology (VRST), Hong Kong, China, 121–128 (2002).

[17] Brișan, C., Boantă, C., Chiroiu, V., *Introduction in optimisation of industrial robots. Theory and applications,* Editura Academiei, Bucharest (2019).

[18] Kataoka, H., Washio, T., Audette, M., Mizuhara, K., *A model for relations* *between needle deflection, force, and thickness on needle insertion*, Proceedings of the Medical Image Computing and Computer-Assisted Intervention Conference, 966–974 (2001).

[19] Pisla D., Vaida, C., Birlescu I., Nadim, A.H., Gherman, B.,Corina Radu, Plitea N., *Risk Management for the Reliability of Robotic Assisted Treatment of Non-resectable Liver Tumors*, Appl. Sci., 10(1), 52 (2020).

[20] Birlescu I., Manfred, H., Vaida C., Plitea N., Nayak A., Pisla, D., *Complete Geometric Analysis Using the Study SE(3) Parameters for a Novel, Minimally Invasive Robot Used in Liver Cancer Treatment*, Symmetry, 11(12), 1491 (2019).

[21] Vaida, C., Plitea, N., Pisla, D., Gherman, B., *Orientation module for surgical instruments - a systematical approach*, Meccanica, 48(1), 145–158 (2013).

[22] Munteanu, L., Rugină, C., Dragne, C., Chiroiu, V., *On the robotic control based on interactive activities of subjects,* Proceedings of the Romanian Academy, series A : Mathematics, Physics, Technical Sciences, Information Science, 21(1), (2020).

[23] Chiroiu, V., Munteanu, L., Ioan, R., Dragne, C., Majercsik, L., *Using the Sonification for Hardly Detectable Details in Medical Images,* Scientific Reports, 9, article number 17711 (2019).

[24] Korayem, M.H., Nikoobin, A., Azimirad, V.,*Trajectory optimization of flexible link manipulators in point-to point motion*, Robotica, 27, 825–840 (2009).

[25] Chiroiu, V., Munteanu, L., Ioan, R., Mosneguțu, V., Girip, I., *On the dL algorithm for controlling the hybrid systems*, Acta Electronica, Special Issue Proceedings of the XXIXth SISOM**,** 60

(1–2), 58–65, Mediamira Science Publishing (2019).

[26] Chiroiu, V., Munteanu, L., Dragne, C., Știrbu, C., *On the diferential dynamic logic model for hybrid systems*, Acta Technica Napocensis - series: Applied Mathematics, Mechanics, and Engineering, 61(4), 2018.

[27] Majercsik, L., *On the* d*L control* applied *to a Stewart platform with flexible joints,* Romanian Journal of Mechanics**,** 4(1), 27–38 (2019).

[28] Motta, PM,*The three-dimensional fine structure of the liver as revealed by scanning electron microscopy*, Int. Rev. Cytol.,Suppl 6, 347–399 (1977).

[29] Ma, M.H., Biempica, L.,*The normal human liver cell*, Am. J. Pathol., 62, 353– 370 (1971).

[30] Jackson RL, Morrisett JD, Gotto AM., *Lipoprotein structure and metabolism,* Physiol. Rev. 56, 259–316 (1976).

[31] Saxena, R., Theise, N.D., Crawford, J.M., *Microanatomy of the human liverexploring the hidden interfaces*, Hepatology, 30(6), 1339–46 (1999).

[32] Si-Tayeb, K., Lemaigre, F.P., Duncan, S.A., *Organogenesis and development of the liver*, Dev. Cell.,18(2), 175–89 (2010).

[33] Yi–Je Lim, Dhanannjay Deo, Tejinde r P. Singh, Daniel B. Jones, Suvranu De, *In Situ* Measurement and Modeling of Biomechanical Response of Human Cadaveric Soft Tissues for Physics-Based Surgical Simulation, Surg Endosc., 23(6), 1298–1307 (2009).

[34] Narayan, K.S., Steele, W.J., Busch, H., *Evidence that the granular and fibrillar components of nucleoli contain 28 and 65 RNA, respectively*, Exp. Cell. Res., 43, 483–492 (1966).

*Modeling of the Flexible Needle Insertion into the Human Liver DOI: http://dx.doi.org/10.5772/intechopen.96012*

[35] Habenschus, M.D., Nardini, V., Dia s, L.G., Rocha, B.A., Barbosa Jr. F. , Moraes de Oliveira, A.R., *In vitro enantioselective study of the toxicokinetic effects of chiral fungicide tebuconazole in human liver microsomes*, Ecotoxicol. Environ. Saf., 181, 96–105 (2019).

[36] Takahashi, M., Takani, D., Haba, M., Hosokawa, M., *Investigation of the chiral recognition ability of human carboxylesterase 1 using indomethacin esters*, Chirality, 32(1), 73–80 (2020).

[37] Cosserat, E., Cosserat, F.,*Theorie des corps deformable*, Herman et Fils, Paris, 1909.

[38] Eringen, A.C., *Microcontinuum Field Theory*, volume II. Fluent Media, Springer, New York (2001).

[39] Eringen, A.C., Kafadar, C.B., *Polar field theories.* In A. C. Eringen,editor, Continuum Physics, volume IV, pages 1–75. Academic Press, NewYork (1976).

[40] Iesan, D., *Existence theorems in the theory of micropolar elasticity*. International Journal of Engineering Science, 8, 777–791 (1970).

[41] Iesan, D., *Existence theorems in micropolar elastostatics*, International Journal of Engineering Science, 9, 59–78 (1971).

[42] Forest, S., Sievert, R., *Nonlinear microstrain theories*, International Journal of Solids and Structures, 43(24), 7224–7245 (2006).

[43] Munteanu, L., Donescu, St., *Introduction to Soliton Theory*: *Applications to Mechanics,* Book Series Fundamental Theories of Physics, vol.143, Kluwer Academic Publishers, Dordrecht, Boston (Springer Netherlands) (2004).

[44] Chiroiu, V., Munteanu, L., Gliozzi, A.S., *Application of the Cosserat theory for* *modeling the reinforcement carbon nanotube beams*, CMC: Computers, Materials & Continua, 19(1), 1–16 (2010).

[45] Dragne, C., Chiroiu, V., Munteanu, L., Brişan, C., Rugină, C., Ioan, R., Stănescu, N-D, Stan, A.F., *On the collision free-trajectories of a multipleneedle robot based on the Fibonacci sequence,* New Trends in Mechanism and Machine Science, Volume 89 of the Mechanisms and Machine Science series, Chapter 20 pp.1–12 Springer (2019).

[46] Lai, A.C., Loreti, P., Velluci, P.: A Model for Robotic Hand Based on Fibonacci Sequence. In: Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO-2014), pp.577–584 (2014).

[47] Aghili, F., Parsa, K.: *Design of a reconfigurable space robot with lockable telescopic joints*. In Conference IEEE/RSJ, International Conference on Intelligent Robots and Systems (2006).

[48] Burton, D. M.: *Elementary number theory* (5th ed.). New York: McGraw-Hill (2002).

[49] Fox, W.P.: Fibonacci Search in Optimization of Unimodal Functions. Department of Mathematic Francis Marion University, Florence, SC 29501 (2002).

[50] Silverman, J. H.: *A friendly introduction to number theory*. (3rd ed.) Upper Saddle River, NJ: Pearson Education (2006)

## Section 2
