Dialogue System for Nursing Robot

#### **Chapter 5**

## Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots in Long-Term Care

*Tetsuya Tanioka, Feni Betriana, Ryuichi Tanioka, Yuki Oobayashi, Kazuyuki Matsumoto, Yoshihiro Kai, Misao Miyagawa and Rozzano Locsin*

#### **Abstract**

The purpose of this chapter is to explore the issues of development of conversational dialog of robots for nursing, especially for long-term care, and to forecast humanoid nursing partner robots (HNRs) introduced into clinical practice. In order to satisfy the required performance of HNRs, it is important that anthropomorphic robots act with high-quality conversational dialogic functions. As for its hardware, by allowing independent range of action and degree of freedom, the burden of quality exerted in human-robot communication is reduced, thereby unburdening nurses and professional caregivers. Furthermore, it is critical to develop a friendlier type of robot by equipping it with non-verbal emotive expressions that older people can perceive. If these functions are conjoined, anthropomorphic intelligent robots will serve as possible instructors, particularly for rehabilitation and recreation activities of older people. In this way, more than ever before, the HNRs will play an active role in healthcare and in the welfare fields.

**Keywords:** issues, conversation, dialog, humanoid nursing partner robots (HNRs), long-term care

#### **1. Introduction**

The issue of healthcare demands for the increasing older adult population in Japan is a significant concern, and in other developed countries [1]. This concern is further affected by the decreasing number of healthcare workers who are also getting older, resulting in high turnover rates of healthcare workers [2–4]. In this situation, it is appropriate to consider the use of healthcare robots, which is increasingly recognized as the potential solution to meet care demands of older persons as well as of patients with mental illness [5].

"What are the prominent areas of concern to support older persons when using healthcare robots?" and "What are the barriers to introducing Humanoid Nursing partner Robots (HNRs) to hospitals or elderly institutions?" Similarly, another

question may be, "Which of these types of robots are needed for nursing care, the anthropomorphic or non-anthropomorphic robots?"

Different technological requirements can dictate whether anthropomorphic or non-anthropomorphic robots are needed. For example, if the technological demand is for measuring blood pressure and body temperatures, an anthropomorphic machine may not be necessary. Currently, non-robotic technologies are detecting and retrieving this information with digital hand-held devices, not necessarily robots [6]. However, anthropomorphic robots may be necessary when a conversation is expected, particularly during a dialog with older persons while taking blood pressures and other vital signs, much like a human nurse does today. In addition, the following distinctive questions are asked, "What nursing care tasks can be programmed specifically for anthropomorphic nursing partner robots?" and "What are the core competencies that only nurses, and professional caregivers can do?" Reflecting on the aforementioned questions, it is essential to establish a field of Robot Nursing science, developed by nurse scientists from the perspective of a unique ontology of nursing, and designed from a foundation of robotics engineering, computer science, and nursing science. The expectation is to develop a knowledge base for robot nursing science as foundation for the practice of nursing that uniquely embraces the anthropomorphic robot realities, particularly in demanding precise conversational capabilities. These realizations were illuminated through the posited questions from which the answers may further the development of robot nursing science and its practice.

The aim of this chapter is to explore the issues concerning the development of dialog robots for nursing, especially for long-term care, and the prospects for introducing HNRs into nursing practice.

#### **2. Development of robots that can compassionate conversations**

#### **2.1 Concerns regarding human-robot conversation capabilities**

As an issue for humanoid robot verbalization, the robot voice should have an appropriate intonation, the speech speed, and the voice range that is easy for older persons to hear [7]. If a cute-looking robot utters a low-pitched voice similar to that of an adult male, the user may find it creepy [8]. Therefore, it is necessary to consider a humanoid robot with easy voices for older persons to hear and voices that have a sense of familiarity.

Challenges to developing robot-nursing science are realistic. These challenges highlight the necessity to promote research with the goal of systematizing technological competencies, ethical thinking, safety measures, and outcomes of using robots in nursing settings. With new devices and technologies developed by engineers, introduced and used in nursing care, robot nursing science can only develop within an ontology of nursing at its core. The growing reality of healthcare robot utility is perceived as nursing partners in practice. Human caring expressed as human-to-human relationships, and among nonhumans are the futuristic visioning of healthcare with humanoid robots as main protagonists.

#### **2.2 Development of caring dialogical database of humanoid nursing partner robots and older persons**

Full and effective use of robots by nurses and healthcare providers would lead to a better understanding of patients and their needs. Thus, it is necessary to develop a "Caring Dialog Database" for HNRs in order to enhance robot capabilities to know

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

the patient/client, and to share the expressions of human-robot interactions in esthetic ways. Furthermore, it is important to develop a dialog pattern that allows humanoid robots to empathize with an older person [9]. The ability to empathize and to communicate accurate empathy is likely to enhance the older person's feeling cared for through HNR actions such as: 1) Listening attentively and accepting of older persons; 2) Knowing older persons intentionally; and 3) Establishing appropriate caring dialog.

#### **2.3 Robotics and artificial intelligence**

Robotics and Artificial Intelligence (AI) will become a predominant aspect of healthcare and in welfare settings. Human caring was based on a human-to-human relationship. However, in a nonhuman-to-human relationship in the case of HNRs, it is essential to consider what is required in the aspects of ethical concerns and human safety. Regarding redefinitions of nursing and its underlying beliefs, values, and assumptions, it is pertinent to understand the implications of AI and its role in HNR in healthcare. Thus, robotics, AI, with Natural Language Processing (NLP) will become a predominant aspect of healthcare and welfare settings, particularly among older persons [10].

The HNRs must perform appropriate empathic dialogs [9, 11] by accurately judging the person's expression, non-verbal, and language expressions using AI sensory tools [12].

For the emotional recognition and non-verbal output, the required functions include: 1) Recognizing users' facial expressions; 2) Matching the expressions with emotion database information; 3) Selecting appropriate expressions from emotion database; and 4) Conveying emotional expression by particular motion, for example, using flashing light, moving upper limb and head, etc.

Furthermore, a robot's recognition and verbal output for voices by other persons, e.g., older persons can include: 1) Subjects' voice recognition; 2) Text conversion by NLP; 3) Matching with the NLP database for appropriate response; 4) Speech synthesis, and 5) Vocalization.

In conversations of robot with older person, it is expected that robot can provide accurate empathic response according to the situation. If an older person said,

**Figure 1.** *Pepper is interacting with older persons.*

#### *Information Systems - Intelligent Information Processing Systems, Natural Language Processing...*

"I want to eat sushi!", but the humanoid robot responds with, "I cannot eat because I am a robot", this is unlikely to engage older persons with the robot because such response does not demonstrate the empathic understanding of the robot. However, if the humanoid robot responds like so: "I am a robot, but I would like to try eating sushi. Tell me, what does it taste like?", this answer is likely to engage older persons because it relates a feeling of understanding and of empathy. If HNRs have this empathic response competency, older persons can attain well-being by understanding the content of dialogs and conversations with robots, such as Pepper robot (**Figure 1**).

### **3. Required conversation functions for humanoid nursing partner robots**

This section discusses requirements for HNRs to allow a two-way conversation (dialog) with the user/other. HNRs should comprehend the content of the remarks, the intention of these remarks, including emotions, and others. From the information on speech, paralanguage, and appearance, such as facial expressions and gestures, HNRs can present listening postures to the user, using an appropriate 'line of sight' and nods to signify the appropriateness of the response to the user's remarks (**Figure 2**). These functionalities might facilitate active speech engagements of the user with the HNRs. Furthermore, when HNRs return appropriate responses based on the user's remarks and the contents they understand from non-verbal information, the user can feel that HNRs are listening and understanding dialog/conversation and may feel satisfied with the information or content of the interactive dialog.

As an example of an appropriate response by HNRs, a method such as repeating the keyword used by the user in the remark, or providing a topic related to the keyword can be considered. Further, the voice and movement of the HNRs during its response also affect the impression of the dialog (**Figure 3**).

For example, when a user speaks a sad topic or shows a sad expression, it is necessary that the HNR responds with a sentence and bodily behavior related to compassion, comfort, and encouragement. Moreover, it is important that the robot's voice is conveyed with a tone of artificial compassion that matches the response

#### **Figure 2.**

*An example of the required performance of the application for dialog/conversation with older persons (the robot's receiver function).*

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

#### **Figure 3.**

*An example of the required performance of the application for dialog/conversation with older persons (the robot's sender function).*

sentence, and accurately delivers the humanoid robot's intention to the user [13, 14]. The humanoid robot's response matching the user's emotions may contribute to an expression of artificial sympathy and thereby enhancing empathic expressions for the user.

In the current facial expression recognition technology, reading facial expressions more accurately by recognizing a human face in 3 dimensions (3D) is also being studied [15, 16]. Research is being conducted on robot capabilities assessing human emotions not only from the movements of the eyes and mouth, but also from the movements of human facial muscles and upper body [17]. For the response ability of humanoid robots, it is at the stage wherein creation appropriations of response sentences are evidenced, including the examination of the response vocalization intervals without discomfort, and the implementations with robot verifications and improvement [18–21].

The required performance of the application for interacting with older persons requires a function that allows HNRs to speak according to the user's remarks, rather than the user conversing according to the robot's remarks. Alternatively, it is necessary to devise a way to give the user the feeling of having a dialog by making the user feel as if the HNR understands the user's remarks and intentions. As reception functions for the application, in addition to the technology to accurately read the content of remarks from the user's voice, the technology to read the user's situation from information other than voice such as facial expressions and gestures is required. Furthermore, as its transmission functions, a response sentence, vocalization and actions the expressive function matching the user's remarks and emotions are required.

#### **4. How to develop a robot that can express verbal and non-verbal expressions**

For a robot to convey verbal and non-verbal expressions like those of a human, it is necessary to have a receptor that is equivalent to that of a human. These are the sensory receptors for sight [22, 23], smell [24, 25], and touch [26]. Then, the information detected from these receptors is entered to the system. As such, it is

necessary to perform machine learning based on the input information and prepare to output verbal and non-verbal expressions [27, 28]. Additionally, it is necessary for the robot to be able to perform the same movements and expressions as humans do in terms of its output [29, 30]. Depending on how similar the robot's expression is to human behavior, however, it may lead to an uncanny valley [31] that human beings may find creepy at some points, thereby influencing the responses it can do.

#### **4.1 Anthropomorphic form necessary in human-robot conversation**

The anthropomorphic form is necessary when an HNR is expected to talk to older persons or take vital signs like human nurse do. The influence of physiognomies of HNRs is a greater determining factor apparent to the efficient response of human beings in human-robot transactive engagements. Instead of the "Uncanny Valley" captivating robot communication, it is the human-HNR interactive 'fit' or congruence that may be better appreciated by human persons when HNRs are appropriately described from its appearance or looks. It's accurate and appropriate conversational capabilities further appropriate responses by HNR dependent largely on conversational communications that can easily be influenced by artificial affective communication [8].

In the case of a pleasant conversation with HNRs, human beings have a sense of affinity (Shinwa-kan) with HNRs, like appreciating their cuteness, and expressions of fun. However, human beings have also disappointed, especially when robots have poor conversational competencies. Human beings feel fear, misunderstanding, and confusion depending on the deviousness of the conversational language contents.

#### **4.2 Roles and functions of humanoid nursing partner robots**

Nevertheless, as companions in patient care, HNRs should assume multiple roles including being healthcare assistants to help with task completion. It is necessary for HNRs to possess abilities to express artificial emotions through linguistically appropriate and accurate communication processes, including nonverbal expressions with autonomous bodily movements. It is also critical that the appearance of HNRs would be more familiar, relatable, non-intimidating [32], does not cause human emotional unease and discomforts such as fear, anxiety, and suspiciousness, since human-like appearance of HNRs can lead to resistance [33].

One of the essential attributes of HNRs proposed is Artificial Affective Compassion (AAC) [8]. With the AAC (**Figure 4**) accentuating the significance

#### **Figure 4.**

*Human Emotive Impressions from Conversations During Human-robot Interaction using Artificial Affective Communication (from Ref. [8]).*

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

of language in human-robot interaction, not only will physiognomies of robot impact HNRs' value, but also its capabilities to communicate with AI for NLP. Communicating with artificial affection instilled with phonology and appropriately applied by mimicking human interactions through human features, elements designed with social and cultural nuances in communicative situations in transactive engagements with human beings may be made more valuable and meaningful for human healthcare practice.

#### **5. An example of conversation with older persons and Pepper**

The issue of conversation with Pepper includes expressions such as robot gaze [34], eye blink synchrony [35], eye contact [36], and speech [37]. This issue between older persons and Pepper with a conversation function in the application named "Kenkou-oukoku TALK for Pepper [38]" is vocalization with less intonation. This characteristic makes it difficult for older persons to understand whether Pepper's sentence is an interrogative or a declarative sentence. Similarly, it was found difficult for older persons to understand the end of the sentence with Pepper's talk. The pitch of Pepper's voice is high and difficult to hear. Similarly, Pepper's sensors may not be able to register the correct meaning of the sentence because of the older persons' soft voice or use of a dialect. If the contents of the conversation cannot be recognized, Pepper may interrupt the conversation or suddenly change the topic, which may offend older persons. Therefore, many situations exist wherein the contents of the dialog do not match. In the current performance of Pepper, it changed the topic while the user was still thinking about the answer to Pepper's question [39]. In addition, the operational issue of Pepper is its line of sight. If its line of sight is deviated from the talking person using Pepper's dialog program, Pepper will proceed with the conversation while recognizing others objects around it, thereby failing its line of sight. **Figure 5** presents Robot's line of sight.

The role of an intermediary to support the conversation between older persons and Pepper is important [40]. In the current conversation with Pepper, the user must adapt oneself to the Pepper's utterance. In this case, older persons are expected to listen to Pepper's talk instead of doing all the talking. They must have cognitive responsiveness ability while talking with Pepper. Training to respond quickly and accurately to Pepper's questions may be useful as a rehabilitation for cognitive function.

**Figure 5.** *Pepper's line of visualization.*

Furthermore, to use the current Pepper's conversation application for cognitive rehabilitation of older persons, researchers propose a method in which older persons play a role of listeners. This role might be useful for training as they can concentrate on listening to the speaker's utterances, understand the content of the conversation, and to convey their personal feelings to the other person. When the conversation with Pepper is over, if the intermediary will instruct older persons to recollect the conversation content, this process may lead to maintenance of memory and confirmation, and training of information processing functions for older people.

As a means of improving the Pepper robot application, it is desirable that there is no one-way conversation by Pepper when older persons adapt the Pepper's utterances. Moreover, it is necessary to improve the conversation performance of the application so that older persons can enjoy talking with the robot for a long time. Thus, it is necessary to improve the following: (1) Timing of talk response; (2) Talk content must match the situation; (3) Appropriate reaction to the user's speech; (4) Functions of having eye contact with the user properly; and (5) Functions of reacting to users with non-verbal expressions. It is considered to assure the accomplishment of mutual conversation by these functions.

In order to solve the problem of line of vision, it is necessary to enable robots to express verbal and non-verbal expressions at the same level as human beings. It is considered that robots are merely showing artificial verbal and non-verbal expressions through machine learning [41]. Advanced intelligence is required when trying to express verbal and non-verbal expressions by incorporating artificial thinking, mind, and compassion [42]. Therefore, it is necessary to give the computer an artificial self [43].

Demands for quality nursing care and household responsibilities may be successfully met because of anticipated automation and robotization of work activities through AI and other technological advancements [44]. AI has become the latest "buzzword" in the industry today. To date, there is no AI machine able to 'learn' collective tacit knowledge. AI applies supervised learning and needs a great deal of data to do so. Humans learn in a 'self-supervised way'. Humans observe the world and figure out how it works. Humans need fewer data because humans can understand facts and interpret those using metaphors. Humans can transfer their abilities from one brain path to another. Moreover, these are skills, which AI will need if it is to progress to human intelligence [45].

#### **6. Dialog systems**

The types of dialog systems can be classified as follows: task-oriented dialog systems and non-task-oriented dialog systems. A task-oriented dialog system [46] performs the dialog necessary to achieve the demands of the user. The non-taskoriented dialog system [47] aims to continue the dialog itself. In order to continue the dialog, it is necessary to be able to handle non-task-oriented dialog [48]. The number of people who can converse at the same time is one robot to one person. However, in order to develop a high-performance humanoid robot in the future, it is desirable that one robot can have a dialog with three people.

Regarding the initiative of dialog, in the case of HNRs, the nurse and caregiver have the initiative. In the case of a dialog system, the distinction is made as to whether or not it has a physical body. For example, Siri does not have a body, but AI speakers and communication robots have a physical body. In this book, the HNRs having physicality are premised on the AI technology of mounted dialog processing. *Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

#### **Figure 6.**

*Dialog processing mechanism using natural language processing.*

As a classification of learning methods, regarding modality, in the case of a voice dialog robot, learning performed from a plurality of pieces of information (multimodal) [49] is needed. For example, information on a dialog between a skilled nurse and a care recipient is recorded at the same time. Then, it is necessary to let AI learn the motions and biological data acquired by the moving images and sensors as multimodal information.

The following are the steps involved in the HNR's generation of a response sentence containing emotions in response to the patient's speech. These steps are illustrated in **Figure 6**.


#### *Information Systems - Intelligent Information Processing Systems, Natural Language Processing...*

To synthesize emotions, find expressions from the emotion expression dictionary semantically similar to the expressions included in the response sentence and replace them with synonymous emotional expressions in accordance with the patient's emotions. If the patient has expressed a feeling of "sadness", then the expressions in the response sentence are replaced by expressions that suit the feeling of "sadness".

6.Lastly, a sentence is generated. Here, the response sentence is synthesized with artificial emotions and is corrected if it contains unnatural elements or if it does not match the context of the preceding sentence. Moreover, add some sentences (that evoke a speech response) so as to not stop the conversation with the patient.

This system requires speech act type estimation, topic estimation, concept extraction, and frame processing. Few studies have experimented and applied interactive processing using machine learning at a practical level in the field such as long-term care. It is considered that the cause is the difficulty of responding to situations that cannot occur in normal conversation, acquiring utterances, and predicting emotions in special dialog situations such as in long-term care.

In order to incorporate the tacit knowledge of nursing/long-term care into AI as explicit knowledge, we will record multimodal dialog data at the nursing/long-term care site. A multimodal nursing/long-term care corpus is constructed by its label to the data by skilled human nurses. Based on this corpus, we will develop a machine learning model method for predicting care labels from multimodal information. It is important to evaluate and tune the model with the goal of creating labels with high prediction accuracy. It might be an important point for future development to adapt the current mainstream method to dialog in long-term care with the older person and perform dialog processing. The following problems are considered when the target has a natural dialog with the dialog system only by adapting the current mainstream method.


We also believe it necessary to build a language model suitable for caregiving dialog with the older person for use with speech recognition, by collecting and analyzing a corpus of audio-visual caregiving dialogs. Thus, not only the language model but also the acoustic model must be tailored to the nursing practice and speech of older persons. In addition, by collecting and analyzing the audio-video

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

corpus of long-term care dialog, it will be necessary to construct a language model suitable for elderly long-term care dialog for use in voice recognizers. It is thought that not only the language model but also the acoustic model that is suitable for the nursing care site and the voice of the older person will be required.

As HNRs learn to perform nursing functions, such as ambulation support, vital sign measurement, medication administration, and infectious disease protocols, the role of nurses in care delivery will change [52].

Hamstra [53] argued:

*With less burden on nurses and improved quality of care for patients, collaboration with nurse robots will improve current trends of nursing shortages and unsafe patient ratios.*

*While there is much to be excited about, there are aspects of nursing that cannot be replaced by robots. Nurses ability to understand the context, interpret hidden emotions, recognize implications, reflect empathy, and act on intuition, are innate and human skills that drive success as nurses.*

It is not obvious whether robotics can parallel these characteristics yet, as research on these topics is still ongoing. Also, it is important to remember that highability humanoid robots that can function much like a human being has not been developed in 2020. It is still in the developmental stage, and, today, it cannot be used as a functional work robot that is autonomous. Therefore, intermediaries such as healthcare providers play critical roles within the transactive relations between and among robot and older persons now.

Transactive is a term focusing on the transactional nature of things [54]. As an active process, it illuminates the main feature of the relationship among humanto-human and human-to-intelligent machines, which is, always a transaction. The term illuminates the relationship between HNRs and human persons. This picture shows a transactive relationship among older persons, an occupational therapist as intermediary, and Pepper (**Figure 7**).

**Figure 7.** *A transactive relationship among older persons, intermediary, and Pepper.*

#### **7. Prospects for the introduction of humanoid nursing partner robots in clinical nursing practice**

Since AI and robots used for nursing and long-term care are diverse, it is important to clarify what AI is, what robots are used for in nursing and long-term care, how to use it, and how to apply it to nursing care? Therefore, conducting research related to this topic is important.

In addition, we will search for solutions in clinical settings used for nursing and clarify the required performance and required functions for AI. From the perspective of nursing and medical care, it is important to establish academic disciplines that explore and collaborate with AI and robot developers from the Faculty of Engineering. For example, consider how many and what kind of nursing work the healthcare robot can perform. Looking at the current situation, various robots that do not have human shapes have already "invaded" the medical world. There are robots for improving efficiency and accuracy, such as surgical robots and robots that support dispensing operations and assisting caregivers such as providing transfer and bathing support.

In the future, it can be imagined that a robot that scans the running of blood vessels and programs injection technology will perform injections. It may be possible to secure blood vessels for intravenous injection, which can be safer than nurses and doctors. However, what should we do to ensure safety, when this robot breaks down and out of control? Like humans, it is necessary to be able to judge whether to puncture anymore.

Will humans take on the role of monitoring robots in the relationship between robots and humans (nurses)? Will it add the role of monitoring robots to keep patients safe? Alternatively, now that computers have been introduced, will robot-dedicated engineers be stationed in hospitals just as computer engineers are stationed?

#### **8. Conclusion**

The purpose of this chapter is to explore the issues of development of conversational dialog of HNRs for nursing, especially for long-term care, and to forecast the robot introduction into clinical practice. The major issue is HNRs verbalization, which include inappropriate intonation, voice range, and the speech speed. These issues bring challenge to promote HNRs introduction into clinical practice. In order for robot to meet the demand and situations in nursing and healthcare, it is essential to improve conversation functions and performance of HNRs, such as the ability to express appropriate verbal and non-verbal expressions. For this challenge, collaboration between nursing researchers and AI and machine developers is recommended.

#### **Acknowledgements**

This work was partially supported by JSPS KAKENHI Grant Numbers JP17H01609. We would like to express our sincere appreciation to all the patients and participants who contributed to this article. With many thanks to Dr. Savina Schoenhofer for reviewing this chapter prior to its publication.

#### **Conflict of interest**

The authors have no conflicts of interest directly relevant to the content of this article.

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

### **Author details**

Tetsuya Tanioka1 \*, Feni Betriana1 , Ryuichi Tanioka1 , Yuki Oobayashi1 , Kazuyuki Matsumoto1 , Yoshihiro Kai2 , Misao Miyagawa<sup>3</sup> and Rozzano Locsin1

1 Tokushima University, Tokushima, Japan

2 Tokai University, Kanagawa, Japan

3 Tokushima Bunri University, Tokushima, Japan

\*Address all correspondence to: tanioka.tetsuya@tokushima-u.ac.jp

© 2021 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/ by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

### **References**

[1] Muramatsu N, Akiyama H: Japan: super-aging society preparing for the future. Gerontologist. 2011;51(4):425- 432. DOI: 10.1093/geront/gnr067

[2] Buchan J, Aiken L: Solving nursing shortages: a common priority. J Clin Nurs. 2008;17(24):3262-3268. DOI: 10.1111/j.1365-2702.2008.02636.x

[3] Murray MK: The nursing shortage. Past, present, and future. J Nurs Adm. 2002;32(2):79-84. DOI: 10.1097/ 00005110-200202000-00005

[4] World Health Organization. Global strategy on human resources for health: Workforce 2030 [Internet]. 2006. Available from: https://apps.who.int/ iris/bitstream/handle/10665/250368/ 9789241511131-eng.pdf;jsessionid=F4F5 4C19AA58FCDA94DE34746F6DA886?se quence=1 [Accessed: 2020-11-24]

[5] Stuck RE, Rogers WA. Understanding older adult's perceptions of factors that support trust in human and robot care providers. In: Proceedings of the 10th International Conference on Pervasive Technologies Related to Assistive Environments, PETRA 2017; 21-23 June 2017; Island of Rhodes, Greece. New York, NY, USA: ACM, p.372-377; 2017. DOI: 10.1145/3056540.3076186

[6] Kakria P, Tripathi NK, Kitipawang P: A Real-Time Health Monitoring System for Remote Cardiac Patients Using Smartphone and Wearable Sensors. Int J Telemed Appl. 2015;373474. DOI: 10.1155/2015/373474

[7] Law M, Sutherland C, Ahn HS, et al.: Developing assistive robots for people with mild cognitive impairment and mild dementia: a qualitative study with older adults and experts in aged care. BMJ Open. 2019;9(9):e031937. DOI: 10.1136/bmjopen-2019-031937

[8] Betriana F, Osaka K, Matsumoto K, Tanioka T, Locsin RC: Relating Mori's

Uncanny Valley in generating conversations with artificial affective communication and natural language processing. Nurs Philos. 2021;22(2):e12322. DOI: 10.1111/ nup.12322

[9] Pepito JA, Ito H, Betriana F, Tanioka T, Locsin RC: Intelligent humanoid robots expressing artificial humanlike empathy in nursing situations. Nurs Philos. 2020;21:e12318. DOI: 10.1111/nup.12318

[10] Pou-Prom C, Raimondo S, Rudzicz F: A Conversational Robot for Older Adults with Alzheimer's Disease. ACM Trans Hum-Robot Interact. 2020;9(3):Article 21. DOI: 10.1145/3380785

[11] Nocentini O, Fiorini L, Acerbi G, Sorrentino A, Mancioppi G, Cavallo F: A Survey of Behavioral Models for Social Robots. Robotics. 2019,8(3),54. DOI: https://doi.org/10.3390/robotics8030054

[12] Behera A, Matthew P, Keidel A, et al.: Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for Enhancing Intelligent Tutoring Systems. Int J Artif Intell Educ. 2020;30:236-270. DOI: 10.1007/s40593-020-00195-2

[13] Ray A. Compassionate Artificial Intelligence. Compassionate AI Lab; 2018. 258p. ISBN-10: 9382123466

[14] Kerasidou A: Artificial intelligence and the ongoing need for empathy, compassion and trust in healthcare. Bull World Health Organ. 2020;98(4):245- 250. DOI: 10.2471/BLT.19.237198

[15] Tornincasa S, Vezzetti E, Moos S, et al.: 3D Facial Action Units and Expression Recognition using a Crisp Logic. Computer-Aided Design & Applications. 2019;16(2):256-268. DOI: 10.14733/cadaps.2019.256-268

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

[16] Agbolade O, Nazri A, Yaakob R, Ghani AA, Cheah YK: 3-Dimensional facial expression recognition in human using multi-points warping. BMC Bioinformatics. 2019;20(1):619. DOI: 10.1186/s12859-019-3153-2

[17] Vithanawasam TMW, Madhusanka BGDA: Dynamic Face and Upper-Body Emotion Recognition for Service Robots. Gaze and filled pause detection for smooth human-robot conversations. In: Proceedings of the 2018 IEEE/ACIS 17th International Conference on Computer and Information Science (ICIS); 6-8 June 2018; Singapore, Singapore; 2018, p.428-432. DOI: 10.1109/ICIS.2018.8466505

[18] Milhorat P, LalaEmail D: A Conversational Dialogue Manager for the Humanoid Robot ERICA. Advanced Social Interaction with Agents. 2018;119-131.

[19] Bilac M, Chamoux M, Lim A. Gaze and filled pause detection for smooth human-robot conversations. In Proceedings of the 2017 IEEE-RAS 17th International Conference on Humanoid Robotics; (Humanoids), 15-17 November 2017; Birmingham, UK; p.297-304. DOI: 10.1109/HUMANOIDS.2017.8246889

[20] Aoyagi S, Hirata K,

Sato-Shimokawara E, Yamaguchi T. A Method to Obtain Seasonal Information for Smooth Communication Between Human and Chat Robot. In 2018 Joint 10th International Conference on Soft Computing and Intelligent Systems (SCIS) and 19th International Symposium on Advanced Intelligent Systems (ISIS), 2018; Toyama, Japan; p. 1121-1126. DOI: 10.1109/ SCIS-ISIS.2018.00176

[21] Lala D, Nakamura S, Kawahara T. Analysis of Effect and Timing of Fillers in Natural Turn-Taking. In Proceedings of the Interspeech 2019, 15-19 September 2019; Graz, Austria; p. 4175-4179. DOI: 10.21437/Interspeech.2019-1527

[22] Simul NS, Ara NM, Islam MS. A support vector machine approach for real time vision based human robot interaction. In Proceedings of 19th International Conference on Computer and Information Technology (ICCIT), 2016; Dhaka, Bangladesh; p.496-500. DOI: 10.1109/ICCITECHN.2016. 7860248

[23] Liu Z, Wu M, Cao W, et al.: A Facial Expression Emotion Recognition Based Human-robot Interaction System. IEEE/ CAA Journal of Automatica Sinica. 2017; 4(4):668-676.

[24] Miwa H, Umetsu T, Takanishi A, Takanohu H. Human-like robot head that has olfactory sensation and facial color expression. In Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation; 21-26 May 2001; Seoul, South Korea. p.459-464. DOI: 10.1109/ROBOT.2001.932593

[25] Coradeschi S, Ishiguro H, Asada M, Shapiro SC, Thielscher M, Ishida H: Human-Inspired Robots. IEEE Intelligent Systems. 2006; 21(4):74-85. DOI: 10.1109/MIS.2006.72

[26] Martinez-Hernandez U, Prescott TJ. Expressive touch: Control of robot emotional expression by touch. In Proceedings of 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 26-31 August 2016; New York. New York: IEEE; 2016. p.974-979. DOI: 10.1109/ROMAN.2016.7745227

[27] Li Y, Hashimoto M. Effect of emotional synchronization using facial expression recognition in human-robot communication. In Proceedings of 2011 IEEE International Conference on Robotics and Biomimetics. 7-11 December 2011; Karon Beach, Phuket. New York: IEEE; 2011. p.2872-2877. DOI: 10.1109/ROBIO.2011.6181741

[28] Yoon Y, Ko WR, Jang M, Lee J, Kim J, Lee G. Robots Learn Social Skills: End-to-End Learning of Co-Speech Gesture Generation for Humanoid Robots. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA). 20-24 May 2019; Montreal; New York: IEEE; 2019. p.4303-4309. DOI: 10.1109/ ICRA.2019.8793720

[29] Hua M, Shi F, Nan Y, Wang K, Chen H, Lian S. Towards More Realistic Human-Robot Conversation A Seq2Seqbased Body Gesture Interaction System. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS): 3-8 November 2019; Macau; New York: IEEE; 2019. p.247-252. DOI: 10.1109/ IROS40897.2019.8968038

[30] Salem M, Rohlfing K, Kopp S, Joublin F. A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In 2011 RO-MAN. 31July-3 August 2011; Atlanta; New York: IEEE; 2011. p.247-252. DOI: 10.1109/ ROMAN.2011.6005285

[31] Mori M: The Uncanny Valley. Energy. 1970;7:33-35 (in Japanese).

[32] Prakash A, Rogers WA: Why Some Humanoid Faces Are Perceived More Positively Than Others: Effects of Human-Likeness and Task. International Journal of Social Robotics. 2015;7(2):309- 331. DOI:10.1007/s12369-014-0269-4

[33] Müller BCN, Gao X, Nijssen SRR, Damen TGE: I, Robot: How Human Appearance and Mind Attribution Relate to the Perceived Danger of Robots. International Journal of Social Robotics. 2020. DOI: 10.1007/ s12369-020-00663-8

[34] Boucher JD, Pattacini U, Lelong A, et al.: I reach faster when I see you look: gaze effects in human-human and human-robot face-to-face cooperation. Frontiers in Neurorobotics. 2012;6(3). DOI: 10.3389/fnbot.2012.00003

[35] Tatsukawa K, Nakano T, Ishiguro H, Yoshikawa Y: Eyeblink synchrony in multimodal human-android interaction. Scientific Reports. 2016;6:39718. DOI: 10.1038/srep39718

[36] Xu T, Zhang H, Yu C: See you see me: the role of eye contact in multimodal human-robot interaction. ACM Trans Interact Intell Syst. 2016;6(1):2. DOI:10.1145/2882970

[37] Cid F, Moreno J, Bustos P, Núñez P: Muecas: a multi-sensor robotic head for affective human robot interaction and imitation. Sensors. 2014;14(5):7711- 7737. DOI: 10.3390/s140507711

[38] XING INC. Kenkou-oukoku TALK for Pepper [Internet]. Available from: https://roboapp.joysound.com/talk/ [Accessed 2020-10-2]

[39] Miyagawa M, Yasuhara Y, Tanioka T, et al.: The Optimization of Humanoid Robot's Dialog in Improving Communication between Humanoid Robot and Older Adults. Intelligent Control and Automation. 2019;10(3):118-127. DOI: 10.4236/ ica.2019.103008

[40] Osaka K, Sugimoto H, Tanioka T, et al.: Characteristics of a Transactive Phenomenon in Relationships among Older Adults with Dementia, Nurses as Intermediaries, and Communication Robot. Intelligent Control and Automation. 2017;8(2):111-125. DOI: 10.4236/ica.2017.82009

[41] Greeff J, Belpaeme T: Why Robots Should Be Social: Enhancing Machine Learning through Social Human-Robot Interaction. PLoS ONE. 2015;10(9):e0138061. DOI: 10.1371/ journal.pone.0138061

[42] Asada M: Development of artificial empathy. Neuroscience Research. 2015;90:41-50. DOI: 10.1016/j. neures.2014.12.002

*Issues in the Development of Conversation Dialog for Humanoid Nursing Partner Robots… DOI: http://dx.doi.org/10.5772/intechopen.99062*

[43] Linda S: Endres, Personality engineering: Applying human personality theory to the design of artificial personalities. Advances in Human Factors/Ergonomics. 1995;20:477-482. DOI: 10.1016/ S0921-2647(06)80262-5

[44] "Future of Work 2035: For Everyone to Shine" Panel. "Future of Work: 2035"-For Everyone to Shine-[Report] [Internet]. 2016. Available from: https:// www.mhlw.go.jp/file/06- Seisakujouhou-12600000- Seisakutoukatsukan/0000152705.pdf. [Accessed 2020-11-30]

[45] Badimo, KH. How Artificial Intelligence can help to address some of the limitations of knowledge management. 2019. Available from: https://www.linkedin.com/pulse/ how-artificial-intelligence-can-helpaddress-some-knowledge-badimo/ [Accessed 2020-11-30]

[46] Liao K, Liu Q, Wei Z, et al.: Taskoriented Dialogue System for Automatic Disease Diagnosis via Hierarchical Reinforcement Learning. ArXiv. 2020; arXiv preprint arXiv:2004.14254.

[47] Isoshima K, Hagiwara M. A Non-Task-Oriented Dialogue System Controlling the Utterance Length. In Proceedings-2018 Joint 10th International Conference on Soft Computing and Intelligent Systems and 19th International Symposium on Advanced Intelligent Systems (SCIS-ISIS 2018): 5-8 December 2018; Toyama, Japan. Institute of Electrical and Electronics Engineers Inc.; 2019. p.849-854. DOI: 10.1109/ SCIS-ISIS.2018.00140

[48] Zhou Y, Black AW, Rudnicky AI. Learning Conversational Systems that Interleave Task and Non-Task Content. In: Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI-17): 19-25 August 2017; Melbourne, Australia. International Joint Conferences on Artificial Intelligence; 2017. p.4214- 4220. https://www.ijcai.org/ Proceedings/2017/0589.pdf

[49] Fernández-Rodicio E, Castro-González Á, Alonso-Martín F, Maroto-Gómez M, Salichs MÁ: Modelling Multimodal Dialogues for Social Robots Using Communicative Acts. Sensors. 2020;20(12):3440. DOI: 10.3390/s20123440

[50] Khalil RA, Jones E, Babar MI, Jan T, Zafar MH, Alhussain T: Speech Emotion Recognition Using Deep Learning Techniques: A Review. IEEE Access. 2019;7,117327-117345. DOI: 10.1109/ ACCESS.2019.2936124

[51] Blei DM, Ng AY, Jordan MI: Latent Dirichlet Allocation. Journal of Machine Learning Research. 2003;3:993-1022.

[52] Robert N: How artificial intelligence is changing nursing. Nursing Management. 2019;50(9),30-39. DOI: 10.1097/01.NUMA.0000578988. 56622.21

[53] Hamstra, B. Will these nurse robots take your job? Don't freak out just yet [Internet]. 2020. Available from: https:// nurse.org/articles/nurse-robots-friendor-foe/ [Accessed: 2020-12-23]

[54] Tanioka T: The Development of the Transactive Relationship Theory of Nursing (TRETON): A Nursing Engagement Model for Persons and Humanoid Nursing Robots. Int J Nurs Clin Pract. 2017;4:223. DOI: https://doi. org/10.15344/2394-4978/2017/223

#### **Chapter 6**

## Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness

*Kyoko Osaka, Ryuichi Tanioka, Feni Betriana, Tetsuya Tanioka, Yoshihiro Kai and Rozzano C. Locsin*

#### **Abstract**

Robot therapy uses humanoid and animal-like robots. The robot therapy for older adults is expected to affect the therapeutic goals, including physical condition, cognitive function, and provide joy. By interaction with humanoid or animal-like robot, the older adults who are not physically active may have the improvement of their physical condition, such as hugging, stroking, talking with them, and participating in any activity involving the robot. The typical examples show that animal therapy has almost the same effectiveness as robot therapy among older people. It is clarified that robot therapy can be expected to have a healing effect on patients, improve motivation for activity, and increase the amount of activity, like animal therapy. Furthermore, it was essential to consider the intermediary role of nurses for connecting the robot and older adults and their role, even if the robot is not sophisticated enough to be useful as a humanoid nurse robot for rehabilitation and dialogue with older adults. Thus, robot therapy could be considered another important intervention in the challenging health and innovative care practices needed in the care of older persons. This chapter explains the robot therapy program for patients with dementia from the viewpoint of its framework and effectiveness.

**Keywords:** robot therapy, animal therapy, older adults, dementia, intermediary role

#### **1. Introduction**

In Japan, the number of older adults requiring medical and nursing care is increasing, constituting a super-aged society [1]. This trend is exacerbated by the decrease in the active working population and is accompanied by a declining birth rate [2]. This situation in Japan and other countries, including for current and future rehabilitation services, increases the demand for nursing care of older adults. With that as a result, nursing staff shortages are becoming more serious [3, 4]. Therefore, it is necessary to bridge the gap between human resources and demand for services in health care. In addition, the number of patients with dementia is also increasing especially among the older adult population who need more engaged medical and nursing care [5].

There are plenty of reports on the benefits of animal therapy, which began in the USA in the 1970s [6]. One study conducted in patient with schizophrenia found that showed cortisol level was significantly reduced after participating in an animal assisted therapy session, which could indicate that interaction with the therapy dogs reduced stress [7]. Another study reported that measuring actigraphy increased sleep duration (min) when visitors were accompanied by a dog rather than the robot seal or soft toy cat [8]. Another study reported that animal therapy is associated with decreased impulsivity, aggression, and anxiety, and increased sociability [9].

Meanwhile, several studies have suggested that the robots used in robot therapy can improve the cognitive level and reduce the Behavioral and Psychological Symptoms of Dementia (BPSD) in patients [10, 11]. However, none of the studies have tested the robots on a large sample, meaning that their findings have limited generalizability [12]. Yokoyama reported the caregiver must play an intermediary role during robot therapy for older people with dementia [13]. For such therapy to be effective, it must be evaluated from the perspective of the user (an older adult with dementia) and the caregiver (a nurse or other professional caregiver).

Osaka and other studies [14–16] analysed Heart Rate Variability (HRV) and accelerometer data in two-second increments and showed real-time results. As such, changes in autonomic activity and the intensity of physical exercise could be determined during the implementation of robot therapy. This device sensor was designed to be small size and thus carry only minimal burden for older persons. Another advantage was that it could transmit data to a computer wirelessly, allowing the tested subjects to move freely. It is also possible for an observer to supplement the data by recording through the participant observation of the relationship between the older person and the caregiver during the intervention.

Limited study on robot therapy in older adults with dementia has assessed the intervention objectively and comprehensively by using participant observation and HRV and accelerometer data. The study by Osaka et al. [14] is valuable in that it provides objective data to those involved in caring for older adults with dementia (caregivers and health care professionals involved in rehabilitation), thereby allowing them to review their interventional approaches in relation to standard. Moreover, if it can be demonstrated that low-cost robot therapy is effective, then the study will offer valuable data for developing policies on cost-effective robotics in dementia care.

This chapter explains the robot therapy program for patients with dementia from the viewpoint of its framework and effectiveness.

#### **2. Definition of terms**

#### **2.1 Animal therapy**

The American Veterinary Medical Association (AVMA) defined Animal Assisted Therapy (AAT) as one of the Animal Assisted Intervention (AAI) [17]. There are various animals are used for AAT, such as canines, felines, and equines, depending on purpose of treatment, but the most frequently used animal for AAT is the dog [18]. The AAI for older people can be expected to have the effect of suppressing the decline in cognitive functions and improving the peripheral symptoms (depression, agitation, aggression) and insomnia associated with dementia [10].

#### **2.2 Robot therapy**

The number of robot therapy articles for rehabilitation or recreation that include communication is increasing. Considering the entity of a care robot, several definitions are recently offered [19]. However, there is no consensus

*Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

about their findings. The devices and applications in those studies have yet to be integrated into widespread clinical use [20]. Robot therapy functions include providing therapy, educate, enable communication, and so on [21]. A pilot study showed that by interacting with Paro, a seal-like robot, the communication and interaction skills, and activity participation of older people improved [22]. Another study reported that the use of Paro is associated with improvement in emotional state and social interaction and reduce the challenging behaviours among older people [23]. Research showed that robot therapy has the same effect on people as animal therapy [21]. The effects of robot therapy and animal therapy influence the physical, cognitive, and mental conditions of the users, especially the older people.

#### **3. Theoretical framework**

The Model for the intermediary Role of nurses in Transactive relationships with Healthcare robots (MIRTH©) [24] explains the engagement processes that are characteristic activities of older adults with dementia, the nurse as mediator, and the communication robot (**Figure 1**). Healthcare robots' function in transactive relationships among patients and nurses. The nurses' role as intermediaries is integral to facilitating the interaction between these robots and the older adult patients who are in transactive relationships. The effects of the intermediary role are especially prominent with low-fidelity robots in use today. The functional abilities of the nurse as intermediary include knowledge of advancing technologies regarding robots that foster quality care.

Nurses as intermediaries should: (1) have an accurate awareness of each of the functions of robot performance and the usefulness of each function relevant to patient care situations; (2) create relationships with healthcare robots so that they can promote the health and safety of older adults while increasing their enjoyment through physical and social activities; and (3) seek safe, secure, and competent ways to facilitate using healthcare robots for healthcare. In essence, intermediaries

**Figure 1.** *Illustration of MIRTH©.*

should prepare the environment for using healthcare robots. In doing so older adults can use healthcare robots for complicated operations, with the nurse as intermediary monitoring the effective use of robots and identifying clinical problems while working with healthcare institutions to address preventable healthcare problems.

The MIRTH© model has five assumptions:

It is the responsibility of nurses as professionals to practice nursing grounded in discipline-related knowledge of nursing. The most important attribute in nursing is the relationship expressed as caring. Robot performance requires an intermediary for their effective and safe use [14, 15]. This assumption expresses the importance of the functions of intermediaries in robot-human situations. The intermediary is inextricably linked with the patient and the healthcare robot;


It is the responsibility of nurses as professionals to practice nursing grounded in discipline-related knowledge of nursing. The interactive engagement, the lived experience of the caring between patients and nurses, gives meaning to the nursing relationship, the most important attribute in Nursing.

Framework for robot therapy program.

It has been reported that therapies using animals have a healing effect on patients and an improvement in motivation for performing activities [28, 29]. Park et al. performed a meta-analysis on animal assisted and pet robot interventions which suggested that AAI and Pet Robot Intervention (PRI) significantly reduce depression in patients with dementia. It report, nine studies were analysed and seven of them showed confirming results. The outcome measurements used scales such as functional tests and depression scales. In the two studies, pulse oximetry, pulse rate or galvanic skin response (GSR) (electric skin response) were combined and evaluated as physiological indicators [10].

Intervention therapies using animals for hospitalized patients is not uncommon. Studies by Osaka and others [14–16] suggest that robot therapy is expected to have a more healing effect on patients and improve motivation for activities for older people by using an expensive humanoid robot such as Pepper from an inexpensive communication robot.

**Figure 2** shows the framework of the effectiveness using robot therapy by Osaka. Robot therapy uses humanoid and animal-like robots. The robot therapy is expected to affect the therapeutic goals, including physical effect (e.g., relaxation, motivation), physiological effect (e.g., improvement of vital signs), and social effect (e.g., stimulation of communication among inpatients and caregivers) [30]. *Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

#### **Figure 2.**

*The framework for robot therapy program.*

By interaction with humanoid or animal-like robot, the older adults who are not physically active may have the improvement of their physical condition [31], such as hugging and stroking them, talking with them, and participating in any activity involving the robot.

The intermediary role of the nurse involves mediating and connecting patients with robots. It also involves a focus on ethical and moral issues inherent in nursing situations that include activities by healthcare robots [25]. Specifically, the intermediary person was in charge of connecting the subject with the robot. An intermediary can support older persons to interact well with the robot according to their physical condition. Also, they can provide joy for older adults when interacting with the robot, and among other persons. Moreover, in this interaction, the cognitive function of older persons with dementia may improve as they are able to communicate or have a conversation with robots [32]. Thus, intermediary can ask older adults if they are having fun, and if they feel like they have a companion in their daily life.

In Japan, various robots are produced and introduced for robot therapy in hospitals and other health care facilities. However, the performance and functions of these robots are often of lesser fidelity and functionality than expected by some facilities, thereby preventing their continued use after the initial introduction [33, 34]. The dialogue between healthcare robots and older adults was difficult without an intermediary role because of the difficulty for older adults to understand the robot because of the speed of its speech and tone of vocalization [15]. Oftentimes, because of this robot inefficiency, it is essential to consider instituting the intermediary role of nurses to engage the robot with the older adults. This nurses' role can enhance utility and instigate efficiency even if the robot is not sophisticated enough to be useful for rehabilitation and dialogue with older adults.

#### **4. Method**

#### **4.1 Subjects of the study**

The subjects of the study were two female older persons who were diagnosed with dementia using the Hasegawa's Dementia Scale-Revised (HDS-R) [35] instrument. Both were in their 80s and met the following inclusion criterion (diagnosed with dementia with a certain score): the HDS-R between the score range of 3-20 points. Exclusion criteria included older people who could not communicate verbally, those who could not interact with dogs and small stuffed toy robots, those who could not wear a portable electrocardiogram, and those who could not consent from their families.


Data collection occurred on a single day, in a single observation period for each person; data collection for animal therapy was on October 10, 2017, and for robot therapy was October 25, 2019.

#### **4.2 Hypothesis**

As a prediction of the data comparison results, both animal therapy and robot therapy have the same effect on the physical, mental, and cognitive functions of the older person. Each subject data was extracted from animal and robot therapy, and both effects were compared. Data extraction methods were indicated in **Figures 3** and **4**.

#### **4.3 Ethical consideration**

The data collection procedure was performed following the Private Information Protection Law, with approval from the Tokushima University Hospital Ethics Board (approval number 2039) and Mifune Hospital (approval number 20170201-1). The purpose and methods used in the study were explained to all subjects and their guardians. Subjects were assured that their personal information would be protected and would only be used for research purposes, and that anonymity would be maintained in the report.

#### **4.4 Data extraction method**

#### *4.4.1 Animal therapy protocol*

As described below, a total of 5 minutes of data was extracted before, during, and after the therapy (**Figure 3**). In the animal therapy, the subject was an older person with dementia who was admitted to the facility. Animal therapy was performed by a therapist after music therapy. The animal used was a dog.

#### *4.4.2 Robot therapy protocol*

As described below, robot therapy was set to a total of 5 minutes of data before, during, and after the therapy. However, due to data collection constraints, it could not obtain data for 5 minutes during and after the therapy (**Figure 4**). In the robot therapy procedure, the subject was an older person with dementia. The robots used were Kabo-chan (W23×H28×H28 (sitting high), weight 680g) and Mi-chan (W25×D20×H 30 (sitting high) 30cm, weight 390g). These robots can talk, sing, and slightly nod charmingly in response to touch and spoken words. These robots have sensors that are installed in the mouth, head, hands, feet, and main body. These sensors allow the robots to verbally respond to any sounds and movements.

#### **4.5 Procedure of data analysis**

#### *4.5.1 Analysis of autonomic nervous activity*

Heart Rate Variability (HRV) data were assessed at various frequency bands using an HRV software tool (MemCalc/Bonaly Light: GMS, Tokyo, Japan).

*Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

#### **Figure 3.**

*Animal therapy protocol. Animal therapy data extraction method. Before therapy 10:24:00 ~ 10:28:58 (5 minutes in total); During therapy 11:32:00 ~ 11:36:58 (5 minutes in total); After therapy 11:37:00 ~ 11:41:58 (5 minutes in total). As described above, a total of 5 minutes of data was extracted before, during, and after the therapy.*

#### **Figure 4.**

*Robot therapy protocol. Robot therapy data extraction method. Before therapy 10:05:00 ~ 10:09:58 (5 minutes in total); During therapy 10:26:16 ~ 10:27:58 (1 minute 41 seconds in total); After therapy 10:27:58 ~ 10:28:40 (42 seconds in total). As described above, it was set to extract a total of 5 minutes of data before, during, and after the therapy. However, due to data constraints, data during and after the therapy was less than 5 minutes. The measurement time was sometimes short because the measurement was performed according to the procedure of robot therapy in the clinical setting. In addition, an artifact was included in the electrocardiogram due to the subject's movements, so it was not possible to obtain all the data for 5 minutes. This was the limitation of this clinical study.*

The low frequency (LF) and high frequency (HF) bands in heart rate variability (HRV) reflect sympathetic and parasympathetic nervous systems which is commonly accepted as the activities of the autonomic nervous system [36, 37]. In a continuously recorded data, inter-beat (R-R) intervals were obtained for a 1-min segment using the maximum entropy method. In this study, the two major spectral components of HRV, the variances of the Low-Frequency (LF: 0.04 - 0.15 Hz) band and High-Frequency (HF: 0.15 - 0.4 Hz) band, were calculated. The HF data can be used as an index of parasympathetic nervous activity, and the LF/HF ratio can be used as an index of sympathetic nervous activity.

An optimal level of variability within an organism's key regulatory systems is critical to the inherent flexibility and adaptability or resilience that epitomizes healthy functioning and well-being [38]. HRV is the change in the time intervals between adjacent heartbeats. It is an emergent property of interdependent regulatory systems that operate on different time scales to adapt to environmental and psychological challenges. The heart's rhythms are characterized as reflecting both physiological and psychological functional status of internal self-regulatory systems. Lowered parasympathetic activity, rather than reduced sympathetic functioning appears to account for the reduced HRV in aging [39]. This can be observed when persons engage in meeting a challenge that requires effort and increased sympathetic activation. Alternatively, it can indicate increased parasympathetic activity as occurs during slow breathing [40]. With psychological regulation, lower HF power is associated with stress, panic, anxiety, or worry [41].

In this study, the use of HRV was critical in measuring the psychological functional status and emotional experience of older persons particularly those

#### **Figure 5.**

*Heart Rate Variability (HRV) data were assessed at various frequency bands using an HRV software tool (MemCalc/Bonaly Light: GMS, Tokyo, Japan).*

with dementia. Changes in autonomic nervous activity were determined using HRV. Heart Rate (HR) -mean, HF, LF/HF, NU % (ratio of sympathetic nerve components), and body movement (acceleration) are shown in the graph, it shows the relationship between subjects' interaction with dog or robot doll during intervention. The results were recorded graphically, enabling visual assessments and measurements (**Figure 5**).

#### **5. Typical examples and discussions**

#### **5.1 Comparison of animal therapy and robot therapy using the HRV and accelerometer as biological responses of older persons with dementia**

#### *5.1.1 Animal therapy*

#### *5.1.1.1 The experimental data before the animal therapy*

The data were visually recorded enabling graphic visual assessments and measurements. The analysis of the HRV allowed the evaluation of autonomic nervous function.

After the movement of the dog (animal) trunk, the sympathetic nerve recordings became predominant. Afterwards, it was the parasympathetic nerve that became predominant. Changes in autonomic nervous activity can be confirmed from the data recorded in the accelerometer and the results of heart rate variability analysis as the body moves (**Figure 6**).

#### *5.1.1.2 The experimental data during the animal therapy*

Subject A's HRV data showed sympathetic nerve predominance after touching the dog, after which it showed a sympathetic nerve predominance. Before touching the dog for the second time, the pulse rate decreased, and the parasympathetic nerve became predominant. By interacting with the dog, fluctuations in heart rate were observed,

*Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

#### **Figure 6.**

*The experimental data before the animal therapy (Subject A). Note: HR-mean; Heart Rate mean, HF; High Frequency, LF/HF; Low Frequency / High Frequency, NU; Normalized Unit, Movement; the body movement. The unit of vertical axis. HR-Mean; Beat/min, HF; msec2, LF/HF; Nothing, NU (%), Movement; G. The unit of horizontal axis; hour: min: second.*

and a balance between sympathetic and parasympathetic nerve activities was observed. Therefore, it was considered that effective stimulation could be provided by contact between the subject and subject A through interaction with the dog (**Figure 7**).

#### *5.1.2 Robot therapy Data*

#### *5.1.2.1 The experimental data before robot therapy*

**Figure 8** shows the alternating sympathetic and parasympathetic activities before the robot therapy.

#### *5.1.2.2 The experimental data during the robot therapy*

After holding the robot, it was observed that the parasympathetic nerve activity became predominant, and then subsequently, the sympathetic nerve became predominant. In addition, after holding the robot for the second time, the parasympathetic nerve became predominant.

#### **Figure 7.**

*The experimental data during animal therapy (Subject A).*

Sympathetic nerve activity was predominant before the start of the therapy. However, during the therapy, the sympathetic nerve activity predominance continued immediately after holding the robot was observed. Autonomic nervous activity was stable at the end of the therapy (**Figure 9**).

#### *5.1.2.3 Comparison of the interactions between older people with dementia and their caregivers during animal therapy and during robot therapy*

In both animal therapy and robot therapy, stable heart rate and body movements were confirmed in all processes before, during, and after therapy. These were during the awake state, and the awakening could be confirmed visually by participant observation and recorded video data.

In the animal therapy, the LF/HF value was high even before the start of therapy, and the predominance of sympathetic nerve activity was confirmed. During the therapy, the LF/HF value increased immediately after the first touch of the dog, immediately before the second touch of the dog, and immediately after the touch. These activities confirmed that the sympathetic nerve activities were dominant.

#### *Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

**Figure 8.**

*The experimental data before robot therapy (Subject B).*

At the end of the therapy, the LF/HF value was high for about one minute, confirming the predominance of sympathetic nerve activity.

In the robot therapy, the subjects had high LF/HF values and predominant sympathetic nerve activity before the start of the therapy. In addition, during the therapy, high LF/HF values that continued immediately after the robot was first held were observed, confirming the predominance of sympathetic nerve activity. After the therapy, the autonomic nervous activity became stable.

#### **5.2 Comparison intermediary interactions during animal therapy and robotic therapy using participant observations with older adults**

Animal therapy was conducted by the therapist and pianist in the healthcare institution. After music therapy, the therapist brought a dog to the older person (Subject A). Her expression can be seen from **Figure 10**. From the observation, it was evident that the older person spontaneously touched and stroked the dog. Subject A seemed happy touching the dog and intermediary bring the dog near her.

#### **Figure 9.**

*The experimental data during robot therapy (Subject B).*

The nurse intermediary was in charge of connecting Subject A with the dog. The intermediary asked Subject A some questions: "Do you like dogs and animals" and "Have you ever owned a dog?" Subsequently, the intermediary asked, "May I bring the dog closer to you?" Since Subject A's mobility requires a cane when walking, and her daily activities is slow due to old age, the intermediary held on to the small dog, and brought it near to subject A's chest (**Figure 10**) to make it easier for her to touch the dog. Subject A replied that she had a dog, and that she liked dogs and had many dogs, and that one of her dogs was as small as the size of dog used for animal therapy. Then the intermediary picked a conversational topic, and the dog waved its tail seemingly to mean that "This dog seems happy."

Subject B was in the robot therapy section. It was observed that older persons seemed happier during their interactions with robots. Subject B touched and held the robots. She stroked their legs and arms, and head as if the robots were her grandchildren. When the older person saw the pictures displayed in the television screen, they turned the robot to see the TV screen and she exclaimed to the robot "Look at the TV!" After that, she asked the robot some questions like "Do you like animals?" And she stroked the robot's head. When the robot said, "Thank you", she *Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

#### **Figure 10.**

*Subject A's scene during the animal-assisted therapy. During animal therapy: she spontaneously stretched her arms, stroked dog's body, and smiled from beginning to end.*

laughed. When the robot sang, she clapped her hands and said, "You're so good!" (**Figure 11**).

The intermediary asked subject B about her impression of robots and if she was interested in the robot. Subject B was interested in the robot and listened to what the robot had to say. She replied, "around 80 years old" to the robot question, "How old are you?" However, there were some occasions when she could not hear what the robot said. Therefore, the intermediary had to repeat what the robot said. Since the inexpensive robot used had limited conversational word content, the intermediary supplemented the conversation content and enhanced the interval in the conversation. Also, since the robot can sing songs, the intermediary tried to sing thereby illustrating that it was the robot that was singing. Subject B listened intently while the robot was singing, and she enjoyed singing to the tune along with the robot. On one occasion, the intermediary informed subject B that she could touch the robot. She asked. "Would you like to pick up robot Mi chan?"

Conversations with the robots illustrated that nurses as intermediaries can show that emotional conversations establish effective transactive engagements between subjects and robots.

The comparison of AAI and Robot therapy showed that each method has its benefits and shortcomings indicating that the two methods could potentially complement each other. Both therapies were shown to have a possibility of beneficial effects on the emotional wellbeing of patients with dementia. There is a possibility that if robot therapy using an inexpensive robot such as used in this study might be obtain the same effect as AAI, barriers peculiar to AAI such as zoonotic diseases, animal bites, and allergies can be avoided. In addition, it will be possible to use AAI and robot therapy properly while taking advantage of their respective characteristics and advantages.

#### **6. Conclusion**

This chapter explained the robot therapy program for patients with dementia from the viewpoint of its framework and effectiveness.

The electrographic data provided neurophysiological evidence of the influence of robot utilization on the autonomic nervous system activity of older adults with dementia. The examples described were demonstrations of studies, which captured how data were collected through different devices and specific procedures to describe, explain, predict, and prescribe phenomena, as evidenced from a rigorous analysis of data regarding human-robot interaction with nurses as intermediaries.

The typical examples show that animal therapy has almost the same effectiveness as robot therapy among older people. It is clarified that robot therapy can be expected to have a healing effect on patients, improve motivation for activity, and increase the amount of activity, similar to animal therapy. Furthermore, it was essential to consider the intermediary role of nurses for connecting the robot and older adults and their role, even if the robot is not sophisticated enough to be useful as a humanoid nurse robot for rehabilitation and dialogue with older adults.

Thus, robot therapy could be considered another important intervention in the challenging health and innovative care practices needed in the care of older persons. Nevertheless, two issues were realized regarding living with a human-type communication robot as a strategy for rehabilitation care and to improve cognitive functions and prevent cognitive decline in the older adults. In this regard, robot therapy has not been generalized, and more analysis, descriptions and discussions about its practical utility are required.

#### **Acknowledgements**

This work was supported by JSPS KAKENHI Grant Numbers JP17K17504. We would like to express our sincere appreciation to all the patients and participants who contributed to this article. With many thanks to Dr. Savina Schoenhofer for reviewing this chapter prior to its publication.

*Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

#### **Author details**

Kyoko Osaka1 \*, Ryuichi Tanioka<sup>2</sup> , Feni Betriana<sup>2</sup> , Tetsuya Tanioka3 , Yoshihiro Kai4 and Rozzano C. Locsin3

1 Department of Clinical Nursing, Kochi Medical School, Kochi University, Kochi, Japan

2 Graduate School of Health Sciences, Tokushima University, Tokushima, Japan

3 Department of Nursing, Institute of Biomedical Sciences, Tokushima University Graduate School, Tokushima, Japan

4 Department of Mechanical Engineering, Tokai University, Kanagawa, Japan

\*Address all correspondence to: osaka@kochi-u.ac.jp

© 2021 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/ by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

### **References**

[1] Muramatsu N, Akiyama H: Superaging society preparing for the future. The Gerontologist. 2011;51(4):425-432. DOI: 10.1093/geront/gnr067

[2] Ministry of Health, Labour, and Welfare. 2015 Edition, Annual Health, Labour and Welfare Report-Consideration of a depopulating society- Towards a society where people can live in peace and realize of their hope. 2015. https://www.mhlw.go.jp/ english/wp/wp-hw9/dl/summary.pdf [Accessed: 2020-12-25]

[3] Department of International Affairs Japanese Nursing Association: Nursing in Japan 2016. https://www.nurse. or.jp/jna/english/pdf/nursing-injapan2016.pdf

[4] Buchan J, Aiken L: Solving nursing shortages: A common priority. J Clin Nurs. 2008;17(24): 3262-3268. DOI: 10.1111/j.1365-2702.2008.02636.x [Accessed: 2020-12-25]

[5] Amjad H, Carmichael D, Austin AM, Chang C, Bynum JPW: Continuity of care and health care utilization in older adults with dementia in Fee-for-Service Medicare. JAMA Intern Med. 2016;176(9):1371-1378. DOI: 10.1001/ jamainternmed.2016.3553

[6] Mangalavite AM: Animal-Assisted Therapy: Benefits and Implications for Professionals in the Field of Rehabilitation. 2014;1-26. Research Papers. Paper 547. http://opensiuc.lib. siu.edu/gs\_rp/547

[7] Paula Calvo, Joan R. Fortuny, Sergio Guzmán, Cristina Macías, Jonathan Bowen, María L. García, Olivia Orejas, Ferran Molins, Asta Tvarijonaviciute, José J. Cerón, Antoni Bulbena, Jaume Fatjó: Animal Assisted Therapy (AAT) Program As a Useful Adjunct to Conventional Psychosocial Rehabilitation for Patients with

Schizophrenia: Results of a Small-scale Randomized Controlled Trial. Frontiers in Psychology. 2016;7:1-13. DOI: 10.3389/fpsyg.2016.00631

[8] Thodberg K, Sørensen L.U, Christensen J.W, Poulsen P.H, Houbak B, Damgaard V, Keseler I, Edwards D, Videbech P.B: Therapeutic effects of dog visits in nursing homes for the elderly. Psychogeriatrics. 2016;16(5):289-297. DOI: 10.1111/psyg.12159

[9] Richeson NE: Effects of animalassisted therapy on agitated behaviors and social interactions of older adults with dementia. Am J Alzheim Dis Other Dement. 2003;18(6):353-358. DOI: 10.1177/153331750301800610

[10] Park S, Bak A, Kim S, Nam Y, Kim H, Yoo DH., Moon M: Animalassisted and pet-robot interventions for ameliorating behavioral and psychological symptoms of dementia: a systematic review and meta-analysis. Biomedicines. 2020;8(6):150. DOI: 10.3390/biomedicines8060150

[11] Minmin L, Peng L, Ping Z, Mingyue H, Haiyan Z, Guichen L, Huiru Y, Li C: Pet robot intervention for people with dementia: a systematic review and meta-analysis of randomized controlled trials. Psychiatric Research. 2019;271:516-525. DOI: 10.1016/j.psychres.2018.12.032

[12] Feast AR, White N, Candy B, Kupeli N, Sampson EL: The effectiveness of interventions to improve the care and management of people with dementia in general hospitals: a systematic review. Int J Geriatr Psychiatry. 2020;35(5):463-488. DOI: 10.1002/gps.5280

[13] Yokoyama A: Challenges for the future of robot assisted therapy from the viewpoint of psychiatry: with the knowledge of animal assisted therapy. *Robot Therapy Program for Patients with Dementia: Its Framework and Effectiveness DOI: http://dx.doi.org/10.5772/intechopen.96410*

Journal of The Society of Instrument and Control Engineers. 2012 July 10; 51(7): 598-602.

[14] Osaka K, Sugimoto H, Tanioka T, Yasuhara Y, Locsin R, Zhao YR, Okuda K, Saito K: Characteristics of a transactive phenomenon in relationships among older adults with dementia, nurses as intermediaries, and communication robot. Intelligent Control and Automation. 2017; 8: 111- 125. DOI: 10.4236/ica.2017.82009

[15] Osaka K, Tanioka T, Tanioka R, Kai Y, Locsin RC. Effectiveness of care robots, and the intermediaries' role between and among care robots and older adults. In: Proceedings of the IEEE/SICE International Symposium on System Integration (SII '20), 12-15 January 2020; January 2019; Honolulu, Hawaii: IEEE; 2020. p.611-616 DOI: 10.1109/SII46433.2020.9026262

[16] Tanioka R, Yasuhara Y, Osaka K, et al: Autonomic nervous activity of patient with schizophrenia during Pepper CPGE-led upper limb range of motion exercises. Enferm Clin. 2020;1:48-53. DOI: 10.1016/j. enfcli.2019.09.023

[17] American Veterinary Medical Association. Animal-assisted interventions: Definitions. (n.d.). Retrieved from https://www.avma.org/ resources-tools/avma-policies/animalassisted-interventions-definitions [Accessed: 2020-12-25]

[18] Nimer J, Lundahl B: Animalassisted therapy: a meta-analysis. Anthrozoos. 2007;20(3): 225-238. DOI: 10.2752/089279307X224773.

[19] S H Hosseini, K M Goher: Personal Care Robots for Older Adults: An Overview. 2017;13:11-19. DOI: 10.5539/ ass.v13n1p11

[20] Fiske A, Henningsen P, Buyx A: Your robot therapist will see you now: ethical implications of embodied

artificial intelligence in psychiatry, psychology, and psychotherapy. J Med Internet Res. 2019;21(5). DOI: 10.2196/13216.

[21] Shibata T, Wada K: Robot therapy: a new approach for mental healthcare of the elderly - a mini review. Gerontology. 2011;57: 378-386. DOI: 10.1159/000319015

[22] Huei-Chuan Sung, Shu-Min Chang, Mau-Yu Chin, Wen-Li Lee: Robotassisted therapy for improving social interactions and activity participation among institutionalized older adults: a pilot study. Asia Pac Psychiatry. 2015;7:1-6. DOI: 10.1111/appy.12131

[23] Birks M, Bodak M, Barlas J, Harwood J, Pether M: Robotic seals as therapeutic tools in an aged care facility: a qualitative study. J Aging Res. 2016;article ID 8569602:7 pages. DOI: 10.1155/2016/8569602

[24] Osaka K: Development of the model for the Intermediary role of nurses in transactive relationships with healthcare robots. Int J Hum Caring. 2020;24(4):265-274. DOI: 10.20467/ HumanCaring-D-20-00014

[25] Yasuhara Y, Tanioka R, Tanioka T, Ito H, Tsujigami Y: Ethico-legal issues with humanoid caring robots and older adults in Japan. Int J Hum Caring. 2019;23(2):141-148. DOI: 10.20467/1091-5710.23.2.141

[26] Locsin R: Technological competency as caring in nursing. Sigma Theta Tau International Press. 2005. 229 p.

[27] Boykin A, Schoenhofer S: Nursing as caring: A model for transforming practice. 2nd ed. Jones and Bartlett. 2001. 71 p.

[28] Cherniack EP, Cherniack AR: The benefit of pets and animalassisted therapy to the health of older individuals. Curr Gerontol

Geriatr Res. 2014;2014:623203. DOI: 10.1155/2014/623203

[29] Lundqvist M, Carlsson P, Sjödahl R, Theodorsson E, Levin LÅ: Patient benefit of dog-assisted interventions in health care: a systematic review. BMC Complement Altern Med. 2017;17(1):358. DOI:10.1186/ s12906-017-1844-7

[30] K. Wada, T. Shibata, T. Saito, K. Tanie. Effects of robot-assisted activity for elderly people and nurses at a day service center. In: Proceedings of the IEEE, 2004;92(11):1780-1788, DOI: 10.1109/JPROC.2004.835378

[31] Fasola J, Mataric MJ: Using socially assistive human–robot interaction to motivate physical exercise for older adults. In: Proceedings of the IEEE, 2012;100(8):2512-2526, Aug. 2012, DOI: 10.1109/JPROC.2012.2200539

[32] Tanaka M, Ishii A, Yamano E, Ogikubo H, Okazaki M, Kamimura K, Konishi Y, Emoto S, Watanabe Y: Effect of a human-type communication robot on cognitive function in elderly women living alone. Med Sci Monit. 2012;18(9):CR550-557. DOI: 10.12659/ msm.883350

[33] Tanioka T, Smith CM, Osaka K, Zhao Y: Framing the development of humanoid healthcare robots in caring science. Int J Hum Caring. 2019;23(2):112-120, 2019. DOI: 10.20467/1091-5710.23.2.112

[34] Ferreira FMRM, Chaves MEA, Oliveira VC, Van Petten AMVN, Vimieiro CBS: Effectiveness of robot therapy on body function and structure in people with limited upper limb function: a systematic review and meta-analysis. PLoS One. 2018;13(7):e0200330. DOI: 10.1371/ journal.pone.0200330.

[35] Imai Y, Hasegawa K: The revised Hasegawa's Dementia Scale (HDS-R)- Evaluation of its usefulness as a

screening test for dementia. Hong Kong Journal of Psychiatry. 1994;4(SP2):20-24.

[36] von Rosenberg W, Chanwimalueang T, Adjei T, Jaffer U, Goverdovsky V, Mandic DP: Resolving ambiguities in the LF/HF ratio: LF-HF scatter plots for the categorization of mental and physical stress from HRV. Front in Physiol. 2017;.8:360. DOI: 10.3389/fphys.2017.00360

[37] Takahashi N, Kuriyama A, et.al: Validity of spectral analysis based on heart rate variability from 1-minute or less ECG recordings. Electrophysiology. 2017;40:1004-1009. DOI: 10.1111/ pace.13138

[38] Fred Shaffer, Rollin McCraty, Christopher L. Zerr: A healthy heart is not a metronome: an integrative review of the heart's anatomy and heart rate variability. frontiers in psychology. 2014;5:1-19. DOI: 10.3389/ fpsyg.2014.01040

[39] Umetani K, Singer DH, McCraty R, Atkinson M. Twenty-four hour time domain heart rate variability and heart rate: relations to age and gender over nine decades. J Am Coll Cardiol. 1998;31(3):593-601. DOI: 10.1016/ s0735-1097(97)00554-8

[40] Marc A. Russo, Danielle M. Santarelli, Dean O'Rourke: The physiological effects of slow breathing in the healthy human. Breathe. 2017;13(4): 298-309. DOI: 10.1183/20734735.009817

[41] Rollin McCraty, Fred Shaffer. Heart Rate Variability: New Perspectives on Physiological Mechanisms, Assessment of Self-regulatory Capacity, and Health risk: Glob Adv Health Med. 2015;4(1): 46-61. DOI: 10.7453/gahmj.2014.073

## Section 4
