**2.6 Speech recognition**

To create a humanoid robot to enable speech recognition one has to use different hardware and software elements. These elements are as follows: Python

programming language, different AI packages like speech-recognition and chatterpot that need to be integrated into a pocket PC such as Raspberry Pi. Nowadays, humanoid robots can recognize the words of multiple people speaking simultaneously, can get certain information from the internet and so on. Certain types of them can be used in halls and offices and can communicate with many people [8]. Nao humanoid robot is also able to see, talk and hear. Nao can also naturally interact with humans. A shown in **Figure 4**, it has 4 built-in microphones and loudspeakers, 2 cameras. Nao can learn and adapt to almost every interaction and becomes more and more intelligent with time and empirics i.e., experience. He remembers answers and content and can immediately use them again in similar situations. It acquired its skills through a programming interface to IBM Watson's Language, Vision, Speech and Data APIs. These present almost endless possibilities

*Communication and Interaction between Humanoid Robots and Humans*

*DOI: http://dx.doi.org/10.5772/intechopen.97334*

The Sophia from Hanson Robotics shown in **Figure 5** is a good example of how the AI is implemented in humanoid robots. Within its Robot Intelligent System it has some unstructured language learning as well as statistical natural language learning and natural language generation, but for some answers she might go to the Web, and some of the answers, might go to natural language learning. However some answers might enter into its robotic personality and hence it can behave similar to a human. The above-mentioned components that are implemented in the hardware and software are not distinct things, the real cracks of intelligence are in fact how they come and interact together to form an entire architectural organism as shown in **Figure 5**. The AI algorithms are included in Humanoid Robots for reasoning (logics), learning, perception and interaction, all of which as a whole inter-operates together in a

complex way of communication and interaction with humans.

for further development.

**Figure 3.** *Temi robot [7].*

**119**

**Figure 2.** *Aibo robot [6].* *Communication and Interaction between Humanoid Robots and Humans DOI: http://dx.doi.org/10.5772/intechopen.97334*

**Figure 3.** *Temi robot [7].*

has one main camera and another slanted camera in which animals need to memorize up to 200 different interactions and can recognize and respond appropriately to them. The Aibo-robot is also equipped with six sensors, a motion detector and a lie detector which enables the robot to detect obstacles and move flawlessly around the house. This robot has also four microphones, thus being able to hear and respond accurately to your voice commands. You can get this robot at around three thou-

The Temi robot, as shown in **Figure 3**, is the first robot that interacts with humans while providing a flawless connection between devices and your loved ones. It is equipped with a navigational robot system, 360-degree Lidar, true depth camera, RGB camera, 5 proximity sensors and real-time sensor fusion which analyses data and ensures autonomous navigation through a 3D mapping path, planning obstacle avoidance using detection and tracking its features at 10.1 inches per second. LCD touch colour display with a pixel density up to 225 PPI, comprising a brushless DC motor and planetary gear with which it can autonomously track the face and tilt the screen with accuracy you to interact with a robot with the clarity it has a 13-megapixel high resolution which can record thousands of ATB videos at 30 FPS while providing two-way live conversation with their loved ones. Temi-robot has 20 Watt speakers with high fidelity equalizers which provide the best quality music. It also has four omni-directional i.e., all-directional digital mics with realtime localization, in order to provide the best audio call experience. With built-in Alexa, one can command the TV to play music, place calls, check the weather and even control smartphone devices without leaving your comfort. Temi is a personal

robot that you can order and get online for a price of some1500 dollars.

hardware and software elements. These elements are as follows: Python

To create a humanoid robot to enable speech recognition one has to use different

sand dollars and they are available online.

*Collaborative and Humanoid Robots*

**2.6 Speech recognition**

**Figure 2.** *Aibo robot [6].*

**118**

programming language, different AI packages like speech-recognition and chatterpot that need to be integrated into a pocket PC such as Raspberry Pi. Nowadays, humanoid robots can recognize the words of multiple people speaking simultaneously, can get certain information from the internet and so on. Certain types of them can be used in halls and offices and can communicate with many people [8].

Nao humanoid robot is also able to see, talk and hear. Nao can also naturally interact with humans. A shown in **Figure 4**, it has 4 built-in microphones and loudspeakers, 2 cameras. Nao can learn and adapt to almost every interaction and becomes more and more intelligent with time and empirics i.e., experience. He remembers answers and content and can immediately use them again in similar situations. It acquired its skills through a programming interface to IBM Watson's Language, Vision, Speech and Data APIs. These present almost endless possibilities for further development.

The Sophia from Hanson Robotics shown in **Figure 5** is a good example of how the AI is implemented in humanoid robots. Within its Robot Intelligent System it has some unstructured language learning as well as statistical natural language learning and natural language generation, but for some answers she might go to the Web, and some of the answers, might go to natural language learning. However some answers might enter into its robotic personality and hence it can behave similar to a human. The above-mentioned components that are implemented in the hardware and software are not distinct things, the real cracks of intelligence are in fact how they come and interact together to form an entire architectural organism as shown in **Figure 5**. The AI algorithms are included in Humanoid Robots for reasoning (logics), learning, perception and interaction, all of which as a whole inter-operates together in a complex way of communication and interaction with humans.

happiness, meaning and purpose. In Japan, this is called Ikigai. Studies of Ikigai teach people that they feel most fulfilment when their lives incorporate work that they love and help society to enable more people to achieve their Ikigai. They pursue new forms of automation in society with a human touch to develop capabilities that amplify, rather than replace human ability. This is Toyota's historic philosophy of Jidoka, an idea that embraces the concept of Artificial Intelligence (AI), in other words, the human, and the machine work together in order to do some-

They are currently pursuing this vision in four research areas: Robotics, Automated Driving, Accelerated Materials Design and Discovery, and Machine Assisted

TRI vision and mission are focused on solving the problem of how technologies can enhance and ease the human experience bringing forth a higher quality of life, independence and happiness. TRI envisions a future where Toyota products can improve the quality of life for societies around the world with an outstanding performance and contribution, their mission being the development of automated driving, robotics and other human enhancement and amplification technology from Toyota. Technological capabilities that will help people navigate safely from their kitchen to their living rooms, or safely across town, and most importantly, by providing this kind of human amplification technology, they hope to make the

A growing number of Japanese businesses are testing robots as a viable solution to the country's shrinking workforce. They're popping up in stores, banks and soon are expected also in hotels. Bank of Tokyo-Mitsubishi UFJ is testing Nao, a robotic client service that answers basic questions and is designed to speak 19 languages. Multilingual polyglot robotics has been planned to serve foreign clients during the

By the time, the bank hopes to have even more robots on its staff. Pepper is a humanoid robot talking to clients. A humanoid has human-like features, for example, arms, legs as well as a head - but it is designed to look like a robot. Producer

A hotel planned to open at Huis Ten Bosch Amusement Park in Nagasaki this summer aims to have 10 robots as staff members and soon to increase the number to more than 90 percent of hotel services being provided by robots as shown in

Today's innovation may be the necessity of tomorrow. Japan has an aging population that has fuelled heated debates about the involvement of robots in the state's workforce. A survey by home service operator Orix Living found that more seniors feel comfortable being nursed by a robot than when receiving services from a foreign nurse. The number of elderly citizens in Japan is steadily increasing, thus bringing about a real need for humanoid service robots to help them out in dealing

In a country where the population is shrinking due to various reasons, where the workforce is shrinking and there is considerable resistance to an influx of immigrants as in Japan, it appears that robots may play a very big role in their

One group that seems to want to embrace robots are elderly citizens of Japan who are cared for by robots. Using emotion recognition functions, the Pepper robot, released in February 2015, can understand and respond to people who joke, dance, and even make rep music in the Japanese language, see **Figure 7**. Pepper robot is

Softbank hopes Pepper will be a family robot, as in the Jetsons cartoons.

thing better than if either one of them could do on their own.

*Communication and Interaction between Humanoid Robots and Humans*

*DOI: http://dx.doi.org/10.5772/intechopen.97334*

quality of life for everyone much better [11].

and taking care of various home tasks.

**3.1 Pepper robot understands the emotions**

Cognition [11].

Tokyo 2020 Olympics.

**Figure 6**.

future [13].

**121**

**Figure 4.** *Nao robot [9].*

**Figure 5.** *Sophia Intelligent Robot [10].*
