**5. Connecting the Bioloid robot with Kinect**

This system consists of both hardware and software. The way it functions is by capturing the user gestures, processing them and sending the processed signal further to the humanoid robot. Initially, the user makes a certain body gesture maintaining it for a short period of time [16].

The Kinect sensor is then used to capture the depth image of the user and recognizes the gesture by tracking the user's skeleton. This stage is called the image capturing stage. The depth image captured is processed into a computer in order to obtain an approximation of the positions of each body joint. In the gesture recognition stage, the angle between some of the body joints are then calculated and used as

1.20 meters tall, has been designed by Softbank Robotics and can handle a conver-

It can analyse human expressions, voice tones and gestures, thereby enabling them to respond. This type of robot can serve for education, health and entertainment purposes, its primary purpose however being not hard work, but home

Pepper is the ideal robot for a family and can very quickly become the family of those individuals living alone and feeling lonely in their households, as it makes the elderly feel very comfortable in their interactions with these types of humanoid robots. In this context it should be mentioned that the Japanese society is prone to a friendly approach in its relations to robots in general and humanoid robots in particular. This is related also to their history and especially their world famous

**4. The gesture-based remote human-robot using Kinect**

Kinect is a line of motion sensing input device signals produced by Microsoft. Initially, the Kinect was developed as a gaming accessory for

Xbox 360 and Xbox One video game consoles and Microsoft Windows PCs shown

sation with people.

**Figure 6.**

**Figure 7.**

*Robot chief in preparing the pancakes [12].*

*Collaborative and Humanoid Robots*

manga books.

in **Figure 8**.

**122**

entertainment or shopping.

*Pepper robot at the working place [14].*

**4.1 Structure of the control system**

features for gesture classification. Once the correct gesture is recognized robot will then execute the motion correlating it with the recognized gesture [16]. The robot receives the command via a wireless interface. A built in mechanism is also embedded in by the PC with additional stability control to maintain the balance while moving and giving it the ability to get back up from potential falls autonomously [17].

*metaDATA getDATA* ∈ *depthVid*;

*Communication and Interaction between Humanoid Robots and Humans*

*jointCoordinates metaDATA* ∈*JointPOSITIONS*; *ColorJointIndices metaDATA* ∈*JointINDICES*;

*trackedBODIES find*∈ *metaDATA*; *trackedBODIES find*∈ *metaDATA*;

*DOI: http://dx.doi.org/10.5772/intechopen.97334*

*image colorImg* ð Þ ⟨: j : j :⟩, 1 ;

*nBODIES length* ∈*trackedBODIES*;

*Interaction between Bioloid Robots and Humans control architecture.*

**if**

*robotArmControl* f

**for** body ¼ 1 : *nBODIES*; **end**

**for** *i* ¼ 1 : 25;

**end end end**

**Figure 10.**

**125**

**Figure 9.**

*Extraction of arm reference points.*

The hardware used for this system are: Kinect sensor, PC, humanoid robot-kit, along with other additional tools. The Kinect used in this research is a depth imaging camera originally used for entertainment and gaming for Xbox game console made by Microsoft Corp. The humanoid robot kit used is Bioloid Premium Kit.

Above is presented the Kinect v2 connection method with the Bioloid robot using V-Rep simulation software.

The robot simulator V-REP with the integrated development environment is based on a distributed control architecture: each object/model can be individually controlled via an embedded script, a plug-in, a ROS or BlueZero node, a remote API client, or a custom solution. This makes V-REP very versatile and ideal for multirobotic applications. Controllers can be written in C/C++, Python, Java, Lua, Matlab, or Octave. V-REP is used for fast algorithm development, factory automation simulations, fast prototyping, and verification, robotics-related education, remote monitoring, safety double-checking, etc. [18–20].

A direct connection for the human gait parameters using Kinect camera that is capable of providing human body tracking in real-time is shown in **Figures 9**–**11**. The position of the hands is then continuously updated and relayed to the robot, which moves towards the indicated position.

#### **5.1 Pseudo-code for communication with a humanoid robot**

The pseudo-code for connecting Kinect to MatLab and v-Rep software is given as follows:

```
Algorithm 1
    Initializing parameters Neck, Head, Right Leg, Left Leg, Right Hand, Left Hand&&Spine∈
    SkeletonConnectionMAP
    Insert variables: if
                                        SkeletonConnectionMAP ∈
                                                                   3 ⋯ 15
                                                                   ⋮⋱⋮
                                                                  12 ⋯ 24
                                                                 2
                                                                 6
                                                                 4
                                                                             3
                                                                             7
                                                                             5
                                                                               MxN
                     end
    Insert variables: VREP API
                                           vrep remApi vrep f :simx ∈ ∀;
                   clientID ∈ remApi∥true;
                              if clientID > � 1;
                              disp vrep ð Þ ∈ API ;
                                                   end
    Create Right Arm :
    return vrep:simx:Object
                              1 ⋯ 3
                              ⋮⋱⋮
                              4 ⋯ 6
                            2
                            6
                            4
                                       3
                                       7
                                       5 ∈clientID&&ART nð Þ : m ;
    Create color and depth videoinput objects :
    colorVid input ∈kinect1;
    depthVid input ∈ kinect2;
                   depthSource Frame, Trigger ∈depthVid;
                                             himg figure;
    while
          trigger ∈depthVid&&colorVid;
                          colorIMG getDATA ∈colorVid;
124
```
*Communication and Interaction between Humanoid Robots and Humans DOI: http://dx.doi.org/10.5772/intechopen.97334*

```
metaDATA getDATA ∈ depthVid;
if
    trackedBODIES find∈ metaDATA;
        trackedBODIES find∈ metaDATA;
           jointCoordinates metaDATA ∈JointPOSITIONS;
           ColorJointIndices metaDATA ∈JointINDICES;
robotArmControl f
        image colorImg ð Þ ⟨: j : j :⟩, 1 ;
          nBODIES length ∈trackedBODIES;
for i ¼ 1 : 25;
for body ¼ 1 : nBODIES;
          end
end
end
end
```
#### **Figure 9.**

features for gesture classification. Once the correct gesture is recognized robot will then execute the motion correlating it with the recognized gesture [16]. The robot receives the command via a wireless interface. A built in mechanism is also embedded in by the PC with additional stability control to maintain the balance while moving and giving it the ability to get back up from potential falls autonomously [17].

The hardware used for this system are: Kinect sensor, PC, humanoid robot-kit, along with other additional tools. The Kinect used in this research is a depth imaging camera originally used for entertainment and gaming for Xbox game console made

Above is presented the Kinect v2 connection method with the Bioloid robot

The robot simulator V-REP with the integrated development environment is based on a distributed control architecture: each object/model can be individually controlled via an embedded script, a plug-in, a ROS or BlueZero node, a remote API client, or a custom solution. This makes V-REP very versatile and ideal for multirobotic applications. Controllers can be written in C/C++, Python, Java, Lua, Matlab, or Octave. V-REP is used for fast algorithm development, factory automation simulations, fast prototyping, and verification, robotics-related education,

A direct connection for the human gait parameters using Kinect camera that is capable of providing human body tracking in real-time is shown in **Figures 9**–**11**. The position of the hands is then continuously updated and relayed to the robot,

The pseudo-code for connecting Kinect to MatLab and v-Rep software is given

*SkeletonConnectionMAP* ∈

**end**

<sup>5</sup> <sup>∈</sup>*clientID*&&*ART n*ð Þ : *<sup>m</sup>* ;

**himg** *figure*;

**vrep** *remApi vrep* f *:simx* ∈ ∀;

3 ⋯ 15 ⋮⋱⋮ 12 ⋯ 24

2 6 4

**Initializing parameters** *Neck*, *Head*, *Right Leg*, *Left Leg*, *Right Hand*, *Left Hand*&&*Spine*∈

by Microsoft Corp. The humanoid robot kit used is Bioloid Premium Kit.

remote monitoring, safety double-checking, etc. [18–20].

**5.1 Pseudo-code for communication with a humanoid robot**

which moves towards the indicated position.

as follows:

**Algorithm 1**

*SkeletonConnectionMAP* Insert variables: **if**

Insert variables: VREP API

*Create Right Arm* :

**while**

**124**

**return** *vrep:simx:Object*

**colorVid** *input* ∈*kinect*1; **depthVid** *input* ∈ *kinect*2;

**end**

*clientID* ∈ *remApi*∥*true*;

2 6 4

*Create color and depth videoinput objects* :

*trigger* ∈*depthVid*&&*colorVid*;

**if** *clientID* > � 1; **disp vrep** ð Þ ∈ *API* ;

> 3 7

**depthSource** *Frame*, *Trigger* ∈*depthVid*;

*colorIMG getDATA* ∈*colorVid*;

1 ⋯ 3 ⋮⋱⋮ 4 ⋯ 6

using V-Rep simulation software.

*Collaborative and Humanoid Robots*

*Interaction between Bioloid Robots and Humans control architecture.*

**Figure 10.** *Extraction of arm reference points.*
