**5.1 Specifications of KUKA's youBot**

KUKA's youBot consists of an omnidirectional mobile platform along with 5-degree of freedom (DOF) manipulator that possess two finger gripper, which can be controlled independently. The omnidirectional base of the robot has a payload of 20 kg and it is driven by four omni wheels, which allows the movement of wheels in all directions without any external mechanical steering. Each wheel consists of a series of rollers mounted at a 45° angle. The wheels are driven by brushless DC motors with built-in gearbox, relative encoder and joint bearing. Wheel motion is controlled independently through drives using commands. The lowest level of command being pulse width modulation (PWM) and higher level control commands are: current control (CC), velocity control (VC), and position control (PC).

Youbot's arm can manipulate a load of up to 0.5 kg in any position in three dimensions. The distributed axis controllers are connected via the EtherCAT backplane bus EBUS. Each joint in youbot arm has its own servo controller, which contains: ARM Cortex-M3 microcontroller, hall sensors, EtherCAT interface, position, velocity, and current PID-controllers. Using predefined positions provided

**Figure 7.** *KUKA's youBot mobile manipulator (image courtesy: http://www.youbot-store.com/).*

over WLAN or Ethernet as input, the PC calculates moment, speed or position instructions for each axis and handles their interpolations individually.

Youbot's arm wrist is equipped with a two finger parallel gripper with a 20 mm stroke. There are multiple mounting points for the gripper fingers that user can choose based on the size of the objects to pick up [14]. In the standard gripper, the jaws are operated by two stepper motors.

Operation of the robot is possible with both connected power supply unit and the batteries. The power controller features separate charging controls for the two maintenance-free lead-acid batteries (24 V, 5 Ah) when the charger (200 W) is connected. If no charger is connected, the two batteries can supply power for up to 90 minutes. In addition, the power controller features individually switchable power supplies for the computer and the motors. Battery current is monitored via the on-board computer.

#### **5.2 V-REP and MATLAB interface for youBot**

V-REP is an open source robot simulation tool that possesses various features, relatively independent functions and more elaborate application program interfaces (APIs). It is one of the stable platforms than other open source tools with easy plugin and configuration of robotic systems [15]. Each object/model in V-REP scene can be individually controlled via an embedded script, a plugin, a Robot Operating System (ROS) node, a remote API client, or a custom solution. Control algorithms can be written in C/C++, Python, Java, Lua, MATLAB, and Octave or Urbi.

Remote API can be implemented by blocking function calls, non-blocking function calls, data streaming, and synchronous operation. In this research, we used MATLAB as the remote API because it provides a very convenient and easy way to write, modify, and run. This also allows controlling a simulation or a model with the exact same code as the one that runs the real robot. The remote API functionality relies on the remote API plugin and the remote API code on the client side. Both programs are open source and can be found in the "programming" directory of V-REP's installation. Robot HIL control system is connected with V-REP robot simulator through MATLAB remote API function as shown in **Figure 8**.

Simulation scene in V-REP robot simulator contains several elemental objects that are assembled in a tree-like hierarchy and operate in conjunction with each other to achieve physical interactions. In addition, V-REP also possesses several calculation modules that can directly operate on one or several objects in a scene. Major objects and modules used in the simulation scene include (i) sensors, (ii) CAD models of the plant and robot manipulator, (iii) inverse kinematics, (iv) minimum distance calculation, (v) collision detection, (vi) path planning, and (vii) visual servo control. Other objects that were used as basic building blocks are: dummies, joints, shapes, graphs, paths, lights, and cameras.

V-REP supports different vision sensors (orthographic and perspective type) and proximity sensors (ray-type, pyramid-type, cylinder-type, disk-type, and

**Figure 8.** *Schematic diagram for robot HIL system with V-rep through MATLAB API.*

*Physical Interaction and Control of Robotic Systems Using Hardware-in-the-Loop Simulation DOI: http://dx.doi.org/10.5772/intechopen.85251*

cone- or randomized ray-type proximity sensors). In this study, we used tactile sensors, force sensors, camera, RGB sensor, XYZ sensor, and laser range finder. Following are the steps to be followed for interfacing V-REP with MATLAB.


#### **5.3 HIL-based control law implementation**

Commands to the actual physical youBot drives are sent from an Intel Atom onboard computer running on a real-time Linux kernel for the Simple Open EtherCAT Master (SOEM). A real-time communication is established between drives and on-board computer using EtherCAT, a technology used in KUKA's industrial robots.

The KUKA youBot drive protocol is open source, which encourages the users to develop their own applications and control systems. This flexible feature enables us to deploy the control algorithm in Texas Instruments C2000 microcontroller. Since, we deploy HIL-based control law and virtual model of the youBot, we eliminate the Intel Atom on-board system and use the TI C2000 hardware target board for deploying the grasp and task control algorithms. **Figure 9** shows the HIL setup for the robotic systems using TI C2000 real time controller, which runs the vision and force control scheme for the physical interaction along with the model of the youBot developed using MATLAB. The animated actions of the youBot are observed using V-REP robot simulator for object handling controlled using the HIL setup.

MATLAB Simulink environment has the support for deploying the developed algorithm, model of environment in the real time target boards, thereby providing support for HIL. MATLAB also possess the capability to interface Texas Instruments C2000 microcontroller board. With those extended support, the control algorithms are deployed in the C2000 microcontroller board.

For carrying out those tasks, following steps are done:


**Figure 9.**

*Hardware-in-the-loop simulation setup of the robotic system.*


Repeated testing of the control algorithms are performed until meeting desired performance. Steps 4 and 5 are repeated for gaining confidence of real time implementation of the control algorithms in the physical Kuka YouBot mobile manipulator.

These testing phases enable to deploy the algorithm in real time. This was demonstrated using the model of Kuka YouBot mobile manipulator in V-REP robot simulation environment. Sensors data are taken from the V-REP environment as input to the HIL systems. Performance of the robot was assessed for the implemented control algorithms and the results are presented as shown in **Figures 10**–**12**.

#### **5.4 Physical interaction of youBOT**

Commands to the actual physical youBot drives are sent from an Intel Atom onboard computer running on a real-time Linux kernel for the Simple Open EtherCAT Master (SOEM). A real-time communication is established between drives and on-board computer using EtherCAT, a technology used in KUKA's industrial robots. The KUKA youBot drive protocol is open source, which encourages the users to develop their own applications and control systems. This flexible feature enables us to deploy the control algorithm in Texas Instruments C2000 microcontroller. Since, we deploy HIL-based control law and virtual model of the youBot, we eliminate the Intel Atom on-board system and use the TI C2000 hardware target board for deploying the grasp and task control algorithms.

**Figure 10** shows the snapshot of the animated Kuka's youBot performing physical interaction with the objects in the environment simulated using V-REP robot simulator. The trajectory taken by the robotic arm, and the force exposed for grasping of objects are commands from the HIL-based robot control algorithms implemented using TI C2000 microcontroller board. The force and tactile feedbacks along with visual feedback are acquired from robot deployed in the V-REP environment. Those signals are given as an input to the control algorithms executing in the C2000 controller through the MATLAB environment.

*Physical Interaction and Control of Robotic Systems Using Hardware-in-the-Loop Simulation DOI: http://dx.doi.org/10.5772/intechopen.85251*

#### **Figure 10.**

*Kuka's youBot performing physical interaction with the objects in the environment simulated using V-REP robot simulator.*

Range of sensors monitoring the environment during the physical interaction of the robot, gives the acquired environmental parameters to the MATLAB environment through the APIs. **Figure 11** shows the 2D visual range of the youBot feedback along x and y axis from the virtual physical environment acquired from the V-REP robot simulator.

Based on the acquired information and additional sensor feedback to the MATLAB environment, sensor fusion is done using Kalman filter. The probabilistic range of output of Kalman filter, data are fed to the C2000 real time controller through the target PC connector. The vision and force control algorithm running on the C2000 board, generate control signals based on the required tasks and grasp forces for dynamic range of objects in the physical environment. This process ensures to give the required force, maintains the stability of the grasped objects, and avoids slippery of the objects in the robot's hand. Initially, the tracking characteristics of the youbot are tested for a circular trajectory for object handling. From **Figure 12**, the plot shows the deviation of the actual trajectory from the desired

**Figure 11.** *2D visual range of Kuka's youBot feedback to the MATLAB environment from V-REP along x and y axis.*

#### **Figure 12.**

*Circular trajectory tracking characteristics of Kuka's youBot for the vision and force control algorithm implemented in TI C2000 real time controller.*

trajectory. For different observations, it was evident that the deviation is around 8% for the proposed algorithm.

Based on the satisfactory performance observed from tracking the circular trajectory, the youbot was configured with the application toward object handling. It was engaged in pick and place of an object using the proposed vision and force control algorithm implemented as in the HIL setup shown in **Figure 6** using TI C2000 real time controller. **Figure 13** shows the tracking of desired trajectory

#### **Figure 13.**

*Object handling trajectory of static Kuka's youBot for the vision and force control algorithm implemented in TI C2000 real time controller.*

*Physical Interaction and Control of Robotic Systems Using Hardware-in-the-Loop Simulation DOI: http://dx.doi.org/10.5772/intechopen.85251*

during the physical interaction of the youbot with the environment. It is observed that the youbot in a static position engaged in physical interaction with the objects. The actual trajectory is deviated by 8% from the desired trajectory for the physical interaction of objects in the environment. This ensures better tracking accuracy of 92% in the implemented control algorithm using the proposed HIL setup and TI C2000 hardware for robotic applications. Further improvement can be ensured with the choice of appropriate adaptive sensor fusion techniques, control algorithms, and optimization techniques for fine tuning the system parameters.
