**1. Introduction**

Robotics have been flourished as a platform with its extensive set of applications, which once was thought as a night mare, and now have started dominating the industrial sector. Because of its expanded domain of applications, it can even start ruling humans in near future. Robotic systems have been turned into very powerful essentials of today's industrial sector. It is often anticipated to synthesis certain features of human tasks in huge scale by using actuators, sensors, and personal computers or on-board computers. Designing and controlling of such robotic systems requires combining the concepts from various classical fields of science [1].

Robotic manipulators are a class of robotic systems with mechanical-articulated arm along with a static base. On the other hand, mobile robots belong to another different class of robotic systems with mobile base. Currently, most of the automation tasks in industries also depend on hybrid variant of robotic systems, which

have both manipulator and mobile base [2]. Usage of robots mainly aims to reduce man power in control appliances for industrial and other household or commercial appliances, by enabling accuracy in processing and performing physical interactions. Actions performed by robotic systems are relatively faster than humans and also deliver reliability and robustness in the system.

Various robotics research domains are available with their potential paths that work together with human beings in households and workplaces as useful and capable agents based on the ability of robots. Another most essential objective of robotic systems are interacting with the physical environment toward specific applications domains, tracking the objects, and moving the gripper with accuracy [3]. Also precision in trajectory tracking of robotic systems, grasping of objects, stabilizing the grip and force control are all challenging research issues, which in turn may seriously affect the production in industrial sector if they are not properly addressed [4].

For performing physical interaction with the help of robotic systems, handling of objects in industries and other applications are becoming predominant. It is driven by proper motion control accomplished by fixing global, joint, and object's reference frames in the physical environment [5]. The choice of workspace of the robotic systems also plays a major role for physical interaction. Workspaces are the collection of points that can be reached by the robot based on its configuration, size of links, type of joints, and also defined by its own limitations.

For establishing the physical link of the robot with the environment, tactile sensors along with force sensors and vision sensors can be deployed based on the variety of applications. Sensing of the handled objects and force applied on the particular object based on dexterous capability of the objects are vital parameters in physical interaction. Controlling the force applied on the objects and stabilizing the grasping potential of the grippers are crucial challenges while performing interaction with the environment. Various controllers were proposed by researchers for reducing the control effort with reduced noise in control torque. Also for operating the robotic systems under unknown external environments, certain adaptive techniques were also proposed by researchers. Accuracy in driving of the end-effectors of the robots can be accomplished by tracking the desired path for each joint of robots by choosing the best closed loop controllers. In this chapter, systematic solutions are provided for the design of grasp and force control along with the support of visual servoing for controlling the hybrid robotic systems with the physical environment [6].

### **2. Physical interaction of robotic systems**

Robotic systems can interact physically in its workspace by first establishing a contact with the environment with appropriate configuration. Subsequently, the robots are fed with required force for motion of its physical parts to perform certain desired tasks. Interaction by means of grasping of objects in workspace and the desired motion control constrained to tasks are dealt. Since, the early inception of robotic research, manipulation of robots with its environment has been dealt by different researchers.

#### **2.1 Interaction with environments**

Robotic systems grasping objects in the environment can be implemented based on either contact level method or knowledge-based method. Contact level method applies force torque to the objects handled by the robot. Few of the variants of the contact-based methods include frictionless contacts, soft contact, and with

#### *Physical Interaction and Control of Robotic Systems Using Hardware-in-the-Loop Simulation DOI: http://dx.doi.org/10.5772/intechopen.85251*

frictional contact forces applied. Reliability of grasping potential can be estimated using force closure property, by compensating external disturbances. Other possible forces of contact can also be random potential grasps, which may not be optimal but manages vector space that includes all feasible contact forces.

The limitation of contact-based approach is that it never considers the constraints imposed on the hand of the robotic systems. This may result in target points that may not be reachable by the hand. Knowledge-based method considers the hand post postures that are already predefined. This provides a qualitative method for planning for grasping objects by adapting to the workspace geometry. Control algorithms based on task-oriented grasping consider requirements of tasks in the workspace of the robots. This method requires minimum number of contacts than other conventional methods. So, the number of fingers in the workspace of the robotic systems can be minimized. For any given task, the efficiency of taskoriented approach is reasonably better, and it also increases the versatility of the robot for performing different tasks.

#### **2.2 Role of sensors for physical interaction**

Usage of tactile sensors are one of the best choice for performing physical interactions of robots with the environment. They are also used for estimating the contact information and detecting the slippery of objects during grasping. Force sensors on the other hand estimates the impact of forces applied on the objects and studies the dexterous capability of the object under test. Vision sensors are one of the best choices for observing the impact of physical interaction of the robot with the environment [7]. However, it also challenges the designer with series of preprocessing stages during implementation. Model-based pose estimation techniques can simultaneously observe the position as well the orientation of the hand and the objects under manipulation.

Best and robust estimation of environmental parameters during physical interaction of the robotic systems can be availed by implementing sensor fusion techniques. Sensor fusion provides mechanisms for combining different sensor outputs for better observation about the environment. Accurate prediction of environmental parameters is possible only if the information acquired from multiple sensors is combined for decision making. Popular approaches such as Kalman Filter and Extended Kalman Filter can be better choices for sensor fusion processes, particularly in robotic applications.

#### **2.3 Planner for physical interaction**

The requirements of physical interaction and tasks can be defined with the support of interaction planning algorithms. Physical interaction tasks can be automatically specified, by feeding the description of objects and tasks to be handled at workspace to planning algorithms. Choice of ideal gripper, gripper pre-shapes, and gripper adaptors provides generic task-based planning algorithms for the endeffectors of robots with good versatility. Also, the planning algorithm may also be environmental specific based on the dexterous capability of the objects for grasping and tasks performed on the environment.

Vast category of objects may limit the physical interaction tasks supported by the planner. Coupling the planner with vision-based sensors for physical interaction with strange objects in the environment provides perfect versatility and autonomy for the robotic systems. Approximating the shapes of the strange objects, the planning algorithms generate suitable velocity and force references for the end-effectors to actuate the joint actuators.

## **2.4 Application of force and feedback during interaction**

During physical interaction of the robot with the environment, even a minimum change in the positioning may lead to generate unwanted interaction forces by the planner. Choice of active and passive stiffness methods can be employed to regulate the application of the forces on the end-effectors. Based on the requirements of task, the active stiffness method controls the desired end-effector stiffness. Passive method enables to deform the mechanical body of the gripper based on the external forces applied.
