**4. Results**

#### **4.1 Simulation**

To confirm the effectiveness of the proposed method, we show some numerical examples. As a model of the hand/arm system, we use the 7dof arm of the HRP-3P and a developed 4 fingered hand. This 4 fingered hand has the thumb with 5 joints and the index, middle and third fingers with 4 joints. The distal joint of each finger is not directly actuated and moves along with the next joint.

10 Will-be-set-by-IN-TECH

**if** Found finger *i* posture contacting object **then break**

*Algorithm 1* Determination of Grasping Posture for *n* finger grasp

**if** Arm IK is solvable **then break**

**if** Finger *i* IK is solvable **then**

**if** Not found finger *i* posture **then break**

**if** *n* − *f*(p1, ··· , p*n*) ≥ *δ f* **then break**

Sample Δp*p* and Δp*s*.

**loop loop**

> **end loop for** *i* = 1 to *n* **for** *j* = 1 to m Sample Δp*i*.

**end for**

Fig. 10. Grasping module on choreonoid

**Start** Start grasp planning. **Stop** Abort grasp planning.

along with the next joint.

**4. Results**

**4.1 Simulation**

**SetEnv** Assign obstructions to the selected items.

The result of grasp planning is showed on a graphics window.

To confirm the effectiveness of the proposed method, we show some numerical examples. As a model of the hand/arm system, we use the 7dof arm of the HRP-3P and a developed 4 fingered hand. This 4 fingered hand has the thumb with 5 joints and the index, middle and third fingers with 4 joints. The distal joint of each finger is not directly actuated and moves

**end loop**

**end if end for**

#### Fig. 11. Simulation environment

We prepared 9 reference motion as shown in Fig.4. The overview of numerical example is shown in Fig.11. We used a can with 0.15[kg] placed on the table as a grasped object. For this weight of the object, the grasp planner selects 3-fingered fingertip grasp or 4-fingered fingertip grasp. For each grasping style, we make only the distal link of finger contacting the object.

As shown in Fig.12, when grasping the can at a breast position, the humanoid robot grasps the can by using the four-fingered fingertip grasp. On the other hand, as shown in Fig.13, when grasping the can at waist height, the finger grasps the top of the can by using the three-fingered fingertip grasp. When grasping the top of the can, it is often difficult to grasp it by using the four-fingered fingertip grasp due to the size of the fingers.

On the other hand, we set the weight of the object as 0.35[kg] for the case of Fig.14. In this case, the humanoid robot grasps the object by using the enveloping grasp with squatting down since it is difficult to grasp the object by simply standing on the ground.

Then we performed experiment. As an grasped object, we used a 200[ml] can. We set *m*max,*<sup>i</sup>* and *m*min,*<sup>i</sup>* so that the hand grasps this can by using the enveloping grasp. In case of the enveloping grasp, we determined the finger posture (the 10th line of Algorithm 1) as follows; We first set that the distal (5th) link of the thumb and the 2nd and the 4th link of other fingers contact the object. Then, for the finger links contacting the object, we assigned a joint to each link supposed to make contact and adjusted the angles of this joint. If all the contacts are realized at the same time, 10th line of Algorithm 1 returns **true**. This grasp planning takes less than 1[s] by using Pentium M 2.0[GHz] PC and the calculation time of force closure is less than 1[ms].

#### **4.2 Experiment**

Fig. 15(a) shows the image taken by the camera attached at the head of HRP-3P humanoid robot. Three cameras are attached at the head. We used image processing software developed in VVV, AIST (Maruyama et al. (2009)). Given the geometrical model of the can, the position/orientation of it was calculated by using the segmentation based stereo method. Fig. 15(b) shows the result of extracting the segment. Although it takes less than one second

(a) (b)

Grasp Planning for a Humanoid Hand 75

(c) (d)

Fig. 13. Three-fingered fingertip grasp when grasping a can at waist hight

Fig. 14. Four-fingered enveloping grasp with squatting down

Fig. 12. Four-fingered fingertip grasp when grasping a can at breast height

to obtain the position/orientation of the can, we took the image for a few times to cope with the fault of the image processing.

The image processing was performed on the same PC with the motion planner. This PC was connected to the cameras by using IEEE 1394 cable. The image taken by the camera was processed by using this PC and the position/orientation data was transferred to the motion planner. After the motion planner plans the grasping motion, the joint angle data were transferred to the CPU boards controlling the motion of the robot. Here, we have two CPU boards installed in the chest of the robot where one is used to control the multi-fingered hand and the other is used to control the rest part of the robot. The wireless LAN is equipped with the robot. A directory of the motion planner PC was mounting on the CPU boards and the robot is controlled by using the joint angle data stored to this directory. As shown in Fig.16, HRP-3P successfully grasps objects.

The robot takes out pet bottle from fridge as shown in Fig.17. At first, the robot opens the fridge door using settled motion. Vision sensor measures the object position and the planner cannot find a feasible posture only by using the arm/hand kinematics. The planner searches a feasible posture by changing the shoulder height and finds it. The robot squats down and reaches the object and grasps it successfully.

12 Will-be-set-by-IN-TECH

(a) (b)

(c) (d)

to obtain the position/orientation of the can, we took the image for a few times to cope with

The image processing was performed on the same PC with the motion planner. This PC was connected to the cameras by using IEEE 1394 cable. The image taken by the camera was processed by using this PC and the position/orientation data was transferred to the motion planner. After the motion planner plans the grasping motion, the joint angle data were transferred to the CPU boards controlling the motion of the robot. Here, we have two CPU boards installed in the chest of the robot where one is used to control the multi-fingered hand and the other is used to control the rest part of the robot. The wireless LAN is equipped with the robot. A directory of the motion planner PC was mounting on the CPU boards and the robot is controlled by using the joint angle data stored to this directory. As shown in Fig.16,

The robot takes out pet bottle from fridge as shown in Fig.17. At first, the robot opens the fridge door using settled motion. Vision sensor measures the object position and the planner cannot find a feasible posture only by using the arm/hand kinematics. The planner searches a feasible posture by changing the shoulder height and finds it. The robot squats down and

Fig. 12. Four-fingered fingertip grasp when grasping a can at breast height

the fault of the image processing.

HRP-3P successfully grasps objects.

reaches the object and grasps it successfully.

Fig. 13. Three-fingered fingertip grasp when grasping a can at waist hight

Fig. 14. Four-fingered enveloping grasp with squatting down

(a) (b)

Grasp Planning for a Humanoid Hand 77

(c) (d)

(e) (f)

Fig. 17. Experimental result of taking out a pet bottle from a fridge

Fig. 15. Image taken by the stereo vision system

Fig. 16. Experimental result of grasping objects

14 Will-be-set-by-IN-TECH

(a) (b)

(c) (d)

Fig. 15. Image taken by the stereo vision system

Fig. 16. Experimental result of grasping objects

Fig. 17. Experimental result of taking out a pet bottle from a fridge

Y. H. Liu, (1993). "Qualitative test and force optimization of 3-D frictional form-closure grasps using linear programming", *IEEE Trans. Robot. Automat.*, vol. 15, no. 1, pp.163-173. G.F. Liu, and Z.X. Li, (2004). "Real-time Grasping Force Optimization for Multifingered

Grasp Planning for a Humanoid Hand 79

G.F. Liu, J.J. Xu, and Z.X. Li, (2004). "On Geometric Algorithms for Real-time Grasping Force Optimization", *IEEE Trans. on control System Technology*, vol.12, no.6, pp.843-859. K.i Maruyama, Y. Kawai, F. Tomita, (2009). Model-based 3D Object Localization Using

A.T. Miller, S. Knoop, H.I. Christensen, and P.K. Allen, (2003). Automatic Grasp Planning

A.T. Miller and Peter K. Allen, (2000). GraspIt!: A Versatile Simulator for Grasp Analysis,

B. Mishra, J.T. Schwartz, and M. Sharir, (1987). "On the existence and synthesis of multifinger positive grips", *Algorithmica (Special Issue: Robotics)* vol. 2, no. 4, pp.541-558. A. Morales, P.J. Sanz, A.P. del Pobil, and A.H. Fagg, (2006). Vision-based Three-finger Grasp Synthesis Constrained by Hand Geometry, *Robotics and Automous Systems*, no.54. A. Morales, T. Asfour, and P. Azad, Integrated Grasp Planning and Visual Object Localization

Y. Nakamura, K. Nagai, and T. Yoshikawa, (1987). "Dynamics and stability in coordination of

S. Nakaoka, A. Nakazawa, K. Yokoi, H. Hirukawa, and K. Ikeuchi, (2003). Generating Whole

V. Nguyen, (1988). "Constructing force closure grasps", *Int J Robot Res* vol. 7, no. 3, pp.3-16. N. Niparnan and A. Sudsang, (2004). Fast Computation of 4-Fingered Force-Closure Grasps from Surface Points, *IEEE/RSJ Int. Conf. Intelligent Robots and Systems*. N. Niparnan and A. Sudsang, (2007). "Positive Span of Force and Torque Components

M.S. Ohwovoriole, (1980). *An extension of screw theory and its application to the automation*

Y.C. Park and G.P. Starr, (1992). "Grasp synthesis of polygonal objects using a three-fingered

R. Pelossof, A. Miller, P. Allen, and T. Jebra, (2004). An SVM Learning Approach to Robotic

N.S. Pollard, (2004). Closure and Quality Equivalence for Efficient Synthesis of Grasps from

J. Ponce and B. Faverjon, (1995). "On computing three-finger force-closure grasps of polygonal

J. Ponce, S. Sullivan, J.-D. Boissonnat, and J.-P. Merlet, (1993). On Characterizing and

Computing Three- and Four-fingered Force Closure Grasps of Polygonal Objects,

multiple robotic mechanisms", *Int J Robot Res* vol. 8, no. 2, pp.44-61.

*International Conf. on Robotics and Automation*, pp.4701-4706.

robot hand", *Int J Robot Res* vol. 11, no. 3, pp.163-184.

Grasping, *IEEE Int. Conf. on Robotics and Automation*.

Examples, *Int. J. of Robotics Research*, vol.23, no.6, pp.595-613.

objects", *IEEE Trans Robot Automat* vol. 11, no. 6, pp.868-881.

for a Humanoid Robot with Five-Fingered Hands, (2006). *IEEE/RSJ Int. Conf. on*

Body Motions for a Biped Humanoid Robot from Captured Human Dances, *IEEE Int.*

of Three-Fingered Three-Dimensional Force-Closure Grasps", *Proc. of the IEEE*

*of industrial assemblies*, Ph.D. dissertation, Department of Mechanical Engineering,

using Shape Primitives, *IEEE Int. Conf. on Robotics and Automation*.

*ASME Int. Mechanical Engineering Congress & Exposition*.

vol.9, no.1, pp.65-77.

*Intelligent Robots and Systems*.

*Conf. on Robotics and Automation*.

OpenRTM: http://www.is.aist.go.jp/rt/OpenRTM-aist/

*IEEE Int. Conf. on Robotics and Automation*.

Stanford University.

Occluding Contours, *Proc. 9th ACCV*, MP3-20.

Manipulation: Theory and Experiments", *IEEE/ASME Transactions on Mechatronics*,

## **5. Conclusions and future works**

In this chapter, we present algorithms of grasp planning for humanoid hand/robot system. By using the convex model for both hand and the object, the nominal palm position/orientation can be calculated easily. Also by using the ellipsoidal approximation of the friction cone, the grasp planning can be finished within a short period of time.

We improved the performance of the planner by considering the following things; First, depending on which part of the object to be grasped by the hand, our planner changes the grasping style. We numerically confirmed that, when grasping the side of the can, the robot grasps it by using the four-fingered fingertip grasp. On the other hand, when grasping the top of the can, the robot grasps it by using the three-fingered fingertip grasp. Also, if it is difficult to find the grasping posture only by using the arm/hand kinematics, the planner tries to use the whole body motion. We confirmed the effectiveness of our planner modularized as Choreonoid plugin by using experimental results.

Planner for the humanoid hand is necessary for future applications. Grasp planning would be improved on the dexterousness. The planner for multi-arm and handling objects has to be developed.

#### **6. References**


16 Will-be-set-by-IN-TECH

In this chapter, we present algorithms of grasp planning for humanoid hand/robot system. By using the convex model for both hand and the object, the nominal palm position/orientation can be calculated easily. Also by using the ellipsoidal approximation of the friction cone, the

We improved the performance of the planner by considering the following things; First, depending on which part of the object to be grasped by the hand, our planner changes the grasping style. We numerically confirmed that, when grasping the side of the can, the robot grasps it by using the four-fingered fingertip grasp. On the other hand, when grasping the top of the can, the robot grasps it by using the three-fingered fingertip grasp. Also, if it is difficult to find the grasping posture only by using the arm/hand kinematics, the planner tries to use the whole body motion. We confirmed the effectiveness of our planner modularized as

Planner for the humanoid hand is necessary for future applications. Grasp planning would be improved on the dexterousness. The planner for multi-arm and handling objects has to be

K. Akachi, K. Kaneko, N. kanehira, S. Ota, G. Miyamori, M. Hirata, S. Kajita, and F. Kanehiro,

Ch. Borst, M. Fischer, G. Hirzinger, (1999). A fast and robust grasp planner for arbitrary

C. Borst, M. Fischer, G. Hirzinger, (2003). Grasping the Dice by Dicing the Grasp, *IEEE/RSJ Int.*

J.A. Coelho Jr. and R.A. Grupen, (1996). Online Grasp Synthesis, *IEEE Int. Conf. on Robotics and*

M.R. Cutkosky, (1989). On Grasp Choice, Grasp Models, and the Design of Hands for Manufacturing Tasks, *IEEE Trans. on Robitics and Automation*, vol.5, no.3. S. Ekvall and D. Kragic, (2007). Learning and Evaluation of the Approach Vector for Automatic Grasp Generation and Planning, *IEEE Int. Conf on Robotics and Automation*. C. Ferrari, and J. Canny, (1992). Planning optimal grasps, *IEEE Intl. Conf. on Robotics and*

L. Han, J. C. Trinkle, and Z.X. Li, (2000). "Grasp Analysis as Linear Matrix Inequality Problems", *IEEE Trans. on Robotics and Automation*, vol. 16, no.6, pp. 663-674. K. Harada, K. Kaneko, and F. Kanehiro (2008). Fast Grasp Planning for Hand/Arm Systems based on Convex Modes, *Procedings of IEEE Int. Conf. on Robitics and Automation*. K.Kaneko, K. Harada, and F. Kanehiro, (2007). Development of Multi-fingered Hand for Life-size Humanoid Robots, *IEEE Int. Conf. on Robotics and Automation*, pp. 913-920. S.B. Kang and K. Ikeuchi, (1995). Toward Automatic Robot Instruction from

J. Kerr and B. Roth, (1988). "Analysis of multifingered hands", *Int J Robot Res* vol. 4, no. 4,

Perception-temporalsegmentation of Tasks from Human Hand Motion, *IEEE*

(2005). Development of Humanoid Robot HRP-3P, *IEEE-RAS/RSJ Int. Conference on*

3D objects, *Proc. of the IEEE International Conf. on Robotics and Automation*, vol 3,

**5. Conclusions and future works**

grasp planning can be finished within a short period of time.

Choreonoid plugin by using experimental results.

*Conf. on Intelligenr Robots and Systems*.

*Trans. on Robotics and Automation*, vol. 11, no. 5. I.A. Kapandji, (1985). Physiologie Architecture, *Maloine S.A. Editeur*.

*Automation*, pp.2290-2295.

developed.

**6. References**

*Humanoid Robots*.

pp.1890-1896.

*Automation*.

pp.3-17.


**5** 

*Republic of Korea*

**Design of 5 D.O.F Robot Hand** 

**with an Artificial Skin for an Android Robot** 

Dongwoon Choi, Dong-Wook Lee, Woonghee Shon and Ho-Gil Lee *Department of Applied Robot Technology, Korea Institute of Industrial Technology* 

There have been many researches of robot hands in robot fields and they have been considered one of the most complicated area. There are many reasons why researches of robotic hand are difficult, and these are from complicated structures and functions of hands. There are many types of robotic hands in robotics area, but they can be classified to major two categories. The one is a robotic hand for an operation in industrial area and the other is an experimental hand like human hand. The most of robotic hands in industrial area are 1 D.O.F or 2 D.O.F gripers and they are designed for precise, repetitive operations. In the other area, human like robotic hands, main concerns are how the shape of robotic hands resembles human hands and how the robotic hands can operate like human hands. Most human like robotic hands have 3 ~ 5 fingers like human hands and their shape, size and functions are designed based on human hand. For a long time, major area in researches of robotic hand has been an industrial area, but the importance of human like robotic hands are getting more and more increasing, because the needs of robots will be changed from industrial fields to human friendly environment such as home, office, hospital, school and so on. In brief, the mainstream of robotics will be changed from industrial robots to service robots. One of the important factors for service robots in human friendly environment is their appearance. In general, most of humans are feeling friendly and comfortably to similar appearance like them, so the appearance of service robots should resemble human and their hands should imitate human hands, too. For this reason, there have been many researches

Haruhisa Kawasaki developed Gifu hand 2 which has 5 independent fingers. It has 16 D.O.F and 20 joints, so it is one of the most complicated hands. It can operate all joint of each fingers and with attached tactile sensors, delicate grip can be operated. However, its size is big to install to the human size robot (Haruhisa Kawasaki. et al., 2002). F. Lotti used spring joint and tendon to make UBH 3. It has 5 fingers and human like skin and like Gifu hand, each finger has independent joint. The characteristics of this hand is using a spring to its joint and this make its structure simple, but this hand uses too many motors and they are located in other place, so this hand is not good to humanoid robot (F. Lotti. et al., 2005). Kenji KANEKO developed human size multi fingered hand. It has 13 D.O.F complicated fingers and all devices are located in hand but it has 4 fingers and the back of the hand is too big like glove. In this reason, the shape of this hand is a little bit different to human hand, so

**1. Introduction** 

for human like robotic hands.

