**1. Introduction**

18 Will-be-set-by-IN-TECH

114 Haptics Rendering and Applications

Rizzolatti, G. (2005). The mirror neuron system and its function in humans, *Anatomy and*

Sallnäs, E.-L., Rassmus-Gröhn, K. & Sjöström, C. (2000). Supporting presence in collaborative

Santos, N. S., David, N., Bente, G. & Vogeley, K. (2008). Parametric induction of animacy

Santos, N. S., Kuzmanovic, B., David, N., Rotarska-Jagiela, A., Eickhoff, S. B., Shah, J. N., Fink,

Scholl, B. & Tremoulet, P. (2000). Perceptual causality and animacy., *Trends in cognitive sciences*

Shams, L. & Kim, R. (2010). Crossmodal influences on visual perception, *Physics of life reviews*

Shimojo, S. & Shams, L. (2001). Sensory modalities are not separate modalities: plasticity and

Takahashi, K., Mitsuhashi, H., Murata, K., Norieda, S. & Watanabe, K. (2011a). Feelings of

Takahashi, K., Mitsuhashi, H., Murata, K., Norieda, S. & Watanabe, K. (2011b). Improving

Takahashi, K., Mitsuhashi, H., Murata, K., Norieda, S. & Watanabe, K. (2011c).

Takahashi, K., Mitsuhashi, H., Norieda, S., Sendoda, M., Murata, K. & Watanabe, K. (2010).

Tanaka, A., Koizumi, A., Imai, H., Hiramatsu, S., Hiramoto, E. & de Gelder, B. (2010). I

Tremoulet, P. D. & Feldman, J. (2000). Perception of animacy from the motion of a single object,

Wang, D., Tuer, K. & Ni, L. (2004). Conducting a real-time remote handshake with haptics,

Zajonc, R. B. (1968). Attitudinal effects of mere exposure., *Journal of personality and social*

Animacy and Pleasantness from Tactile Stimulation: Effect of Stimulus Frequency and Stimulated Body Part, *IEEE Int. Conf. on Systems, Man, and Cybernetics*

Shared Experiences by Haptic Telecommunication, *International Conference on*

Kansei Information and Factor Space of Nonverbal Interpersonal Communicaion – Psychological Investigation on Physical Contacts Modified by Japanese Onomatopoieas –, *Transactions of Japan Society of Kansei Engineering* 10(2): 261–268. Takahashi, K., Mitsuhashi, H., Norieda, S., Murata, K. & Watanabe, K. (2010).

Frequency-dependence in haptic, visual, and auditory animacy perception, *IEICE*

Japanese Onomatopoeias and Sound Symbolic Words in Describing Interpersonal Communication., *The Proceedings of the International Conference on Kansei Engineering*

feel your voice. Cultural differences in the multisensory perception of emotion.,

*Proceedings of the 12th international conference on Haptic interfaces for virtual environment*

experience, *Consciousness and cognition* 17(2): 425–437.

study on animacy experience, *NeuroImage* 53(1): 291–302.

interactions, *Current opinion in neurobiology* 11(4): 505–509.

*Human Communication Group Symposium 2010*, pp. 331–336.

environments by haptic force feedback, *ACM Transactions on Computer-Human*

G. R., Bente, G. & Vogeley, K. (2010). Animated brain: a functional neuroimaging

*embryology* 210(5-6): 419–421.

*Interaction* 7(4): 461–476.

4(8): 299–309.

7(3): 269–284.

*(SMC2011)*.

*Biometrics and Kansei Engineering*.

*and Emotion Research 2010*, pp. 2162–2171.

*Psychological science* 21(9): 1259–1262.

*Perception* 29(8): 943–951.

*and teleoperator systems*.

*psychology* 9(2): 1–27.

The development of haptic technology is allowing the introduction of Virtual Reality systems as teaching and working tools into many fields such as engineering (Howard & Vance, 2007; Savall et al., 2002) or surgery (Basdogan et al., 2004; Li & Liu, 2006).

Haptic devices allow users to interact with a certain environment, either remote or virtual, through the sense of touch, considerably enhancing interactivity. A haptic device is a mechanism that allows users to control the movements of a virtual tool or a real robot and receive tactile and kinesthetic information from the working environment (Fig. 1).

The usability of these systems is conditioned by the quality of the haptic feedback applied to the user. Technologically, the computation of appropriate and realistic haptic stimuli continues to be a complicated issue. The human sensory-motor system demands a fast update rate (at least 1 kHz) for the haptic stimuli applied to the user in order to avoid instabilities in

Fig. 1. Haptic interaction with a virtual environment

for Complex Interactions 3

Effective Haptic Rendering Method for Complex Interactions 117

rendering, that provides transparent manipulation of rigid models with a high polygon count

Unlike penalty methods, constraint-based methods do not use the interpenetration between rigid objects to calculate collision response. These methods use virtual coupling techniques (Colgate et al., 1995) and restrict the movement of virtual objects on the surface of obstacles. Zilles and Salisbury (1995) proposed a constraint-based method for 3-DOF haptic rendering of generic polygonal objects. They introduced the "god-object", an idealized representation of the position of the haptic device that is constrained on the surface of obstacles. At each time step, the location of the god-object minimizes the distance to the haptic device and the difference between the two positions provides the force direction. Ruspini et al. (1997) extended this approach by replacing the god-object with a small sphere as well as proposing methods to smooth the object surface and add friction. Later, Ortega (2006) extended the 3-DOF constraint-based method of Zilles and Salisbury by employing a 6-DOF god-object. The different haptic rendering methods described above have contributed extensively to a better representation of contact events between virtual objects. However, haptic interactions with multiple contacts, which also include geometrical discontinuities, have not yet been adequately accomplished computing unrealistic or unstable haptic feedback in these cases. These type of situations are very common in real scenarios and therefore it is necessary to compute properly a stable haptic response in order to improve the usability of these systems. The proposed haptic rendering method overcomes limitations from previous approaches in

The haptic rendering method outlined in this chapter computes the force and torque that result when a collision between two type of objects occurs: a virtual tool (mobile object) manipulated by the user of the haptic device and any object in the simulation (static object). Three main modules can be identified in the haptic rendering method proposed (Fig. 2):

The complete haptic rendering sequence could be described as follows: firstly, the control module acquires the position (*Uh*) and orientation (*Rh*) of the haptic device and sends it to the collision detection module. With this information, the module checks for collisions between the mobile object and the static environment. If no collisions occur, it waits for new information from the control module. Otherwise, when a collision event occurs, the contact information of both static and mobile objects (**C***s*, **C***m*) is sent to the collision response module which calculates the interaction force and torque. This haptic feedback approximates the contact force and torque that would arise during contact between real objects (*Fr*, *Tr*). Finally, the collision response module sends this information to the control module, which applies it

(Otaduy & Lin, 2006).

this type of collisions.

**3.1 Collision detection**

**3. Proposed haptic rendering method**

collision detection, collision response and control module.

to the haptic device (*Fh*, *Th*), maintaining a stable system behaviour.

A more complete description of each module can be found in the following sections.

The collision detection method presented in this chapter can handle non-convex objects without modifying the original mesh. A technique based on a spatial partition (voxels) has been chosen. Hierarchical methods like octrees have also been tested since they require less

the system and to present rigid objects with reasonable stiffness (Shimoga, 1992). However, this update rate is often difficult to reach by haptic rendering methods, especially when working in complex environments. One possible solution is to reduce the computational cost of calculating the haptic response by decreasing the accuracy of the method. However, this can result in the emergence of discontinuities in the response. This leads to find a trade-off between the accuracy of the method, which guarantees a smooth and stable haptic response, and the computational cost.

This chapter describes a haptic rendering method to properly compute interacting forces and torques in complex environments, ensuring improved feedback by seeking a compromise between continuity and computational cost. In addition, the proposed method pays particular attention to provide users with comfortable interaction. The method is valid for applications in which the virtual environment is composed of rigid and static objects, excluding deformable objects.

The remainder of this chapter is organized as follows: Section 2 presents an overview of the related research on the area. Afterwards, Section 3 describes the haptic rendering method proposed by the authors, describing in detail all algorithms necessary to render appropriate and stable forces and torques to the user. The proposed method is then evaluated in Section 4 within two different virtual scenarios simulating common collisions during aeronautic maintainability tasks. Aeronautic virtual mock-ups have been selected for algorithms testing due to their high interaction complexity. Finally, conclusions and future directions are drawn in Section 5.
