**2.2. Composition and operation of the simulator**

index, and middle fingers. *Gx*(*s*)is the transfer function that represents the dynamics of human neuromuscular control system (Akazawa *et al*., 1987; Okuno *et al*., 1999), and is given by:

> <sup>1</sup> ( ) <sup>1</sup> *<sup>x</sup> <sup>s</sup> Gs K*

( ) <sup>0</sup> () () , *Kt K aA t A t f e* =+ + é ù

as:

controls the end effector.

272 Electrodiagnosis in New Frontiers of Clinical Research

ence between the commanded angle *<sup>Θ</sup>*˜

effector (c3) has one opening-closing degree-of-freedom Θ*<sup>H</sup>* .

measured by an optical encoder.

2 1

t

where the time constants were calculated to be *τ*<sup>1</sup> =0.12*s* and *τ*<sup>2</sup> =0.25*s*, and the gain *K* corresponds to the stiffness of the prosthesis fingers, which is not constant, but time-varying

in proportion to the contraction level of the extensor-flexor muscles pair. The user can regulate the stiffness of the hand fingers angle by varying the level of contraction of each of those muscles. The stiffness at resting state *K*0 is 0.1 Nm/rad, and the coefficient *a* is 0.98 rad-1. A software program implementing this model was introduced in the microprocessor that

The position control system (see Figure 1) consists of a DC motor (MINIMOTOR SA, Cro‐ glio, Switzerland, type 2233), its servo controller (Figure 1(c2)), and a one degree-of-freedom end effector with three fingers (Figure 1(c3)). Index and middle fingers are bound between them and are endorsed with an open-close movement with respect to the thumb. This move‐ ment is produced by the DC motor, the servo controller of which works to nullify the differ‐

**Figure 1.** Block diagram of the *Osaka Hand*. The model of the human neuromuscular control system dynamics (labeled as c1) takes the processed EMG signals *Ae* and *Af* from the subject's forearm and calculates the target angle Θ˜ *<sup>H</sup>* ; the

servo controller (c2) works to nullify the difference between Θ˜ *<sup>H</sup>* and the actual motor rotational angle <sup>Θ</sup>

*s* t

<sup>+</sup> <sup>=</sup> <sup>+</sup> (2)

ë û (3)

*<sup>H</sup>* and the actual motor rotational angle *Θ*

^ *<sup>H</sup>* as

^

*<sup>H</sup>* . The end

Figure 2 shows the components of the simulator system, which can be divided into three main sub-systems: data acquisition (EMG and video), processing, and display. Ten light emitter diode (LED) markers and two pairs of surface electrodes are attached to the subject's upper limb as shown in Figure 2(a) and 2(b). Those LEDs and electrodes provide the inputs for the processing sub-system, which is implemented in the graphic workstation (Figure 2(c)).

**Figure 2.** Overview of the simulator components. The graphic workstation (GW) receives the 3D location of the LED markers (a) attached to the subject's arm and the processed EMG signals from two surface electrodes placed on the subject's forearm (b). From these data, the GW calculates and displays (c and d) the finger angle and the arm posture. Inset: detail of LED markers attachment.

Figure 3 shows the block diagram of the simulator, illustrating how the processing system (Figure 3(c)) determines the position of the upper limb from the three-dimensional (3D) location of the markers on the shoulder, elbow, and wrist detected with an OPTOTRAKTM 3D camera (NORTHERN DIGITAL Inc., Ontario, Canada) (Figures 2(a) and 3(a)). The processing system determines the desired finger angle from the processed surface EMGs of both wrist flexor and extensor muscles of the subject (Figures 2(b) and 3(b)). The virtual upper limb and hand are ultimately presented on the 3D graphic workstation (display system, Figure 3(d)).

*2.2.2. Processing system*

software (Figure 2(c) and Figure 3(c)).

of the relationship between target angle *<sup>Θ</sup>*˜

Figure 1). In the steady case, we assumed *Θ*

The relationship between *Θ*

markers attached to the prosthetic hand.

In order to model the relationship between *Θ*

was defined as the angle formed between the vectors *M*1*M*<sup>3</sup>

**Figure 4.** Finger angle Θ*H* is defined as the angle formed between the vectors *M*1*M*<sup>3</sup>

^

range into three areas, as shown in Figure 3(c3).

The location of the LEDs and the processed EMG signals are collected by a graphics worksta‐ tion (GW, SILICON GRAPHICS, Inc., California, USA) that holds the processing system

Simulator of a Myoelectrically Controlled Prosthetic Hand with Graphical Display of Upper Limb and Hand Posture

The angle that the user wants to achieve with the prosthesis fingers (the target angle) is given by Eqs. (2) and (3) using the current value of user's EMG signals *Ae* and *Af* . Those equations are calculated in the real *Osaka Hand* by a Z-transform that gives in discrete time their solution, originally expressed in the frequency domain (see Figure 3(c1)). In the case of the processing system of the simulator, the sampling frequency is not high enough to allow using that transform. Therefore, we used the Runge-Kutta-Gill approximation method for differential

Dynamics of the DC motor servo system of the actual prosthetic hand were calculated in terms

^

*Hand* (see Figure 1), we performed the following measurements by attaching two LEDs to the prosthesis chassis and one on each fingertip as shown in Figure 4. The hand finger angle *Θ<sup>H</sup>*

piecewise approximation calculated by a least squares method. The error was always below 8% with an average of 1.7%, standard deviation (s.d.) 1.36. We roughly divided the operation

^ *<sup>H</sup>* <sup>=</sup>*<sup>Θ</sup>*˜

the fingertips with respect to the chassis. The operation range of this angle is from 0o

*<sup>H</sup>* and rotational angle *Θ*

^

→ and *M*1*M*<sup>4</sup>

*<sup>H</sup>* and final finger angle *ΘH* (see Figure 1) was modeled with a

*<sup>H</sup>* , with zero time delay (Figure 3(c2)).

*<sup>H</sup>* and final finger angle *ΘH* of the real *Osaka*

→and *M*1*M*<sup>4</sup>

→. M1 to M4 are LED

*<sup>H</sup>* of the motor shaft (see

http://dx.doi.org/10.5772/55503

275

→, *i.e.*, the angle between

 to 110o .

equations in order to implement the transfer function *Gx*(*s*) (Eqs. (2) and (3)).

**Figure 3.** Simulator block diagram. With the 3D markers position obtained from the Optotrak 3D camera (block a), the processing system (c) calculates subject's arm posture and wrist rotational angle. From the processed EMG signals *Ae* and *Af* (b), the model of the human neuromuscular control system dynamics (c1) calculates a first approximation of the desired angle Θ˜ *HS* . The dynamics of the servo control system (c2) can be regarded as the identity, Θ˜ *HS* <sup>=</sup><sup>Θ</sup> ^ *HS* . Non‐ linear characteristics of the relationship between Θ ^ *HS* and Θ*HS* are inserted in block c3. The virtual arm and prosthetic hand are displayed in the display system (d).

#### *2.2.1. Data acquisition system*

The OPTOTRAK 3D camera (Figure 2(a) and Figure 3(a)) detects the position of the LED markers attached to the user's shoulder, elbow, and wrist. The marker on the shoulder is attached to the point where the movement of the *acromion* of the scapula is smallest during the motion of the arm. The marker on the elbow is fixed in the external palate of the humeral. The arm posture is calculated from those LEDs locations.

To measure the rotational angle of the wrist during an external pronation of the arm, eight LEDs are placed on the external side of a bracelet-like device attached to the wrist (shown in the inset of Figure 2).

The EMG signals of wrist muscles are picked up with surface electrodes (Figure 2(b) and Figure 3(b)). These signals are then amplified (gain 58.8 dB, CMRR 110 dB) to the range ±5 V, fullwave rectified, smoothed with a second order low-pass filter (cut-off frequency 2.7 Hz), and then sampled at a frequency of 25 Hz, 12 bits per sample (resolution of ±2.4 mV, less than 0.01% of the maximum value) with an OPTOTRAK Data Acquisition Unit (NORTHERN DIGITAL Inc., Ontario, Canada).
