**3.1 Kinematics development**

In order of setting the kinematics of the robot in LinuxCNC, it is necessary to define the forward kinematics and the inverse kinematics. Therefore, it is necessary to describe mathematically both kinematics. This work is done by linking the input parameters with the output parameters with the help of geometry. The geometrical parameters (GP) of MR-2000 are shown in **Table 2**.

## *3.1.1 Forward kinematics*

As described earlier in Section 2.3, the forward kinematics allows to find our coordinates [X*T*, Y*T*, Z*T*, A, B] from the values of the angles of each joint, taking into count the value of the parameters [*θ*1, *θ*2, *θ*3, *θ*4, *θ*5] (See **Figures 3** and **4**).

Knowing this, the coordinates are calculated as follows:

$$b = -a\_1 \sin\left(\theta\_3\right) + L\_2 \cos\left(\theta\_3\right) - a\_2 \sin\left(A\right) + L\_3 \cos\left(A\right) + L\_1 \cos\left(\theta\_2\right) + a\_0 \tag{1}$$

$$X\_T = b \ast \cos\left(\theta\_1\right) \tag{2}$$

$$Y\_T = b \ast \sin\left(\theta\_1\right) \tag{3}$$

$$Z\_T = L\_1 \sin\left(\theta\_2\right) + a\_1 \cos\left(\theta\_3\right) + L\_2 \sin\left(\theta\_3\right) + a\_2 \cos\left(A\right) + L\_3 \sin\left(A\right) + d\_0 \tag{4}$$

$$A = \theta\_3 + \theta\_4 \tag{5}$$


**Table 2.** *Parameters of robot.* *Implementation of an Artificial Vision System for Welding in the Retrofitting Process… DOI: http://dx.doi.org/10.5772/intechopen.88360*

$$B = \theta\_{\S} \tag{6}$$

*XT* and *YT* are located in **Figure 3**, and *ZT*, A, and B in **Figure 4**. Based on the above, the LinuxCNC system can find the final position of the robot in the system of coordinates [X*T*, Y*T*, Z*T*, A, B] in the joint mode.

#### *3.1.2 Inverse kinematics*

An approximation to the structure of the robot is presented in **Figures 3** and **4**, where it is possible to observe the different parameters involved in the kinematics

LinuxCNC has a module of kinematics that allows loading a file of code C++ previously compiled with the model of the kinematics of the robot. In this file, both

The camera used is the Blackfly of 1.3 MP that has the following characteristics: resolution of 1288 � 964 px, type of CCD sensor, pixel size of 3.75 μm. This type of camera is manufactured for applications of artificial vision. It has robust features

In order of setting the kinematics of the robot in LinuxCNC, it is necessary to define the forward kinematics and the inverse kinematics. Therefore, it is necessary to describe mathematically both kinematics. This work is done by linking the input parameters with the output parameters with the help of geometry. The geometrical

As described earlier in Section 2.3, the forward kinematics allows to find our coordinates [X*T*, Y*T*, Z*T*, A, B] from the values of the angles of each joint, taking into count the value of the parameters [*θ*1, *θ*2, *θ*3, *θ*4, *θ*5] (See **Figures 3** and **4**).

*b* ¼ �*a*<sup>1</sup> *sin* ð Þþ *θ*<sup>3</sup> *L*<sup>2</sup> *cos* ð Þ� *θ*<sup>3</sup> *a*<sup>2</sup> *sin A*ð Þþ *L*<sup>3</sup> *cos A*ð Þþ *L*<sup>1</sup> *cos* ð Þþ *θ*<sup>2</sup> *a*<sup>0</sup> (1)

*ZT* ¼ *L*<sup>1</sup> *sin* ð Þþ *θ*<sup>2</sup> *a*<sup>1</sup> *cos* ð Þþ *θ*<sup>3</sup> *L*<sup>2</sup> *sin* ð Þþ *θ*<sup>3</sup> *a*<sup>2</sup> *cos A*ð Þþ *L*<sup>3</sup> *sin A*ð Þþ *d*<sup>0</sup> (4)

**GP of the arm Value** a0 300 mm a1 100 mm a2 0 mm d0 300 mm L1 500 mm L2 700 mm L3 500 mm

*XT* ¼ *b* ∗ *cos* ð Þ *θ*<sup>1</sup> (2) *YT* ¼ *b* ∗ *sin* ð Þ *θ*<sup>1</sup> (3)

*A* ¼ *θ*<sup>3</sup> þ *θ*<sup>4</sup> (5)

kinematics should be defined for the complete functioning of the robot.

and captures precise images to contribute to the image processing stage.

of the robot.

*Digital Imaging*

**3. Results**

**Table 2.**

**68**

*Parameters of robot.*

**2.4 Artificial vision system**

**3.1 Kinematics development**

*3.1.1 Forward kinematics*

parameters (GP) of MR-2000 are shown in **Table 2**.

Knowing this, the coordinates are calculated as follows:

In order to find the values of angular positions based on the spatial coordinates, the following equations are developed:

$$b = \sqrt{{\bf X}\_T^2 + {\bf Y}\_T^2} \tag{7}$$

$$h = \sqrt{\left(Z\_T - d\_0 + \sqrt{a\_2^2 + L\_3^2} \sin\left(A - \arccos\left(\frac{L\_3}{\sqrt{a\_2^2 + L\_3^2}}\right)\right)\right)^2 + (b - a\_0)^2} \tag{8}$$

$$\theta\_1 = \arctan\left(\frac{X\_T}{Y\_T}\right) \tag{9}$$

$$\begin{aligned} \theta\_2 &= \arccos\left(\frac{h^2 + L\_1^2 - \sqrt{a\_1^2 + L\_2^2}^2}{2hL\_1}\right) \\ &+ \arccos\left(\frac{Z\_T - d\_0 + \sqrt{a\_2^2 + L\_3^2}\sin\left(A - \arccos\left(\frac{L\_3}{\sqrt{a\_2^2 + L\_3^2}}\right)\right)}{h}\right) \end{aligned} \tag{10}$$

$$\begin{aligned} \theta\_3 &= -\theta\_2 + \arccos\left(\frac{h^2 + L\_1^2 - \sqrt{a\_1^2 + L\_2^2}^2}{2hL\_1}\right) - \arcsin\left(\frac{a\_1}{\sqrt{a\_1^2 + L\_2^2}}\right) \\ &- \arccos\left(\frac{\sqrt{a\_1^2 + L\_2^2}^2 + h^2 - L\_1^2}{2h\sqrt{a\_1^2 + L\_2^2}}\right) \end{aligned} \tag{11}$$

$$
\theta\_4 = A - \theta\_3 \tag{12}
$$

$$
\theta\_{\mathsf{F}} = \mathsf{B} \tag{13}
$$

For the execution of a welding process, it is necessary that the robotic arm meets a series of requirements related to the precision and straightness in the movement of the axes. These parameters are necessary for taking the robot to work under the welding standards of parts. To verify the precision in the displacement of the axes, a dial comparator is placed as shown in **Figure 5** and the moving each axis is measured, so that the following results are obtained.

Commands by MDI (Command Line Interface) was sent to test the move of the robotic arm and the kinematics. The result of these moves is shown in the **Table 3**, where the precision of the arm is measured with a digital dial comparator with a resolution of 0.01 mm. The results show a maximum difference of 0.16 mm being permissible for the application in which it will be employed. Each axis is assembled as in **Figure 5**, in which the indicator is located parallel and centered to the axis to be measured.

contrasts with the color of the pieces to facilitate the segmentation and identifica-

to create an image emphasizing edges. The resulting image can be observed in **Figure 7**; this process was made with the aim of being able to identify the possible

*Implementation of an Artificial Vision System for Welding in the Retrofitting Process…*

Subsequently, the image is converted to a gray scale, and a Sobel filter is applied

Before the pieces are segmented to highlight the edges, a morphological operation of closing is made to eliminate small gaps (filling them) and join components connected nearby. The kernel used is the structuring element of a rectangle of 1 8 pixels. The result is presented in **Figure 8**, where the space that separates the two

In the last process, it is possible to highlight the suitable area for the welding process; the algorithm is able to do it due to a Gaussian filter that is applied to eliminate the rest of the edges of the piece and leave only the highlighted union of both. With the segmentation process done, the line that traces the union of the pieces is transformed into points that will be used to coordinate to which the arm

To verify the precision of the algorithm, several probes are made to check the real coordinates and the coordinates calculated from the computer vision. In this process, the values of the location of the camera and the relationship of the pixels and the location XYZAB change according to the precision of each servomotor.

tion of the edges.

closeness of the working pieces.

*DOI: http://dx.doi.org/10.5772/intechopen.88360*

must reach to carry out the welding process.

pieces was totally filled.

**Figure 7.** *Edge detection.*

**Figure 8.** *Welding trajectory.*

**71**

#### **Figure 5.** *Digital indicator.*


**Table 3.** *Movements.*
