**6. Simulation results**

where

*Unmanned Robotic Systems and Applications*

and

disturbance and uncertainties.

where **S** ¼ *Sϕ; Sθ; S<sup>ψ</sup> ; Sz; Sx; Sy*

*Sϕ;* \_ *Sθ;* \_ *S<sup>ψ</sup> ;* \_ *Sz;* \_ *Sx;* \_ *Sy*

where **e**\_ ¼ *e*\_*ϕ;e*\_*θ;e*\_*<sup>ψ</sup> ;e*\_*z;e*\_*x;e*\_*<sup>y</sup>*

where **<sup>S</sup>**\_ <sup>¼</sup> \_

laws in Eq. (71):

and *sign*ð Þ¼ **S** *sign S<sup>ϕ</sup>*

**46**

energy function Eq. (69), one can get:

� �*<sup>T</sup>*

� �*<sup>T</sup>*

� �*; sign S*ð Þ*<sup>θ</sup> ; sign S<sup>ψ</sup>*

≤ ∑ 6 *i*¼1

¼ �∑ 6 *i*¼1

<sup>≤</sup> � *<sup>η</sup>*<sup>1</sup> ffiffi 2 p ∑ 6 *i*¼1 ∣*Si*∣

*Sx* ¼ *e*\_*<sup>x</sup>* þ *λ*5*ex Sy* ¼ *e*\_*<sup>y</sup>* þ *λ*6*ey*

*ex* ¼ *x*<sup>9</sup> � *xd; ey* ¼ *x*<sup>11</sup> � *yd;*

> *<sup>V</sup>* <sup>¼</sup> <sup>1</sup> 2

� �*<sup>T</sup>* and **A** is the diagonal matrices where **A** ¼ *diag*f g *λ*1*; λ*2*; λ*3*; λ*4*; λ*5*; λ*<sup>6</sup> with *λ<sup>i</sup>* > 0. Substituting the value of designed control

where **K** ¼ *diag K*f g <sup>1</sup>*;K*2*;K*3*;K*4*;K*5*;K*<sup>6</sup> with *Ki* >0 and **<sup>d</sup>** <sup>¼</sup> *<sup>d</sup>ϕ; <sup>d</sup>θ; <sup>d</sup><sup>ψ</sup> ; dz; dx; dy*

*<sup>V</sup>*\_ <sup>¼</sup> **<sup>S</sup><sup>T</sup>** �**Ksign S**ð Þþ **<sup>d</sup>** � �

j*Si*j*dsi* � *Ki*j*Si* ð Þj

∣*Si*∣ð Þ �*dsi* þ *Ki*

� �*; sign S*ð Þ*<sup>z</sup> ; sign S*ð Þ*<sup>x</sup> ;* � *sign Sy*

To do so, let us select a candidate Lyapunov function as:

As previously done, the task now is to prove that the system Eq. (49), with the sliding variables given by Eqs. (56), (64) and (67). If the control laws are designed as Eqs. (59), (60), (62), (63), and (66) then the sliding manifolds are reached in finite time *tr* and the tracking error *eϕ, eθ, e<sup>ψ</sup> , ez, ex, ey* will stay on the sliding manifolds thereafter. Consequently the controlled states *x*1*, x*3*, x*5*, x*7*, x*9*, x*<sup>11</sup> will converge to the desired values in finite time *tf* in the presence of bounded

)

)

**STS** (69)

. By taking the time derivative of the Lyapunov

*<sup>V</sup>*\_ <sup>¼</sup> **<sup>S</sup>TS**\_ (70)

. After substitution of *S*, *V*\_ can be expressed as:

� �*<sup>T</sup>*

(73)

� �<sup>Þ</sup> *T*

*<sup>V</sup>*\_ <sup>¼</sup> **ST**ð Þ **<sup>e</sup>**€ <sup>þ</sup> *<sup>A</sup>***e**\_ (71)

*<sup>V</sup>*\_ <sup>¼</sup> **<sup>S</sup><sup>T</sup>** �**Ksign S**ð Þþ **<sup>d</sup>** � � (72)

(67)

(68)

This section presents the simulation results of the SMC described in the previous section. The tracking performance of the quadrotor is evaluated by making it track a circle of radius 1 m at an altitude of 3 m with a desired yaw angle of *π=*6. The tracking performance is shown in **Figures 6**–**9**.

The control inputs are shown in **Figures 10**–**13**. One can observe that there exists a presence of chattering in the control inputs when using the signum function and cannot be directly implemented in real-time hardware. To overcome this, the boundary layer approximation is utilized which smoothens the control inputs.

**Figure 7.** *Y-position tracking.*

**Figure 11.** *u2.*

*Vision-Based Autonomous Control Schemes for Quadrotor Unmanned Aerial Vehicle*

*DOI: http://dx.doi.org/10.5772/intechopen.86057*

**Figure 12.** *u3.*

**Figure 13.** *u4.*

**49**

**Figure 8.** *Altitude tracking.*

**Figure 9.** *Yaw tracking.*

**Figure 10.** *u1.*

*Vision-Based Autonomous Control Schemes for Quadrotor Unmanned Aerial Vehicle DOI: http://dx.doi.org/10.5772/intechopen.86057*

**Figure 13.** *u4.*

**Figure 9.** *Yaw tracking.*

**Figure 10.** *u1.*

**48**

**Figure 8.** *Altitude tracking.*

*Unmanned Robotic Systems and Applications*

100 is a fully customize-able and programmable flight platform that lets its users perform operations such as pipeline health monitoring, surveillance, search and rescue and in applications requiring external sensor interface. Accompanied with the M100, a series of add-ons help in making its handling user-friendly. Similar to any other development drone in the market, the Matrice M100 comes with a

*Vision-Based Autonomous Control Schemes for Quadrotor Unmanned Aerial Vehicle*

To aid in implementation of user defined controllers and task maneuvers, a separate on-board computer, named the *DJI Manifold*, is provided in **Figure 16**. The Manifold is an embedded Linux computer which incorporates a NVIDIA Tegra K1 SOC (CPU + GPU + ISP in a single chip) with both standard and extended connections and interfaces. The single GPU (Graphical Processing Unit) unit helps us run CUDA to aid in performing complex image processing operations. The Linux environment acts as a support to run ROS (Robot Operating System), which is the key element for any sorts of development on the Matrice M100. This would be men-

To gather visual data, the DJI Matrice M100 is provided with a completely controllable Zenmuse X3 Gimbal. This could be easily interfaces with the DJI Manifold for image processing. However, in this case, a separate downward facing camera is used to perform the task of vision based landing. This is done so as to keep the gimbal free to perform other tasks such as image capturing, video capturing and

This section briefly describes the software abstraction layer and its paradigm to control and the associated hardware flow of Matrice M100 quadrotor. As discussed in the hardware setup the DJI M100 uses DJI Manifold as its on-board computer to control and communicate with Flight controller and on-board sensors interfaced with it. *DJI On-board SDK (OSDK)* is an open source software library which enables the OBC (On-Board Computer) to handle the Input-Output data coming from the

likewise. The downward facing camera chosen is the *LogiTech C310* camera (**Figure 17**) which can be interfaced with the manifold using an USB connection. The landing pad is a wooden platform of dimension 4 feet 4 feet. At the center, an AruCo marker is placed of dimension 12.5 cm 12.5 cm. The AruCo Marker chosen is a 4 4 matrix of marker ID 7. The dimension of the marker is chosen such that it is clearly detected from an altitude as high as 10 m as well as from an altitude as low as 0.4 m. The landing pad setup as shown in **Figure 18**

would be mounted on the roof of a car for experimental purposes.

programmed flight controller.

*DOI: http://dx.doi.org/10.5772/intechopen.86057*

**8.2 Software description**

**Figure 16.** *DJI manifold.*

**51**

tioned in detail in the upcoming sub-section.

**Figure 14.** *Complete block diagram.*
