*2.3.6 Millimeter-wavelength radar*

It is a detection radar working in a microwave band. It transmits signals with a wavelength that is in the millimeter range—a short wavelength. It, therefore, can detect movements that are as small as a fraction of a millimeter [26]. Due to the complex farmland operating environment ultrasonic and other sensors based on optical principles are easily affected by climatic conditions, whereas millimeter wavelength radars a non-cooperative sensor can work in all weather conditions with strong penetrating ability, large operating distance, reliable detection, and anti-electromagnetic interference [6]. Compared with other ranging sensors the millimeter wavelength radar, however, has low resolution and implemented with discrete components that demand increase power consumption and overall system high cost. It can not only detect parallel distances but also hardly describe the outline of the OA objects as well as their angle in the field of view. It is, therefore, limited to terrain imitation flight systems for agricultural UAVs. Lovescu and Rao [26] noticed that Texas Instrument has solved these challenges with complementary metal-oxide semiconductors (CMOS)-based mmWave radar devices and implemented frequency modulated continuous wave (FMCW) that measures range as well as angle and velocity. This differs from traditional pulse-radar systems which transmit short pulses periodically they noted.

## *2.3.7 Time of flight (TOF) ranging*

It is one of the widely used ranging methods. It works by the round trip flight time whereby its camera illuminates the whole scene including the objects using a pulse or continuous wave light source and then measures the reflected light time using the speed of light to and from the object to obtain the distance and hence the 3D depth range map. It is the quickest technology to capture 3D information, created in a single shot of an area or scene [9]. Compare with other ranging methods, it has the features of low energy consumption and easy deployment and is suitable for applications where high ranging accuracy is required. However, due to the transmission features of the light wave signal non-linear propagation factors such as reflection, refraction, and diffraction can all cause measurement time deviation which will lead to huge distance calculation errors. When applied in agricultural UAVs safety, an auxiliary device derived from the combination of TOF ranging principle and other sensors is more suitable for obstacle avoidance [26]. AMS TOF obstacle detection and avoidance sensors, for example, are based on a proprietary single photon avalanche photodiode (SPAD) pixel design and time to digital converters (TDCs) with extremely narrow pulse width and can measure the time of flight of a laser emitted infrared ray reflected from an object in real-time [7].

*Review of Agricultural Unmanned Aerial Vehicles (UAV) Obstacle Avoidance System DOI: http://dx.doi.org/10.5772/intechopen.103037*

#### *2.3.8 Monocular visual ranging*

It captures images through a single-lens camera. The constraints of payload due to size, weight and power (SWaP) of both small and micro UAVs favors the use of monocular cameras. It is a 3D depth reconstruction from a single still image. It is simple in structure, mature in technology and fast in computing speed [26], but it cannot directly obtain the depth information of obstacles because both optical flow and perspective cues cannot handle frontal obstacles well [21]. The algorithm deployed to interpret the image data using various cues in the image is what makes monocular vision cameras able to create 3D images, determine distances between objects and detect obstacles [9]. Hence, monocular ranging is difficult to meet the requirements of real-time performance and accuracy of obstacle avoidance for UAVs giving the complexity of the farmland operating environment. However, a relative size cue to detect frontal collisions which work on the knowledge that the size of approaching objects increases with nearness appears promising for real-time frontal obstacle detection using a single camera. According to Mori and Scherer [14], the time to collision can be obtained by measuring the expansion of the obstacle.

#### *2.3.9 Binocular stereo vision sensors*

Beginning with identifying an image pixel that corresponds to the same point in a physical scene observed by multiple cameras, stereoscopic vision is the calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints. Triangulation using a ray from each camera can then be used to determine the 3D position of a point [15]. Stereo vision technology can recognize and measure the distance between the fuselage and the obstacles due to its good concealment and ability to obtain comprehensive information including color and texture of obstacles as well as 3D depth information. The most serious issue with binocular vision, however, is stereo matching. The effects of lighting changes, scene rotation, object occlusion, low image resolution, interference and even overwhelming of target features all lead to target feature instability and decreased object detection accuracy [6]. It is, therefore, more used in consumer and professional UAVs rather than in agriculture.

#### **3. Conclusion**

The attempt was made to situate the topic and review within the milieu of the national discourse on economic development to isolate it from the pool of academic exercise which has become the bane of research in Nigerian Institutions of Learning (NIL). The imperatives of enhanced mechanization of the nation's farm systems as a cardinal strategy to fight poverty and for capital formation and economic growth were highlighted. The role of unmanned aircraft in precision agriculture was x-rayed to include soil and water analysis, planting, crop and spot spraying, crop monitoring, irrigation, farm health assessment and livestock production systems. Based on the three functional compartments of unmanned aircraft system—guidance, navigation and control (GNC) the application of UAV to agriculture was examined to include the two main platforms fixed wing and rotor airframes, the navigations sensors such as real-time kinematic (RTK), ultrasonic sensors, laser sensors, infrared sensing

technology, structured light, time of flight (TOF) ranging, millimetre-wavelength radar, monocular visual ranging and binocular stereo vision and the controls geometric relationship, real-time planning and decision-making algorithms, the autonomous mental development (AMD) algorithm, online free path generation and navigation system, multi-UAVs genetic algorithm, the simultaneous localization and mapping (SLAM) technology and UAV swarm control and sensor fusion systems. The obstacle avoidance methods of many Ag UAV systems were found to be either by on-site suspension, planned travel route and/or autonomous obstacle avoidance. The merits and the demerits of the available Ag UAV technologies were highlighted and the gap to bridge in airframe technology—cost, payload and flight endurance; in sensors—direct imaging sensors of less cost, size and weight; reliability—in mechanical, electronics and interference; in off-the-shelf devices for rapid remediation, in operation-autonomous take-off and landing, automated computation of flight paths, integrated spray and remote sensing algorithm to direct the operation of spraying and in power for higher and lightweight solar cells were noted.

The objective of the study is to examine the literature pertaining to the use of UAV for pesticides and fertilizer application in obstacle rich farms. The study shows that UAVs platform for agricultural pesticide and fertilizer application needs higher payloads weight that demands enhanced mechanical (especially the landing gears) and integrated electrical structure. Enhanced flight endurance to ensure continuous spraying operation by a unit of task size is important. It was further observed that rotor aircraft with the ability for vertical take-off and landing (VTOL) is better for agricultural operations.

It was also observed that the farm environment is replete with flight obstacles for Ag UAVs. The various types of obstacles were outlined. Nine sensors that are primary to obstacle sensing and avoidance technology were therefore reviewed for their applicability to such environment and their merits and demerits highlighted. None is found completely independent without an assist to achieve the objectives of real-time all-weather autonomous operation. A fusion of multi-sensor data systems is therefore in vogue to complement the shortfall of each other. The potential for real-time and near real-time exists but the development of technology for quality imagery and rapid processing leading to real-time response is needed. The infrared, time of flight and millimeter wavelength radar sensors for detecting farm and flight environment obstacles appear must promising especially with its modern versions for Ag UAV.

The autonomous mental development (AMD) algorithm and the simultaneous localization and mapping (SLAM) technology appear to be ahead of others in achieving autonomous identification of obstacles and real-time obstacle avoidance for agricultural UAVs. They are, therefore, found fit for further studies and development for deployment on Ag UAVs for pesticides and fertilizer application in obstacles in rich farmlands.

*Review of Agricultural Unmanned Aerial Vehicles (UAV) Obstacle Avoidance System DOI: http://dx.doi.org/10.5772/intechopen.103037*
