UAV and obstacle sensing techniques – a perspective

N. Aswini (Department of Electronics and Communication Engineering, MVJ College of Engineering, Bengaluru, India)
E. Krishna Kumar (Indian Space Research Organisation, Bengaluru, India)
S.V. Uma (Department of Electronics and Communication Engineering, RNS Institute of Technology, Bengaluru, India)

International Journal of Intelligent Unmanned Systems

ISSN: 2049-6427

Publication date: 2 January 2018

Abstract

Purpose

The purpose of this paper is to provide an overview of unmanned aerial vehicle (UAV) developments, types, the major functional components of UAV, challenges, and trends of UAVs, and among the various challenges, the authors are concentrating more on obstacle sensing methods. This also highlights the scope of on-board vision-based obstacle sensing for miniature UAVs.

Design/methodology/approach

The paper initially discusses the basic functional elements of UAV, then considers the different challenges faced by UAV designers. The authors have narrowed down the study on obstacle detection and sensing methods for autonomous operation.

Findings

Among the various existing obstacle sensing techniques, on-board vision-based obstacle detection has better scope in the future requirements of miniature UAVs to make it completely autonomous.

Originality/value

The paper gives original review points by doing a thorough literature survey on various obstacle sensing techniques used for UAVs.

Keywords

Citation

Aswini, N., Krishna Kumar, E. and Uma, S. (2018), "UAV and obstacle sensing techniques – a perspective", International Journal of Intelligent Unmanned Systems, Vol. 6 No. 1, pp. 32-46. https://doi.org/10.1108/IJIUS-11-2017-0013

Download as .RIS

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Emerald Publishing Limited


1. Introduction

Unmanned aerial vehicles (UAVs) are aircrafts which fly without a human pilot on board. UAVs have become so popular because of their greatest advantage – no risk of human life! They are playing a predominant role in military services like surveillance, monitoring, tracking enemies, destruction using modern tools and in civil applications like atmospheric research, weather forecasting, firefighting, road traffic monitoring and control, crop, and harvest monitoring. Commercial applications include logistics, aerial photography, film making, etc. Around the world, drones equipped with cameras and sensors are providing companies with clearer, more comprehensive views of their businesses, and the opportunities and threats that surround them.

The further sections are ordered as mentioned. Section 2 presents early developments of UAV, different types and applications. In Section 3, the functional components of UAV are discussed. In Section 4, current and evolving trends and challenges faced by UAVs are discussed. Among the various challenges faced while designing UAVs we have discussed more on the various obstacle sensing mechanisms available in Section 5. Then, Section 6 explores the necessity to incorporate obstacle sensing and processing mechanism on-board UAV. To make this possible, vision-based sensing and avoidance techniques are a better solution compared RADAR, LIDAR, or laser sensors.

2. Overview of UAV development

2.1 Early developments

UAVs date back to 1849 when Austrians attacked Italian city of Venice with unmanned balloons loaded with explosives. There was considerable advancement in the types of UAV used during early years, the First World War, the Second World War, Korean War, Cold War, Vietnam War, and later in Gulf War (Keane and Carr, 2013). The first UAV was the “Kettering Bug” developed by US Air force during the First World War. It was a self-flying torpedo. When compared to the latest UAVs, it was a simple machine which could be calibrated for precision attacks against fortified enemy defenses up to 75 miles away. Gyroscopes were used to stabilize it. It is more a guided missile than a drone. The German V-1 buzz-bomb, a jet-powered cruise missile, was the World War II’s most used drone. The development of RADAR to replace television as the primary guidance system was a remarkable change in the Second World War operation. OQ2 and OQ3 drones developed had one-hour endurance and a speed of 85 mph. The ideas were good, but the technology was not advanced. In 1960s and 1970s, the USA developed a huge number of UAVs for military purposes. Throughout the twentieth century, there has been a tremendous development in UAV technology. Drones have done remarkable work during anti-terrorist operations performed by the USA. They have done it with little cost, and no risk to armed forces. Even though initially it was used for military purposes, later people started using UAVs for civilian applications. Now, many countries like the USA, UK, France, Iran, China, Israel, India, Pakistan, etc., are using UAVs.

2.2 Types and applications of UAV

The UAVs are of various shapes and sizes depending upon the application. The common types are single rotor (www.ebuav.com/wurenji/chanpinzhongxin/Single_Rotor_UAV/2014/1120/1.html), fixed wing (www.shadowair.com/uavs?lightbox=image23zd), multirotor (http://838inc.com/expertise/unmanned-aerial-vehicles/), and hybrid (http://newatlas.com/carbonix-volanti-vtol-fixed-wing-industrial-uav/48253/#gallery). A comparison in the performance of fixed wing and multirotor UAVs are given in Table I.

With respect to weight, Civil UAVs are classified as micro (less than 2 kg), mini (greater than 2 kg and less than 20 kg), small (greater than 20 kg and less than 150 kg), and large (greater than 150 kg). According to their range/altitude (Gupta et al., 2013), UAV can be classified, as shown in Table II. In Table III, the endurance of some of the UAVs along with examples are given. In Plate 1, images of few UAVs are included (Figure 1).

UAVs can take different forms according to the applications (Accenture, 2016). Depending on their lift capacity and payload specifications, UAVs can also carry multiple sensors to extract a wide range of information, increasing the number of possible applications and the business value of their outcomes.

Apart from military operations, UAVs are chosen for:

  • atmospheric research;

  • geological surveys;

  • hurricane evolution and research;

  • oceanographic observations;

  • volcanoes study and eruption alert;

  • weather forecasting;

  • cloud study programs;

  • ozone layer studies and monitoring;

  • disaster operations management;

  • firefighting;

  • oil slick observations;

  • flood watch;

  • catastrophic situation assessment;

  • search and rescue (looking for survivors from shipwrecks, aircraft accidents, etc.);

  • earthquake monitoring; and

  • nuclear radiation monitoring

  • international border patrol;

  • environmental monitoring;

  • law enforcement;

  • road traffic monitoring and control; and

  • crop and harvest monitoring.

In Table IV, the technical details of some of these applications are given.

3. Functional components of UAV

UAV can be remote controlled, or can fly autonomously in a predefined path. The functional categories can be used for:

  • military training;

  • reconnaissance;

  • combat;

  • research and development; and

  • civil and commercial applications.

Despite the category in which they are used, the main functional units of UAV in general, are the sensors, flight control system, power supply, propulsion system, communication unit and the ground station as depicted in Figure 2. The ground control station (GCS) can be land based or sea based which consists of the facilities for human control of unmanned vehicles. The ground station helps to control and monitor UAV. It will provide information about speed, heading direction, altitude, bank angle, GPS status, battery status, telemetry signal, etc. Endurance and performance of UAV are directly connected with the source of power supplies used for UAVs. The type of energy sources can be fuel based or battery based. The batteries (Meyer et al., 2009; Cwojdziński and Adamski, 2014) used are lithium polymer batteries (3.7-4.2 V, six to eight cells), photovoltaic cells, super capacitors, and hydrogen fuel cells. Solar energy collected from photovoltaic cells can also be used as a source of power for UAV. The propulsion system provides the necessary power for the UAV to move forward or hover. The main parts are the brushless motors, propellers, and electronic speed controller. There are now many sensors in the market that vary in size, weight, resolution, and cost. Among the sensors, RADARs can detect objects without cooperative communication. They are the most widely used mechanism to detect air to air vehicles. Sunlight, smoke, fog, dust, and other factors do not affect them, and have improved directionality and range characteristics. LIDARs are highly precise and can detect objects of different sizes and shapes by calculating the time taken for light to travel back and forth. They are too large to be incorporated on to small systems.

Optical, ultrasonic sensors are small in size, low cost, and have low-power consumption. Micro electro mechanical system (MEMS) (Kumar et al., 2017) or use of sensor fusion technology is more advanced method of sensing. Camera-based sensing, one of the feasible methods of obstacle detection for small UAVs is discussed in detail in the later sections. Infrared sensors can be used at night. Navigation sensors like inertial navigation sensors and GPS are used to measure position, velocity, attitude, and rotation (pitch, roll, and yaw) of UAV. Sensors are one of the main payloads of an UAV. For long endurance UAV payloads, the processing power has a direct effect on the power consumption. Optical image sensor payloads (Meyer et al., 2009) will be suitable for long endurance UAVs.

A UAV system cannot operate without secure and reliable communication. The communication can be of three ways – communication between UAV and GCS, communication between UAVs, and communication between UAV and satellite. The main parts of the communication module are the transmitter and receiver.

The commonly used UAV control, telemetry, and video frequencies are 900 MHz, 1.2 GHz, and 2.4 GHz and 5.8 MHz. The communication between UAV and GCS provides a data link for transmitting the information captured by UAV in the form of images and videos files. The GCS will be sending the control and guidance commands to UAV.

4. Challenges and trends

UAVs, better known as drones, are one of the technological marvels of our age. Design of an UAV means integrating hardware, software, sensors, actuators, communication systems, and payloads into a single unit for the application involved. It is really a challenging task. The challenges can be technical and managerial. We need to identify the risks associated with UAV system development. The major challenge faced by military UAVs is that they need to integrate with the existing traffic collision avoidance system (TCAS) done by RADARs. There are various policy challenges (Swaminathan, 2015), related to structure, network of air traffic control, physical and electronic identification of drones, protocols, and communication systems. There are many pre-flight requirements which need to be followed before using UAVs for any commercial application. This includes the registration, pre-flight inspection, use of trained flight operators, air space, height at which UAVs can fly (400 ft), weight (less than 55 pounds), weather visibility, visual line of sight, speed (should be less than 100 mph), time of day, accident reporting, etc. Few more challenges faced by drones are as follows:

  • limited flight endurance and payload capacity;

  • power;

  • limitations in type of sensors used;

  • obstacle sensing and avoidance;

  • difficulty in launching;

  • safety;

  • cost; and

  • losing control during flight.

Great amount of research works is going on in areas like batteries with longer endurance UAV, stealth technologies for military purpose, smaller, lighter sensors, increased on-board computing power, multifunctional UAVs, etc. Recently Intel engineers set a new world record by flying 500 drones at the same time, all fitted with LED lights. They jointly formed 3D shapes in the air. It was a demonstration of how large number of drones can communicate each other. Tech world is also waiting for EHANG 184, a unique passenger drone which is an aerial version of Uber (http://dronelife.com/2017/01/11/drone-trends-2017/). The latest drone design includes Diodon, an inflatable drone; DJI Spark, which has 3D sensing technology for obstacle avoidance; Cleo drone, of donut design which can be slipped inside our pockets; Mola Ufo, a selfie drone, etc., are a few to list. Technological improvements will make UAVs faster, stronger, and safer. A team of scientists has demonstrated that UAVs were able to build a rope bridge, assemble items to create a structure, or detect and catch an object in the air (Accenture, 2016), as shown in Plate 2.

Addressing all the challenges faced by UAVs is beyond the scope of this paper, so in the further section we are addressing one major challenge faced by UAVs which is sensing and avoiding obstacles.

5. UAV sensing methods

The commercial applications require the small UAVs to fly at lower altitude or operating inside buildings, where they are exposed to many hazards and obstacles. Current UAV technology in automatically sensing, detecting, and avoiding fixed obstacles such as power line, building, tower, tree, and moving obstacles such as birds, and other aircraft is still immature compared to manned aerial vehicle. As UAVs’ market is predicted to provide billions of dollars in economic growth especially in commercial market, researchers around the world are now trying to develop an efficient automatic sense and avoid system to satisfy the demands and requirements of the UAVs. So, there is a great scope of research in embedding Sense-Avoid Detect algorithms on-board UAV. Figure 3 shows how a sense and avoid system works.

Table V gives an account of the merits and demerits of existing UAV sensing techniques.

Table VI compares the various types of sensing methods in terms of their range, power, size, and cost.

The obstacle detection and avoidance technology started with sensors detecting objects in front of the drone. TCAS and automatic dependent surveillance-broadcast (ADS-B) technology have already matured for collision avoidance technique in manned aerial vehicle, the system is currently satisfied only for cooperative intruders. Non-cooperative sensors of active and passive type can provide better detection for the non-cooperative obstacles especially when problem of data link loss occurs. They are important, particularly for the UAVs that fly inside buildings since the obstacles are mostly wall, machine, office equipment, and humans. Active sensing includes laser ranging, RADAR, and sonar which will transmit signals for detecting obstacles. They usually provide a very good information of the obstacles distance and it is a very essential criterion in tracking operation. On the other hand, passive sensors such as electro optical, infrared, thermal imaging, and motion detector depend on detection of signals emitted from the obstacle.

In Lin and Saripalli (2015), a path planning is done to generate a collision free trajectory for the UAV. They use a cooperative surveillance technology to know the position of intruder aircrafts using ADS-B. In Sahawneh (2016), minimum sensing range to safely avoid a collision is first calculated. They use both deterministic approach and probabilistic approach to estimate the collision risk of an encounter scenario. The implementation is done using various sensors like camera, RADAR, and ADS-B. Using a single fish eye camera of 185° FOV, visual tracking and position estimation of UAVs are performed in Sapkota et al. (2016). Collision avoidance is not incorporated here. A light weight RADAR system which can be used for mini UAVs with complete system design (both hardware and software) is given in Moses et al. (2011). Target detection is done but methods to avoid collision need to be added to it.

A non-cooperative UAV system utilizing a vision-based navigation system, integrated with GNSS system and a MEMS-IMU sensor (Salazar and Sabatini, 2013), performs obstacle detection and tracking. A low-level tracking is done with a Viterbi algorithm and a high-level tracking is done with a Kalman Filter. It tries to avoid both static and dynamic obstacles. Using five cameras (Zarandy et al., 2011), a vision system was developed for calculating the attitude of an UAV and thereby provide collision warning.

Various bottlenecks in the implementation of vision processing on board a small UAV, and their possible solutions are given in Ehsan and McDonald-Maier (2009). There is need of better computer architecture on board that is capable of processing image/video in real time. For increasing the flight time, the power consumption also need to be less. The power constraint will affect the frequency of operation. So, to have vision-based collision avoidance system, we require low-power vision processing on board, low clock frequency, real-time processing, and light weight. One more work which we felt appreciable was the use of four ultrasonic sensors (Bhardwaj et al., 2015), each covering a span of 15°. But since it depends on transmission and reception of echo the range is less (10-15 cm). The angle covered by sensors also needs to be improved.

In Aguilar et al. (2017), a monocular on-board camera is used. It compares the image obtained in real time from UAV with a database of known obstacles. They used feature point detector speeded up robust features (SURF) algorithm. The drawback is that the algorithm works for only the obstacles stored in the database. Scale invariant feature transform (SIFT) techniques (Divya Lakshmi and Vaithiyanathan, 2017) can also be used for selecting feature points in the images returned by UAVs. There are two fundamental groups of vision-based obstacle avoidance techniques; those that compute the apparent motion (optical flow), and those that rely on the appearance of individual pixels (basic image processing) for monocular vision-based obstacle avoidance systems. Using optical flow and feature tracking methods, an obstacle avoidance technique is developed for fixed wing UAVs (Dayton et al., 2015). Using Lucas Kanade optical flow technique, the time difference between two frames of video is used to calculate the velocity of the motion. The system works with single moving and stationary target. But they are not considering objects (moving and stationary) from multiple directions. In Omkar et al. (2014), a vision-based obstacle detection method is used for fixed wing UAV. The algorithm works well for dark- and light-colored obstacles in the environment. This method may not be suitable in a foggy environment and while detecting multiple obstacles in urban areas because of the use of fixed wing. A quadcopter will be suitable for that, since it has hovering capability which is most needed in urban areas. While navigating inside a building, or when a bird comes suddenly in the path, or a person comes in front the method need to be improved. It would be feasible if improved image processing techniques are used to detect obstacles and calculate distance to it. In Agrawal et al. (2014), optical flow-based guidance for UAV is performed. An on-board forward-facing pinhole camera having a field of view 90° and 20 frames/second is used. The various sensors and techniques used in automobiles like car to unmanned air vehicles have been given in Connolly (2007). A collision avoidance scheme is presented using Aerial Quadrotor (drone) (Esrafilian and Taghirad, 2016). Video streams obtained using front camera of drone and navigation data measured by Aerial Quadrotor drone is transmitted to ground placed laptop through wireless network connection. Simultaneously localization and mapping (SLAM) is helpful in navigation and mapping, the navigation data received are processed by oriented FAST and rotated binary robust independent elementary features (BRIEF) (ORB) and SLAM to compute three-dimensional sparse maps and three-dimensional position. The scaling parameter of monocular SLAM is figured out using linear filtering. Kalman Filter is employed for fusing a sensor in monocular camera of Aerial Quadrotor.

From the survey of various UAV sensing and detection methods, we can classify them as given in Figure 4.

For sensing obstacles, small UAVs cannot carry RADAR- or LIDAR-based systems. So, the use of cameras will be a better option. Camera-based sensing can depend on SLAM techniques, optical flow methods, stereo vision or monocular (single camera) vision-based techniques. Mono-based methods do not need demonstrating the 3D model of the objects and are diverse. The various existing monocular vision-based object detection techniques include estimation of relative size of obstacles (objects coming closer will expand in size), relative clarity (blurry, foggy or hazy when objects are far away), texture gradient (front part of the picture has more texturized features), linear perception (parallel lines feel like converging as they move away from us), interposition (among two objects, one which is nearer blocks view of the other), and relative motion (motion parallax). Various image processing algorithms can be used in monocular vision-based techniques. For this, feature detection and description algorithms are the basic steps for object detection and tracking. Features can be edges and corners.

Algorithms like Canny, Sobel, or Laplacian edge detectors can be used for edge detection. For corner and region detection, Harris, ShiTomasi can be used. Feature descriptor algorithms include SURF, SIFT, and improved ones like BRIEF (Calonder et al., 2010) and ORB (Rublee et al., 2011). Although SIFT has proven to be very efficient in object recognition applications, it requires a large computational complexity which is a major drawback especially for real-time applications. SURF technique, which approximates SIFT, performs faster than SIFT without reducing the quality of the detected points (Jain et al., 2017). These two robust feature descriptors are invariant to scale changes, blur, rotation, illumination changes, and affine transformation.

6. New requirements for UAV sensing and avoidance

In the existing method of collision avoidance to detect the obstacle by using various sensors like RADAR, LIDAR, ultrasonic sensors, cameras, etc., they transmit the image/video data to GCS. Maneuvering commands are decided in GCS and sent back to the flight control system of UAV. Instead of having a wireless transmission to GCS, the incorporation of an on-board processing unit can reduce the time taken for maneuvering, as shown in Figure 4. From the previous discussions, what we would like to add is that for commercial applications, to perform an on-board data processing, vision-based obstacle avoidance system is the only promising solution because of the constraints of size, weight, power demand, and cost. We can consider visual sensors rather than RADAR, because the latter is too bulky and too expensive to fit on a small- or medium-sized UAV. For micro and small aerial vehicles, the constraint of size, weight, and power makes them choose digital cameras as the best choice for obstacle detection. Due to this trend, image processing techniques are emerging as an attractive proposition for UAV collision avoidance. Advanced photogrammetry software support video/image processing of UAV captured videos and images. The programming language used to develop application for UAV dependents on the platform you are using to control them. Some of the platforms are Arduino, Raspberry Pi, and Beagle board. MATLAB/Simulink is a better choice for video/image processing rapid prototyping and embedded system design. Using MATLAB, we can perform feature detection, extraction, and matching; object detection and tracking; motion estimation; and video processing. There are latest tool boxes like computer vision toolbox, which supports camera calibration, stereo vision, 3D reconstruction, and 3D point cloud processing. We can even design, simulate estimation, and control algorithms in MATLAB/Simulink and generate Embedded C code for it (Figure 5).

The on-board camera needs to capture video/images, which must be processed inside UAV itself. The vision-based processing unit can give its outputs to the flight control system. The flight control system can take decisions and give maneuvering commands to UAV. This will drastically reduce the processing time. The plan of action can be represented, as shown in Figure 6. Use of a camera can make the process easier and simple.

While embedding a single camera, processor, and control system on board, we need to consider the following requirements from a technical point of view for enhanced capabilities in collision avoidance:

  1. position, size, speed, and angle of UAV;

  2. speed and direction of the moving intruders need to be predicted;

  3. size, weight, and electrical power of UAV;

  4. payload limitation;

  5. path planning algorithm should be efficient, less complex;

  6. memory requirement for on-board sensor data processing;

  7. processing speed required for on-board data processing;

  8. environment, weather conditions, background noise, and clutter;

  9. Static/Dynamic intruder;

  10. unanticipated maneuvers;

  11. the time taken for the obstacle avoidance process;

  12. detecting small point like objects;

  13. minimum distance between UAV and intruder; and

  14. UAV egomotion.

7. Conclusion

An attempt to explore, understand, and compile the different types of UAVs, their major functional components, technological advancements, current, and near future trends in general along with the studies done to explore the possible problems associated with enhancing the capabilities of UAV while evolving from its current environments of open space to relatively more occupied space with taller or other structures likely to encounter in its flight paths are concluded here. The survey included fundamentals on each of the major functional block with some description of the functional component, elements involved and their working plus the relevant references from literature. Next section explored topics on challenges and trends seen from the perspective of functional blocks. In the following section, a survey on various UAV sensing methods is given. The survey concludes by discussing why and how vision-based sensing techniques emerge as an attractive proposition for UAV collision avoidance in relation to all existing techniques. This concludes highlighting the scope of better understanding of research and its utilization for future.

Figures

Types of UAV

Figure 1

Types of UAV

Functional components of UAV

Figure 2

Functional components of UAV

Sense and avoid system

Figure 3

Sense and avoid system

UAV sensing and detection methods

Figure 4

UAV sensing and detection methods

New requirement for UAV collision avoidance

Figure 5

New requirement for UAV collision avoidance

Vision-based collision avoidance

Figure 6

Vision-based collision avoidance

Different types of UAV

Plate 1

Different types of UAV

Technical improvements

Plate 2

Technical improvements

Comparison of fixed wing and multirotor UAV

Fixed Wing Multirotor
Efficient in fast flying Efficient in low flying
High flying Navigation inside buildings
More endurance Vertical take-off capability, hovering
Good for aerial survey of large areas Stabilized video return
Limitation – take-off and landing is difficult. Limitation – low endurance and mechanical complexity

Classification of UAVs

UAV type Weight (kg) Altitude (ft) Application
Long range
HALE >600 65,000 Military
MALE >600 45,000 Military
Medium range
TUAV 150-600 10,000 Military
Short range
Micro UAV 2-20 3,000 Military/Commercial

Examples of UAVs

UAV type Examples Endurance
HALE Global Hawk 24 hours
Phantom Eye 80 hours
MALE Predator 30-40 hours
HERON 45 hours
TUAV Aerostar 12 hours
Zanka III 28 hours
Micro UAV Netra 30 minutes
RQ-16 T-Hawk 40 minutes

Notes: HALE, High altitude long endurance; MALE, medium altitude long endurance; TUAV, tactile UAV

Technical details of some of the applications of UAV

Area of usage Technical details
Construction sites Surveying, mapping, and 3Dmodelling are done
UAVs mounted with high definition cameras are used (Accenture, 2016)
Drones must weigh less than 55 pounds and must fly less than 400 feet height above ground level.
Flown during day time, speed should not exceed 100 mph and should not fly out of line of sight (LOS)
Mining Vehicles need to travel vertical, horizontal, and hover around to get into tighter spots for better imagery
UAVs can weigh 1-2 kg, with high resolution RGB, infrared cameras, GPS, altimeters, and 30-45 minutes endurance
Use of LIDAR is the latest technology
Law enforcement, search, and rescue Need to provide full 360° view of the rescue places
Aerial video processing need to be done
Cover up to 10 miles in an hour
In military and border control, UAVs equipped with cameras and sensor payloads are used
Providing medical kits to remote areas
Agriculture Used in precision farming
Inbuilt sensors, microcontrollers, and GPS receivers
Inclusion of plant counting algorithms.
Spraying of pesticides (Kale et al., 2015)
Latest drones can cover up to 33 acres/hour
Wildlife protection Use of UAV images in species identification, study of endangered species, population count, assessing mortality rate of animals, etc. (Bird, 2014)
Ultra HD cameras capable of capturing photos at a faster rate
Equipment of thermal sensors to detect forest fire, provide visual ID on wildlife, night surveillance, etc.

UAV obstacle sensing methods – merits and demerits

UAV obstacle sensing methods Merits Demerits
TCAS/ADS-B Prevents aircraft collision.
Uses secondary surveillance RADARs for traffic control
It cannot display aircrafts without TCAS transponders
RADAR Detects aircraft and gives its altitude
Sunlight, smoke, fog, dust, and other factors do not affect them
Big size
Power requirement is more
LIDARS High precision and resolution in distance measurement
Accurate 3D map of surroundings
Able to detect objects of different sizes and shapes
High cost
Power Consumption
Using visual sensors (IR camera, optical camera Small, light, flexible, and can be easily equipped
Resembles natural vision
Need to reduce the background clutter and noise
Image processing techniques and memory requirement
Using ultrasonic sensors Low cost
Easy to use
Sensitive to extreme weather

UAV sensing methods – a comparison

UAV obstacle sensing methods Range Power Size and weight Cost
TCAS/ADS-B Good High High High
RADAR Good High High High
LIDARS Good High High High
Using visual sensors (IR camera, optical camera Medium Less Less Less
Using ultrasonic sensors Medium Less Less Less

Glossary

UAV

unmanned aerial vehicles

GPS

global positioning system

LIDAR

light detection and ranging

RADAR

radio detection and ranging

HD

high definition

MEMS

micro electro mechanical systems

GCS

ground control system

ADS-B

automatic dependent surveillance-broadcast

GNSS

global navigation satellite system

TCAS

traffic collision avoidance system

SIFT

scale invariant feature transform

SURF

speeded up robust features

BRIEF

binary robust independent elementary features

ORB

oriented FAST and rotated BRIEF

SLAM

simultaneously localization and mapping

References

Accenture (2016), “A business approach for the use of drones in the engineering & construction industries”, available at: www.accenture.com/t00010101T000000__w__/fr-fr/_acnmedia/PDF-24/Accenture-Drones-Construction-Service.pdf

Agrawal, P., Ratnoo, A. and Ghose, D. (2014), “A composite guidance strategy for optical flow based UAV navigation”, Third International Conference on Advances in Control and Optimization of Dynamical Systems, Vol. 47, pp. 1099-1103.

Aguilar, W.G., Casaliglla, V.P. and Pólit, J.L. (2017), “Obstacle avoidance based-visual navigation for micro aerial vehicles”, Electronics, Vol. 6 No. 1, pp. 1-23, available at: www.mdpi.com/journal/electronics

Bhardwaj, S., Warbhe, A. and Kumar, B.R. (2015), “Sensor system implementation for unmanned aerial vehicles”, Indian Journal of Science and Technology, Vol. 8 No. S2, pp. 7-11.

Bird, D.M. (2014), “Unmanned vehicle systems and wildlife management in the 21st century”, Spotlight Presentation Royal Society for the Protection of Birds Sandy, Bedfordshire.

Calonder, M., Lepetit, V., Strecha, C. and Fua, P. (2010), “BRIEF: binary robust independent elementary features”, 11th European Conference on Computer Vision (ECCV).

Connolly, C. (2007), “Collision avoidance technology: from parking sensors to unmanned aircraft”, Sensor Review, Vol. 27 No. 3, pp. 182-188, doi: 10.1108/02602280710758101.

Cwojdziński, L. and Adamski, M. (2014), “Power units and power supply systems in UAV”, Aviation, Taylor and Francis, Vol. 18 No. 1, pp. 1-18.

Dayton, J., Enriquez, M., Gan, M., Liu, J., Quintana, J. and Richards, B. (2015), “Obstacle avoidance system for UAVs using computer vision”, American Institute of Aeronautics and Astronautics, doi.org/10.2514/6.2015-0986.

Divya Lakshmi, K. and Vaithiyanathan, V. (2017), “Image registration techniques based on the Scale Invariant Feature Transform”, IETE Technical Review, Vol. 34 No. 1, pp. 22-29.

Ehsan, S. and McDonald-Maier, K.D. (2009), “On-board vision processing for small UAVs: time to rethink strategy”, IEEE NASA/ESA Conference on Adaptive Hardware and Systems, pp. 75-81.

Esrafilian, O. and Taghirad, H.D. (2016), “Autonomous flight and obstacle avoidance of a quadrotor by monocular SLAM”, IEEE 2016 4th International Conference on Robotics and Mechatronics (ICROM), Tehran, 26-28 October.

Gupta, S.G., Ghonge, M.M. and Jhawandhiya, P. (2013), “Review of unmanned aircraft systems”, International Journal of Advanced Research in Computer Engineering & Technology, Vol. 2 No. 4, pp. 1646-1658.

Jain, S., Sunil Kumar, B.L. and Shettigar, R. (2017), “Comparative study on SIFT and SURF face feature descriptors”, IEEE International Conference on Inventive Communication and Computational Technologies, Coimbatore, 10-11 March.

Kale, S.D., Khandagale, S.V., Gaikwad, S.S., Narve, S.S. and Gangal, P.V. (2015), “Agriculture drone for spraying fertilizer and pesticides”, International Journal of Advanced Research in Computer Science and Software Engineering, Vol. 5 No. 12, pp. 804-807.

Keane, J.F. and Carr, S.S. (2013), “A brief history of early unmanned aircraft”, Johns Hopkins APL Technical Digest, Vol. 32 No. 3, pp. 558-571.

Kumar, R., Purohit, M., Saini, D. and Kaushik, B.K. (2017), “IIT-Roorkee, India, ‘air turbulence mitigation techniques for long-range terrestrial surveillance”, IETE Technical Review, Vol. 34 No. 4, pp. 416-430.

Lin, Y. and Saripalli, S. (2015), “Sense and avoid for unmanned aerial vehicles using ADS-B”, IEEE International Conference on Robotics and Automation, pp. 6402-6407, doi: 10.1109/ICRA.2015.7140098.

Meyer, J., du Plessis, F. and Clarke, W. (2009), “Design considerations for long endurance unmanned aerial vehicles”, in Lam, T.M. (Ed.), Aerial Vehicles, ISBN: 978-953-7619-41-1, InTech, 1 January.

Moses, A., Rutherford, M.J. and Valavanis, K.P. (2011), “Radar-based detection and identification for miniature air vehicles”, IEEE International Conference on Control Applications (CCA) Part of IEEE Multi-Conference on Systems and Control, Denver, CO, pp. 933-940.

Omkar, S.N., Tripathi, S., Kumar, G. and Gupta, I. (2014), “Vision based obstacle detection mechanism of a fixed wing UAV”, International Journal of Advanced Computer Research, Vol. 4 Nos 1-14, pp. 172-178.

Rublee, E., Rabaud, V., Konolige, K. and Bradski, G. (2011), “ORB: and efficient alternative to SIFT or SURF”, IEEE International Conference on Computer Vision (ICCV), Barcelona, 6-13 November.

Sahawneh, L.R. (2016), “Airborne collision detection and avoidance for small UAS sense and avoid systems”, available at: http://scholarsarchive.byu.edu/etd/5840

Salazar, L.R. and Sabatini, R. (2013), “A novel system for non-cooperative UAV sense-and-avoid”, European Navigation Conference, Vienna.

Sapkota, K.R., Roelofsen, S., Rozantsev, A., Lepetit, V., Gillet, D., Fua, P. and Martinoli, A. (2016), “Vision-based unmanned aerial vehicle detection and tracking for sense and avoid systems”, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1556-1561, doi: 10.1109/IROS.2016.7759252.

Swaminathan, R. (2015), Drones & India, Exploring Policy and Regulatory Challenges Posed by Civilian Unmanned Aerial Vehicles, Paper No. 58, Observer Research Foundation.

Zarandy, A., Nagy, Z., Vanek, B., Zsedrovits, T., Kiss, A. and Nemeth, M. (2011), A Five-Camera Vision System for UAV Visual Attitude Calculation and Collision Warning, Springer-Verlag, Berlin and Heidelberg.

Further reading

Lwin, N and Tun, H.M. (2014), “Implementation of flight control system based on Kalman and PID controller for UAV”, International Journal Of Scientific & Technology Research, Vol. 3 No. 4, pp. 309-312.

Unmanned Aircraft Systems and Technologies (2010), “Technology focus”, Bulletin of DRDO, Vol. 18 No. 6.

Corresponding author

N. Aswini is the corresponding author and can be contacted at: shijiaswini@gmail.com

About the authors

N. Aswini is presently working as an Assistant Professor in the Department of Electronics and Communication, MVJ College of Engineering, Bengaluru, India. She has done Master’s in VLSI Design and Embedded Systems and has more than ten years of experience in teaching. She has guided various post-graduate and under graduate student projects. Currently she is doing research in obstacle sensing, detection, and avoidance techniques for unmanned aerial vehicles under Visvesvaraya Technological University, Karnataka, India.

Dr E. Krishna Kumar is a former Senior Scientist, Indian Space Research Organisation (ISRO) and has over 30 years of experience that includes aerospace and electronics, satellite control, mission critical support for on-board and on-ground critical systems and flight software, aircraft communications, surveillance, navigation and technology developments. He has done his PhD from Indian Institute of Science, Bangalore, India. He also supported academics in different engineering colleges, universities in various capacities for research, curriculum developments, and academic committees. He is currently a Consultant for Research and Products Developments in Aerospace and Electronics.

Dr S.V. Uma is presently working as an Associate Professor in the Department of Electronics and Communication, RNS Institute of technology, Bengaluru, India. She has done Doctorate in the area “Congestion control and Improved QoS in multimedia networks” from Bangalore University. She has nearly 20 years of teaching experience and her interests include communication, network security, signal, and image processing.