It’s easy to fly even a small drones like the DJI Mavic Pro and the GoPro Karma outside of your visual range, but this violates the rules for both recreational and commercial flying under the excluded RPA category. These limitations exist because these aircraft simply don’t have the situational awareness and reliability to be operated safely Beyond Line of Sight (BLOS) in civilian airspace.
In war zones the military routinely fly smaller drones BVLOS using GPS waypoints and live video telemetry. These techniques aren’t safe enough for civilian use because they don’t let you reliably detect and avoid other aircraft. So what new tech is being developed in the detect and avoid space to enable BVLOS operations?
Improved airspace control for drones
A number of well supported efforts are betting that automated delivery drones will soon become an everyday part of city life. To support what may become thousands of drones using the airspace over cities simultaneously, there are now efforts to use cellular networks as the data backbone to track and coordinate drone movements. This means that only a small cellular radio chip needs to be placed integrated into the flight controller of each drone in addition to the GPS in order to ensure that drones maintain sufficient separation from other drones and manned aircraft. Data on the drone position received by the cellular network is communicated with manned aircraft in the vicinity via a ground station and the cellular network. An implementation of this approach knows as the Low Altitude Tracking and Avoidance System (LATAS) by Precision Hawk has received the first ever exemption under the FAA Part 107 rules to allow for BVLOS operation.
The Low Altitude Tracking and Avoidance System (LATAS) by Precision HawkIn more rural areas systems like LATAS are likely to be difficult to implement due to the poor coverage of the cellular network. These areas work almost exclusively on the see and avoid principle, which means that drones must have the capability to detect other aircraft in their vicinity directly.
Drone based sensors for detection and avoidance
Ultrasonic and camera based collision avoidance sensors are now appearing on small drones, but commonly have a range of only a few meters. This doesn’t allow enough time to detect and avoid manned aircraft that are travelling at speed. The ever reducing cost of high resolution image sensors points to a future where vision based systems may be able to increase their detection range (assuming no flying at night or in cloud). The challenge here is that detecting objects further away means a greater pixel density and a greater processing power to match. Companies such as NVIDIA are working hard on reducing the size and cost of GPU based image processors that are well suited to Artificial Intelligence (AI) techniques such as deep learning and neural networks for object identification. This is mainly being driven by the push for autonomous vehicles. In time Moore’s Law may bring these AI object recognition techniques down to a scale that can be implemented into even very small drones for aircraft detection and avoidance.
Miniature radar might be a more powerful answer
The future of detection and avoidance of other aircraft may be in the field of miniature radar. Two approaches are showing promise for low weight, power and cost that may be suitable for aircraft detection and avoidance in any weather or and at night.
Synthetic aperture radars (SAR) have a small and low cost antenna arrangement that detects objects moving relative to the antenna. The technique has traditionally been used to investigate ground objects from a moving drone, but can be used to detect approaching aircraft.
Echodyne are developing a different approach based on their proprietary Metamaterials Electronically Scanned Array (MESA) radar. This technique can identify and track both stationary and moving targets similarly to military Active Electronically Scanned Array (AESA) radars used in fighter jet targeting systems, but uses meta materials to vastly simplify the transmitter/receiver module.
Both of these radar approaches have no moving parts and can deliver solutions weigh less than 1kg, have a power usage of less than 20W and can detect manned aircraft at ranges greater than 1km. These radars have a typical field of view of 120°, meaning that three would be required for full 360° situational awareness. This currently puts them out of contention for very small drones, but means use in aircraft as small as 10kg may be possible.
Where do we stand
The bottom line is that while very small off the shelf aircraft such as the Sensefly eBee can easily be programmed to fly out of the visual range of the operator, the aircraft doesn’t yet have the situational awareness to do it safely. New technologies and regulations are coming, but they are likely to initially require larger and more expensive drones or substantial ground infrastructure to operate effectively. In the meantime, extended line of sight operations can work within the current VLOS legislation to get better coverage using the current small drone fleet that aren’t equipped with sense and avoid technology.