Why Drones Need AI

Author photo: Rick Rys
ByRick Rys
Category:
Technology Trends

New drone sensing and computing platforms can make sense out of still and video images and act with intelligence thanks to new onboard sensors, computing platforms, and artificial intelligence. AI is becoming increasingly easier for product developers to apply thanks to powerful new platforms. But how does AI help drones accomplish their respective missions?

For one, it is about autonomy. It takes depth perception and situational awareness to keep drones from crashing into things.  Unlike piloted aircraft, drones must often operate in close proximity to other objects or structures or even while touching or connecting to them. If drones are ever going to be used for package delivery in a city environment, they will need to deal safely with other drones, cars, people, dogs, birds, and many other obstacles. Drones can be equipped with video cameras, LIDAR, GPS and a range of other sensors, but they also need the brains to interpret and act on this information.  And for that, they need some serious processing power. That processing is best done locally and we see that embedded systems are being developed at the chip level to give drones the processing power that allows them to perform complex tasks quickly.

The military is working hard on autonomous drones.  This includes giving drones the ability to recognize situations and delegating the decision to act independently of the remote human pilot. Considering that human drone pilots have made mistakes, maybe AI can do better. The technology behind situational awareness and the ability to recognize objects has many applications for commercial and industrial use.

We’ve all heard the claims that self-driving cars will be safer than human-operated cars.  Those of us who live in New England and other densely populated areas certainly have all seen examples of bad human driving. Fatality statistics show that humans make plenty of mistakes.

In Formula 1 racing, you can see just how far digital technology has crept into the operation of the car and how successful it has been to improve the chances of winning. Considering that simple cruise control outperforms humans, it is not hard to see why a digital race car that synchronizes the race car to a digital twin in the pits operating on a 3D model of the racetrack can provide a competitive advantage. The digital twin knows the exact topology of the racetrack and simulates intimate details of the car’s engine, steering, and aerodynamics. As an example, a Renault Sport Formula One car has more than 200 sensors and relies on Microsoft Dynamics AX and Microsoft’s Azure machine learning suite. Software used to design car parts can be used to dynamically simulate how that part functions with all the other parts of the car in a virtual system. Audi built a self-driving version of its RS 7 sports car. This car can compete with experienced human drivers and, just as IBM’s Deep Blue computer eventually beat world-class chess master, Gary Kasparov, cars operated with AI can already perform better than many human drivers, whether commuting to work, out on a Sunday drive, or on a race track.

Leading companies like Intel and Qualcomm are building processors specifically for drones and NVidia is making lightweight, low-power embedded products that support drone applications. Combining such powerful processors with drones will result in some impressive new capabilities when connected with wireless communications, a vast array of camera and sensors, and actuators that give drones robotic capabilities. Machine vision and machine learning advances can bring AI capabilities to the next generation of drones.

 

Image removed.

NVidia TX2 embedded processor

The new Intel Nervana platform is expected to produce breakthrough performance and dramatic reductions in the time to train complex neural networks. The Intel RealSense cameras and Movidius vision processing units (VPUs) are designed to put artificial intelligence into video surveillance and will likely find applications in drones.

DJI’s Phantom 4 consumer drone uses front and rear vision sensors along with side Infrared sensors to stop short of obstacles like walls, even when the pilot steers into a crash.  The drone system can be trained to recognize you. The Phantom 4 can follow you and photograph a video selfie, while avoiding obstacles as you ride a bicycle on a curved path. It is also smart enough to return to the launching spot, should it lose communications from the base station controller.

In the US, NASA is working with the FAA on a new air traffic control system for drones. Other countries are doing the same. Considering drone traffic control is being developed without legacy rules, there is plenty of opportunity to insist that standards and computing power make drones safe, smart, fast, reliable, and secure. The FAA is moving away from radar and increasing the use of ADS-B- Automatic Dependent Surveillance for manned aircraft.  ADS-B uses precise position from satellites that includes GPS, but also communicates information about other nearby aircraft, ground vehicles, weather conditions, and terrain. With FAA moving to ADS-B for pilots it’s likely that we’ll see this technology appear in commercial drones.

Issues like the safe spacing of individual drones or multiple drones operating in formation require processing power. That processing power needs to be local to the drone to insure fast response time for reliable actions. Flying a drone with such local intelligence should make it easier for a pilot to accomplish a mission. AI applications using data from ADS-B have great potential to allow drones to be aware of their situation and act to safely accomplish their mission.

Engage with ARC Advisory Group

Representative End User Clients
Representative Automation Clients
Representative Software Clients