Ingo Bartussek - Fotolia

How do drones and augmented reality work together?

Adding drones and augmented reality together changes the way humans interact with drones, says IEEE member Todd Richmond, offering hands-free control over real-time drone operation.

Before jumping into how drones and augmented reality mix, let's look at drones themselves. A drone uses sensors to make decisions -- for example, direction, speed, etc. -- and those sensors feed algorithms to make those decisions. The reality for the drone is it presumably has a task along with a desire, at the risk of anthropomorphizing, not to crash -- i.e., manage speed and altitude.

The interesting calculations come when a drone must determine whether to continue its mission or divert to avoid hurting a human or damaging property. In this case, sensor data feeds the algorithms, and the algorithms have to determine the proper course of action. This scenario has nothing to do with augmented reality as it is typically described or discussed.

Current drones can perform highly autonomous flight. A human only needs to somehow tell the drone where to go and what to do, and algorithms can handle the rest. For many situations, that degree of interaction and situational awareness for the human is sufficient. For instance, if a drone is monitoring agricultural areas with cameras and other sensors, constant human intervention likely is not needed, as the data can be collected, parsed and analyzed when the drone returns. So, the human is mostly out of the loop other than setting the paths and goals for the drone.

For other applications, a human may want real-time information from the drone and may want to interact or intervene while the drone is in flight. For instance, a surveillance drone may carry a camera that transmits a real-time video feed. The user may want to maneuver the drone or manipulate the camera -- point, tilt or zoom -- to get a better view of a situation. Currently, this is done on a tablet or computer screen, requiring the operator to either hold something or have associated hardware. One could imagine a headset with an augmented reality display that presents the information to the user hands-free and overlays other relevant information -- map data, telemetry from the drone, communication options and so forth.

Drones and augmented reality could also be used for command and control, using gaze and gesture for control. For instance, if a set of augmented reality glasses was providing a first-person view from the drone, turning your head could initiate the drone to turn, leaning forward would move the drone forward and so on. Head movements could also control just the camera pan, tilt and zoom, with hand gestures picked up by cameras on the augmented reality device used for navigating the drone. Voice commands could also be used instead of or to supplement gestures and head movements.

Augmented reality provides a way to embody the drone, rather than control it externally. The best example of this is to compare older radio-control aircraft control, with a pilot standing on the ground, aircraft flying and no point of view from the airplane, to current drone piloting, with the pilot getting a cockpit view and often not able to see the actual craft from their vantage point.

Have a question for one of our experts? Submit it now. All questions are anonymous.

Dig Deeper on Internet of things platform

Data Center
Data Management