Drones allow exploring dangerous or impassable areas safely from a distant point of view. However, flight control froman egocentric view in narrow or constrained environments can be challenging. Arguably, an exocentric view would afford a betteroverview and, thus, more intuitive flight control of the drone. Unfortunately, such an exocentric view is unavailable when exploringindoor environments. This paper investigates the potential of drone-augmented human vision, i.e., of exploring the environment andcontrolling the drone indirectly from an exocentric viewpoint. If used with a see-through display, this approach can simulate X-rayvision to provide a natural view into an otherwise occluded environment. The user’s view is synthesized from a three-dimensionalreconstruction of the indoor environment using image-based rendering. This user interface is designed to reduce the cognitive load ofthe drone’s flight control. The user can concentrate on the exploration of the inaccessible space, while flight control is largely delegatedto the drone’s autopilot system. We assess our system with a first experiment showing how drone-augmented human vision supportsspatial understanding and improves natural interaction with the drone.