Drones offer camera angles that are not possible with traditional cam- eras and are becoming increasingly popular for videography. However, fly- ing a drone and controlling its camera simultaneously requires manipulating 5-6 degrees of freedom (DOF) that needs significant training. We present ARPilot, a direct-manipulation interface that lets users plan an aerial video by physically moving their mobile devices around a miniature 3D model of the scene, shown via Augmented Reality (AR). The mobile devices act as the viewfinder, making them intuitive to explore and frame the shots. We leveraged AR technology to explore three 6DOF video-shooting interfaces on mobile devices: AR keyframe, AR continuous, and AR hybrid, and compared against a traditional touch interface in a user study. The results show that AR hybrid is the most preferred by the participants and expends the least effort among all the techniques, while the users’ feedback suggests that AR continu- ous empowers more creative shots. We discuss several distinct usage patterns and report insights for further design.
Yu-An Chen, Te-Yen Wu, Tim Chang, Jun You Liu, Yuan-Chang Hsieh, Leon Yulun Hsu, Ming-Wei Hsu, Paul Taele, Neng-Hao Yu, and Mike Y. Chen. 2018. ARPilot: designing and investigating AR shooting interfaces on mobile devices for drone videography. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’18). Association for Computing Machinery, New York, NY, USA, Article 42, 1–8.