I made this drone prototype in 2015 while exploring if it is possible to fuse Virtual Reality headsets and drones to make remote (teleoperated) flying easier. You can read full overview in our paper “Stereoscopic First Person View System for Drone Navigation”.
This is a drone (a hexacopter to be precise) that has a custom made wide-angle stereo camera. It has two calibrated fish-eye cameras in parallel setup with human like baseline (calibrating fish-eye was a bit tricky). The camera images form two stereo hemispheres for left and right eyes. The cameras are 185 degrees and the stereo effect is good in 160 degrees range. There are some convergence issues on the sides, but it is for human peripheral vision.
The camera streams real-time H264 stereo video feed at 30 fps over WiFi or 4G. The streaming and image processing is done on NVIDIA Tegra TK1 board. Big Thanks to NVIDIA for making such an awesome mobile computer! Tegra TX1 is even better now (in 2016).
The ground station receives the video feed and renders it in Oculus Rift DK2 at 75 fps with up-sampling. The higher frame rate rendering is important for reducing VR sickness. In theory, one needs to stream 90 fps from the camera to avoid VR sickness, but in practice it consumes too much network bandwidth and is not practical (as of 20015-2016). Wireless streaming from a moving drone with existing protocols (over WiFi or 4G) proved to be a challenge. In addition to streaming issues, I had one nasty flyaway and one crash due to radio interference between GPS and other antennas. I ended up adding some RF shielding (photos below). I guess, aluminum foil is critical tech in airspace.
A drone operator has a 180 degrees immersive view with digital panning in the Virtual Reality headset and can fly the drone by controlling it. This project uses digital panning and not mechanical gimbals like other projects I looked at. Digital panning reduces VR sickness due to very little latency in viewing rendering (latency is <10ms when looking around). In some way, user experience is similar to a cinema where the video is 24fps, but looking around has much higher fps (human like fps).
Here is the flight video (with Oculus footage):
I tested this system with 15 people flying it for real and 80+ people seeing offline footage in VR headsets. Test pilots could take off, fly and land with no issues while in VR. Users reported the sensation of flying and out of body experiences. About 70% of users did not feel issues with the motion/VR sickness due to the rendering/control schema used in this project. Usually, people experience less sickness when standing as opposed to seated flying.
Interestingly, this teleoperation can be used to gather data for deep imitation learning – the systems knows exactly where a human operator is looking and what he/she is doing at all times. So it is possible to use this data to teach autonomous navigation AI for drones.