Insect-Inspired Vision System Helps Drones Pass Through Small Gaps

We’ve posted before about autonomous drones flying through small gaps, but the big difference here is that in this case, the drone has no information about the location or size of the gap in advance. It doesn’t need to build up any kind of 3D map of its environment or model of the gap, which is good because that would be annoying to do with a monocular camera. Instead, UMD’s strategy is to “recover a minimal amount of information that is sufficient to complete the task under consideration.”

To detect where the gap is, the drone uses an optical-flow technique. It takes a picture, moves a little bit, and then takes another picture. It identifies similar features in each picture, and thanks to parallax, the farther away features behind the gap will appear to have moved less than the closer features around the gap. The edges of the gap are the places where you’ve got the biggest difference between the amount that features appear to have moved. And now that you know where all those things are, you can just zip right through!

Or, almost. The other piece of this is using visual servoing to pass through the gap. Visual servoing is just using visual feedback to control motion: The drone takes a picture of the gap, moves forward, takes another picture, and then adjusts its movement to make sure that its position relative to the gap is still what it wants. This is different from a pre-planned approach, where the drone figures out in advance the entire path that it wants to take and then follows it—visual servoing is more on the fly. Or, you know, on the bee.

The UMD researchers tested this out with a Bebop 2 drone packing an NVIDIA Jetson TX2 GPU. A variety of different gaps of varying sizes and shapes were cut in a foreground wall, which was covered in newspapers to give them some extra texture, and this is where we’re obligated to point out that this technique probably won’t work out if you’re trying to fly through a gap in one white wall with another white wall on the other side. Anyway, as long as you’ve got newspapered walls, this system works quite well, the researchers say: “We achieved a remarkable success rate of 85 percent over 150 trials for different arbitrary shaped windows under a wide range of conditions which includes a window with a minimum tolerance of just 5 cm.”

The maximum speed that the drone was able to achieve while passing through the gap was 2.5 m/s, primarily constrained by the rolling shutter camera (which could mess up the optical flow at higher speeds), but again, this method isn’t really intended for high performance drones. Having said that, the researchers do mention in the conclusion of their paper that “IMU data can be coupled with the monocular camera to get a scale of the window and plan for aggressive maneuvers.” So, hopefully we’ll be seeing some of that in the near future.

“GapFlyt: Active Vision Based Minimalist Structure-Less Gap Detection For Quadrotor Flight,” by Nitin J. Sanket, Chahat Deep Singh, Kanishka Ganguly, Cornelia Fermuller, and Yiannis Aloimonos from the University of Maryland, is published in IEEE Robotics & Automation Letters.

[ UMD ]

Source: IEEE Spectrum Robotics