Today’s autonomous drones have reaction times of tens of milliseconds, which is not enough for navigating fast in complex dynamic environments. To safely avoid fast moving objects, drones need low-latency sensors and algorithms. We departed from state-of-the-art approaches by using event cameras, which are bioinspired sensors with reaction times of microseconds. Our approach exploits the temporal information contained in the event stream to distinguish between static and dynamic objects and leverages a fast strategy to generate the motor commands necessary to avoid the approaching obstacles. Standard vision algorithms cannot be applied to event cameras because the output of these sensors is not images but a stream of asynchronous events that encode per-pixel intensity changes. Our resulting algorithm has an overall latency of only 3.5 milliseconds, which is sufficient for reliable detection and avoidance of fast-moving obstacles. We demonstrate the effectiveness of our approach on an autonomous quadrotor using only onboard sensing and computation. Our drone was capable of avoiding multiple obstacles of different sizes and shapes, at relative speeds up to 10 meters/second, both indoors and outdoors.

Learn more (opens external site)

 

Comments are closed.

Submit a Team Connection

Click here to submit a new Bioinspired Design Connection (you must be logged in first).

Browse Team Connections

Choose by category, team or week:

BioDesign Connections by Category (2020 – 2022)

by Team (2022 only)

by Week (2022 only)

Most Recent Connections

Connection Interactions

Recent Comments

  1. to reduce the impact of car accidents, it may be possible to study the force diverting physics of cockroaches to…

Top Voted Connections