Up Next

Live Out Loud

Watch this self-flying drone zip around obstacles at 30 MPH

Last week, a human-controlled drone flew into power lines in Los Angeles, leaving hundreds of people without electricity. If the drone had had its own set of eyes to guide it, it might have steered itself away from the wires and prevented the accident.

Most toy drones don’t have that yet, but engineers at MIT’s Computer Science and Artificial Intelligence Lab are working on a program that allows a drone to sense its environment accurately as it’s flying around at 30 M.P.H. All by itself, without human help.

“Everyone is building drones these days, but nobody knows how to get them to stop running into things,” said CSAIL PhD student Andrew Barry in a statement.

Self-driving cars often use LIDAR, a type of remote-sensing technology, but that’s too heavy for a small drone to carry. Cars also need maps to navigate the streets. Companies like Google and Apple have poured millions into building reliable maps. Doing the equivalent for low-flight airways, which are filled with trees, power lines and other obstacles, just isn’t feasible.

If we want drones that can fly quickly and navigate in the real world, we need better, faster algorithms,” Barry said.

His system is about 20 times faster than similar existing software, according to a MIT press release. In the video above, you can see the thing zipping through trees, with relative ease—again all without human intervention. It’s able to see even small and thin branches on its flight path with enough time to shift course on its own. It’s pretty impressive.

The program that Barry and his team developed allows drones to build a map of their environment in real-time. Cameras attached to the 1-pound drone’s wings survey its surroundings at 120 frames per second. Most algorithms use this data to figure out what objects just a few meters out. But that’s really computationally intensive, and limits the amount of flight time and speed a drone can achieve. This is why most commercially available drones don’t fly faster than 5 or 6 m.p.h., according to the researchers.

The MIT team realized that in most scenarios, the environment doesn’t change all that much between frames, which meant that it was safe to compute differences in the drone’s flight path at 10-meter intervals. This is an approach other computer vision experts are taking to cut down the amount of processing machines need to do to understand moving images. Perfecting this kind of technique will be useful not only for autonomous drones and self-drivign cars, but also for things like better video search.

Barry’s algorithms are freely available online for others to tweak and improve.

Watch this self-flying drone zip around obstacles at 30 MPH

Last week, a human-controlled drone flew into power lines in Los Angeles, leaving hundreds of people without electricity. If the drone had had its own set of eyes to guide it, it might have steered itself away from the wires and prevented the accident.

Most toy drones don’t have that yet, but engineers at MIT’s Computer Science and Artificial Intelligence Lab are working on a program that allows a drone to sense its environment accurately as it’s flying around at 30 M.P.H. All by itself, without human help.

“Everyone is building drones these days, but nobody knows how to get them to stop running into things,” said CSAIL PhD student Andrew Barry in a statement.

Self-driving cars often use LIDAR, a type of remote-sensing technology, but that’s too heavy for a small drone to carry. Cars also need maps to navigate the streets. Companies like Google and Apple have poured millions into building reliable maps. Doing the equivalent for low-flight airways, which are filled with trees, power lines and other obstacles, just isn’t feasible.

If we want drones that can fly quickly and navigate in the real world, we need better, faster algorithms,” Barry said.

His system is about 20 times faster than similar existing software, according to a MIT press release. In the video above, you can see the thing zipping through trees, with relative ease—again all without human intervention. It’s able to see even small and thin branches on its flight path with enough time to shift course on its own. It’s pretty impressive.

The program that Barry and his team developed allows drones to build a map of their environment in real-time. Cameras attached to the 1-pound drone’s wings survey its surroundings at 120 frames per second. Most algorithms use this data to figure out what objects just a few meters out. But that’s really computationally intensive, and limits the amount of flight time and speed a drone can achieve. This is why most commercially available drones don’t fly faster than 5 or 6 m.p.h., according to the researchers.

The MIT team realized that in most scenarios, the environment doesn’t change all that much between frames, which meant that it was safe to compute differences in the drone’s flight path at 10-meter intervals. This is an approach other computer vision experts are taking to cut down the amount of processing machines need to do to understand moving images. Perfecting this kind of technique will be useful not only for autonomous drones and self-drivign cars, but also for things like better video search.

Barry’s algorithms are freely available online for others to tweak and improve.

WHERE TO WATCH