Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a sophisticated computer vision system for flying robots.
NanoMap allows drones to navigate through dense environments at 20 miles per hour.
Drones’ abilities are taking off
Today’s commercial drones far exceed the capabilities of their predecessors. But if they are to take on more complex or commonplace roles in the workplace, they need to get much smarter and safer.
The vast majority of drones deployed in construction, media, or agriculture applications have some form of computer vision. At the very least they can sense obstacles directly in front of them and avoid collisions.
Some, like DJI’s latest model and those enhanced with Intel’s RealSense technology, can detect obstacles in multiple directions and plot a path around them.
However, CSAIL’s NanoMap system aims to take that awareness to the next level.
As outlined in a new research paper, NanoMap integrates sensing more deeply with control. It works from the starting point that any drone’s position in the real world is uncertain over time.
The new system allows a drone to model and account for that uncertainty when planning its movements – as this video reveals.
Navigating around warehouses to check stock levels or move items from one place to another is just one example of the kind of dynamic environments where drones will need to operate safely.
This ability will be vital in helping drones’ commercial applications to spread.
SLAM dunk scenarios
Developing drones that can build a picture of the world around them and react to shifting environments is a challenge. This is particularly true when computational power tends to be proportional to weight.
Simultaneous localisation and mapping (SLAM) technology is a common way for drones to build a detailed picture of their location from raw data. However, this technique is unreliable at high speed, which makes it unsuitable for tight spaces, or environments where objects are being moved, or the layout is dynamic.
“Overly confident maps won’t help you if you want drones that can operate at higher speeds around humans,” said graduate student Pete Florence, lead author on a related paper.
“An approach that is better aware of uncertainty gets us a much higher level of reliability in terms of being able to fly in close quarters and avoid obstacles.”
NanoMap works with uncertainty
Using NanoMap, a drone can build a picture of its surroundings by stitching together a series of measurements via depth-sensing. Not only can the drone plan for what it sees already, but it can also plan how to move around areas that it can’t see yet, based on what it has seen already.
“It’s like saving all of the images you’ve seen of the world as a big tape in your head,” explains Florence. “For the drone to plan its motions, it essentially goes back into time to think individually of all the different places that it was in.”
NanoMap operates under an assumption that humans are familiar with: if you know roughly where something is and how large it is, you don’t need much more detail if your only aim is to avoid crashing into it.
By accounting for uncertainty in its measurements, the NanoMap system has reduced the team’s crash rate to just two percent.
“The key difference to previous work is that the researchers created a map consisting of a set of images with their position uncertain, rather than just a set of images with their positions and orientation,” says Sebastian Scherer, a systems scientist at Carnegie Mellon University’s Robotics Institute.
“Keeping track of this uncertainty has the advantage of allowing the use of previous images, even if the robot doesn’t know exactly where it is. This allows for improved planning.”
Internet of Business says
As drones spread into more and more vertical applications, such as farming, manufacturing, critical infrastructure maintenance, building, environmental monitoring, security, law enforcement, broadcasting, autonomous cargo, deliveries, and even public transport, their safety around human beings, and in complex environments, becomes ever more important to demonstrate.
Light-touch regulation is a good idea, but public safety must remain paramount.
Over time, the regulatory environment will roll back to accommodate drones as safety improves. But until then, it will remain cautious and conservative – except in remote areas, such as over the sea at offshore wind farms or oil rigs.
MIT should be congratulated for this latest innovation in drone safety, but progress remains incremental.
The core lesson is this: a two per cent crash rate is impressive, but it’s still unacceptable. In enterprise software or cloud services, no one would accept 98 per cent reliability; so it’s certainly not acceptable with industrial machinery in public spaces.
Battery operated, rotary wing, autonomous vehicles have multiple points of failure. In smart cities, factories, or other public spaces, a single catastrophic incident could set back the industry for years. It is incumbent on all of us to ensure that no one is harmed.
The post MIT’s NanoMap vision helps drones to see complexity at speed appeared first on Internet of Business.
EHang this week released footage of the latest test flights of its EHang 184 personal Autonomous Aerial Vehicle. The EHang 184 can transport a single person at up to 130kph in Force 7 typhoon conditions, the company said. EHang plans to further improve the passenger experience and add an optional manual control so passengers with piloting experience can operate the AAV manually. It also has developed and tested a two-seater craft that can carry up to 280kg. In 2017, EHang was granted AS9100C certification.
The Winter Olympics' opening ceremony took place today in Pyeongchang and it featured a light show from a record-breaking 1,218 drones, Wired reports. The display was created with Intel's Shooting Star drones, the same ones used in Lady Gaga's Super…
Engadget RSS Feed