Social media giant Facebook just announced Habitat 2.0, its upgraded simulation platform that allows researchers to train robots in virtual environments. The company’s step forward here moves it closer to achieving “embodied AI,” or the tech that might allow robots to perform everyday tasks.
Habitat 2.0 is the newer version of Facebook’s original Habitat platform. And just like its predecessor, 2.0 lets researchers complete their training at speed while rendering all of the high-level details needed to thoroughly train the robots. These details will include objects and obstacles the robots may encounter in household settings, like countertops, chairs, toys, and boxes, and to be able to navigate real-world spaces in company with humans.
Eventually, this could mean that such robots could handle simpler commands like “load the dishwasher” or “get me a soda from the fridge.” The implications are far greater, however, as they could also potentially help those who are visually impaired take a walk around the block while recognizing obstacles and helping the user avoid them.
Compared to physical training, virtual training saves both time and money and it can be more widely accessed by those who want to take advantage of it. Facebook is hoping that Habitat will make it easier to quickly train assistive robots, especially those designed to tackle boring household chores (I’ll take two, thank you!).
To be successful, however, the robots will first need to learn how to navigate a variety of surfaces, room layouts, and other elements that properly mimic real-world environments. That’s precisely where Habitat 2.0 comes in handy, though. It can quickly train these robots across all kinds of environments (like multi-story homes and office conference rooms) accounting for tons of obstacles and other variables, instead of spending months or years letting them roam around house after house in real life.
Habitat 2.0 is also training against another tough challenge: object interaction. Previous platforms, like Replica, used static datasets that didn’t allow for this despite it being an important part of the training. But with 2.0, robots can now “practice” rolling over carpet, grabbing brushes, and so on.
Dhruv Batra, research scientists at Facebook, stated, “With this new data set and platform, AI researchers can go beyond just building virtual agents in static 3D environments and move closer to creating robots that can easily and reliably perform useful tasks like stocking the fridge, loading the dishwasher, or fetching objects on command and returning them to their usual place.”
Replica was also upgraded to ReplicaCAD. Its humble library of 18 3D scans was expanded to over 110 living area layouts and includes nearly 100 objects; it can also add realistic clutter and allow the robots to “interact” with doors and other elements.
The platform is also multiple orders of magnitude faster than most other 3D simulators out there. Where other platforms can only simulate an assistive robot interacting at 400 steps per second (SPS), ReplicaCAD easily handles 1,200 SPS (with a maximum of 26,000 with extra GPUs). It’ll be interesting to see how the training ultimately goes and if we ever do get consumer-level assistive robots to handle household chores.