David Mindell, MIT Professor in Aerospace Engineering and the History of Technology, is the author of five books, including the recent “Our Robots, Ourselves.” He’s also a pilot and an expert on automated underwater subs who has collaborated with Titanic discoverer Robert Ballard and others in more than 25 underwater explorations. In 2015, he launched a Cambridge, Mass. startup called Humatics that is developing a microlocation system for positioning people and objects down to the millimeter scale.
“Our mission is to revolutionize the way we locate, navigate, and collaborate between people and machines,” says Mindell, CEO of the 25-person company. The Humatics technology was not developed at MIT, but “we build on work from MIT,” says Mindell. The company has also benefited from MIT’s various venture startup services, and has been selected as member of MIT ILP’s elite STEX25 (Startup Exchange 25) accelerator program.
Humatics, which counts Ballard and Apollo 15 astronaut and MIT graduate Dave Scott among its scientific advisors, is initially targeting its technology at industrial automation. The primary mission is to streamline the choreography of robots and humans working together in factories. The application has more in common with deep-sea exploration than you might expect.
“Humatics incorporates lessons we have learned in robotics and automation in extreme environments, such as the deep ocean, outer space, aviation, and some military environments,” explains Mindell. “Our microlocation technology is based on work I did in the 1990s on sonar for high precision navigation of robots in deep ocean. From a navigation standpoint, factories have a lot in common with the deep ocean. There’s no GPS, a lot of noise, and only a few fixed beacons to work from.”
For years, Mindell has sought to translate sonar technologies into new terrestrial applications, but only in the last few years has it become feasible, driven in part by developments in automotive radar. “We’re driving the Moore’s Law for radar,” says Mindell. “Costs are coming down to the point where small, short-range radar can be brought to bear on robotics and autonomous systems.”
Our mission is to revolutionize the way we locate, navigate, and collaborate between people and machines.
“The RF-based Humatics technology is unlike traditional sonar, radar, or LIDAR, in that it doesn’t do “blobology on reflective echoes,” says Mindell. “However, it uses a lot of the same core technologies. Unlike GPS, it can work indoors, and is not limited to an uncertainty range of three to nine meters,” says Mindell. “Almost all robotics and human work happens within a much smaller circle.”
Humatics technology won’t replace all the sensor systems in a factory, says Mindell. Yet, he adds that “Right now there’s no solution for millimeter scale precise positioning in the harsh environment of manufacturing where there are strange lighting conditions and lots of stuff moving around.”
The Humatics technology, differs from most sensor and machine vision solutions used with robotics in that it doesn’t perform blobology, which Mindell describes as “looking at a big dot cloud of echoes and try to apply different algorithms to get 80 percent certainty.” Instead, the microlocation system can pinpoint multiple transponders within a 30-meter range, each of which has a unique digital tag.
“You might wear a wristband, or have safety equipment with built-in transponders, or they can be attached to robots, workpieces, or engine parts,” says Mindell. “We know exactly what and where each transponder is and how far it is from other transponders -- not just how far an object is from your hand, but how far it is from a particularly point on your hand.”
Uncaging the robots
Millimeter-scale precision is essential for the growing efforts to free robots from their cages and let them work with people. “The robotics industry is transitioning,” says Mindell. “Now the innovations are not so much in the robots as in the applications, use cases, and environments that allow them to work in human settings. Most robots are still inflexible clockwork mechanisms doing repetitive tasks. We’re interested in making robots more flexible and collaborative, and less walled off from humans. Right now, we still have a very stand-off relationship with most robots, and for good reason – they can kill you. That’s changing, but we have a long way to go in improving communications, mutual awareness, and safety.”
To improve safety and productivity in a mixed workspace, Humatics has developed spatial technology algorithms that work with the microlocation network to track mobile humans and robots. “Our Spatial Intelligence Platform ™ tracks and analyzes how people and machines and parts move through the factory,” says Mindell. “We can then gather the tremendously rich information from those motion paths, and compare it to a larger database.”
Customers can augment the Humatics platform with visualization, machine learning and analytics tools to help compare daily motion paths against historical patterns. The resulting lessons can be used to provide feedback to workers to enable continuous improvement by emulating the motions of the best workers. “Our system helps to explore the choreography of people working with machines,” says Mindell.
Beyond the factory: automobiles and drones
Humatics is already looking to other applications, including driverless cars. The microlocation technology could be used to identify the location of other vehicles, pedestrians, and infrastructure more precisely than what is capable with GPS, LIDAR, or vision systems.
“Our technology enables relative positioning info with a greater reliability and accuracy than is possible now with blobology based sensors,” says Mindell. “For example, bicycles are very hard to recognize at night or in the snow and rain, but our system is immune to those conditions.”
The catch is that all the other cars, bicycles, pedestrians, and traffic infrastructure would also need transponders. Yet, the hardware is simple enough that it should be able to affordably scale to the level of today’s RFID technology.
Drones are another possibility, especially when operating in tight urban environments or interior spaces where precise positioning is crucial. “To be active in populated areas, drones need to be in very precise relationships to the people and things around them,” says Mindell.
Mindell says that when he pilots a plane, he uses GPS about 99.99 percent of the time. However, the FAA requires a ground-based navigation system backup. “It’s crazy to think we won’t have drones flying through airspace that don’t have a similar requirement,” he says. “When you’re close to buildings, you still want to be operating in direct relationship with things. Our microlocation technology can provide short-lived, very precise navigational interactions.”
We’re interested in making robots more flexible and collaborative, and less walled off from humans. Right now, we still have a very stand-off relationship with most robots, and for good reason – they can kill you. That’s changing, but we have a long way to go in improving communications, mutual awareness, and safety.
The myth of full autonomy
Some of the guiding principles behind Humatics were laid out by Mindell in his 2015 book, “Our Robots, Ourselves: Robotics and the Myths of Autonomy,” which argued that the drive to create fully autonomous robots risks missing out on the greater potential of robotics. “For autonomous systems to be useful, they need to situate themselves in human environments,” says Mindell. “The highest form of technology is that which gives you exactly the right level of automation you need at the time.”
“The more sophisticated companies are designing cars that are your collaborator and friend, that can learn from your driving habits, work with the environment, and draw answers from the cloud,” he says. “These robots bring new levels of decision making embedded within human context. Those relationships should be built into the core of autonomous systems.”
Even if the automobile industry moves to a fully self-driving experience instead of an ADAS interaction, the cars won’t be as autonomous as advertised, says Mindell. “The driver will still pick the destination and possibly change it en route.”
People also tend to overlook the hidden human inputs baked into a car’s programming. “Autonomous car projects draw on huge databases of human drivers,” says Mindell. “There are thousands of human inputs and assumptions from programmers about what constitutes a pedestrian and how fast the car is moving. Even in a driverless car, there’s always a wrapper of human control. All these systems are networks of people and machines.”
The Humatics technology may well provide one essential piece of the puzzle for connecting the robot and human worlds. “The boundaries of autonomous, remote, and manual control are blurring,” says Mindell. “We’re building the navigational envelopes that allow those robots to work with precision, safety, and collaboration in human environments. The integration is where the action is.”