
10.10.23-Showcase-Seoul-Manus Robotics

-
Video details
Startup Lightening Talk
-
Interactive transcript
FAYE WU: Hi, good afternoon. My name is Faye. I'm the co-founder and CTO of Manus Robotics. I studied mechanical engineering at MIT. My research was focused on the design and control of wearable robots. Manus Robotics was actually co-founded by my PhD advisor, Professor Harry Asada, and his graduate students, including me, to develop wearable technologies that can greatly enhance human functionality and productivity.
Today I'm going to tell you a little bit about our latest project, a wearable sensor called Hemyo can enable intuitive and effective gesture-based human-machine interaction. This project is currently being supported by the National Science Foundation. So a study in 2022 reported that, on average, we spend over seven hours a day interacting with devices around us.
That's thousands, if not more, keystrokes, button presses, screen taps, speaking to our microphone, waving at the camera, whatever we have to do to get these devices to respond the way we want it. We don't really notice all the effort and work that we have to put in during this process, because we're just so used to it, until something goes wrong. I'm sure many of us have had some experience yelling at our Alexa or Siri for misinterpreting our commands or providing unwanted responses when we didn't even talk to it.
It can be extremely frustrating when a device we rely on cannot really behave predictably, because it has trouble understanding the user's intent. And in extreme cases, when using a device is more trouble than it's worth, then these frustrations can even lead to device abandonment, one example being assistive devices for people with disabilities. Many of these use EMG or electromyography to detect very weak signals coming from the muscle to control their actions. Because electromyography signals are inherently noisy, it can be very easily corrupted, and just in general it takes a long time to learn to use well.
So it's not very surprising to see that up to 45% of these devices end up in cupboards collecting dust rather than really helping the people in need achieve better quality of life. The more recent effort and development in human-machine interface have been focusing on voice and vision-based solutions. But there are still many practical problems remain to be solved.
In particular, the privacy issue that's commonly associated with these type of solutions can be one of the biggest challenges that need to be overcome before there are general public acceptance and adoption. After all, nobody here wants to be monitored 24/7 by some company and potentially have their private data sold by that company to who knows where for even more exploitation. So with our experience designing wearable robots and sensors at MIT, we developed Hemyo as a wristband sensor that can detect hand gesture commands. And we hope this can help make device interaction more accessible, more intuitive, and safe.
With our simple API, this sensor can be very easily integrated into a range of applications, from assistive devices, industrial IoT, to smart home automation and even AR-VR. This innovative technology utilizes modified near-infrared spectroscopy to optically measure local blood flow changes during gesture performance from different layers of tissue from around the arm. So after we capture a particular blood flow pattern from a gesture during calibration, we use generative networks to create more patterns that are curated and personalized to this person, and use all of these data to help train our algorithm and convert the blood flow patterns into unique gesture codes that are unaffected by natural variations in gesture performance.
And so that's why we're able to achieve over 98% real-time gesture detection accuracy with just a 30-second calibration process. So far we're able to distinguish up to 15 gestures at the same time. A unique advantage of this sensor is that it can be actively controlled by tuning the light intensity, the pulse pattern, the receiver sensitivity, such that we can capture blood flow from deeper layer tissue even through clothes, and still get a signal that's of higher granularity and quality compared to EMG, without the extensive signal processing or user training that's required by it.
Our algorithm can also be trained 600 times faster using 1,000 times fewer memories compared to conventional solutions in this space. And that's why we're able to run everything very efficiently, just on a simple edge computing device. And best of all, there is no privacy concern when it comes to using Hemyo on a daily basis.
We have conducted field tests with impaired stroke survivors and spinal cord injury patients from eight different facilities in the greater Boston area, and all participants have responded favorably to using Hemyo to control and assist the wearable gripper to help them perform simple activities of daily living. So with this exciting result, we have started to work with prosthetics and exoskeleton manufacturers in the US and Europe to develop a version of Hemyo that can be easily integrated into their products and help them improve their user experience.
At the same time, we are engaged in conversations with companies in the AR-VR and consumer electronics areas to explore other business opportunities for this sensor. As for our activities in Asia, we are currently evaluating sensors from two Japanese companies with the goal of working together with them to develop a version of Hemyo hardware that includes their technology inside. We welcome other co-development opportunities with Korean companies as well, especially in the sensor and hardware manufacturing sector.
And of course, we're still looking for more partners and customers who are interested in adopting new types of HMI for their products. So if you're curious and would like to know more, please talk to me at my booth. Thank you.
-
Video details
Startup Lightening Talk
-
Interactive transcript
FAYE WU: Hi, good afternoon. My name is Faye. I'm the co-founder and CTO of Manus Robotics. I studied mechanical engineering at MIT. My research was focused on the design and control of wearable robots. Manus Robotics was actually co-founded by my PhD advisor, Professor Harry Asada, and his graduate students, including me, to develop wearable technologies that can greatly enhance human functionality and productivity.
Today I'm going to tell you a little bit about our latest project, a wearable sensor called Hemyo can enable intuitive and effective gesture-based human-machine interaction. This project is currently being supported by the National Science Foundation. So a study in 2022 reported that, on average, we spend over seven hours a day interacting with devices around us.
That's thousands, if not more, keystrokes, button presses, screen taps, speaking to our microphone, waving at the camera, whatever we have to do to get these devices to respond the way we want it. We don't really notice all the effort and work that we have to put in during this process, because we're just so used to it, until something goes wrong. I'm sure many of us have had some experience yelling at our Alexa or Siri for misinterpreting our commands or providing unwanted responses when we didn't even talk to it.
It can be extremely frustrating when a device we rely on cannot really behave predictably, because it has trouble understanding the user's intent. And in extreme cases, when using a device is more trouble than it's worth, then these frustrations can even lead to device abandonment, one example being assistive devices for people with disabilities. Many of these use EMG or electromyography to detect very weak signals coming from the muscle to control their actions. Because electromyography signals are inherently noisy, it can be very easily corrupted, and just in general it takes a long time to learn to use well.
So it's not very surprising to see that up to 45% of these devices end up in cupboards collecting dust rather than really helping the people in need achieve better quality of life. The more recent effort and development in human-machine interface have been focusing on voice and vision-based solutions. But there are still many practical problems remain to be solved.
In particular, the privacy issue that's commonly associated with these type of solutions can be one of the biggest challenges that need to be overcome before there are general public acceptance and adoption. After all, nobody here wants to be monitored 24/7 by some company and potentially have their private data sold by that company to who knows where for even more exploitation. So with our experience designing wearable robots and sensors at MIT, we developed Hemyo as a wristband sensor that can detect hand gesture commands. And we hope this can help make device interaction more accessible, more intuitive, and safe.
With our simple API, this sensor can be very easily integrated into a range of applications, from assistive devices, industrial IoT, to smart home automation and even AR-VR. This innovative technology utilizes modified near-infrared spectroscopy to optically measure local blood flow changes during gesture performance from different layers of tissue from around the arm. So after we capture a particular blood flow pattern from a gesture during calibration, we use generative networks to create more patterns that are curated and personalized to this person, and use all of these data to help train our algorithm and convert the blood flow patterns into unique gesture codes that are unaffected by natural variations in gesture performance.
And so that's why we're able to achieve over 98% real-time gesture detection accuracy with just a 30-second calibration process. So far we're able to distinguish up to 15 gestures at the same time. A unique advantage of this sensor is that it can be actively controlled by tuning the light intensity, the pulse pattern, the receiver sensitivity, such that we can capture blood flow from deeper layer tissue even through clothes, and still get a signal that's of higher granularity and quality compared to EMG, without the extensive signal processing or user training that's required by it.
Our algorithm can also be trained 600 times faster using 1,000 times fewer memories compared to conventional solutions in this space. And that's why we're able to run everything very efficiently, just on a simple edge computing device. And best of all, there is no privacy concern when it comes to using Hemyo on a daily basis.
We have conducted field tests with impaired stroke survivors and spinal cord injury patients from eight different facilities in the greater Boston area, and all participants have responded favorably to using Hemyo to control and assist the wearable gripper to help them perform simple activities of daily living. So with this exciting result, we have started to work with prosthetics and exoskeleton manufacturers in the US and Europe to develop a version of Hemyo that can be easily integrated into their products and help them improve their user experience.
At the same time, we are engaged in conversations with companies in the AR-VR and consumer electronics areas to explore other business opportunities for this sensor. As for our activities in Asia, we are currently evaluating sensors from two Japanese companies with the goal of working together with them to develop a version of Hemyo hardware that includes their technology inside. We welcome other co-development opportunities with Korean companies as well, especially in the sensor and hardware manufacturing sector.
And of course, we're still looking for more partners and customers who are interested in adopting new types of HMI for their products. So if you're curious and would like to know more, please talk to me at my booth. Thank you.