10.3.23-Showcase-Osaka-Manus Robotics

Startup Exchange Video | Duration: 6:46
October 3, 2023
  • Interactive transcript
    Share

    FAYE WU: Good afternoon, everyone. I'm Faye. I'm the CEO and co-founder of Manus Robotics. As you can see over here, I got all of my degrees from MIT. There's actually a term for people like me that never leave. It's called a lifer. And of course, there's another tier above this is the super-lifer who just become professors of this. I'm not a super-lifer yet, but we'll see.

    So Manus Robotics was founded by my PhD advisor Harry Asada, and [INAUDIBLE] students, including me, to develop wearable technologies that can greatly enhance human functionality and productivity. So today I'm going to tell you a little bit about our latest project, a wearable sensor called Hemyo that can enable intuitive and effective gesture-based human-machine interaction. This project is currently being supported by the National Science Foundation.

    So a study in 2022 reported that on average, we spend over seven hours a day interacting with various devices around us. We don't really notice all the work and effort involved in controlling these devices, because we're just now really used to it, until something goes wrong, right? So I'm sure many of us have had some experience yelling at our Alexa or Siri for misinterpreting our commands, or giving unwanted responses when we didn't even talk to it.

    And it can be extremely frustrating when a device we rely on cannot behave very predictably, just because it has trouble understanding the user's intent. And in extreme cases, when using a device is more trouble than it's worth, these frustrations can even lead to device abandonment, one example being assistive devices for people with disabilities. Many of these use EMG or electromyography to measure weak electrical signals coming from the muscle to control the actions of these devices.

    But the EMG signals are inherently noisy, can be very easily corrupted, and just in general take a long time to learn to use well. So it's not very surprising to see that over 45% of these devices end up sitting in cupboards collecting dust, instead of really helping people improve their quality of life. The latest developments in human-machine interface have been focusing on voice and vision-based solutions, because of the rise of AI.

    But there are still many practical problems remain to be solved, in particular, the privacy issue commonly associated with solutions like these can be a huge hurdle, in terms of their general acceptance and adoption. After all, nobody really wants to be monitored 24/7 and have the possibility of these companies selling your private data to who knows where for even more exploitation.

    So with our experience developing wearable robots and sensors at MIT, we developed Hemyo as a low cost and compact wristband sensor that can detect gesture commands, and hopefully can help improve device interaction and make it more accessible, intuitive, and safe. We have a simple API to enable the sensor to be easily integrated into many different applications, ranging from assistive devices, industrial IoT, to smart home automation, and even AR/VR.

    This innovative technology utilizes modified near-infrared spectroscopy to optically measure local blood flow changes caused by muscle contractions from different layers of tissue around the arm. After we capture a particular set of blood flow patterns from a gesture during the calibration process, we use generative networks to create a personalized and curated pattern, so similar to what we just captured, and use all of that to help train our algorithm and create a unique gesture code that's unaffected by natural variations in gesture performance.

    As a result, we're able to achieve over 98% gesture detection accuracy with just a 30 second calibration. And so far we can distinguish up to 15 gestures at the same time. A unique advantage of this sensor is that it can be accurately controlled by tuning the light intensity, the pulse pattern, the receiver sensitivity, such that we can capture blood flow from deeper layer tissue, even through clothes, and still get a signal that's of greater granularity and quality compared to EMG, and without the extensive signal processing or user training that's required by EMG.

    Our algorithm can also be trained 600 times faster and uses 1,000 times fewer memory space compared to conventional machine learning solutions in this space. And so that's why we're able to run everything very efficiently on the server, small edge computing device. And the best of all this, there is no privacy concern associated with using our sensor.

    So we field tested with impaired stroke survivors and spinal cord injury patients from eight medical facilities in the greater Boston area by asking them to use a Hemyo-enabled assistive wearable gripper to help them do daily living activities. And all participants have responded favorably to this test. And with this exciting result, we have started to work with top prosthetics and exoskeleton manufacturers in Europe and US to develop a version of Hemyo that can be easily integrated into their products.

    At the same time we are involved in conversations with companies in AR/VR and consumer electronics to explore other business opportunities for this season. As for our activities in Japan, we are currently evaluating sensors from two Japanese companies with the goal of working together with them and co-develop an upgraded version of the Hemyo sensor hardware with their technologies.

    And we would like to engage in more collaboration opportunities with other Japanese companies, particularly in the sensor and hardware manufacturing sectors. And, of course, we're looking for more partners and customers who are interested in adopting new types of HMI for their products or their processes. So thanks for listening. And if you are interested or curious to learn more about us, please talk to me after the session. Thank you.

  • Interactive transcript
    Share

    FAYE WU: Good afternoon, everyone. I'm Faye. I'm the CEO and co-founder of Manus Robotics. As you can see over here, I got all of my degrees from MIT. There's actually a term for people like me that never leave. It's called a lifer. And of course, there's another tier above this is the super-lifer who just become professors of this. I'm not a super-lifer yet, but we'll see.

    So Manus Robotics was founded by my PhD advisor Harry Asada, and [INAUDIBLE] students, including me, to develop wearable technologies that can greatly enhance human functionality and productivity. So today I'm going to tell you a little bit about our latest project, a wearable sensor called Hemyo that can enable intuitive and effective gesture-based human-machine interaction. This project is currently being supported by the National Science Foundation.

    So a study in 2022 reported that on average, we spend over seven hours a day interacting with various devices around us. We don't really notice all the work and effort involved in controlling these devices, because we're just now really used to it, until something goes wrong, right? So I'm sure many of us have had some experience yelling at our Alexa or Siri for misinterpreting our commands, or giving unwanted responses when we didn't even talk to it.

    And it can be extremely frustrating when a device we rely on cannot behave very predictably, just because it has trouble understanding the user's intent. And in extreme cases, when using a device is more trouble than it's worth, these frustrations can even lead to device abandonment, one example being assistive devices for people with disabilities. Many of these use EMG or electromyography to measure weak electrical signals coming from the muscle to control the actions of these devices.

    But the EMG signals are inherently noisy, can be very easily corrupted, and just in general take a long time to learn to use well. So it's not very surprising to see that over 45% of these devices end up sitting in cupboards collecting dust, instead of really helping people improve their quality of life. The latest developments in human-machine interface have been focusing on voice and vision-based solutions, because of the rise of AI.

    But there are still many practical problems remain to be solved, in particular, the privacy issue commonly associated with solutions like these can be a huge hurdle, in terms of their general acceptance and adoption. After all, nobody really wants to be monitored 24/7 and have the possibility of these companies selling your private data to who knows where for even more exploitation.

    So with our experience developing wearable robots and sensors at MIT, we developed Hemyo as a low cost and compact wristband sensor that can detect gesture commands, and hopefully can help improve device interaction and make it more accessible, intuitive, and safe. We have a simple API to enable the sensor to be easily integrated into many different applications, ranging from assistive devices, industrial IoT, to smart home automation, and even AR/VR.

    This innovative technology utilizes modified near-infrared spectroscopy to optically measure local blood flow changes caused by muscle contractions from different layers of tissue around the arm. After we capture a particular set of blood flow patterns from a gesture during the calibration process, we use generative networks to create a personalized and curated pattern, so similar to what we just captured, and use all of that to help train our algorithm and create a unique gesture code that's unaffected by natural variations in gesture performance.

    As a result, we're able to achieve over 98% gesture detection accuracy with just a 30 second calibration. And so far we can distinguish up to 15 gestures at the same time. A unique advantage of this sensor is that it can be accurately controlled by tuning the light intensity, the pulse pattern, the receiver sensitivity, such that we can capture blood flow from deeper layer tissue, even through clothes, and still get a signal that's of greater granularity and quality compared to EMG, and without the extensive signal processing or user training that's required by EMG.

    Our algorithm can also be trained 600 times faster and uses 1,000 times fewer memory space compared to conventional machine learning solutions in this space. And so that's why we're able to run everything very efficiently on the server, small edge computing device. And the best of all this, there is no privacy concern associated with using our sensor.

    So we field tested with impaired stroke survivors and spinal cord injury patients from eight medical facilities in the greater Boston area by asking them to use a Hemyo-enabled assistive wearable gripper to help them do daily living activities. And all participants have responded favorably to this test. And with this exciting result, we have started to work with top prosthetics and exoskeleton manufacturers in Europe and US to develop a version of Hemyo that can be easily integrated into their products.

    At the same time we are involved in conversations with companies in AR/VR and consumer electronics to explore other business opportunities for this season. As for our activities in Japan, we are currently evaluating sensors from two Japanese companies with the goal of working together with them and co-develop an upgraded version of the Hemyo sensor hardware with their technologies.

    And we would like to engage in more collaboration opportunities with other Japanese companies, particularly in the sensor and hardware manufacturing sectors. And, of course, we're looking for more partners and customers who are interested in adopting new types of HMI for their products or their processes. So thanks for listening. And if you are interested or curious to learn more about us, please talk to me after the session. Thank you.

    Download Transcript