5.10.23-Ecosystem-Manus-Robotics

Startup Exchange Video | Duration: 5:01
May 10, 2023
  • Interactive transcript
    Share

    FAYE WU: Hi. Good afternoon. My name is Faye. I studied mechanical engineering at MIT, receiving my bachelor's, master's, and PhD from this wonderful institute. Manus Robotics was cofounded by my PhD advisor, Professor Harry Asada, and his PhD students, including me, to develop wearable technologies that can empower independent functional living.

    Today, I'm going to tell you a little bit about our latest project, a wearable sensor called Hemyo that can detect hand gestures. This project has recently won an SBIR phase II award from the National Science Foundation.

    A 2022 report actually told us that on average, we spend more than seven hours a day interacting with various devices around us. We don't notice the millions or even billions of keystrokes, button taps, that's involved in this process because the resulting device response is predictable and reliable, so these actions are just seamlessly integrated into our lives.

    But when there is a problem with Human-Machine Interface, or HMI for short, it can be extremely frustrating when the machine just starts to behave unpredictably because it doesn't understand the user's actual intent. I'm sure many of us have had some experience yelling at our Alexa or Siri for misunderstanding what our command is or giving an unprompted response.

    And in extreme cases, this bad user experience caused by HMI issues can even lead to device abandonment. One example is assistive device, which has reported an abandoned rate of 35% to 45%. The latest advancements in HMI technologies have been focusing on voice- or vision-based solutions, but there are still many problems remain to be solved, especially the privacy issue. In addition to that, mounting cameras or microphones all over the place can be quite challenging for people with disabilities or the elderly, so it can really limit their access to more advanced HMI that's like this.

    So we developed Hemyo as a gesture recognition wearable technology. The user can wear on their wrist or somewhere on their forearm to very intuitively control devices for different types of applications, from assistive technology to AR/VR, smart home automation, or even industrial IoT.

    So this innovative technology employs an array of near infrared sensors to optically measure local blood flow changes caused by muscle contractions from different layers of the tissue. Our patent-pending algorithm converts these unique blood flow patterns into gesture codes that are unaffected by natural variations in gesture performance. So far, we're able to distinguish up to 15 gestures at the same time and achieve an accuracy of 98% with just a 30-second calibration sequence.

    A unique advantage of this sensor is that it can be actively controlled by tuning the light intensity, the receiver sensitivity, and pulse patterns such that we can capture blood flow activity from deeper layer tissue and even through clothes. Compared to the industry standard for measuring muscle activities, which is electromyography or EMG, Hemyo can obtain signals that are of greater granularity and quality without going through the extensive signal processing that's required for EMG. Our algorithm can also require a much shorter training time and much smaller memory space compared to the conventional machine learning based pattern recognition approach, so we're able to run everything very efficiently on just a small edge computing device.

    So to field test this device, we built a Hemyo-enabled wearable gripper to help stroke survivors perform activities of daily living. We tested this involving patients from eight Boston-area facilities and received favorable and positive feedback from all the participants with regard to the control of the device.

    We plan to further develop this wearable gripper into a multifunctional Hemyo-enabled assistive robot in the future. But at the same time, we're also engaged in conversations with several companies to discuss ways Hemyo can be integrated into their products for industrial and consumer gesture recognition applications. So towards that end, we have already sent a few sensors to these companies for evaluation.

    So we're looking for more collaborators and partners in different areas and-- to help us further explore the usability and functionality of sensors. So thank you for listening. And I really hope we can achieve a future where you can just control any device around us, a TV, a robot, or a Roomba, with just a snap of a finger. Thank you.

    [APPLAUSE]

  • Interactive transcript
    Share

    FAYE WU: Hi. Good afternoon. My name is Faye. I studied mechanical engineering at MIT, receiving my bachelor's, master's, and PhD from this wonderful institute. Manus Robotics was cofounded by my PhD advisor, Professor Harry Asada, and his PhD students, including me, to develop wearable technologies that can empower independent functional living.

    Today, I'm going to tell you a little bit about our latest project, a wearable sensor called Hemyo that can detect hand gestures. This project has recently won an SBIR phase II award from the National Science Foundation.

    A 2022 report actually told us that on average, we spend more than seven hours a day interacting with various devices around us. We don't notice the millions or even billions of keystrokes, button taps, that's involved in this process because the resulting device response is predictable and reliable, so these actions are just seamlessly integrated into our lives.

    But when there is a problem with Human-Machine Interface, or HMI for short, it can be extremely frustrating when the machine just starts to behave unpredictably because it doesn't understand the user's actual intent. I'm sure many of us have had some experience yelling at our Alexa or Siri for misunderstanding what our command is or giving an unprompted response.

    And in extreme cases, this bad user experience caused by HMI issues can even lead to device abandonment. One example is assistive device, which has reported an abandoned rate of 35% to 45%. The latest advancements in HMI technologies have been focusing on voice- or vision-based solutions, but there are still many problems remain to be solved, especially the privacy issue. In addition to that, mounting cameras or microphones all over the place can be quite challenging for people with disabilities or the elderly, so it can really limit their access to more advanced HMI that's like this.

    So we developed Hemyo as a gesture recognition wearable technology. The user can wear on their wrist or somewhere on their forearm to very intuitively control devices for different types of applications, from assistive technology to AR/VR, smart home automation, or even industrial IoT.

    So this innovative technology employs an array of near infrared sensors to optically measure local blood flow changes caused by muscle contractions from different layers of the tissue. Our patent-pending algorithm converts these unique blood flow patterns into gesture codes that are unaffected by natural variations in gesture performance. So far, we're able to distinguish up to 15 gestures at the same time and achieve an accuracy of 98% with just a 30-second calibration sequence.

    A unique advantage of this sensor is that it can be actively controlled by tuning the light intensity, the receiver sensitivity, and pulse patterns such that we can capture blood flow activity from deeper layer tissue and even through clothes. Compared to the industry standard for measuring muscle activities, which is electromyography or EMG, Hemyo can obtain signals that are of greater granularity and quality without going through the extensive signal processing that's required for EMG. Our algorithm can also require a much shorter training time and much smaller memory space compared to the conventional machine learning based pattern recognition approach, so we're able to run everything very efficiently on just a small edge computing device.

    So to field test this device, we built a Hemyo-enabled wearable gripper to help stroke survivors perform activities of daily living. We tested this involving patients from eight Boston-area facilities and received favorable and positive feedback from all the participants with regard to the control of the device.

    We plan to further develop this wearable gripper into a multifunctional Hemyo-enabled assistive robot in the future. But at the same time, we're also engaged in conversations with several companies to discuss ways Hemyo can be integrated into their products for industrial and consumer gesture recognition applications. So towards that end, we have already sent a few sensors to these companies for evaluation.

    So we're looking for more collaborators and partners in different areas and-- to help us further explore the usability and functionality of sensors. So thank you for listening. And I really hope we can achieve a future where you can just control any device around us, a TV, a robot, or a Roomba, with just a snap of a finger. Thank you.

    [APPLAUSE]

    Download Transcript