RD-11.15-16.2022-Manus-Robotics

Startup Exchange Video | Duration: 4:58
November 15, 2022
  • Interactive transcript
    Share

    - Hi, good afternoon. My name is Faye Wu. I studied mechanical engineering at MIT. And I co-founded Manus Robotics together with my PhD advisor, Professor Harry Asada, and along from our lab, Dr. Sheng Liu. And today I'm going to tell you a little bit about the wearable technology we're developing at Manus to empower independent and productive living.

    There are over 500 million people around the world living with self-care issues. And these people require a lot of help performing simple activities of daily living, for example, dressing and cooking. Even though there are many technologies out there to help these patients in these situations, there currently isn't a perfect solution available.

    Many of the existing assistive devices exhibit unpredictable and inconsistent behaviors, mostly due to incorrectly detecting the user's intent, which can often lead to user's frustration, aversion to technology, and eventually device abandonment. And even the most advanced technologies out there require a lot of time and effort to learn to use well, which really alienates these users who require immediate daily living assistance. And so that's why we developed Hemyo, a low-cost and compact optical sensor to detect gesture commands, such that users, especially the elderly and people with disabilities, can very easily and intuitively control various devices around them in their daily living and achieve higher levels of functionality and independence.

    So how does this innovative technology work? Well, whenever a hand gesture is performed, a particular set of muscles within the arm needs to contract to make that happen. And that results in specific blood flow change patterns locally within the arm. And we utilize an array of near-infrared sensors to detect these local blood flow change patterns and use our algorithm to convert them into unique gesture codes that are unaffected by natural variations in gesture performance.

    And that's why we're able to achieve over 98.8% gesture detection accuracy with just a very short calibration sequence, without involving extensive user training. A unique advantage of our Hemyo sensor is that it can be actively controlled by tuning the light intensity, the pulsing pattern, the receiver sensitivity, such that we can capture blood flow activities from deeper layer tissue, even through clothes, and still get a very clean signal back.

    Other technologies that can be used to detect gesture commands and control devices are limited by the fundamental nature of the signals being obtained. Electromyography or EMG, for example, measures very weak electrical signals being sent to activate muscles. And these signals are inherently noisy, very easily corrupted, and generally not super-consistent. And in order to use these for the intended purpose, a lot of training and processing are needed.

    Our Hemyo sensor, on the other hand, requires minimal training and processing. Our algorithm can also perform fast learning and even life adaptation. And we believe this may help us even capture new customers who previously cannot effectively operate devices operated by these existing means.

    So we have conducted field tests with patients from eight different facilities in the greater Boston area. And all participants have responded favorably to our sensor. In particular, we were able to capture clear and repeatable signals from their impaired limbs and use that to control a simple assistive device to do tasks that you can see over here in this picture, opening a water bottle. That might seem trivial to us, but that's a really big deal to people who have not been able to do that for over 15 years because of disease or injury.

    At the same time, we're also very interested in exploring business opportunities in the consumer gesture control application area. And in that pursuit, we have already started talking to several customers, several companies, and sent our sensor to them for evaluation and feedback. In addition, we believe our sensor data may help with the rehab progress as well, since during the field test, we have observed several patients asked us to try the sensor a few more times after we showed them a visualization of the data, because apparently they wanted to get better numbers. And so we think this may help encourage them to do more exercises with their impaired limb and eventually help quantify their rehab progress.

    And so our business strategy is to license our IP to interested parties or sell our sensor as a unit component to manufacturers. And right now we're looking for pilot and co-development partners in several different areas, primarily in the health care assistive device, in the medical device area, and also we welcome opportunities both globally and in the US in other areas, for example, AR/VR, smarthome device control, and computer interface. Thank you for listening and please stop by our booth to see a demo of our sensor.

    [APPLAUSE]

  • Interactive transcript
    Share

    - Hi, good afternoon. My name is Faye Wu. I studied mechanical engineering at MIT. And I co-founded Manus Robotics together with my PhD advisor, Professor Harry Asada, and along from our lab, Dr. Sheng Liu. And today I'm going to tell you a little bit about the wearable technology we're developing at Manus to empower independent and productive living.

    There are over 500 million people around the world living with self-care issues. And these people require a lot of help performing simple activities of daily living, for example, dressing and cooking. Even though there are many technologies out there to help these patients in these situations, there currently isn't a perfect solution available.

    Many of the existing assistive devices exhibit unpredictable and inconsistent behaviors, mostly due to incorrectly detecting the user's intent, which can often lead to user's frustration, aversion to technology, and eventually device abandonment. And even the most advanced technologies out there require a lot of time and effort to learn to use well, which really alienates these users who require immediate daily living assistance. And so that's why we developed Hemyo, a low-cost and compact optical sensor to detect gesture commands, such that users, especially the elderly and people with disabilities, can very easily and intuitively control various devices around them in their daily living and achieve higher levels of functionality and independence.

    So how does this innovative technology work? Well, whenever a hand gesture is performed, a particular set of muscles within the arm needs to contract to make that happen. And that results in specific blood flow change patterns locally within the arm. And we utilize an array of near-infrared sensors to detect these local blood flow change patterns and use our algorithm to convert them into unique gesture codes that are unaffected by natural variations in gesture performance.

    And that's why we're able to achieve over 98.8% gesture detection accuracy with just a very short calibration sequence, without involving extensive user training. A unique advantage of our Hemyo sensor is that it can be actively controlled by tuning the light intensity, the pulsing pattern, the receiver sensitivity, such that we can capture blood flow activities from deeper layer tissue, even through clothes, and still get a very clean signal back.

    Other technologies that can be used to detect gesture commands and control devices are limited by the fundamental nature of the signals being obtained. Electromyography or EMG, for example, measures very weak electrical signals being sent to activate muscles. And these signals are inherently noisy, very easily corrupted, and generally not super-consistent. And in order to use these for the intended purpose, a lot of training and processing are needed.

    Our Hemyo sensor, on the other hand, requires minimal training and processing. Our algorithm can also perform fast learning and even life adaptation. And we believe this may help us even capture new customers who previously cannot effectively operate devices operated by these existing means.

    So we have conducted field tests with patients from eight different facilities in the greater Boston area. And all participants have responded favorably to our sensor. In particular, we were able to capture clear and repeatable signals from their impaired limbs and use that to control a simple assistive device to do tasks that you can see over here in this picture, opening a water bottle. That might seem trivial to us, but that's a really big deal to people who have not been able to do that for over 15 years because of disease or injury.

    At the same time, we're also very interested in exploring business opportunities in the consumer gesture control application area. And in that pursuit, we have already started talking to several customers, several companies, and sent our sensor to them for evaluation and feedback. In addition, we believe our sensor data may help with the rehab progress as well, since during the field test, we have observed several patients asked us to try the sensor a few more times after we showed them a visualization of the data, because apparently they wanted to get better numbers. And so we think this may help encourage them to do more exercises with their impaired limb and eventually help quantify their rehab progress.

    And so our business strategy is to license our IP to interested parties or sell our sensor as a unit component to manufacturers. And right now we're looking for pilot and co-development partners in several different areas, primarily in the health care assistive device, in the medical device area, and also we welcome opportunities both globally and in the US in other areas, for example, AR/VR, smarthome device control, and computer interface. Thank you for listening and please stop by our booth to see a demo of our sensor.

    [APPLAUSE]

    Download Transcript