
RD-11.15-16.2022-Skylla-Technologies

-
Interactive transcript
FELIPE ORTIZ: Good afternoon. My name is Felipe Ortiz. I am the product manager at Skylla Technologies and we, again, specialize in people-centric mobile robo-intelligence. So we started around 2017 as a spinoff from an MIT lab focusing on robotics. Our chairman, Professor Harry Asada, is a renowned roboticist as well as an MIT professor. He's our main contact and connection to MIT.
So the challenge we're trying to tackle is providing mobile automation in an imperfect environment. So you can see it's a growing field. There's a lot of players in this field. But a lot of times they start to focus on essentially controlling the environment, stabilizing it, while in our experience these environments are much more like the image on the left, where it's very dynamic, a bit messy, and there's going to be people involved.
A lot of problems that rise up when trying to deploy robots in these environments are, for example, setting up an infrastructure. All these mobile robots can be deployed. But there's also infrastructure needed, whether that be beacons or a specific area for the robot, or even having people need to load them up.
Additionally, I want to point out the application task adaptability where these robots are usually very highly tuned for one specific task, versus a general task or being able to switch between several tasks. So here I would introduce our solution, which is the Skylla Jetstream Core. Essentially it's the brain for a robot which, when integrated with a machine, should allow it to navigate agilely and with superior accuracy.
Our technology has four major differentiators when opposed to other players in the field. The first one is our human-centric navigation. So when we teamed up with JR East, Japan Railways, back in 2017, we were able to collect a huge wealth of information from people in their train stations. This all gets fed into our models. And so our navigation is primarily focused on how people behave. And we can work in an environment that has people in it.
Secondly our end effector manipulation accuracy, essentially we can get down to 0.3 millimeter accuracy, which allows the robot to easily switch out pieces, whether it be a CNC machine, or do tasks which are very general but need precision without it being needed to be highly tuned for the specific task. Of course, vision and perception is necessary since we're going to map the area around us, detect obstacles, as well as look and identify people and their movements.
And finally, our user experience and interface, we have a no-coding-needed UI which allows anyone to set up this robot in their environment, essentially teach it the behaviors and movements easily, as well as, since it's a robot, it's going around collecting all of this data. We can provide insights and information on the end users, applications and processes and operations, all stored on-premise, so they all have control of their own data.
One customer story we'd like to share is essentially our partnership with DMG MORI. We have been working with them for about two years. And now the WH-AMR5 is actually on market. Over the last two years, we've been out in almost 30 different manufacturing plants and factories, and driven around more than 2,500 miles combined, which is enough to drive across the United States.
So I think you can imagine what sorts of unforeseen issues came up during these times. But now, we're looking to release a third product with them by the end of this year, in 2022. So essentially what we're looking for is partnerships for distribution and automation partnerships. So essentially if you're interested in getting more into the automation field and need some sort of technology, or it can be a complementary technology to what you're trying to delve into, we'd be very interested in talking with you at the booth.
Usually what our sales cycle is like is we explore the customer requirements, if you're an end user, and develop essentially a proof of concept for if it does need to have some specific task. Otherwise, you can basically use the general robotic platform that we provide and be able to plug it into whatever your end users' needs are. Again, my name is Felipe Ortiz. Contact's there in the info, and I look forward to hearing from you.
-
Interactive transcript
FELIPE ORTIZ: Good afternoon. My name is Felipe Ortiz. I am the product manager at Skylla Technologies and we, again, specialize in people-centric mobile robo-intelligence. So we started around 2017 as a spinoff from an MIT lab focusing on robotics. Our chairman, Professor Harry Asada, is a renowned roboticist as well as an MIT professor. He's our main contact and connection to MIT.
So the challenge we're trying to tackle is providing mobile automation in an imperfect environment. So you can see it's a growing field. There's a lot of players in this field. But a lot of times they start to focus on essentially controlling the environment, stabilizing it, while in our experience these environments are much more like the image on the left, where it's very dynamic, a bit messy, and there's going to be people involved.
A lot of problems that rise up when trying to deploy robots in these environments are, for example, setting up an infrastructure. All these mobile robots can be deployed. But there's also infrastructure needed, whether that be beacons or a specific area for the robot, or even having people need to load them up.
Additionally, I want to point out the application task adaptability where these robots are usually very highly tuned for one specific task, versus a general task or being able to switch between several tasks. So here I would introduce our solution, which is the Skylla Jetstream Core. Essentially it's the brain for a robot which, when integrated with a machine, should allow it to navigate agilely and with superior accuracy.
Our technology has four major differentiators when opposed to other players in the field. The first one is our human-centric navigation. So when we teamed up with JR East, Japan Railways, back in 2017, we were able to collect a huge wealth of information from people in their train stations. This all gets fed into our models. And so our navigation is primarily focused on how people behave. And we can work in an environment that has people in it.
Secondly our end effector manipulation accuracy, essentially we can get down to 0.3 millimeter accuracy, which allows the robot to easily switch out pieces, whether it be a CNC machine, or do tasks which are very general but need precision without it being needed to be highly tuned for the specific task. Of course, vision and perception is necessary since we're going to map the area around us, detect obstacles, as well as look and identify people and their movements.
And finally, our user experience and interface, we have a no-coding-needed UI which allows anyone to set up this robot in their environment, essentially teach it the behaviors and movements easily, as well as, since it's a robot, it's going around collecting all of this data. We can provide insights and information on the end users, applications and processes and operations, all stored on-premise, so they all have control of their own data.
One customer story we'd like to share is essentially our partnership with DMG MORI. We have been working with them for about two years. And now the WH-AMR5 is actually on market. Over the last two years, we've been out in almost 30 different manufacturing plants and factories, and driven around more than 2,500 miles combined, which is enough to drive across the United States.
So I think you can imagine what sorts of unforeseen issues came up during these times. But now, we're looking to release a third product with them by the end of this year, in 2022. So essentially what we're looking for is partnerships for distribution and automation partnerships. So essentially if you're interested in getting more into the automation field and need some sort of technology, or it can be a complementary technology to what you're trying to delve into, we'd be very interested in talking with you at the booth.
Usually what our sales cycle is like is we explore the customer requirements, if you're an end user, and develop essentially a proof of concept for if it does need to have some specific task. Otherwise, you can basically use the general robotic platform that we provide and be able to plug it into whatever your end users' needs are. Again, my name is Felipe Ortiz. Contact's there in the info, and I look forward to hearing from you.