
10.5.23-Showcase-Tokyo-Ubicept

-
Video details
Startup Lightening Talk
-
Interactive transcript
SEBASTIAN BAUER: Thank you very much. My name is Sebastian Bauer. I'm co-founder and CEO of Ubicept. And we make computer vision and image processing solutions that operate very well in all kinds of environments.
I'm sure many of you guys are dealing with some kind of perception, mostly camera based. And you've run into challenging problems, like these that you see here. Driver monitoring is, for example, very challenging in low light, detecting the viewing direction of people. Pedestrian detection in the dark is challenging, especially when there is a fast-moving car. And also industrial manufacturing, QR code detection, all these kinds of problems are solved in a very suboptimal way by existing camera solutions.
However, what we can do is actually make that perception better. And I cannot go into all these details for now. But we actually utilize an emerging class of new image sensors and build processing on top of that.
So we can see, actually, much better in low light. We can have very reliable driver monitoring results. We can detect all the pedestrians at night. And also, the QR code detection works really well.
What we can do, just to drive home that point, is operate very well in all three challenging environments. So fast motion is a problem for existing camera sensors. What we have here as a bar is existing camera sensors, CMOS sensors. And you can improve their output with a little bit of software for motion deblurring, for example.
I'm sure many of you guys have heard of event cameras. They have really extremely high frame rate. But they have other downsides.
Human vision is this bar. And this is a single photon sensitive sensor called SPAD, Single Photon Avalanche Diode, for short. And this is the improvement that Ubicept delivers on top of that.
The second problem is low light. And note that this is 1 over lux. So a high bar is even better in low-light environments.
We have CMOS sensors. This is where event cameras really fall short in low-light environments. Humans can see kind of OK-ish, depending on the adaption to low light. And single photon sensors, by nature, are very uniquely positioned to see well in low light.
The third problem is dynamic range. So when you think of single photon sensing, it's not just about seeing well in low light but also seeing very well in extremely bright light. This is also where single photon perception with our processing on top really shines.
And this white line, maybe you have wondered about that. This is what we call the reliable AI threshold. For example, for car applications, it is extremely important to see with about 1,000 frames per second, 1 millisecond of motion blur.
That's pretty good already. See very well in low light, 10 millilux or even 1 millilux, and having a dynamic range of at least 120 dB, maybe even better. And it's really important to drive home the point, we can excel in all these three environments at the same time.
Let's take a look at how it works in practice and what the outcome is. This is something we set up in our lab. We have a disk that we can make spin.
There's two cameras, a single photon sensitive camera and a low-light camera for comparison. And we can adjust the brightness levels and also make that disk spin.
So when there is still some light, 1 lux is actually already pretty dark. But it's somewhat OK-ish. Then the low-light camera can provide an output that is good enough for an off-the-shelf object detector to work reliably on that.
But this disk is spinning faster now. And then you see that the low-light sensor has really a lot of problems dealing with the motion blur, whereas our single-photon-based sensing is still operating very reliably.
This works also even better in much lower light. So now we're going down to 100 millilux, with extremely fast speed. And the low-light camera almost always fails to provide reliable detection, whereas here, that is still very, very high quality.
Then driving home the point, this is not just for low light and fast motion but also huge dynamic range. So what we do now is we have the cell phone flashlight. And the camera settings haven't been changed between all these video sequences. So even though the camera is moving very fast and shining directly into the-- sorry, the flashlight is moving very fast and shining directly into the camera, there is no saturation on the single photon sensor, whereas with the low-light camera, you see that there is a lot of saturated and white image regions. This one is actually, by the way, simulating a sunrise.
More practical use case is license plate recognition. On the left, low-light camera. On the right, single-photon-sensitive camera, in this case, with motion deblurring. And here, you can see that license plate extremely reliably. But you also can actually detect the make and model of a car, which is actually more challenging for surveillance/security camera applications.
And right now, where we are is you can engage with us in many different ways. We have an evaluation kit available. We focus on the processing side. The camera is made by somebody else.
Just if you have some idea of camera perception, these parameters are outstanding because we can have 140 dB dynamic range at up to 500 frames per second. This is really outstanding because usually with a higher frame rate, the dynamic range goes down. And this is all made possible by pretty much detecting light in fundamentally the best way.
Right now, we see really good feedback from companies that make moving platforms in uncontrollable environments-- cars, trucks, drones. We are already engaged with a couple of players, major players in Japan that actually work on POCs with us and also test our evaluation kit. If you have challenging use cases in imaging, please don't hesitate to reach out. We are open to joint development projects, POCs, and also evaluation kit sales in many different scenarios. Surveillance, automotive, robotics, surveying, and mapping is something that we can all do extremely well.
So if you're interested, please stop by our booth. I have the camera with me and can show the perception in life. And also, we have many, many more use cases on our web page. When you go to ubicept.com/blog, there's a couple of interesting demo videos. And we continuously update that with new videos. Thank you very much.
[APPLAUSE]
-
Video details
Startup Lightening Talk
-
Interactive transcript
SEBASTIAN BAUER: Thank you very much. My name is Sebastian Bauer. I'm co-founder and CEO of Ubicept. And we make computer vision and image processing solutions that operate very well in all kinds of environments.
I'm sure many of you guys are dealing with some kind of perception, mostly camera based. And you've run into challenging problems, like these that you see here. Driver monitoring is, for example, very challenging in low light, detecting the viewing direction of people. Pedestrian detection in the dark is challenging, especially when there is a fast-moving car. And also industrial manufacturing, QR code detection, all these kinds of problems are solved in a very suboptimal way by existing camera solutions.
However, what we can do is actually make that perception better. And I cannot go into all these details for now. But we actually utilize an emerging class of new image sensors and build processing on top of that.
So we can see, actually, much better in low light. We can have very reliable driver monitoring results. We can detect all the pedestrians at night. And also, the QR code detection works really well.
What we can do, just to drive home that point, is operate very well in all three challenging environments. So fast motion is a problem for existing camera sensors. What we have here as a bar is existing camera sensors, CMOS sensors. And you can improve their output with a little bit of software for motion deblurring, for example.
I'm sure many of you guys have heard of event cameras. They have really extremely high frame rate. But they have other downsides.
Human vision is this bar. And this is a single photon sensitive sensor called SPAD, Single Photon Avalanche Diode, for short. And this is the improvement that Ubicept delivers on top of that.
The second problem is low light. And note that this is 1 over lux. So a high bar is even better in low-light environments.
We have CMOS sensors. This is where event cameras really fall short in low-light environments. Humans can see kind of OK-ish, depending on the adaption to low light. And single photon sensors, by nature, are very uniquely positioned to see well in low light.
The third problem is dynamic range. So when you think of single photon sensing, it's not just about seeing well in low light but also seeing very well in extremely bright light. This is also where single photon perception with our processing on top really shines.
And this white line, maybe you have wondered about that. This is what we call the reliable AI threshold. For example, for car applications, it is extremely important to see with about 1,000 frames per second, 1 millisecond of motion blur.
That's pretty good already. See very well in low light, 10 millilux or even 1 millilux, and having a dynamic range of at least 120 dB, maybe even better. And it's really important to drive home the point, we can excel in all these three environments at the same time.
Let's take a look at how it works in practice and what the outcome is. This is something we set up in our lab. We have a disk that we can make spin.
There's two cameras, a single photon sensitive camera and a low-light camera for comparison. And we can adjust the brightness levels and also make that disk spin.
So when there is still some light, 1 lux is actually already pretty dark. But it's somewhat OK-ish. Then the low-light camera can provide an output that is good enough for an off-the-shelf object detector to work reliably on that.
But this disk is spinning faster now. And then you see that the low-light sensor has really a lot of problems dealing with the motion blur, whereas our single-photon-based sensing is still operating very reliably.
This works also even better in much lower light. So now we're going down to 100 millilux, with extremely fast speed. And the low-light camera almost always fails to provide reliable detection, whereas here, that is still very, very high quality.
Then driving home the point, this is not just for low light and fast motion but also huge dynamic range. So what we do now is we have the cell phone flashlight. And the camera settings haven't been changed between all these video sequences. So even though the camera is moving very fast and shining directly into the-- sorry, the flashlight is moving very fast and shining directly into the camera, there is no saturation on the single photon sensor, whereas with the low-light camera, you see that there is a lot of saturated and white image regions. This one is actually, by the way, simulating a sunrise.
More practical use case is license plate recognition. On the left, low-light camera. On the right, single-photon-sensitive camera, in this case, with motion deblurring. And here, you can see that license plate extremely reliably. But you also can actually detect the make and model of a car, which is actually more challenging for surveillance/security camera applications.
And right now, where we are is you can engage with us in many different ways. We have an evaluation kit available. We focus on the processing side. The camera is made by somebody else.
Just if you have some idea of camera perception, these parameters are outstanding because we can have 140 dB dynamic range at up to 500 frames per second. This is really outstanding because usually with a higher frame rate, the dynamic range goes down. And this is all made possible by pretty much detecting light in fundamentally the best way.
Right now, we see really good feedback from companies that make moving platforms in uncontrollable environments-- cars, trucks, drones. We are already engaged with a couple of players, major players in Japan that actually work on POCs with us and also test our evaluation kit. If you have challenging use cases in imaging, please don't hesitate to reach out. We are open to joint development projects, POCs, and also evaluation kit sales in many different scenarios. Surveillance, automotive, robotics, surveying, and mapping is something that we can all do extremely well.
So if you're interested, please stop by our booth. I have the camera with me and can show the perception in life. And also, we have many, many more use cases on our web page. When you go to ubicept.com/blog, there's a couple of interesting demo videos. And we continuously update that with new videos. Thank you very much.
[APPLAUSE]