10.10.23-Showcase-Seoul-Ubicept

Startup Exchange Video | Duration: 7:01
October 10, 2023
  • Interactive transcript
    Share

    SEBASTIAN BAUER: Hello. Good afternoon. My name is Sebastian Bauer. I'm the cofounder and CEO of Ubicept. We are five cofounders, and three of which are associated with MIT professor, former postdoc, and former PhD student.

    And what we are doing is we focus on the processing side of data being generated by a new class of image sensor. And if you're working in any kind of imaging, you probably aware-- are aware of these problems. So on the left, you see a driver monitoring scenario in the dark. So detecting the face and the viewing direction of that person is extremely challenging or even impossible.

    In the center, we have a pedestrian detection example. And on the right, QR code detection. So being able to detect QR codes reliably, everything else associated with manufacturing, quality control, and so on, these are challenging scenarios, especially when there is low light in combination with fast motion.

    And this is the difference that our perception system actually makes. So we can have reliable driver monitoring in low-light scenarios. We can detect pedestrians extremely reliably, no matter if it's a fast moving, low light, and also very bright light. And you see that this QR code on the right is detected very reliably.

    In a nutshell, what we are doing for comparison is we operate in all environments. So seeing when there is fast motion involved is very challenging. And what we have here, for comparison, is CMOS sensors on the left. That's the small bar. And that can be improved with some software.

    On top, detecting very fast motion is kind of the wheelhouse of event sensors, if you're familiar with them. So they can read out very fast and track very fast motion. And we also have human vision and a new class of image sensors on the right called SPAD, Single-Photon Avalanche Diode, with Ubicept processing on top of that.

    So our wheelhouse, our expertise really is the processing. And you can see that there is this white line that we call the reliable AI threshold. So that means in order for an AI system, an AI-based system that builds on the cameras to operate reliably in all environments, you want to be higher than this bar. And you see that this new class of sensor with Ubicept processing on top is actually exceeding that requirement.

    The same holds for low-light applications. There, you should watch out. This is 1 over lux. So a higher bar is actually better. And you see that event cameras that really shine in fast-moving scenes have significant disadvantages when it comes to low-light imaging. And for Ubicept perception, we are also above that reliable AI threshold.

    And the third problem is dynamic range. So the ability to resolve super bright and super dark parts of a scene at the same time, also being able to adjust to varying brightness conditions very fast is something that our processing can do extremely well. So you see that the new class of image sensor with Ubicept processing on top of that is the only solution that can reliably operate in all these environments. And that's important to highlight, at the same time, without compromising on one versus the other.

    How does it look like? That's, of course, very important when you talk about perception and camera imaging. What we have here is a disk with a couple of toy models glued to it. And then there is an output from a low-light camera with Ubicept processing, the Ubicept solution, and the conventional low-light camera.

    And we can make the disk spin at 1,000 millilux, about 1 lux. The object detector that is running on the video output can still detect these objects on the right. That is perfectly fine. But now the disk will spin faster. So we have very fast motion involved. And you can see that on the right computer screen, these detections actually become much more sparse.

    And when we look at even lower light, 100 millilux in this scenario, it's almost so dark that you can't see it on that video. Actually, the conventional low-light camera fails to provide reliable detections almost all the time, whereas with Ubicept processing, this is always nicely visible.

    This, again, is low light in combination with fast motion. You might wonder, OK, when you talk about a single-photon camera-- and I'm happy to explain the details to you-- what about bright light? And it turns, actually, out that these cameras are extremely good with our processing also in bright light.

    So this is with autoexposure turned off. And now we have the bright flashlight actually shining into the camera. And the conventional sensor on the right is-- sorry-- fully saturated, as you can see here in this example on the right-- so completely washed out image-- whereas on the left, there is a crisp image that actually leads to a reliable detection.

    Looking at a more realistic use case out in the wild, license plate recognition is very challenging because of the problems that we talked about. So on the left, there is a lot of motion blur. It's impossible to detect that license plate, not to mention actually the make and model of the car, which is something that is very important for surveillance and security applications.

    And how can you work with us? We have an evaluation kit available that can provide parameters in all these three domains-- super high frame rate-- over 500 frames per second with 140 dB dynamic range in each frame, which is kind of outstanding. No other camera can do that. And of course, we can see extremely well in low light.

    Right now, we are working with-- mostly with companies that make moving platforms in uncontrollable environments-- cars, trucks, drones, helicopters, airplanes, and so on. Most of them are from the automotive industry. But we are also very open to applications and industrial inspection, quality assurance, logistics, and many, many more.

    So you can, for example, purchase the evaluation kit. Or we can work together on a PLC. And as you probably guess, perception and cameras are being used in many, many different applications. We touched on mobility, surveillance, defense, quality assurance, surveying, and mapping. We have a nice example for that.

    So if you're working in any of these spaces-- and I'm sure you have unsolved perception problems, particularly based on cameras-- then please don't hesitate to reach out. I'm at my booth. I have a couple more demo videos available and also can show the camera to you. Thank you very much.

    [APPLAUSE]

  • Interactive transcript
    Share

    SEBASTIAN BAUER: Hello. Good afternoon. My name is Sebastian Bauer. I'm the cofounder and CEO of Ubicept. We are five cofounders, and three of which are associated with MIT professor, former postdoc, and former PhD student.

    And what we are doing is we focus on the processing side of data being generated by a new class of image sensor. And if you're working in any kind of imaging, you probably aware-- are aware of these problems. So on the left, you see a driver monitoring scenario in the dark. So detecting the face and the viewing direction of that person is extremely challenging or even impossible.

    In the center, we have a pedestrian detection example. And on the right, QR code detection. So being able to detect QR codes reliably, everything else associated with manufacturing, quality control, and so on, these are challenging scenarios, especially when there is low light in combination with fast motion.

    And this is the difference that our perception system actually makes. So we can have reliable driver monitoring in low-light scenarios. We can detect pedestrians extremely reliably, no matter if it's a fast moving, low light, and also very bright light. And you see that this QR code on the right is detected very reliably.

    In a nutshell, what we are doing for comparison is we operate in all environments. So seeing when there is fast motion involved is very challenging. And what we have here, for comparison, is CMOS sensors on the left. That's the small bar. And that can be improved with some software.

    On top, detecting very fast motion is kind of the wheelhouse of event sensors, if you're familiar with them. So they can read out very fast and track very fast motion. And we also have human vision and a new class of image sensors on the right called SPAD, Single-Photon Avalanche Diode, with Ubicept processing on top of that.

    So our wheelhouse, our expertise really is the processing. And you can see that there is this white line that we call the reliable AI threshold. So that means in order for an AI system, an AI-based system that builds on the cameras to operate reliably in all environments, you want to be higher than this bar. And you see that this new class of sensor with Ubicept processing on top is actually exceeding that requirement.

    The same holds for low-light applications. There, you should watch out. This is 1 over lux. So a higher bar is actually better. And you see that event cameras that really shine in fast-moving scenes have significant disadvantages when it comes to low-light imaging. And for Ubicept perception, we are also above that reliable AI threshold.

    And the third problem is dynamic range. So the ability to resolve super bright and super dark parts of a scene at the same time, also being able to adjust to varying brightness conditions very fast is something that our processing can do extremely well. So you see that the new class of image sensor with Ubicept processing on top of that is the only solution that can reliably operate in all these environments. And that's important to highlight, at the same time, without compromising on one versus the other.

    How does it look like? That's, of course, very important when you talk about perception and camera imaging. What we have here is a disk with a couple of toy models glued to it. And then there is an output from a low-light camera with Ubicept processing, the Ubicept solution, and the conventional low-light camera.

    And we can make the disk spin at 1,000 millilux, about 1 lux. The object detector that is running on the video output can still detect these objects on the right. That is perfectly fine. But now the disk will spin faster. So we have very fast motion involved. And you can see that on the right computer screen, these detections actually become much more sparse.

    And when we look at even lower light, 100 millilux in this scenario, it's almost so dark that you can't see it on that video. Actually, the conventional low-light camera fails to provide reliable detections almost all the time, whereas with Ubicept processing, this is always nicely visible.

    This, again, is low light in combination with fast motion. You might wonder, OK, when you talk about a single-photon camera-- and I'm happy to explain the details to you-- what about bright light? And it turns, actually, out that these cameras are extremely good with our processing also in bright light.

    So this is with autoexposure turned off. And now we have the bright flashlight actually shining into the camera. And the conventional sensor on the right is-- sorry-- fully saturated, as you can see here in this example on the right-- so completely washed out image-- whereas on the left, there is a crisp image that actually leads to a reliable detection.

    Looking at a more realistic use case out in the wild, license plate recognition is very challenging because of the problems that we talked about. So on the left, there is a lot of motion blur. It's impossible to detect that license plate, not to mention actually the make and model of the car, which is something that is very important for surveillance and security applications.

    And how can you work with us? We have an evaluation kit available that can provide parameters in all these three domains-- super high frame rate-- over 500 frames per second with 140 dB dynamic range in each frame, which is kind of outstanding. No other camera can do that. And of course, we can see extremely well in low light.

    Right now, we are working with-- mostly with companies that make moving platforms in uncontrollable environments-- cars, trucks, drones, helicopters, airplanes, and so on. Most of them are from the automotive industry. But we are also very open to applications and industrial inspection, quality assurance, logistics, and many, many more.

    So you can, for example, purchase the evaluation kit. Or we can work together on a PLC. And as you probably guess, perception and cameras are being used in many, many different applications. We touched on mobility, surveillance, defense, quality assurance, surveying, and mapping. We have a nice example for that.

    So if you're working in any of these spaces-- and I'm sure you have unsolved perception problems, particularly based on cameras-- then please don't hesitate to reach out. I'm at my booth. I have a couple more demo videos available and also can show the camera to you. Thank you very much.

    [APPLAUSE]

    Download Transcript