
4.5.23-AI-Ikigai-Labs

-
Interactive transcript
VINAYAK RAMESH: Hi, everyone, Vinayak Ramesh, co-founder and CEO of Ikigai Labs. So I know a few people, a couple of speakers mentioned LLMs earlier. So just raise of hands, raise your hand if you've heard of or use ChatGPT. Keep your hand up now if you've used GPT 2 or GPT 3 directly.
A lot of people put their hands down, right? So what this is showing is GPT 2 and GPT 3, those are models. But ChatGPT is an app. And we've all used ChatGPT. So we really need to go beyond models and turn them into apps in order to make AI usable. And that's what we do at Ikigai.
So Ikigai, we were founded out of MIT. I'm an MIT alum and my co-founder Devarat Shah is Professor of Computer Science at MIT. We're also repeat entrepreneurs. We've been building AI apps in different industries like health care supply chain over the past decade. And with this explosion, we knew that everyone needed a platform for building their own custom AI apps in the enterprise.
So like I mentioned, there's been a ton of work, tons of tools, around building models. But clearly models are not enough for effectively utilizing AI. So we have to turn these models into AI apps. But building AI apps is really, really difficult.
It's like nothing we've built before and completely different from building normal, traditional enterprise software. And in particular, in enterprise software, your hard-coding rules, and then you have under the scene, you have the DevOps, right? So you're making sure your application is up, running, they scale.
But when you're building AI apps, they're different for a few reasons. One is instead of hard-coding rules, you're building these models. So now you have to manage not only that your applications scale but also your models are staying up to date, as data changes.
And the one big difference about AI apps is they're always self-learning, as experts are interacting with the application. So you have to be able to manage this expert-in-the-loop interaction so that the AI apps are continuously learning. So that's exactly what we do at Ikigai. We simplify this entire process.
So we have a no/low-code platform for going from, connecting to raw data, you know, building your models and turning those into applications, and we take care of everything from the self-learning to managing models to scaling the compute behind the scenes. And everyone here is working in the enterprise. And large language models are great, but they've primarily been utilized on consumer data.
But in the enterprise, data is tabular. It comes from different systems. It's disparate. It's missing. It's dirty. So we've developed something that we called LGMs, large graphical models, which is a different type of data structure for handling the challenges of enterprise data, which LLMs and a lot of the things that you're hearing about weren't really meant for in the first place. Oops, let me go back.
And, as part of that, we enable three foundation models on top of which these apps are built. And I'll give a few examples in a second. The first is Deep Match. So like I said, enterprise data spread all over the place, across multiple disparate systems, and it's very hard to join together in order to reconcile and get insights out of. So that's what deep match solves with AI.
Deep Cast is forecasting when you have limited or historical missing data. So we're seeing a lot of supply chain teams use this, especially when they're trying to forecast demand, let's say post-COVID. Whereas all the other tools you have to have lots and lots of data and you can't take into account these short trends and changes.
And then Deep Plan is what-if analysis on steroids. We run hundreds of thousands of scenarios for you, so you can understand things like how much inventory should I order versus how much inventory on hand should I have, versus how much am I going to miss in sales? Or how much labor should I hire or spend on labor in my warehouse versus the amount of service I'm going to provide to my customers.
So those are the problems that Deep Plan solves. So at Ikigai, we're already working with a number of enterprises especially across banking, financial services, insurance, and supply chain in retail and manufacturing. Some of the most popular use cases or AI apps that we've started deploying with these customers are so around labor planning optimization, especially around warehouses and figuring out what's the optimal amount of labor for one of the largest retail 3PLs, 3PL networks. We reduce their labor costs by about 20% by using Deep Plan.
Demand planning, especially like I mentioned in a Post-COVID world, you don't have the reliable data that you had before. And then finally on the banking and insurance side, a lot around payment, transaction, trade monitoring, and reconciliations as well. So we would love to engage. Here's my email. And thank you everyone.
-
Interactive transcript
VINAYAK RAMESH: Hi, everyone, Vinayak Ramesh, co-founder and CEO of Ikigai Labs. So I know a few people, a couple of speakers mentioned LLMs earlier. So just raise of hands, raise your hand if you've heard of or use ChatGPT. Keep your hand up now if you've used GPT 2 or GPT 3 directly.
A lot of people put their hands down, right? So what this is showing is GPT 2 and GPT 3, those are models. But ChatGPT is an app. And we've all used ChatGPT. So we really need to go beyond models and turn them into apps in order to make AI usable. And that's what we do at Ikigai.
So Ikigai, we were founded out of MIT. I'm an MIT alum and my co-founder Devarat Shah is Professor of Computer Science at MIT. We're also repeat entrepreneurs. We've been building AI apps in different industries like health care supply chain over the past decade. And with this explosion, we knew that everyone needed a platform for building their own custom AI apps in the enterprise.
So like I mentioned, there's been a ton of work, tons of tools, around building models. But clearly models are not enough for effectively utilizing AI. So we have to turn these models into AI apps. But building AI apps is really, really difficult.
It's like nothing we've built before and completely different from building normal, traditional enterprise software. And in particular, in enterprise software, your hard-coding rules, and then you have under the scene, you have the DevOps, right? So you're making sure your application is up, running, they scale.
But when you're building AI apps, they're different for a few reasons. One is instead of hard-coding rules, you're building these models. So now you have to manage not only that your applications scale but also your models are staying up to date, as data changes.
And the one big difference about AI apps is they're always self-learning, as experts are interacting with the application. So you have to be able to manage this expert-in-the-loop interaction so that the AI apps are continuously learning. So that's exactly what we do at Ikigai. We simplify this entire process.
So we have a no/low-code platform for going from, connecting to raw data, you know, building your models and turning those into applications, and we take care of everything from the self-learning to managing models to scaling the compute behind the scenes. And everyone here is working in the enterprise. And large language models are great, but they've primarily been utilized on consumer data.
But in the enterprise, data is tabular. It comes from different systems. It's disparate. It's missing. It's dirty. So we've developed something that we called LGMs, large graphical models, which is a different type of data structure for handling the challenges of enterprise data, which LLMs and a lot of the things that you're hearing about weren't really meant for in the first place. Oops, let me go back.
And, as part of that, we enable three foundation models on top of which these apps are built. And I'll give a few examples in a second. The first is Deep Match. So like I said, enterprise data spread all over the place, across multiple disparate systems, and it's very hard to join together in order to reconcile and get insights out of. So that's what deep match solves with AI.
Deep Cast is forecasting when you have limited or historical missing data. So we're seeing a lot of supply chain teams use this, especially when they're trying to forecast demand, let's say post-COVID. Whereas all the other tools you have to have lots and lots of data and you can't take into account these short trends and changes.
And then Deep Plan is what-if analysis on steroids. We run hundreds of thousands of scenarios for you, so you can understand things like how much inventory should I order versus how much inventory on hand should I have, versus how much am I going to miss in sales? Or how much labor should I hire or spend on labor in my warehouse versus the amount of service I'm going to provide to my customers.
So those are the problems that Deep Plan solves. So at Ikigai, we're already working with a number of enterprises especially across banking, financial services, insurance, and supply chain in retail and manufacturing. Some of the most popular use cases or AI apps that we've started deploying with these customers are so around labor planning optimization, especially around warehouses and figuring out what's the optimal amount of labor for one of the largest retail 3PLs, 3PL networks. We reduce their labor costs by about 20% by using Deep Plan.
Demand planning, especially like I mentioned in a Post-COVID world, you don't have the reliable data that you had before. And then finally on the banking and insurance side, a lot around payment, transaction, trade monitoring, and reconciliations as well. So we would love to engage. Here's my email. And thank you everyone.