
6.22.22-Showcase-Jetcool

-
Video details
Compute Faster and More Sustainably
-
Interactive transcript
BERNIE MALOUIN: Well, very good, very good. Thank you, thank you. A great lineup today. And I'm Bernie Malouin from Jetcool Technologies, a Lincoln Laboratory spinoff. And we're in our fourth year now of solving what we think is some very interesting problems in technology and sustainability.
So we started with a very simple premise. We asked ourselves, what if data centers could compute faster and more sustainably? And that's a really important question, a really important problem. Let's see. Because by most accounts, data centers today consume about 3% of the planet's electricity. In the United States alone, that's enough to power about 12 million households for an entire year. It's about half the homes in the state of California.
Amazingly, though, 30% of that electricity has nothing to do with pushing electrons through transistors. It's got everything to do with keeping those transistors from melting down. So I can see some of the folks doing the math in their head, and you're right. That means that about 1% of the planet's electricity is spent just keeping data centers cool.
So that's a big opportunity. It's a big opportunity. It's also only part of the dirty secret where data centers also consume a lot of water, placing additional stress on areas that are already dealing with water scarcity. But we don't need fewer data centers, we need more data centers. So we're packing more processors into those data center facilities.
And we're seeing something very interesting, because for the first time in 20 years, those processors that are now going into data centers are actually higher and higher power. And we haven't seen this plateaued for the past 20 years, this is a very recent phenomenon. So we're packing more processors in the data centers. Those processors are higher power. Combined, that becomes a very, very difficult challenge.
And we're addressing this challenge with our cooling technology, something we call smart plates. Smart plates use intelligent liquid cooling to cool exactly on the processor where the heat is being generated. Smart plates are the easiest way to safely cool processors while increasing efficiency. With a PUE of 1.03, that means that 3% of a data center's electricity is spent cooling the processors as opposed to the 30% that is the industry standard today.
So we can. We can help data centers compute faster and more sustainably. And we're doing that already. Today we're lowering the temperature of processors in data centers 37%. We're enabling them to continue those trends of higher power processing to the cooling processes over 1,000 watts. And in doing so, we're using less electricity to keep those facilities cool.
So how do we do that? So the essence of our technology is something we call microconvective cooling. And that's using arrays of small fluid jets to cool exactly intelligently right where the heat is being generated in processors. So it involves two different, I guess, components. The first is our proprietary nozzle technology. We use those nozzles to form these arrays of fluid jets. There can be tens, there could be hundreds, there could be 1,000 jets on a single processor. So that allows us to achieve 10 times better heat transfer than competing approaches.
And we combine that proprietary nozzle technology with our patented architectures. Architectures that, again, allow us to cool right where the heat is being generated at or within the device in some instances. So let's take a look at one use case. This is a solution for a popular AMD chipset. Again, we're targeting the hotspots using our intelligent advanced fluid dynamics.
In this situation, we actually replaced an existing incumbent cooling solution using 70% less electricity to run our solution than the incumbent. But we also found something very interesting. We found with the better cooling that we provided, we lowered the temperature of that processor and increased the intrinsic efficiency of the silicon itself. So by reducing the leakage current within that silicon, we increased the efficiency of the processor by 10%.
It's not all about electricity, though. So we've also have an installation at a national laboratory where we're trying to eliminate the need for evaporative coolers. Again, back to the water savings. And we're able to do that. This is with a data center using Intel servers. Where as long as the temperature outside is lower than 125 degrees Fahrenheit, which I thought was absurd until the past couple of days here in Mountain View, but as long as the temperature outside is less than 125 degrees, those evaporative coolers don't have to be turned on.
That's a big deal, because evaporative coolers consume about 10,000 gallons of water every single day for a megawatt of compute power. So a really big impact there is as well. So by combining both the performance and the efficiency part of what Jetcool can do, we can make sustainability the practical choice, not just for ESG goals, but also for P&L goals.
We'd love to work with more companies, companies just like you. Partners that are intent on having an impact. So if you're in the semiconductor or the compute, the data center, or the blockchain space, please-- I'm out of time, but please stop by later and say hello, I'd love to chat. Thanks, everybody.
[APPLAUSE]
-
Video details
Compute Faster and More Sustainably
-
Interactive transcript
BERNIE MALOUIN: Well, very good, very good. Thank you, thank you. A great lineup today. And I'm Bernie Malouin from Jetcool Technologies, a Lincoln Laboratory spinoff. And we're in our fourth year now of solving what we think is some very interesting problems in technology and sustainability.
So we started with a very simple premise. We asked ourselves, what if data centers could compute faster and more sustainably? And that's a really important question, a really important problem. Let's see. Because by most accounts, data centers today consume about 3% of the planet's electricity. In the United States alone, that's enough to power about 12 million households for an entire year. It's about half the homes in the state of California.
Amazingly, though, 30% of that electricity has nothing to do with pushing electrons through transistors. It's got everything to do with keeping those transistors from melting down. So I can see some of the folks doing the math in their head, and you're right. That means that about 1% of the planet's electricity is spent just keeping data centers cool.
So that's a big opportunity. It's a big opportunity. It's also only part of the dirty secret where data centers also consume a lot of water, placing additional stress on areas that are already dealing with water scarcity. But we don't need fewer data centers, we need more data centers. So we're packing more processors into those data center facilities.
And we're seeing something very interesting, because for the first time in 20 years, those processors that are now going into data centers are actually higher and higher power. And we haven't seen this plateaued for the past 20 years, this is a very recent phenomenon. So we're packing more processors in the data centers. Those processors are higher power. Combined, that becomes a very, very difficult challenge.
And we're addressing this challenge with our cooling technology, something we call smart plates. Smart plates use intelligent liquid cooling to cool exactly on the processor where the heat is being generated. Smart plates are the easiest way to safely cool processors while increasing efficiency. With a PUE of 1.03, that means that 3% of a data center's electricity is spent cooling the processors as opposed to the 30% that is the industry standard today.
So we can. We can help data centers compute faster and more sustainably. And we're doing that already. Today we're lowering the temperature of processors in data centers 37%. We're enabling them to continue those trends of higher power processing to the cooling processes over 1,000 watts. And in doing so, we're using less electricity to keep those facilities cool.
So how do we do that? So the essence of our technology is something we call microconvective cooling. And that's using arrays of small fluid jets to cool exactly intelligently right where the heat is being generated in processors. So it involves two different, I guess, components. The first is our proprietary nozzle technology. We use those nozzles to form these arrays of fluid jets. There can be tens, there could be hundreds, there could be 1,000 jets on a single processor. So that allows us to achieve 10 times better heat transfer than competing approaches.
And we combine that proprietary nozzle technology with our patented architectures. Architectures that, again, allow us to cool right where the heat is being generated at or within the device in some instances. So let's take a look at one use case. This is a solution for a popular AMD chipset. Again, we're targeting the hotspots using our intelligent advanced fluid dynamics.
In this situation, we actually replaced an existing incumbent cooling solution using 70% less electricity to run our solution than the incumbent. But we also found something very interesting. We found with the better cooling that we provided, we lowered the temperature of that processor and increased the intrinsic efficiency of the silicon itself. So by reducing the leakage current within that silicon, we increased the efficiency of the processor by 10%.
It's not all about electricity, though. So we've also have an installation at a national laboratory where we're trying to eliminate the need for evaporative coolers. Again, back to the water savings. And we're able to do that. This is with a data center using Intel servers. Where as long as the temperature outside is lower than 125 degrees Fahrenheit, which I thought was absurd until the past couple of days here in Mountain View, but as long as the temperature outside is less than 125 degrees, those evaporative coolers don't have to be turned on.
That's a big deal, because evaporative coolers consume about 10,000 gallons of water every single day for a megawatt of compute power. So a really big impact there is as well. So by combining both the performance and the efficiency part of what Jetcool can do, we can make sustainability the practical choice, not just for ESG goals, but also for P&L goals.
We'd love to work with more companies, companies just like you. Partners that are intent on having an impact. So if you're in the semiconductor or the compute, the data center, or the blockchain space, please-- I'm out of time, but please stop by later and say hello, I'd love to chat. Thanks, everybody.
[APPLAUSE]