Microsoft is resorting to laser etching AI-designed cooling channels directly into data center chips to tame their massive heat

3 hours ago 3

Rommie Analytics

If you think the power consumption of today's gaming graphics cards is bad, it's nothing compared to how energy the massive processors in AI and data systems use. All that power ends up as heat, resulting in chip cooling being a serious challenge. Microsoft reckons it has a great solution, though, and it's all about getting water into the processors themselves.

The most complex direct-die, liquid cooling loops you'll see in a gaming PC all involve using a chamber that mounts on top of the CPU. At no point does the coolant ever touch the chip directly. In a recently published blog, Microsoft explains how it has developed a system that does precisely that.

By etching the surface of the processor die with an intricate pattern of tiny channels, water can then be pumped directly into the silicon itself, albeit to a very shallow depth.

The keyword to describe this is microfluidics, a technology that's been around for many decades, and if the history of consumer tech is anything to go by, it'll be a phrase plastered across every CPU cooler within a couple of years (though not actually do anything).

This might all just seem like Microsoft is cutting a few grooves into the chip and having water to flow through it, but it's far more complicated than that. For a start, the channels themselves are no wider than a human hair, and they're not just simple lines either. Microsoft employed the services of Swiss firm Corintis, which used AI to determine the best pattern for maximum heat transfer.

A close-up image of microchannels etched into the surface of a processor, as part of Microsoft's microfluidics cooling project.

(Image credit: Microsoft)

The end result is a network of microchannels that genuinely look organic, though at first glance, you'd be forgiven for thinking the complex patterns were just manufacturing defects. It certainly looks super cool (pun very much intended).

Microsoft claims the tech is up to three times more effective at removing heat from a massive AI GPU than a traditional cold plate (aka waterblock), citing a 65% reduction in the maximum temperature rise of the silicon.

Since all the coolant transfer apparatus doesn't need to be right on top of the microchannels, the system can also be applied to stacked chips, with each one etched before mounting. This way, each die within the stack is cooled individually, meaning they can operate closer to their maximum specifications than with a normal cold plate.

Take AMD's X3D processors, for example. These all have one stacked chip underneath the heatsink: a Core Complex Die (CCD) bonded to a 3D V-Cache die. Each one acts as a thermal barrier to the other, though the CCD does generate much more heat than the cache die. If these could be both cooled via microfluidics, you'd be able to operate them both at higher clock speeds.

Of course, such complex tech isn't cheap to develop or implement, and the likelihood of it ever appearing at the consumer level is very slim. But I wouldn't be surprised if somebody takes an RTX 5090, rips off the heatsink, and swaps it for a homebrewed microfluidic cooler.

There again, if ramping up power consumption is the only way AMD, Intel, and Nvidia can keep improving chip performance, perhaps we might see etched processors and direct-die cooling being standard fare in our gaming PCs. After all, it wasn't that long ago when heatpipes and vapour chambers were phrases never to be uttered by a PC component manufacturer, but now they're in coolers of every kind.

Read Entire Article