Innovative New Chip-Cooling May Reduce Data-Center Power Costs by 5%
When operating a data center, one of the biggest ongoing expenses is the electricity. Electricity is in high demand largely because of cooling requirements in data centers. This is important because computer equipment generates so much heat. Specifically, when performing complex calculations.
In most systems today, CPUs and GPUs are cooled by placing a heat sync directly on top of the processors. This pulls the heat away from the processor. The heat is then dispersed into the environment, typically through powerful fans operating within each device. While these heat syncs are effective, they aren’t extremely efficient. According to a group of mechanical engineer researchers, there is a new, better option that will be helping to revolutionize the CPU cooling industry.
On Chip Cooling
The same group of mechanical engineers, developed a method that is able to bond microchannels directly onto the chip’s silicon using a process similar to 3D printing. The additive is ‘printed’ onto the silicon in a spiral or maze-like pattern, and then coolant can travel through to quickly remove the heat.
In the study conducted by the group, the technique was able to keep electronics about 18 degrees F cooler than traditional heat syncs. In addition, they determined that this would cut power use in data centers by as much as 5%. Given the fact that data centers spend millions per year on electricity, that 5% cut could translate into very significant savings.
The additive that is being used is a tin-silver-titanium alloy that is 1000 times thinner than a human hair, and it is bonded directly to the metal. A melting laser is used to print the heat-dissipating channels onto the silicon. The process used is a sub-millisecond operation, allowing for the elimination of any gaps between the chip and the cooling layer.
This is critical because when using traditional heat syncs it is necessary to use thermal paste to attach to the processor and also to fill those microscopic gaps that slow the heat transfer process. Having these extra layers reduces the efficiency of the cooling process. With this new method of cooling, the heat dissipation component is essentially one with the chip itself on a microscopic level. This allows for an extremely fast and efficient transfer of heat.
Many Important Benefits
In addition to reducing the operating temperature of the processor and cutting the energy consumption, this new process will provide several other key advantages over previous solutions to the heat problem. Once it is mass produced, it would be a cheaper cooling solution since it is much less bulky and uses less materials. This would also free up room within the cases of electronics that currently use the large heat syncs for cooling.
According to Scott Schiffres, the assistant professor at Binghamton University who worked on this project, “Lower operating temperatures will improve the energy efficiency of data centers by about 5 percent, which can save $438 million dollars in electricity and can prevent 3.7 billion pounds of carbon dioxide from being emitted per year. It will also reduce toxic electronic waste by about 10 million metric tons – enough to fill 25 Empire State Buildings – because of the lower rates of heat-based device failures.”
The benefits for large scale data centers are obvious, but there is also a lot of potential for consumer grade machines. Many hard-core gamers like to over-clock their CPUs to improve performance. This generally causes an increase and heat, which can lead to chip failure if done incorrectly. Using this new cooling technique, the chips could be cooled more efficiently, thus allowing a gamer to boost their performance even more.