• Feature Articles
  • Digital Issues
  • News
  • Events
  • Products
  • Subscribe
  • Advertise
Facebook Twitter Instagram
Trending
  • Self-learning tankless water heater
  • Median income for journeypersons declined in 2020
  • CIPH launches new leadership academy program
  • National plumbing codes take effect in Alta on April 1
  • Electric tankless units
  • TSSA issues warning of trunk slammers
  • Advanced recirculation technology
  • Wolseley opens new store in Ajax
Facebook Twitter Instagram LinkedIn YouTube
Plumbing & HVAC
  • Feature Articles
  • Digital Issues
  • News
  • Events
  • Products
  • Subscribe
  • Advertise
Plumbing & HVAC
You are at:Home»Feature Articles»Liquid cooling for data centres

Liquid cooling for data centres

0
By Plumbing & HVAC Staff on May 24, 2018 Feature Articles

By Bruce Nagy

In 1943 the president of IBM said: “I think there is a world market for maybe five computers.” In 1957 the editor of Prentice Hall Business Books said, “Data processing is a fad that won’t last out the year.”  In 1977 the founder of Digital Equipment Corp. said: “There is no reason for any individual to have a computer in his home.”  Oops.

Computers have now pervaded every aspect of our lives. Buildings, vehicles, entertainment and business equipment in every sector; it’s all controlled by electronic components using zeroes and ones. And increasingly, connected to the internet.

Billions of people post to Facebook or Instagram, buy on Amazon, or stream from Netflix. Consumer use accounts for about 30 percent of this, organizations and governments use the rest.
According to Cisco, a worldwide internet technology and networking company, IP traffic was 4.7 billion zettabytes in 2015. (A zettabyte equals 1000 gigabytes x 1000 x 1000 x 1000.) With the rise of cloud computing, big data, artificial intelligence and exascale supercomputers for scientific research, global data traffic could zoom up to 15.3 zettabytes per year by 2020.

In total, there are 1026 computers that require cooling.

Heavy power use

Most traffic flows through data centres, which in the U.S. alone use an estimated 70 billion kilowatt/hours of electricity per year. About half of that power is for cooling the servers.

At the beginning of this century governments started talking about data centre energy use while organizations began worrying about the costs of cooling, processing, and future expansion.

Data centre (DC) innovators began capturing expelled heat for use in nearby buildings, and locating data centres in Canada, Finland, or other cooler zones. Microsoft famously put servers in a tank and dropped it to the ocean floor. From an engineering perspective it wasn’t that nutty. Liquid cooling is exponentially more efficient than air cooling. It’s a superior conductor new approaches allow it to be used effectively.

Free cooling

Big players like Google and Facebook said they were tackling the problem. Annual electricity consumption increases from digital processing went from 90 percent per year in the first five years of the century, to 25 percent from 2005-2010 and then just four percent by 2015.

Most of the early gains were made using free cooling by locating data centres up north and bringing in naturally cool air. IBM opened a 25,000 square foot data centre in Barrie, Ont. in 2016. The company expected cool the facility for 210 days each year without the chillers, saving about 50 percent on energy.

In some cases, remote free cooling data centres are located beside hydro dams, further increasing energy efficiencies, by minimizing transmission losses. This solution works for large firms, but there are thousands of small companies that cannot or will not lease computing power from a remote service bureau.

CoolIT distribution units feed each server.

Slow progress

Liquid cooling has been on the horizon at annual IT conferences for a decade or two but has made slow progress, likely because the idea of mixing liquids with electronics seems a little ‘out there.’ At the risk of being ridiculed someday, like the Prentice Hall editor in 1957, I’m willing to predict that some form of liquid cooling will dominate data centre cooling in the future. A handful of North American and European firms have been demonstrating highly successful projects.

One of the most incredible astrophysical research facilities in the world is in Kaleden, B.C. The Canadian Hydrogen Intensity Mapping Experiment (CHIME) is the name for a radio telescope that maps the largest volume of space ever surveyed in history.

Heavy data crunching

It’s called a telescope, but it looks like several snowboard half-pipes covering an area the size of five hockey rinks. At seven quadrillion computer operations per second, its volume is equivalent to the data flow through all the world’s mobile networks.

It opened last year after five years in development; a collaboration between 50 Canadian scientists from the National Research Council (NRCan) and three universities. When they started construction in 2012 there was no computer system available that could process a terabyte of information per second, but video game advances brought CHIME an answer: 1024 high-speed processors working simultaneously.

High performance computing (HPC) nodes at CHIME reside directly adjacent to the telescope in sealed shipping containers designed to prevent leakage of electromagnetic interference from the servers.

“A normal server rack has 80 central processing units (CPUs),” says Geoff Lyon, CEO & chief technology officer at CoolIT Systems Inc., Calgary, which designed the high-density liquid cooling solution for CHIME.

“This can be increased to 160-200 CPUs with an advanced liquid cooling solution…Our system also saves 70 percent on energy compared with most instances when air conditioning is used for the whole server room.”

Coolant flows through the servers in blue feed and red return lines.

Looks can be deceiving

“It looks like a conventional air-cooled system, but we bring cooling liquid in through tubes and a series of manifolds and disconnects, directly to the hottest components – the CPUs and GPUs (graphics processing units). The tubes deliver cooling liquid to a thin copper plate that comes into contact with the hot spots and acts as a heat exchanger. Heated liquid circulates out of the facility for external cooling.”

In the CHIME system, 26 CoolIT DCLC CHx40 Coolant Distribution Units (CDUs) move liquid to external dry coolers from Intel Xeon Processor E5-2620 v3 CPUs and Dual AMD FirePro S9300x2 GPUs in 256 General Technics GT0180 custom 4u servers across 26 racks.

The central processing units provide continuous operation of the world’s largest low-frequency radio correlator system, devoid of active cooling components within the DC itself. “Some of the efficiency results from the fact that the cooling liquid can be 40 degrees Celsius,” says Lyon. “The CPU runs happily at 80 degrees.”

The system seals the liquid inside copper heat exchangers and tubes to avoid contact with the electronics. A similar approach is taken by several other DC innovators, like Cloud & Heat Technologies GmbH in Dresden, Germany.

Oil-cooling

There is another liquid cooling solution in use by numerous leading-edge facilities around the world. In this case servers are submerged in a dielectric, nonconductive synthetic mineral oil blend. It looks like water but behaves differently when it comes in contact with server racks. No shorting out takes place, just highly efficient cooling. Brandon Moore of Green Revolution Cooling (GRC) in Austin, Texas, says it offers about 1200 times the heat absorbing capacity of air cooling.

At a research university in Texas there was a 90 percent reduction in energy use. The PUE (power use efficiency) was reported at 1.03 – 1.05, compared to 1.7 for an average DC air-cooled system. The project also reduced the power load for computing by 20 percent and cut construction costs by 60 percent. Although heavy, the server tanks are relatively small and there are no fans, water loops or heat exchangers.

The company has installed similar liquid cooling systems in Japan, Australia, Spain, Austria, Costa Rica and the UK. They are not alone. Several competitors also plunge electronics into tanks full of oil.

Supply and return manifolds connect the servers.

Growing market

LiquidCool Solutions of Rochester Minnesota is working with Johnson Controls, Siemens, Bonneville Power, and the state of New York. On its web site it quotes WiseGuy Research Consultants, headquartered in Pune, India, predicting that the global data centre liquid immersion cooling market will grow from $109.51 million USD in 2015 to $959.62 Million in 2020 – a compound annual growth rate of 54.3 percent.

In 2007 the CEO of Microsoft said: “There’s no chance that the iPhone is going to get any significant market share.” He probably regrets having said that. In 2018 I’m saying that liquid cooling for data centre servers is a real thing. Hope I’m right.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleVariable capacity
Next Article Words of wisdom sought

Related Posts

Reaching for a lifeline in business

Hybrid electric heat pump water heaters

Ventilation efficiency fallacies

Comments are closed.

TWITTER
Tweets by Plumbing_HVAC_
RSS Plumbing & HVAC
  • Self-learning tankless water heater
  • Median income for journeypersons declined in 2020
  • CIPH launches new leadership academy program
  • National plumbing codes take effect in Alta on April 1
  • Electric tankless units
  • TSSA issues warning of trunk slammers
  • Advanced recirculation technology
  • Wolseley opens new store in Ajax
  • Alberta mass timber buildings can now build up to 12-storeys
  • Cordless threader
About
About

Plumbing & HVAC

Canada's largest and most qualified circulation to the mechanical trades.

Subscribe Now!

Recent Posts
March 28, 2023

Median income for journeypersons declined in 2020

March 27, 2023

CIPH launches new leadership academy program

March 24, 2023

National plumbing codes take effect in Alta on April 1

Pages
  • Advertise
  • eNewsletter
  • Feature Articles
  • Get in Touch
  • News
  • Products
  • Subscribe
Copyright © 2021 Plumbing & HVAC all rights reserved | Designed and Developed by Upnorthwebs

Type above and press Enter to search. Press Esc to cancel.