Better data center efficiency starts with better facility cooling

Better data center efficiency starts with better facility cooling

Want a more energy-efficient data center? Start by optimizing your cooling efforts.

Data center cooling efficiency has been a particularly hot topic among industry experts in recent months for one very simple reason: It's an energy hog. 

In 2014, data centers consumed 70 billion kilowatt-hours of electricity. The good news, according to Data Center Knowledge's editor in chief, Yevgeniy Sverdlik, is that by 2020, experts predict this amount to rise to a modest 73 billion kilowatt-hours (4 percent increase). This is assuming that data center operators stay in step with current energy efficiency efforts, which have been significantly improved since 2010. 

Despite these enhancements, one area that's still in need of optimization is data center cooling, which accounts for, on average, about 40 percent of the energy these facilities consume. Given the exorbitant amount of money tied to these systems, it's clear that facility managers who achieve greater cooling efficiency will be in a strong position to lower their operational expenses. 

Exploring the wilier options

Some rather inventive cooling strategies have been explored within the past few years. In January, The New York Times published an article about a prototype of a submarine server room engineered by Microsoft. The facility is powered primarily by the tides, and lives hundreds of feet beneath the ocean's surface. The biggest benefit of the aquatic data center would be that it eliminates what the Times refers to as "one of the technology industry's most expensive problems," which is data center cooling. 

The facility's fault, however, is that deploying them on a larger scale could have environmental repercussions and result in unforeseen technical issues, and many experts remain very skeptical of the technology's long-term viability. 

Another interesting hack at enhancing cooling efficiency came from social media giant, Facebook. In September, Mark Zuckerburg posted photos of the Sweden facility, located just 70 miles south of the Arctic Circle. Only 150 employees are needed to man the data center, and they get around on scooters to save time. The best part of the facility is that because of its location, the cooling is system is technically more of a ventilation system.

It sounds ingenious, and in many ways it is. Unfortunately, it's not in every organization's best interest (or budget) to build a state-of-the-art data center on the outer rim of the Arctic Circle. 

There are easier ways than moving to the Arctic to improve data center cooling efficiency. There are easier ways than moving to the Arctic to improve data center cooling efficiency.

A more sensible approach: Streamlined airflow management

"Don't wiggle into your diving suits and thermal underwear just yet."

There's certainly something to be said for constructing aquatic server rooms and data centers that are cooled with Arctic air, but for the vast majority of organizations, these are simply out of the question. At the same time, dialing up the cooling capacity has proven ineffectual, and for good reason. Experts contend that the most prolific cause of poor cooling efficiency has nothing to do with the computer room cooling units, and everything to do with airflow management. 

Bypass airflow, for instance, happens when cool air passes around servers instead of through them. Meanwhile, poor exhaust containment, another big airflow pitfall, introduces the risk of warm air circulating back into cool aisles. These issues are typically the main culprits for cooling inefficiencies.

Extraneous circumstances such as regional climate will always influence cooling capacity, but there are some factors that data center managers can control, and airflow management is one of them. 

Don't wiggle into your diving suits and thermal underwear just yet. Contact Geist today to learn about a more sensible approach to cooling efficiency.