Top 4 Things to Know About Data Center Cooling

Today's data centers require a great deal of monitoring to keep running efficiently. Proper maintenance includes a variety of factors, one of which is appropriate temperature control. The IT equipment used in data centers needs electricity to function but often produces heat as a byproduct of that function. In the enclosed space of a data center, this heat can quickly build up.

 

While the American Society of Heating and Air-Conditioning Engineers' temperature guidelines for data centers allow temperatures up to 104 degrees Fahrenheit, sustained high temperatures at or above this level can damage or destroy devices and components within the data center. Such damage not only results in significant repair costs but can also lead to costly downtime as well.

 

To prevent such outages, removing heat from a data center is of the utmost importance. Fortunately, a variety of methods and technologies are available today to help maintain data center temperatures at an appropriate level.

 

Choose the Best Cooling Design for Your Facility

If you're in the process of designing a data center or renovating your current one, one of the most valuable data center cooling tips is to choose the best plan for your facility's needs. The best choice depends on several factors but can have a significant impact on your data center's cooling efficiency. Just a few of the factors to keep in mind when choosing your center's cooling design include:

 

 

Here are four of the most important things to keep in mind about data center cooling strategies so you can determine the best cooling methods for your data center.

 pexels photo 236093

1. Pick the Right Cooling Method

By considering these factors and balancing them appropriately, you can have a better idea of what your facility can handle. Once you have a better grasp of this, you can choose which cooling method would suit your facility's needs best. The most conventional cooling methods are described in more detail below:

 

Air Cooling

This basic method of cooling uses computer room air conditioners, or CRACs. These air conditioners remove heat from warm air, converting it into cold air before recirculating it. These CRACs can be arranged in any number of configurations, focusing on the whole room, a single row or even a single rack.

 

CRACs can also be used in combination with raised floors, supplying cold air underneath the area so fans can draw it upward to cool the equipment. The CRACs can then collect the heated air from the upper portions of the room, cool it and recycle it back underneath the floor, moving the heat out of the chamber.

 

Liquid Cooling

Liquid cooling is more expensive than air cooling, both on a technical and budgetary level, but it offers more cooling power. Liquid cooling designs use cold water or refrigerants from a cooling tower to remove heat from the facility. The liquid does this by moving around or through racks or CRACs, collecting heat as it goes before recycling back into a cooling tower, which removes the heat from the liquid as waste.

 

Because it offers greater cooling power than air-cooling, this type of cooling system is often a necessity for high-density systems. However, it requires much more infrastructure and planning and is more expensive than air cooling or free cooling methods.

 

Free Cooling

Free cooling is also known as air-side or water-side economization. The air-side method of cooling uses outside air to cool the equipment, limiting the expenses associated with chillers and air conditioners by minimizing their usage. This type of cooling increases the problems associated with air contaminants and variations in humidity, but more sophisticated systems have reduced these problems by transferring heat outdoors in a more indirect fashion.

 

Water-side economization uses similar methods, employing outside air combined with evaporation techniques to cool liquid run through the facility without using chillers. This has become a more popular approach to cooling as allowable data center temperatures have increased. Free cooling usually has a similar infrastructure as liquid cooling but costs significantly less.

 

By considering each of these options and comparing them to your company’s plans and expectations, you can choose a system that best suits your data system’s needs.

 

data center cooling

 

2. Be Proactive About Potential Problems With Your Cooling System

Designing your cooling system doesn’t end at choosing the type of system to implement. Too many organizations simply pick a room, pick a cooling system and fill the space in whatever arrangement will fit. However, this approach can lead to several problems.

 

Improper Infrastructure

An undersized power or cooling infrastructure can severely limit your operational capacity, while an oversized infrastructure can unnecessarily increase your capital expenses and monthly expenditures.

 

Poor Component Placement

If you place your components without considering your infrastructure, you may reduce your cooling system’s efficiency. For example, misplacing a rack may force you to detour an air duct in a way that reduces its potential airflow.

 

Inappropriate Floor Planning

If you have too little space in your data center room, it can complicate your component placement and waste floor space.

 

Preventing Cooling System Pitfalls

To avoid these pitfalls with your cooling system, take some preventative measures both in designing and maintaining it.

 

Implement a Hot-Aisle/Cold-Aisle Configuration

Instead of trying to cool your entire data center at a single low temperature, focus on removing hot air from the room before it has a chance to recirculate. You can easily separate hot and cold air by making rows with the racks. Have the front of the racks face each other, to create a "cold aisle," and the back face each other, to create a "hot aisle." Once you've accomplished this configuration, you can more efficiently vent air from the hot aisles before it has a chance to affect the servers in adjacent rows.

 

More sophisticated variations on this design involve walls between the racks or between the racks and the ceiling, isolating the warm and cold air even further. This type of configuration can help decrease energy usage up to 20 percent and can prevent some cooling inefficiencies.

 

Maintain Organized Cables

Tangled cables can substantially block your airflow, preventing cold air from distributing beneath raised floors. They also actively trap heat within enclosures, causing heat to increase to dangerous levels more quickly. Avoid these cabling issues by moving your cables to overhead cable managers and using high-capacity cable managers inside enclosures, minimizing the tangle. This is an easy process if you implement it early on and can prevent a lot of cooling inefficiencies.

 

Plan for Your Racks

Too many companies design the room before they choose their racks. The best way to operate is to select the racks best suited to your needs, decide on their configuration and density, and design the room around them. The company doesn’t risk overcrowding or under-provisioning and can plan the best infrastructure of cooling and power resources for its data center’s needs. This prevents costly structural deficiencies.

 

data center efficiency

3. Look for New Ways to Improve Your Data Center’s Cooling Efficiency

Years ago, energy efficiency in data center cooling systems was an afterthought. With relatively cheap energy available and rack density low, companies weren’t too worried about the electric bill each month. However, today’s data center managers are expected to pay much closer attention to energy costs.

 

Data center cooling can account for up to 30 percent of a data center’s operational costs and 70 percent of a data center’s energy use. According to a 2011 survey, 97 percent of data center managers said reducing their energy expenditure is either a “somewhat” or “very” important concern. Of those respondents, 87 percent cited lower costs as the primary motivation for reducing their energy expenditure.

 

With costs such a high priority, data center managers have to do what they can to continually improve the efficiency of their data center cooling systems. Here are a few ways to improve the effectiveness of your existing cooling system.

 

Install Blanking Panels

Block off unused rack spaces by installing blanking panels. If you use these panels in combination with a hot-aisle/cold-aisle configuration, they prevent hot air from recirculating through the unused rack space. Plan for these blanking panels by using racks compatible with snap-on panels. This way you don’t need to break out the toolkit every time you need to reconfigure your data center arrangement.

 

Replace Inefficient UPS Systems

Any unnecessary heat sources must be removed from your room if you want to improve efficiency. You can easily replace traditional online UPS systems with more modern models, which are more energy efficient. This reduces your system's heat output, rendering your cooling efforts more efficient.

 

Use Close-Coupled Cooling

Close-coupled cooling systems tend to be more efficient than traditional perimeter and raised floor systems. Instead of cooling the whole room, these systems use hot-aisle/cold-aisle configurations with row-based air conditioning units in close configuration to focus cooling where it’s needed most, reducing unnecessary energy expenditure. The modular nature of this system also helps managers more easily reconfigure the system to install new equipment or handle overheating racks.

 

Regularly Re-Evaluate the System

Whenever you make a substantial change to your current system or company budget, re-evaluate your system to make sure you’re using the best cooling system for your configuration, room and budget. Check equipment temperatures regularly to make sure you’re falling within national standards and make changes when necessary. The same system you’ve used for years may fall short if you’ve made substantial changes to your equipment configuration since initial implementation.

 

 network cable ethernet computer 159304

4. Remember Why Structured Cabling Makes a Difference for Your Cooling System

One way to help your cooling system work more efficiently, regardless of the cooling system you use, is through structured cabling.

 

In most data centers, the methodology employed with cabling is a simple point-to-point system, where patch cables run directly to and from connected hardware. In a structured cabling system, these patch cables are replaced with patch panels, which are placed at the top of a rack.

 

These patch panels, in turn, connect to other patch panels via a multi-fiber assembly trunk that connects to the main distribution area. At this main distribution area, all the moves, adds and changes can be accomplished with short patch cables, improving visibility and organization.

 

In short, the system creates a modular alternative to the point-to-point system that is more organized and standardized. When properly designed and installed, this system provides your data center with an organized cabling infrastructure that offers numerous benefits to your operations, including:

 

 

 

 

 

 

In short, a structured cabling system can bring more efficiency and flexibility to your data center and its cooling system.

 

Contact Intellicom to Help Install a Structured Cabling System

If you’re interested in implementing a structured cabling system in your data center, Intellicom’s data cabling services can help. Intellicom has provided structured cabling services and support for data centers for over 25 years, helping companies accomplish everything from data center construction to expansion and reconfiguration. We offer expert care and service across the United States, operating as a structured cabling company in North Carolina, our home state, and across the country.

Print