Manage Airflow for Cooling Efficiency
The challenge of airflow management is conceptually simple: 1) ensure that the cold “supply” air from cooling equipment gets to the IT equipment inlets without mixing with hot exhaust (“return”) air; and 2) ensure that hot exhaust air returning to the cooling equipment intake does not mix with the cold supply air.
Airflow management has become even more important as data centers incorporate high–density server racks, which demand as much as 60 kW of power per rack versus 1-5 kW per rack just a few years ago—and generate ten or more times the amount of heat per square foot.
In an ideal world, cold supply air would be directly ducted to IT equipment intakes, and hot return air would be ducted directly back to the intake of the cooling unit(s). But simply clearing clutter from the airflow path and installing physical devices to direct and separate hot and cold air can help improve cooling efficiency. Three highly effective airflow management strategies, hot aisle/cold aisle layout, containment, and in-rack/in-row cooling are covered in detail on other pages on this website. This page focuses on smaller devices and modifications that can improve airflow and make almost any data center more energy efficient.
Install devices that improve airflow
Blanking panels cover up open, unused areas of server racks so that cold supply air passes through the equipment rather than over or under it. Blanking panels decrease server inlet air temperatures and increase the temperature of air returning to the CRAC, both of which improve operational efficiency. (See Figures 1 and 2 below.) A single 12 inch blanking panel can reduce rack temperatures by 20°F.1
Figure 1: A server rack shown prior to and after the installation
of blanking panels. Photos courtesy of PlenaForm Systems
and EDP Europe.Structured cabling systems eliminate tangles of cables that might constrain exhaust airflow from rack–mounted equipment. In addition, cutting cables and power cords to the correct length provides more room for air to flow away from the back of the rack. (See Figures 3 and 4 below.)
Figures 3 and 4: Unstructured cabling (left) can impede the flow of hot exhaust air
away from the server, causing hot spots. Structured cabling (right) improves
airflow. Image credit: Dotcom-Monitor.com blog.
Air restrictors or floor grommets (Figure 5) improve cooling efficiency by sealing around cables where they enter and exit plenums (such as a raised floor). Reducing air leaks from around cabling helps direct more cold air to the IT equipment.
Figure 5: Raised–floor grommet. Image credit: 42U.com.
CRAC “chimneys” extend CRAC air intakes toward the ceiling, which maximizes the temperature of return (exhaust) air at the cooling units to improve efficiency. These simple sheet metal extensions simply raise the return air intakes closer to the ceiling of the server room, where they can pull in only return (hot) air, and not supply (cold) air. See Figure 6. Note that chimneys can impede maintenance access to cooling unit filters. However, the filters can be relocated to the top of the chimney, or access panels can be provided in the sides of the chimney.
Figure 6: Computer Room Air Conditioning (CRAC) Unit with a “chimney” that extends
the return air intake into the ceiling plenum, where it can pull in hot air exhausted by
servers. Photo credit: Polargy.com.
Tune up raised floors
Most data centers still utilize a “raised floor” for cold air distribution. A raised floor is a data center construction model in which a slightly higher floor is constructed above the building's original floor, leaving the open space created between the two for cold air distribution and wiring.2 Computer Room Air-conditioners (CRACs) deliver their cold “supply” air below the raised floor, where it travels through the open space to vented tiles that are typically placed in front of server racks. Assuming there is sufficient pressure in the subflooring, the conditioned air will rise through vented (or “perforated”) tiles and enter the server racks above to cool them.
It’s not unusual to find “hot spots” – warm areas in the data center – caused by inadequate cold air distribution or dense heat loads. This is not necessarily the fault of the CRAC or improper facility maintenance. It is more likely due to air flow patterns below the floor, the placement and type of vented floor tiles, and/or the flow of air around equipment racks.3 Vented tiles in particular are incorrectly located or sized in many data centers: in the average datacenter there are either too many vented tiles, too few, the wrong type, or they are placed in sub-optimal locations.
Figure 7: A typical vented (or perforated) tile and a tile-pulling tool.
Photo credit: LinkedIn.4
Determining the optimal number, type and location of vented tiles in a data center requires some technical expertise. But don’t overlook low-hanging fruit, such as stray vented tiles located in hot aisles, allowing cold air from the subfloor to mix with hot exhaust air, reducing cooling efficiency.
- Be on the lookout for boxes, service carts, and other obstructions sitting on top of vented tiles in cold aisles, impeding airflow.
- Place perforated tiles in cold aisles only. Placing perforated tiles in any location but a cold aisle will increase bypass air flow.5 The only reason to place a perforated tile in a hot aisle is if it is a “maintenance tile.” A maintenance tile can be carried to where work is being performed in a hot aisle. An IT employee can then work in the hot aisle, standing on the vented tile in relative comfort. However, the perforated tile should never be left in the hot aisle once work is completed.
- Seal gaps between raised floors and walls, columns and other structural members. Raised floor tiles should be assessed periodically for unwanted leaks – in other words, cool air escaping from the subfloor in places where it is not cooling equipment. Sealing the spaces between the raised floors and room walls is a no-brainer. Gaps are easily identified by a simple visual inspection. A more subtle form of bypass can be found when columns are not finished above the ceiling or below the floor. Often, the sheet rock used to enclose a column forms a chase for direct bypass of cold air into the hot return air stream. These chases must be sealed to reduce bypass air flow.
Use an appropriate selection of tiles. Frequently, data center managers address insufficient airflow and hot spots by installing high-velocity “grates” in the floor near the hot spots. Grates typically pass three times more air than perforated tiles. (See Figure 8.) Although placing grates near hot spots may seem like a solution, it can actually make the problem worse. If the under-floor space is maintained at a fixed pressure for perforated tiles, the throughput of the grate is such that the cold air will blow straight to the top of the aisle with very little capture at the racks.
- Adjust the placement of perforated tiles independently for each cold aisle. Calculate the IT or heat load of each cold aisle and place an appropriate number of perforated tiles or grates (but not perforated tiles mixed with grates – see above) to cool the IT load in that aisle. Placing too few tiles in the cold aisle will cause recirculation. (Recirculation is the mixing of hot exhaust air with the cold intake of the IT equipment.) Placing too many will increase the amount of bypass airflow. If one needs to choose between a little recirculation and a little bypass, the latter is always more prudent. Remember that heat loads change as servers are added or removed. When the loads change, the number of tiles must be adjusted accordingly.6
- Eliminate sub–floor obstructions. Many data centers use the sub–floor plenum for more than just airflow. The subfloor area should be inspected for obstructions, such as bundled cabling or equipment, which may be impeding airflow.
Use CFD or thermal imaging technology to find hidden opportunities
Two technologies allow us to “see” otherwise invisible airflow problems and to capitalize on energy efficiency opportunities.
Computational fluid dynamics (CFD) modeling uses a computer to model a data center's airflow and thus inform tile placement for optimum cooling and energy efficiency. Computational fluid dynamics (CFD) has been around since the early 20th century -- initially used to analyze airflow around aircraft for aerodynamics -- but in recent years CFD has emerged as a tool for data centers. CFD provides data center managers with a detailed 3-D thermal map of how cold air is moving through a data center, and predicts how airflow will change in response to vented tile modifications and other cooling and airflow adjustments. See Figure 9.
Thermal imaging makes it possible to view equipment temperature conditions which are invisible to the naked eye. All objects emit infrared radiation based on their temperatures. The hotter an object is, the more it radiates, allowing infrared (or thermal) imaging to assess cooling and air flow problems in the data center. In addition to identifying hot spots, thermal imaging can locate “cold spots,” where equipment is receiving too much cool air, wasting energy. As a bonus, thermal imaging can also identify overloaded circuits and loose connections before they result in unplanned outages. Figure 10 below shows a row of server racks with cold air (depicted in blue) coming up through perforated floor tiles. The temperature gradient visibly increases as air moves from the bottom to the top of the racks, becoming hotter (and transitioning from yellow, to orange, to red) along the way.
Figure 10: Infrared image showing temperature gradients across server racks. Photo
credit: Thermal Imaging Services, LLC.
With knowledge gleaned from CFD or thermal imaging, data center managers can adjust airflow management devices (e.g., vented floor tiles, grommets) and cooling equipment to reduce hot and cold spots—ultimately reducing the amount of time cooling units need to run and saving energy.
Savings and Costs
Blanking panels
Adding a single 12" blanking panel to the middle of a server rack can yield 1% to 2% energy savings per rack.7 Blanking panels can cost as little as $7 per 1U panel.8
Staff at Kaiser Permanente’s ENERGY STAR certified data center used blanking panels and other airflow management strategies to eliminate nearly 70,000 cubic feet per minute (CFM) of bypass air. Read the case study (PDF, 79.2 KB) for additional details.
Tuning up raised floors and installing floor grommets
Tuning up floor tiles and minimizing cold air leakage from the subfloor can have a major impact on cooling efficiency. A single unprotected opening of approximately 12" x 6" can bypass enough air to reduce the system cooling capacity by 1 kW.9 Floor grommets cost roughly $40 to $125 each, depending on the size and function.
Correctly locating and sizing vented tiles, combined with sealing open spaces around cable feeds in the floor plenum, improves subfloor air pressure and reduces bypass airflow. This can reduce energy expense by 1-6%.10
CRAC “chimneys” and vented tile optimization
Google’s Green Data Centers: Network POP Case Study (PDF, 3.6 MB) examines a small data center’s experience with CRAC air return extensions (“chimneys”), vented tile optimization, cold aisle containment, and temperature and humidity adjustments. Taken together, these measures showed a return on investment (ROI) in less than one year.
CFD modeling
Some vendors offering CFD-aided cooling optimization services have demonstrated over 25% energy savings.11
Managing Airflow Pays Off for QTSQTS undertook a variety of measures to save energy at its Atlanta Metro data center. One critical initiative was to optimize airflow, which was no small task considering the facility is one of the largest data centers in the world. At 990,000 square feet, it serves more than 200 enterprise customers and has its own on–site electrical substation. After the operations team conducted an extensive study and an airflow assessment, QTS undertook four important steps to improve air flow and minimize the loss of cool air in its data center. QTS:
The results speak for themselves. After implementing these measures, QTS’s PUE (power usage effectiveness) dropped by 0.11, resulting in a savings of approximately $60,000 over a two–month period. These low-cost airflow management measures delivered approximately 20 percent of the company’s energy efficiency savings. Additional efficiency measures included chiller temperature adjustments, the use of economizers, and installation of efficient lighting. The multiple initiatives undertaken by QTS earned the Atlanta metro facility a Leadership in Energy and Environmental Design (LEED) Gold certification. |
Tips and Considerations
Due to the complexity of airflow in data centers, the proper use of airflow management devices such as blanking panels, structured cabling, floor grommets, and vented tiles is not always obvious. A professional air flow assessment can help identify ways to optimize cooling efficiency in data centers by better deploying these devices. A professional assessment should include examination of:
- The amount of airflow in relation to the hottest server racks
- The size of under–floor supply plenums
- Pressure in under–floor supply plenums
- The size of return plenums or ceiling height
- Location and size of vented tiles.
A comprehensive airflow and tile analysis also allows for CRAC unit location optimization.
1 DOE's Better Buildings Data Center Partnerships by Dale Sartor, Lawrence Berkeley National Laboratory, June 15, 2015, slide 26. https://datacenters.lbl.gov/sites/default/files/NYC_DataCenterConference_52015.pdf (PDF, 13.7 MB)
2 Raised Floor by Margaret Rouse, TechTarget, May 2012: http://searchdatacenter.techtarget.com/definition/raised-floor
3 Understanding Datacenter Vent Floor Tile for Higher Thermal Efficiencies by G. H. White. LinkedIn, Marche 24, 2016. https://www.linkedin.com/pulse/understanding-datacenter-vent-floor-tile-higher-thermal-white-gene
4 Understanding Datacenter Vent Floor Tile for Higher Thermal Efficiencies by G. H. White. LinkedIn, Marche 24, 2016. https://www.linkedin.com/pulse/understanding-datacenter-vent-floor-tile-higher-thermal-white-gene
5 Bypass air flow is cold supply air that does not lead to productive cooling of the IT load.
6 Source: http://searchdatacenter.techtarget.com/tip/Air-flow-management-strategies-for-efficient-data-center-cooling
7 Data Center Energy Efficiency Best–Practices — Insights Into The ROI On Best–Practices, December 3, 2008, 42U. http://www.slideshare.net/42u/data-center-energy-efficiency-best-practices
8 The vertical space taken up by a server is measured in Rack Units (“RU” or "U"). A ’U’ is equivalent to 1.75 inches (4.45cm).
9 Air flow management strategies for efficient data center cooling, by Vali Sorell, Search DataCenter. http://searchdatacenter.techtarget.com/tip/Air-flow-management-strategies-for-efficient-data-center-cooling
10 Energy Efficiency Best Practices, 42U.
11 Guidelines for Energy-Efficient Datacenters, The Green Grid, p. 4. Information provided by the Green Grid in report no longer available online.