Properly Deployed Airflow Management Devices

Description

All airflow management strategies strive to either maximize cooling by supplying cooling (“supply”) air directly to equipment, or by eliminating the mixing and recirculation of hot equipment exhaust air.

A long–term monitoring study of 19 data centers by the Uptime Institute concluded that only 60% of the cool air being pumped into the data center was cooling equipment. It also found that 10% of the data centers had hot spots.22 (Data center hot spots refer to server input air conditions that are either too hot or too dry, according to the American Society of Heating, Refrigerating and Air–Conditioning Engineers (ASHRAE) TC 9.9 guidelines.)

Sound airflow management strategies are becoming even more important as data centers accommodate modern high–density server racks, which demand 20 kW to 30 kW of power per rack versus 2 kW per rack just a few years ago—and generate ten or more times the amount of heat per square foot.

  • Diffusers should be positioned to deliver cool air directly to the IT equipment. At a minimum, diffusers should not be placed such that they direct air at rack or equipment heat exhausts.
  • Blanking panels are fundamental to efficient airflow control in server racks. On the front of server racks, unused rack spaces (open areas) are covered with blanking plates so that air passes through the equipment rather than around it. Blanking panels decrease server inlet air temperatures as well as increase the temperature of air returning to the CRAC, both of which improve operational efficiency. (See Figure 7 and 8 below.) BNY Mellon, an ENERGY STAR certified data center, employed blanking panels as part of their energy efficiency upgrades. See their case study and public service announcement recognizing them as a Low Carbon IT Champion.
  • Structured cabling systems can eliminate disorderly and excess cables that might constrain exhaust airflow from rack–mounted equipment. In addition, cutting cables and power cords to the correct length will provide more room for air to flow away from the back of the rack. (See Figures 9 and 10 below.)
  • Eliminating sub–floor obstructions can improve efficiency. Many data centers use the sub–floor plenum for more than just airflow. Cabling, for instance, can impede proper air circulation, and taking fixed obstructions into account can yield a more efficient floor tile arrangement.
  • Floor grommets (see Figure 11) improve cooling efficiency by sealing areas where cables enter and exit plenums (such as a raised floor). Less leakage helps direct more cold air to the equipment that needs cooling.
  • Vented tiles are incorrectly located or sized in many data centers. Due to the complexity of airflow behavior, the correct configurations are not readily obvious. A professional air flow assessment can help indentify ways to improve cooling efficiency.

Staff at Kaiser Permanente’s ENERGY STAR certified data center used boat covers, underfloor baffles and blanking panels to eliminate nearly 70,000 CFM of bypass air. See their case study PDF (79KB) and public service announcement PDF (2.4MB) recognizing them as a Low Carbon IT Champion.

Figure 7 shows temperature data from a blanking panel installation. The server rack with the blanking panels maintains a cool server inlet temperature by not allowing mixing between the hot and cold aisle.  Without a blanking panel, mixing of the hot and cold aisle air occurs and the server inlet temperatures are much higher.
Figure 7: Actual data from a blanking panel installation demonstrates their effectiveness in maintaining cool server inlet temperatures23
Figure 8 shows a picture of a server rack before and after blanking panel installation on the cold aisle side of the server rack.  Before installation one can see clear through the server rack in the spaces where servers were not installed in the rack.  These spaces are the conduits where the hot and cold air mixing will take place.
Figure 8: A server rack shown prior to and after the installation of blanking panels. (Photo courtesy of PlenaForm Systems Exit ENERGY STAR and EDP Europe Exit ENERGY STAR.)
Figure 9 shows the back of a server with unstructured cabling.  The tangled weave of cables will restrict proper airflow.
Figure 9: Unstructured cabling. (Photo courtesy of Dotcom-Monitor.com Blog)
Figure 10 shows structured cabling - tightly bundled cables (bound with plastic ties) coming from the back of each server of a large server rack.
Figure 10: Structured cabling
Figure 11 shows a raised floor grommet.  The grommet prevents unwanted leakage of airflow from your data center by sealing every cabling entry point.  Cables enter through the grommet´┐Żs dense flexible brush portal.
Figure 11: Raised–floor grommet. (Photo courtesy of 42U.com)

Savings and Costs

Savings

  • Adding a single 12" blanking panel to the middle of a server rack can yield 1% to 2% energy savings.24

Costs

  • Blanking panels cost approximately $4 to $12 per 1U panel.25 Assuming 10 U of empty space per rack, the costs will be $40 to $120 per rack.26
  • Labor costs will vary, depending on the type of blanking panel. Installing 1000 U of blanking panels required more than 80 hours of labor, according to one study.27
  • Floor grommets cost roughly $50 to $100 each, depending on the size and function.
  • The Green Grid recently completed a comprehensive analysis Exit ENERGY STAR of return on investment and power usage effectiveness for data center efficiency upgrades. Payback for adding baffles and blanking panels, repositioning temperature/humidity sensors, and adjusting temperature setpoints was 29 months on capital costs alone.

How Managing Airflow Saves Big

QTS undertook a variety of measures to save energy at its Atlanta Metro data center. One critical initiative was to optimize airflow, which was no small task considering the facility is one of the largest data centers in the world. At 990,000 square feet, it serves more than 200 enterprise customers and has its own on–site electrical substation.

After the operations team conducted an extensive study and an airflow assessment, QTS undertook four important steps to maximize air flow and minimize the loss of cool air in its data center. QTS:

  1. Ensured that open vented tiles were not obstructed by unused equipment, plastic bins, and cables.
  2. Closed vented tiles that were not in use.
  3. Installed self–sealing grommets with brush material in the raised floor to seal holes for cables and wiring.
  4. Used vinyl covers to seal gaps around entrance doors, pipes and windows to keep cold air from escaping, or hot air from entering, the raised floor.

The results speak for themselves. After implementing these measures, QTS’s PUE (power usage effectiveness) dropped by 0.11, resulting in a savings of approximately $60,000 over a two–month period.

These measures made up approximately 20 percent of the company’s energy efficiency savings. Additional power management efforts included economizers, efficient lighting and work on chiller temperatures.

The multiple initiatives undertaken by QTS earned the Atlanta metro facility a Leadership in Energy and Environmental Design (LEED) Gold certification.

Considerations

  • Due to the complexity of airflow in data centers, the proper use of airflow management devices such as diffusers, blanking panels, structured cabling, floor grommets, and vented tiles is not always obvious. A professional air flow assessment can help indentify ways to improve cooling efficiency in data centers by better deploying these devices. A professional assessment should include examination of:
    • The amount of airflow in relation to the hottest server racks,
    • The size of under–floor supply plenums,
    • Pressure in under–floor supply plenums,
    • The size of return plenums or ceiling height, and
    • Location and size of vented tiles (e.g., not located in the hot aisle).

Useful Links

22 http://upsitetechnologies.com/images/stories/pdf/whitepapers/reducing bypass airflow is essential for eliminating hotspots.pdf
23 http://www.ptsdcs.com/whitepapers/39.pdf
24 Data Center Energy Efficiency Best–Practices — Insights Into The ROI On Best–Practices, December 3, 2008, 42U.
25 The vertical space taken up by a server is measured in Rack Units (RU or "U"). A ’U’ is equivalent to 1.75 inches (4.45cm).
26 http://www.ptsdcs.com/whitepapers/39.pdf
27 http://www.ptsdcs.com/whitepapers/39.pdf