The limitations of “room-based” cooling solutions
All of the electrical power delivered to the IT loads in a data center ends up as waste heat that must be removed to prevent sensitive electronic equipment from overheating. An example from Industrial Light and Magic, a special effects studio, puts this task in perspective. A single rack in their data center containing 84 “blade” servers had an IT load of 28kW, and generated the same amount of heat as four Weber Spirit barbecues grills!1
Computer room air conditioning (CRAC) units push chilled air into a data center and around the equipment – a setup known as “room-based cooling.” In most cases, cooling the vast volumes of air in an entire room is very inefficient. Raised floors and hot and cold aisle containment reduce the volume of air to be moved, but still result in a lot of excess chilling, since cold air is pushed into empty space rather than directly onto hot equipment.
In some data centers, traditional room-based cooling systems are simply reaching the limits of their capabilities. Certain types of servers (blade servers, for instance) pack a lot of power into a small space. That means that they also exhaust a lot of heat into a small space. As a result, some room-based cooling systems – which are designed for “power densities”2 on the order of 1-5 kW per rack – simply cannot keep up with the heat output, and hot spots can develop.
Close-coupled cooling solutions
To address this problem, air conditioning systems can be brought closer to servers by placing them within rows of server racks or by integrating them with individual racks. Such systems are generally referred to as “close-coupled” cooling devices. These devices can be used in addition to, or instead of, standard room-based cooling systems.
With a row-oriented cooling architecture, each CRAC unit is dedicated to cooling one row of server racks. The CRAC units may be mounted among the racks (see Figure 1), overhead, or under the floor. Compared with the room-based cooling, airflow paths are shorter and more clearly defined. This reduces the CRAC fan power required, increasing efficiency.
A row-oriented design also allows cooling capacity and redundancy3 to be targeted to the actual needs of specific rows. For example, row-oriented architecture allows one row of racks to run high density applications such as blade servers, while another row satisfies lower power density applications such as communication enclosures. Similarly, mission critical servers can be installed in one row that features redundant cooling, and less important workloads can reside on servers in a row without redundant cooling.
A row-oriented architecture can be implemented without a raised floor. This increases the floor load bearing capacity, reduces installation costs, eliminates the need for access ramps, and allows data centers to exist in buildings that otherwise do not have the headroom to permit the installation of a raised floor.
In rack-oriented cooling architectures, the CRAC units are dedicated to one server rack and are directly mounted to (or within) the racks (see Figure 2). Compared with the room-oriented or row-oriented architecture, the rack-oriented airflow paths are even shorter and more precisely defined, so the highest power density (up to 60 kW per rack) can be achieved. Similar to row cooling, the reduction in the length of the airflow path reduces the CRAC fan power required, increasing efficiency. And rack-oriented cooling allows cooling capacity and redundancy to be targeted to the actual needs of specific racks of servers.
Mixed cooling architectures
Nothing prevents the room, row, and rack cooling architectures from being used together in the same facility. In fact, there are many cases where mixed use is beneficial. Specifically, a data center operating with a broad spectrum of power densities could benefit from a mix of all three types.
Savings and Costs
Generally speaking, in-row and in-rack cooling systems have higher capital costs than room-based cooling systems, but lower operating costs – particularly at higher power densities, where room-based cooling solutions have trouble keeping up with higher, concentrated heat loads. Unnecessary airflow is avoided, which can save more than 50% fan power consumption compared with room-based cooling.4
Figure 3 shows a simulation of “first costs” (capital costs for cooling units, piping, chillers, installation and containment) associated with room-based, row-based, and rack-based cooling systems. The simulation assumes 480 kW of total IT load and looks at first costs associated with different rack densities (3 kW to 20 kW). First costs for all systems decline as rack density increases thanks to a shrinking data center footprint – e.g., fewer racks, raised floors, and piping are needed. First costs for row-based cooling are slightly higher than for room-based, as more cooling units and more piping are needed. Rack-based first costs drop dramatically as server density increases (e.g., racks fill up with servers), but they are still higher than row and room-based cooling.
Contrary to conventional thinking, it actually costs less to operate high-density IT environments than low-density environments. Figure 4 offers an illustration.
Figure 4 shows annual cooling costs (electricity costs) for the same simulation depicted in Figure 3. Note that row-based cooling systems use less electricity than room-based systems across all server rack densities because the cooling units are located closer to the IT loads and unnecessary air flow is avoided. Rack-based systems begin to use dramatically less electricity than room-based systems as rack density goes beyond 6 kW per rack because servers can be added to existing racks, with little additional cooling needed. However, as server rack density increases from 12 to 20 kW, rack and row-base solutions being to require more airflow and water flow and annual costs increase. Nevertheless, rack and row-based cooling costs remain lower than room-based cooling costs.
Figure 5 illustrates the results of a demonstration project conducted by Lawrence Berkeley National Laboratory (LBNL) in partnership with the Silicon Valley Leadership Group. Like Figure 4, it shows that rack- and row-mounted, close-coupled cooling devices are more efficient than computer room cooling devices and have lower operating costs.5
Tips and Considerations
One advantage of rack and row-based cooling is that the failure of a single cooling unit only affects one rack or row, as opposed to the entire data center.
On the other hand, since rack-based cooling capacity cannot be shared with other racks, redundancy can be an issue with rack-based systems, and, to a lesser extent, row-based systems. N+1 redundancy is commonly used in room- based cooling architectures. (For example, a room-based cooling system consisting of 3 CRAC units would need one extra CRAC on hand in the event that one unit failed.) With row-based cooling, a redundant (extra) cooling unit would be needed for each row. For rack-based cooling, a redundant (extra) cooling unit would be needed for each rack.6
For a more detailed comparison of cooling architectures, including pros and cons, see APC’s white paper, Choosing between Room, Row, and Rack-based Cooling for Data Centers (PDF, 629 KB). Information about relative costs and savings is also provided.
1 Mitchell, Robert. Data Center Density Hits the Wall. Computerworld. June7, 2010. Available online at: http://www.computerworld.com/s/article/349433/Data_Center_Density_Hits_the_Wall
2 “Power density” refers to the power consumption of an entire server rack
3 “Redundancy” is the duplication of critical components of a system with the intention of increasing the reliability of the system, usually in the form of a backup or fail-safe.
4 Dunlap, Kevin; Rasmussen, Neil. Choosing Between Room, Row, and Rack-based Cooling for Data Centers (PDF, 629 KB), APC Whitepaper. 2012. Available online at: http://www.dcsawards.com/files/VAVR-6J5VYJ_R2_EN.pdf (PDF, 629 KB)
5 Bell, Geoffrey. Improving Data Center Efficiency with Rack or Row Cooling Devices. Federal Energy Management Program. March 2012. Available online at: http://energy.gov/sites/prod/files/2013/10/f3/dc_chilloff2.pdf (PDF, 816 KB)
6 Dunlap, Kevin; Rasmussen, Neil. Choosing Between Room, Row, and Rack-based Cooling for Data Centers (PDF, 629 KB), APC Whitepaper. 2012. Available online at: http://www.dcsawards.com/files/VAVR-6J5VYJ_R2_EN.pdf (PDF, 629 KB)