Raise the Temperature

You Should Not Feel Cold in Your Data Center

Fifteen to twenty years ago it wasn’t unusual to put on a jacket or sweater upon entering a data center.  In fact, some data center operators installed coat hangers and kept spare jackets by the entrance for exactly that purpose.  At the time, data centers set thermostats as low as 55 degrees F.  Today, if you walk into your data center and find it noticeably colder than the rest of the building, chances are good that you can save energy by raising the temperature.

At what temperature should a data center be operated?  In 2008, the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) expanded the recommended temperature range from 68°F to 77°F (the 2004 level) to 64.4°F to 80.6°F – as measured at the inlet (or front) of the server.  Some data centers can be run at higher temperatures.

Server Inlet Temps

Figure 1: Recommended server inlet temperatures

Although any data center will save energy by raising temperatures, data centers with air-side or water-side economizers and/or variable speed fan drives in their cooling units stand to benefit the most. That's because higher set point temperatures increase the amount of time that: 1) outdoor air can be used for cooling, and 2) cooling units can run at reduced fan speeds.

Modern Equipment Can Handle Higher Temperatures

Most recent data center equipment is rated for a maximum inlet temperature of 95 degrees F.  For example, the Dell Power Edge R7301  and the HP ProLiant DL 580 Gen 8 servers2 both have upper temperature limits of 95°F. Some servers have limits as high as 113°F or more3, making them ideal for data centers that rely on outdoor air year round for cooling.

Savings and Costs

Data centers can save 4% to 5% in energy costs for every 1°F increase in server inlet temperature.4  As a result, raising server inlet temperatures has been a key component of many data center energy efficiency efforts:

  • BNY Mellon, an ENERGY STAR certified data center, raised the temperature of their “supply air” (air exiting the cooling unit) from 72 to 78 degrees F, which allowed them to increase their chilled water temperature from 44 to 47 degrees F, lowering cooling costs. Their story is featured in an ENERGY STAR case study (PDF, 600 KB).
  • RagingWire, an ENERGY STAR certified data center, raised their chilled water temperature from 50 to 60 degrees F after installing hot aisle containment. An extensive wireless sensor network was installed to monitor and maintain appropriate server inlet temperatures. The project paid for itself in a year, and their story is featured in an ENERGY STAR case study (PDF, 1 MB).
  • The Green Grid completed a comprehensive analysis (PDF, 2 KB) detailing the return on investment (ROI) and power usage effectiveness (PUE) associated with several data center efficiency upgrades. The payback for adding airflow management devices such as baffles and blanking panels, repositioning temperature/humidity sensors, and adjusting temperature set points was just 29 months.
  • Google’s Green Data Centers: Network POP Case Study (PDF, 4 KB) documents specific energy efficiency measures taken in a server room.  Measures included temperature and humidity adjustments, optimized air vent tiles, cold aisle containment, and CRAC air return extensions.  The initiative had a return on investment (ROI) of less than one year.

Tips and Considerations

  • Although adjusting inlet temperature is relatively simple, metering temperature, humidity and power changes across the data center can be complicated. Contact a firm specializing in data center monitoring for assistance.
  • Raising inlet temperature may lead to uncomfortable working conditions. For example, at the higher end of ASHRAE's operating range, the cold aisle (server inlet) temperature might be 80.6°F, but the neighboring hot aisle temperature could be as high as 105°F to 110°F.
  • The highest recommended server inlet temperature setting may not be the most efficient. A study by Dell5 revealed that higher server inlet temperatures can cause internal server fans to automatically speed up, resulting in increased overall energy use.  The study recommended server inlet temperatures between the upper 70s and lower 80s (degrees F) to prevent this from happening.
  • Raising temperatures may lead to data center “hot spots,” or areas receiving insufficient cool air.  Hot spots can lead to equipment failure if maximum operating temperatures are exceeded.  ASHRAE recommends measuring and recording server inlet temperatures at the center of the top, middle and bottom of server racks, at about 2 inches from the front of the equipment.6   See Figure 2.  This helps ensure that you don’t miss hot spots on server racks.
    Where to Measure Server Inlet Temps

    Figure 2: Where to measure server inlet temperatures

  • One way to identify hot spots is through thermal imaging. All objects emit infrared radiation based on their temperatures; thermal imaging makes it possible to view equipment temperature conditions which are invisible to the naked eye. The hotter an object is, the more it radiates, allowing infrared (or thermal) imaging to assess cooling and air flow problems in the data center.  In addition to identifying hot spots, thermal imaging can locate “cold spots,” where equipment is receiving too much cool air, wasting energy.  (As a bonus, thermal imaging can also identify overloaded circuits and loose connections before they result in unplanned outages.)  With knowledge gleaned from thermal imaging, data center managers can adjust airflow management devices (e.g., vented floor tiles, diffusers) and cooling equipment to reduce hot and cold spots—ultimately reducing the amount of time cooling units need to run. Figure 3 below shows a row of server racks with cold air (depicted in blue) coming up through perforated floor tiles.  The temperature gradient visibly increases as air moves from the bottom to the top of the racks, becoming hotter (and transitioning from yellow, to orange, to red) along the way.
    Infrared Image

    Figure 3: Infrared image showing temperature gradients across server racks.  Photo credit: Thermal Imaging Services, LLC.


1https://www.dell.com/support/manuals/us/en/04/poweredge-r730xd/r730xd_ompublication/standard-operating-temperature?guid=guid-c5c1a8e6-c380-46ea-a788-604fd8778370&lang=en-us

2 https://support.hpe.com/hpsc/doc/public/display?docId=c04142791

3 Jon Fitch. Dell’s Next Generation Servers: Pushing the Limits of Data Center Cooling Cost Savings. February 2012. http://www.dell.com/downloads/global/products/pedge/data_center_cooling_fresh_air.pdf (PDF, 838 KB)

4 http://www.42u.com/cooling/data-center-temperature.htm

5 David Moss, Data Center Operating Temperature: The Sweet Spot.  June 2011. http://en.community.dell.com/cfs-file/__key/telligent-evolution-components-attachments/13-4491-00-00-20-10-90-79/Datacenter-Operating-Temperature.pdf?forcedownload=true

6 http://www.raritan.com/blog/2010/08/ashrae-colder-is-not-better-for-a-data-center-2