Skip to main content
U.S. flag

An official website of the United States government

Here’s how you know

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

HTTPS

Secure .gov websites use HTTPS
A lock (LockA locked padlock) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

ENERGY STAR
Utility Navigation
  • About
  • For Partners
Main navigation
  • Find Products
    • Product Finder
    • Rebate Finder
    • Ask the Experts
    • Product Specification Search
    • Best Value Finder
    • Frequently Asked Questions
  • Save at Home
    • Heat & Cool Efficiently
    • Seal and Insulate
    • Expert Home Improvements
  • New Homes
    • Find a Builder
    • Homeowner Benefits
    • Join as a Partner
    • Program Requirements
    • Residential Resources
    • § 45L Builder Tax Credit
    • ENERGY STAR NextGen
    • About Us
  • Commercial Buildings
    • Benchmark
    • Save Energy
    • Earn Recognition
    • Resources by Audience
    • Resources by Topic
    • Training
    • About Us
  • Industrial Plants
    • Industrial Partnership
    • Industrial Assistance Network
    • Plant Certification
    • Challenge for Industry
    • Industries in Focus
    • Treasure Hunt
    • Get started with ENERGY STAR
    • Program Administrators
    • Service & Product Providers
    • Decarbonizing Industry
Breadcrumb
  1. Home
  2. Energy Efficient Products
  3. Data Center Equipment
  4. 16 More Ways To Cut Energy Waste In The Data Center
  5. Make Humidification Adjustments

AIR CLEANERS

AIR-SOURCE HEAT PUMPS

AUDIO/VIDEO

BOILERS

CEILING FANS

CENTRAL AIR CONDITIONER

CLOTHES DRYERS

CLOTHES WASHERS

COMMERCIAL BOILERS

COMMERCIAL CLOTHES WASHERS

Commercial Food Service Equipment

COMMERCIAL WATER HEATERS

COMPUTERS

DECORATIVE LIGHT STRINGS

DEHUMIDIFIERS

DATA CENTER STORAGE

DATA CENTERS

DIGITAL MEDIA PLAYER

Data Center Equipment

  • Embedded Data Centers
  • 5 Simple Ways to Avoid Energy Waste in Your Data Center
  • 16 More Ways to Cut Energy Waste in the Data Center
    • Consolidate Lightly-utilized Servers
    • Implement Efficient Data Storage Measures
    • Utilize Built-in Server Power Management Features
    • Reduce Energy Losses from Power Distribution Units (PDUs)
    • Reduce Energy Loss from Uninterruptible Power Supply Systems
    • Manage Airflow for Cooling Efficiency
    • Move to a Hot Aisle/Cold Aisle Layout
    • Utilize Containment/Enclosures
    • Consider Water Side Economizers
    • Install In-rack or In-row Cooling
    • Make Humidification Adjustments
    • Use an Air-Side Economizer
    • Use Sensors and Controls - Match Cooling, Airflow, IT Loads
    • Provide Energy-efficiency Awareness Training
    • Select a Sustainable Colocation Facility
  • The Energy Cost of Cryptocurrency
  • Data Centers for Utilities
  • IT Solutions and Power Infrastructure
  • Optimize Airflow and HVAC

DUCTLESS HEATING & COOLING

ELECTRIC COOKING PRODUCTS

ELECTRIC VEHICLE CHARGERS

ENTERPRISE SERVERS

FREEZERS

FURNACES

GEOTHERMAL HEAT PUMPS

HEAT PUMP WATER HEATERS

HIGH EFFICIENCY GAS STORAGE WATER HEATERS

IMAGING EQUIPMENT

LABORATORY GRADE REFRIGERATORS AND FREEZERS

LARGE NETWORK EQUIPMENT

LIGHT COMMERCIAL HEATING & COOLING

LIGHT FIXTURES

MONITORS

POOL PUMPS

REFRIGERATORS

RESIDENTIAL WINDOWS, DOORS & SKYLIGHTS

ROOM AIR CONDITIONERS

Seal and Insulate with ENERGY STAR for Partners

SIGNAGE DISPLAYS

SMART HOME ENERGY MANAGEMENT SYSTEMS

SMART THERMOSTATS

SOLAR WATER HEATERS

STORM WINDOWS

TELEPHONES

TELEVISIONS

UNINTERRUPTIBLE POWER SUPPLIES

VENDING MACHINES

VENTILATION FANS

VOICE OVER INTERNET PROTOCOL (VOIP) PHONES

WATER COOLERS

WHOLE HOME TANKLESS GAS WATER HEATERS

Demand Hot Water Recirculating System

Cool Roofs

Rooftop Solar

Green Power Options

Programs and Strategies for Energy Access

Energy Efficiency Home Upgrade Assistance

Inclusive Utility Investment

Product Specifications & Partner Commitments Search

Products Tools and Resources

Recycle

Save on Heating Costs with ENERGY STAR This Season

Scoping Reports

Smart Home Tips for Saving Energy

DIY Lighting with ENERGY STAR

Eligible Commercial Fixture Types

Learn About Brightness

Learn About LED Lighting

Medical Imaging Equipment For Partners

Recent Program Updates

Save Energy in the Data Center / Server Room: Additional Resources

The Certified Lighting Subcomponent Database (CSD)

Earth Day 2023

Enjoy Spring Savings and Help the Planet

Keep Your Cool AND Save Your Money this Summer

Gifts that Do a World of Good

Most Efficient

Make Humidification Adjustments

Until very recently, most data center managers tightly controlled humidity, keeping it between 45 and 50% relative humidity (RH).  The concern was that low humidity could lead to electrostatic discharge (ESD) failures, and that high humidity could cause water droplets to condense inside equipment.  In addition to causing electrical shorts that trip circuit breakers, damage equipment, or harm electrical circuits, high humidity and condensation could also result in rust and corrosion, leading to component failure.

A data center with several CRAC units can consume a great deal of energy maintaining humidity within a narrow tolerance range.  Typically, a computer room air conditioner (CRAC) has the ability to heat, cool, humidify, and de-humidify.  Consider the following scenario involving two CRAC units, described as “extremely common” in early papers on the topic:1

  • When the humidity gets too high, CRAC unit #1 sub-cools the air to remove moisture.  However, this also lowers the temperature, so CRAC #1 then reheats the air (typically with an energy-intensive electric resistance heater) to maintain a constant set-point temperature.
  • CRAC #2, located nearby, now reads the humidity as too low and adds humidity to the air by using energy-intensive infrared or “steam canister” humidifiers.2  However, using steam to increase humidity also raises the temperature, so CRAC #2 must then cool the air to bring the temperature back down. 
  • CRAC #1 subsequently reads humidity as too high, and the cycle starts over again.

This scenario is known as “CRAC fighting,” and it can waste an enormous amount of energy.

Relax humidity tolerances and measure with dew point, not RH

Improved humidity tolerances in equipment have resulted in ASHRAE relaxing its recommended humidity ranges.3 ASHRAE’s recommended data center humidity ranges are:

  • Lower limit: 42 degrees F dew point
  • Upper limit: 59 degree F dew point and 60% relative humidity (regardless of temperature)

An important part of adopting this ASHRAE range is to use “dew point”, an absolute measure of humidity, and not “relative humidity”, which varies with temperature and hence can lead to wider fluctuations in humidity levels due to temperature gradients in the data center. (See Figure 1.)  With broader allowable humidity ranges, CRAC units will spend less time and energy adjusting humidity in your data center.

Typical Server Inlet
Figure 1: Typical server inlet and outlet temperatures, relative humidity (RH) and dew point. Note that RH varies with temperature, but dew point remains constant. Image credit: APC.

Turn off CRAC unit humidifying and dehumidifying capabilities

Data center operators can avoid CRAC fighting by simply turning off the humidifying/dehumidifying functions in some of their CRAC units.  For example, by leaving humidity adjustments to just two CRAC units located far from each other, CRAC fighting can be minimized.  In a small data center, a single CRAC unit can be used as a central humidification unit. 

In addition, the reheat function should be disabled on all CRAC units. Used to raise the temperature of sub-cooled air that has been dehumidified, reheat is unnecessary in the high cooling load5 of a data center.  In fact, California Title 24 does not allow the use of reheat in newly constructed data centers in California.  

Consider eliminating humidity control entirely

Many data centers eliminate humidification with no adverse effects.  In fact, there is a growing body of research that suggests that humidity controls may not be necessary in a data center. Regarding upper limits on humidity, the temperature of IT equipment is significantly higher than that of the cooling coils' operating dew-point, so condensation from high humidity is not expected.  Regarding the lower limits on humidity, humidification appears to be unnecessary if you follow best practices for electrostatic discharge (ESD): use IT equipment that is rated and tested for ESD conformance with IEC 61000-4-2.6   Where personnel handle electronic circuit boards and components, use personal grounding procedures.  (See Tips and Considerations, below.)

Additional evidence in support of eliminating humidity control:

  • An ASHRAE journal article found that high humidity is rarely an issue in most data centers and concluded that "it is difficult to make a case for actively controlling humidity in data centers."7
  • A 2014 ASHRAE study concluded that if industry-standard data center facility design and operating practices are followed, static discharge has a very low probability of harming IT hardware – even at just 8% RH, the lowest level tested. As a result, data centers in cool climates (and particularly those that use free cooling) should no longer need to humidify at all for much of the year.  Many data center experts assume this study will lead ASHRAE to revise the lower limits of acceptable humidity their next official thermal guidelines for data centers.8
  • In August 2008, Intel conducted a 10-month study to assess the effectiveness of using only outside air to cool a data center. The temperature range was 64°F to 92°F. Humidity varied from 4% to over 90% and changed rapidly at times. No increase in server failure was observed. 9
  • Most IT equipment is rated for operation up to 80% RH.

Nevertheless, a 2016 study concluded that data center managers should be aware of humidity levels in their data centers with air side economizers. Researchers collected data from nine worldwide Microsoft data centers that host over 1 million disk drives. Six were located in areas characterized as hot and humid, while three were located in cool and dry conditions. The dry data centers had the lowest failure rates -- regardless of which cooling technology they used. Data centers in hot, humid areas with the highest internal relative humidity readings had failure rates 107-260% higher than the facility with the lowest failure rate (a chiller-cooled data center in a cool and dry area).  Internal relative humidity seemed to have the biggest impact on disk lifetime, with average temperature exerting a lower (but still significant) impact, the researchers note.10

If you must humidify, use energy-efficient technologies11

In the past, energy-intensive technologies were used to humidify server rooms by essentially generating steam:

  • Steam canisters introduce liquid water into a canister containing electrodes.  When the electrodes are powered, water is boiled and steam (water vapor) is produced
  • Infrared humidifiers suspend quartz lamps over an open pool of water.  Infrared light on the water surface releases water vapor
  • Other technologies include direct steam injection, heated pan humidifiers, and gas fired humidifiers.

Adiabatic (no heat used) humidification technologies use much less energy and, unlike steam-based humidifiers, do not add heat to data center air.  In fact, adiabatic systems help cool the data center as water evaporates into the air:

  • Nozzle humidification systems consist of numerous nozzles that are capable of supplying a mist of water to the airstream.  High-pressure water is supplied by pumps or compressed air.   Water is typically purified before being supplied to these systems
  • Ultrasonic humidifiers utilize a series of piezoelectric transducers to create cavitation within a basin of water. This cavitation then creates a fine mist of water vapor. Water used in these systems typically needs to be purified through both reverse-osmosis filtration and deionization.
  • Wetted media assemblies, often referred to as direct evaporative coolers/humidifiers, works by spraying or dripping purified water over layered polyester fabric.  The water wicks down the fabric, saturating it along the way. Excess water is recirculated to the top of the fabric via a pumping system, where the cycle repeats.

Savings and Costs

A number of studies have examined the savings from adjusting humidification settings or from using more efficient humidification technologies:

  • An ENERGY STAR certified data center operated by BNY Mellon changed humidification set points from relative humidity (which varies by temperature) to dew point (an absolute humidity value).  The change reduced humidification run-time from 80% to 20% of the time.  See the BNY Mellon case study (PDF, 600 KB) for more details.
  • Kaiser Permanente set up a sophisticated temperature and humidity monitoring network to optimize cooling efficiency in their ENERGY STAR certified data center. See the Kaiser Permanente case study (PDF, 79.2 KB) for more details.
  • Google’s Green Data Centers: Network POP Case Study (PDF, 3.6 MB) examines a small data center’s experience with vented tile optimization, temperature and humidity adjustments, cold aisle containment, and CRAC air return extensions.  Taken together, these measures showed a return on investment (ROI) in less than one year.  During this study, relative humidity controls were adjusted from a set level of 40% RH to a range of 20% to 80% RH. 
  • EBay completed a case study (PDF, 578 KB) documenting costs and savings for ultrasonic humidification at their Phoenix data center. The power draw of the original steam humidification units was 30 kW, compared to less than 1 kW for the ultrasonic units. ENERGY STAR and eBay estimated savings at $50K annually, resulting in a payback on investment of 1.9 years.  Payback was reduced to 0.5 years thanks to a generous $70K rebate from Arizona Public Service, their utility. 

Tips and Considerations

  • Adiabatic humidifiers often require pre-treated water.  For example, ultrasonic humidifiers should only be used with de-ionized water.
  • ESD damage can be prevented in the data center with the following commonly-used devices:12
    • Antistatic wrist strap, also known as an ESD strap, consists of an elastic band of fabric with fine conductive fibers woven into it, attached to a wire with a clip on the end to connect it to a ground conductor.  If the wrist strap wire is too short to reach the dedicated connecting point on a device, connect it to an unpainted surface or to a ground wire mounted on the device or rack.
    • Antistatic bags are used to store or ship sensitive equipment.
    • ESD static mats (tested to meet ANSI 20.20 standards) consist of an insulated material and two wires with clips on each end of the wire. Spread the mat on a flat surface and attach one of the clips to ground.
    • ESD flooring includes raised access floors made with a grounded high-pressure laminate, sealed concrete floors, and static dissipative floor coverings.  All of these floor types dissipate static electricity.  Carpets should be removed from the data center.
    • ESD shoes have metallic elements that allow static electricity to discharge when the shoes come in contact with ESD flooring. Data center operators should avoid footwear with polymer-based sole material, such as sneakers or deck shoes.

1 Rasmussen, Neil. Avoidable Mistakes that Compromise Cooling Performance in Data Centers and Network Rooms.  APC White Paper #49.  2003.  

2 Steam canister humidifiers use electrodes inserted into the water reservoir to pass current through the water, causing it to boil and to discharge pure steam.

3 Data Center Environments: ASHRAE’s Evolving Thermal Guidelines, by Robin Steinbrecher and Roger Schmidt, Ph.D., ASHRAE Journal, December 2011.

4 Evans, Tony.  Humidification Strategies for Data Center and Network Rooms.  APC White Paper # 59. 2004.  

5 Cooling load is the rate at which a cooling system must remove heat from a conditioned space to maintain temperature and humidity.

6 The IEC 61000-4-2 standard is commonly used to certify equipment such as mobile phones, computers and any sensitive electronic equipment.

7 Hydeman, Mark.  Humidity Controls for Data Centers: Are They Necessary? ASHRAE Journal, March 2010, p. 48. 

8 McFarlane, Robert.  Data Center Humidity Recommendation Falls Dramatically. Techtarget.  2015.  

9 Miller, Rich.  Intel: Server Do Fine with Outside Air. Data Center Knowledge. September 18, 2008. 

10 Beware of Humidity in Free-Cooled Data Centers. Buildings. April 2016. 

11 Colby, Jeffrey.  The Why, What and How of Data Center Humidification.  Engineered Systems. February 1, 2016.

12 Guidelines and Best Practices for the Installation and Maintenance of Data Networking Equipment. Cisco Systems, Inc. January 2014.  

 
ENERGY STAR
United States Environmental Protection Agency

Save Energy.

  • Find Products
  • Save at Home
  • New Homes
  • Commercial Buildings
  • Industrial Plants
  • Partner Resources

Learn More.

  • About Us
  • Join
  • Newsroom
  • Privacy
  • Accessibility Statement
  • Help Desk

Stay Informed.

  • Public Notices
  • Consumer Newsletter
  • Program Updates
Back to top