What do you need to know about designing HVAC systems in data centers?
Will data centers get larger? More efficient? Have different HVAC systems? Learn about the trends here
Cooling data centers
- Cooling data centers has changed over the past several years, in part due to servers that can handle higher temperatures.
- Water use and plumbing issues must be considered when designing a data center’s cooling system.
Respondents:
- Bill Kosik, PE, CEM, BEMP, Senior Energy Engineer, DNV, Oak Park, Illinois
- Brian Rener, PE, LEED AP, Principal, Mission Critical Leader, SmithGroup, Chicago, Illinois
- Ameya Soparkar, Market Leader, Mission Critical, Affiliated Engineers Inc., Rockville, Maryland
- Robert Sty, PE, LEED AP, Vice President, HDR Inc., Phoenix, Arizona
What unique cooling systems have you specified into such projects? Describe a difficult climate in which you designed an HVAC system for a data center.
Ameya Soparkar: We designed a data center for a desert environment in the Middle East. This was a typical data center design that used a chiller plant, computer room air handler (CRAH) units, uninterruptible power supply (UPS), generators, had an adjoining office space, etc. However, the considerations that had to be considered due to the environmental conditions to ensure continuous operation were extraordinary. The system had to be designed for higher ambient temperature 120°F, the condenser water operating temperature was at 95°F-105°F and the sand in the air and sandstorms that could clog up the air intakes, cooling towers had to be factored in the design.
Robert Sty: Large-scale air handling units (AHU) with a fan matrix supplying directly into the data hall have gained popularity in recent years. This type of system works well with the use of air-side economizers for energy savings. Geographic regions with mild temperatures, in lieu of extremes, may present challenges with rapid and dynamic changes in outside air temperature and humidity. The control systems operating the mode of operation in the AHU must be able to respond quickly and maintain the proper conditions, typically based on the standards set by ASHRAE Technical Committee 9.9 Equipment Thermal Guidelines for Data Processing Equipment.
What unusual or infrequently specified products or systems did you use to meet challenging cooling needs?
Bill Kosik: In an analysis and consulting project, the team used in-rack carbon dioxide (CO2) cooling for high-density computer loads. A project in Sweden analyzed the use of deep snow or piles of snow. Another project in Amsterdam used the canal in a central city location for condenser water and free cooling. We did an analysis on the monthly fluctuations in the canal water temperature to see what the energy reduction would be. We have tried using liquid cooling at the component level (processor, memory, storage). Depending on the climate, we can cool main IT equipment without using compressorized cooling equipment.
Brian Rener: A thermosyphon uses refrigerant and condenser fans to provide heat rejection for liquid-cooled high-performance computing or water-cooled chillers. Its efficiency is like that of cooling towers with outdoor temperatures generally at about 50°F or less, but uses no water. Operation with water temperatures greater than 95°F can raise the optimum operating temperatures up, but system is geared toward cooler climates. This is a significant water savings potential.
Ameya Soparkar: We had to specify sand trap louvers that would reduce the amount of sand particles that would get pulled in the air intakes and would clog the filters of the air handling units. We specified a sump sweeping system that had nozzles with jet sprays in the cooling tower sump that would push away the particulate matter, which then got pulled by the filtration part of the system and reduced the amount of sand that got into condenser piping. Lastly, the condenser piping, which was underground, was designed to have no 90-degree bends — all the bends were 45 degrees — to reduce the chances of clogging of pipes and aid cleaning them in case they do get clogged.
Robert Sty: The Phoenix area is known for air temperatures in the summer reaching above 115°F and if that was not enough, designers should consider other sources of heat such as the radiant temperatures at the roof level and generator exhaust. These additional sources can seriously impact the performance of standardized heat rejection equipment if they are located on the roof. There are equipment strategies that can handle these higher temperatures, but the designer should look at alternative locations of air-cooled chillers and generators to see if altering the layout can help mitigate the risk. Tools such as computational fluid dynamics modeling help in determine where there will be issues with equipment location.
Describe a project in which the building used free cooling. Outline the building’s location, size, efficiency and other HVAC needs.
Brian Rener: The University of Utah Downtown Data Center in the high desert of Salt Lake City uses a combination of outside air-free cooling economizer, direct evaporative cooling and fluid coolers for an up to 10 MW predominantly air-cooled system (with a limited amount of liquid-cooled high-performance cooling). It is an Uptime Institute Tier III system with a power usage effectiveness (PUE) of 1.2-1.3. Air-cooled chillers provide for backup cooling, but under normal conditions operate less than 5% of the year.
Ameya Soparkar: We designed a 10 MW data center for a client in Korea with direct outside air economization and air supply temperatures of 75°F and 40% relative humidity. The mechanical system comprised building integrated air handling units, fan arrays, cooling coils installed in the wall and water spray for adiabatic cooling. In that part of the world yellow dust, which is microscopic dust coming from the Gobi Desert, is a concern, so the infrastructure was designed to go to mechanical cooling mode when the contaminant level becomes too high, hence a separate air-cooled chiller plant was installed. The designed comprised of a controls system that would automatically transition from economization to mechanical cooling mode. A PUE of 1.25 was achieved.
Robert Sty: Our mission critical design team has designed several projects across the globe using both air and water-free cooling strategies. Concerning air-side economization, the data center should use aisle containment strategies to support raised supply air temperatures, which increase the number of hours of free cooling on both the air-side (TSAT) and water side (CHWSWT). Air-side can have unique challenges in how the air intake and exhaust/relief strategies align with the building architecture. Based on acoustic studies, sound attenuation may be required, which will drive up the static pressure requirements at the AHU and lead to increased fan horsepower. One issue, often overlooked, is the pressure differential between the data halls and adjacent areas such as the equipment rooms or other corridors. It’s important to make sure this is addressed through vestibules or other mitigation efforts.
How have you worked with HVAC system or equipment design to increase a building’s energy efficiency?
Bill Kosik: Engineers need to design HVAC systems that mirror the build-out of the power infrastructure. IT equipment will typically grow in increments and the UPS systems will match the growth of the IT systems. There is a similar efficiency strategy where the HVAC systems are concerned. Building up HVAC systems at the same pace as the IT and electrical systems allows for better load matching and working at the peak efficiency point.
Robert Sty: The HVAC system is designed primarily around two parameters:
- The data hall environmental requirements.
- The external climatic conditions that influence the ability to reject the heat to the atmosphere.
There may be other limitations, such as the ability to use water in heat rejection strategies or cost and schedule, but once we overlay all the constraints, the design can take shape. As important as it is for the systems to operate during hot and cold extremes, the engineer should look at the yearly bin temperature data profile to understand where the system will operate and under what percentage of loading the facility will be. These elements will help guide equipment selection and control strategies to maximize efficiency.
What best practices should be followed to ensure an efficient HVAC system is designed for this kind of building?
Robert Sty: There are several organizations that put forth best practices and design guidelines for data center design. ASHRAE, the Uptime Institute and the Telecommunications Industry Association are just a few. The HVAC engineer must balance energy and water use with uptime and availability. Considerations for maintenance and the principles of safety in design are also important. Knowledge sharing can be difficult in a market sector driven by secrecy, however, collaboration in the architecture, engineering and construction industry helps raise the collective knowledge base. My advice to any engineer interested in data center design is to proactively reach out and find a mentor to help teach you the best way to apply the various industry standards.
What type of specialty piping, plumbing or other systems have you specified recently?
Robert Sty: As co-location providers respond to the requests of flexibility for their clients, we are deploying multiple chilled water loops with varying temperature bands in the data center. Traditional chilled water systems with ranges of 44˚F-56˚F (CHWS/R temps) for offices, back-of-house cooling and specific data hall environments. Other loops serving data halls with much higher environmental requirements are reaching into the 50˚F-60˚F CHWS temperature band, allowing for an increased use in water-side economizers. Any direct liquid cooled applications have specialty piping and connections from the distribution system to cabinet. If piping systems from distribution to final cabinet connection use dissimilar metals, then dielectric unions should be installed to prevent corrosion.
What are some of the challenges or issues when designing for water use in such facilities?
Bill Kosik: Water has always been an issue, and not just in areas with water scarcity but anywhere. We need to get water to the site and back to treatment facility. This is a careful balance between energy consumption (which consumes water at the generation facility) and water use. Plus, all the treatment and pumping of water requires electricity. While it is difficult to determine a precise comparison of the total environmental impact of water- and air-cooled HVAC systems for data centers, it is not a given that decisions on water and overall environmental impact can be made simply based on climate and energy consumption. Additionally, different types of electricity generation plants use different techniques, which also determines water consumption.
Robert Sty: Municipalities are pushing back on water-based heat rejection, specifically cooling towers and direct/indirect evaporative cooling, in the Western United States due to historic drought conditions. First, the owner and engineer should engage the local municipality to understand if there are local restrictions or concerns in water use. If water is to be used for heat rejection purposes, develop models that discuss how much water use is anticipated and what percentage of the year it would be used. Depending on the system, it may be for trim cooling applications and relatively minor. Another suggestion is to review how the power for the specific site is generated — fossil fuel, nuclear, renewable. If the utility power generation is a water intensive process, then is it better to use some water at the source to reduce the overall system impact? Our industry needs to better understand the entire energy-water nexus and how to solve the problem in a holistic manner.
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.