Analyzing data centers: fire/life safety and HVAC

Data is the lifeblood of any business or organization—which makes a data center a facility’s beating heart. Here, engineers with experience on data center projects show how to succeed on such facilities, and how to keep your finger on the pulse of data center trends in regards to fire/life safety and HVAC.

By Consulting-Specifying Engineer April 27, 2017


  • Robert C. Eichelman, PE, LEED AP, ATD, DCEP, Technical Director, EYP Architecture and Engineering, Albany, N.Y.
  • Karl Fenstermaker, PE, Principal Engineer, Southland Engineering, Portland, Ore.
  • Bill Kosik, PE, CEM, LEED AP, BEMP, Senior Mechanical Engineer, exp , Chicago
  • Kenneth Kutsmeda, PE, LEED AP, Engineering Manager—Mission Critical, Jacobs, Philadelphia
  • Keith Lane, PE, RCDD, NTS, LC, LEED AP BD&C, President, Lane Coburn & Associates LLC, Bothell, Wash.
  • Brian Rener, PE, LEED AP, Senior Electrical Engineer, SmithGroupJJR, Chicago
  • Mark Suski, SET, CFPS, Associate Director, JENSEN HUGHES, Lincolnshire, Ill.
  • Saahil Tumber, PE, HBDP, LEED AP, Senior Associate, Environmental Systems Design, Chicago
  • John Yoon, PE, LEED AP, Lead Electrical Engineer, McGuire Engineers Inc., Chicago

CSE: Describe the cost and complexity of fire protection systems involved with data centers. Have they changed over the years?

Rener: In years past, data centers often used both gaseous and pre-action double-interlocked sprinkler systems and even conventional smoke detectors under a raised floor. The various environment regulations and cost of seal-tight room construction led to pre-action sprinklers along with more accurate high-sensitivity, air-aspirated smoke-detection systems. These days with high-density, hot-aisle containment, we are seeing close coordination within the enclosures with special high-temperature sprinkler heads and no raised floors. Newer suppression systems using high-pressure nitrogen to micronize water, which causes no damage to equipment, are being examined for use in data centers.

CSE: What unique HVAC requirements do data center building projects have that you wouldn’t encounter in other buildings?

Suski: There are many unique HVAC requirements that affect data centers. In fact, the design of the HVAC system is one of the most critical factors for all data centers. The HVAC system in data centers not only provides comfort air for the occupant, but it also (and more importantly) provides the cooling air for the equipment. Damage can occur to circuit boards and electronic equipment starting at 80°F, so it is critical to data center operations that the HVAC system function properly. Understanding the different HVAC design approaches will allow fire protection engineers to identify the appropriate fire protection strategy of the particular data center. The HVAC systems configurations range from flooded to targeted to contained. Each type of air-distribution configuration has a unique approach and protection strategy.

Fenstermaker: The proliferation of direct-evaporative cooling application in large data centers, combined with expanded allowable temperature and humidity limits for IT- equipment intakes, means mechanical engineers need to pay close attention to all wall- and system-component surfaces for the possibility of surface temperatures that could drop below room dew point, effectively creating rain inside of the data center.

Tumber: The project requirements and design attributes of a data center are different from other uses. The mission is to sustain IT equipment as opposed to humans. They are graded on criteria such as availability, capacity, resiliency, flexibility, adaptability, time to market, scalability, cost per megawatt, and more. These criteria are unique to data centers, and designing a mechanical system that meets all the requirements can be challenging.

CSE: Have you specified distinctive HVAC systems on any data centers? What unusual or infrequently specified products or systems did you use to meet challenging HVAC needs?

Rener: We’ve worked with HVAC systems ranging from traditional computer room air handler/computer room air conditioning (CRAH/CRAC) units and fan wall air handlers to close-coupled pumped refrigerant systems. The selection and specification of a certain system are driven by the individual facility requirements as well as the geographical region in which the facility is located. The key is to do the upfront analysis and decision matrix to select a system that balances flexibility, maintainability, and performance.

Tumber: There are many products that have been designed specifically for the data center market. Each project is unique so there is no one-size-fits-all solution. The unique technologies I have used on projects include evaporative cooling (direct and indirect); outdoor DX units that use heat pipe, heat wheel, or plate-type heat exchanger for economization; liquid cooling; CRAC units; and chillers with pumped refrigerant technology. These technologies have been deployed significantly over the past couple of years and are no longer considered unusual in the data center industry.

CSE: Have you VRF systems, chilled beams, or other types of HVAC systems into a data center? If so, describe its challenges and solutions.

Yoon: We have specified large top-of-rack pumped refrigerant systems where the primary challenge was the sheer volume of refrigerant. These systems were large enough for the AHJ to require self-contained breathing apparatus gear and refrigerant gas-alarm/exhaust systems within the white space. While not necessarily any more or less hazardous than many other systems present within the data center, the system types and the potential presence of nontechnical personnel emphasized the need for proper training for anyone with access to the white space.

Tumber: I have used a VRF system and dedicated outdoor air system for the ancillary spaces of the data center, such as network operations centers, security operations centers, offices, and conference rooms. Technologies such as VRF, chilled beam, and radiant cooling are not intended for process-cooling requirements of data centers. They are designed for comfort-cooling applications and can be deployed in limited areas in a data center.

CSE: Have you specified VRF systems, chilled beams, or other types of HVAC systems into a data center? If so, describe its challenges and solutions.

Rener: We have worked with close-coupled pumped refrigerant systems, and one of the challenges is the amount of refrigerant that is in the system as it relates to monitoring. The proper refrigerant-monitoring system and other measures in compliance with ASHRAE Standard 15: Safety Standard for Refrigerant Systems must be incorporated into the design to ensure the safety of the employees. The close-coupled systems are very energy-efficient and allow for at-the-rack cooling solutions without chilled water in the data hall.

CSE: When designing a data center in an unusual climate (for example, cooling with primarily outside air or in an exceptionally humid environment), what should the mechanical engineer consider? What unique HVAC requirements are there in these different climate types?

Tumber: As in any project, climate greatly impacts the mechanical system design. For example, direct-evaporative cooling is feasible in Prineville, Ore., but not in Miami. A one-size-fits-all approach is not appropriate. A thorough engineering analysis is essential to figure out the best solution. Engineers need to keep abreast with the latest technologies so that they can guide their clients. Frequently there are tradeoffs that also need consideration. For example, a mechanical system that incorporates direct-airside economization will reduce electrical consumption in winter. However, dry outside air during winter will require humidification, which will increase water consumption.

Rener: If a data center is going to use outside-air free cooling, then two issues are critical. First, the proper level of filtration must be incorporated into the AHUs to ensure various particulates do not enter the data floor from the outside. In desert climates, such as Phoenix, this can even include sand particles. Second, the system must be able to react to the dynamic changes in weather that can happen almost in an instant. The University of Utah data center in Salt Lake City uses airside free cooling 75% of the year on average. The fan-matrix AHUs use MERV 8/14 pre- and final filtration and a sophisticated weather station to inform the BMS of what action to take based on the weather patterns of that particular moment.

Kosik: Climate analysis is a critical part of the design process, not only to examine peak design days but also how the temperature and moisture content of the outside air changes over the course of a year. Studying hourly data and bin analysis will help make the HVAC equipment selection more effective in terms of not oversizing equipment. This is where close communication with the client is very important. Is the client willing to downsize the equipment and “gamble” that the temperatures don’t exceed the design parameters? Or is this a chance the client is not willing to take? Also, climate change is causing an increased year-round temperature. There are data models that predict where the temperatures are headed, but this is a design decision that needs to made because installing cooling equipment to address this warming will increase capital costs.

Fenstermaker: High-altitude desert climates have become prime locations for large data centers since outside air economization and direct- or indirect-evaporative cooling can be employed all year long, without the use of supplemental refrigerant-based cooling (chillers or otherwise). However, at the higher altitudes, the required airflow can increase 10% or more, which may impact building design.