Reduce data centers’ drain on energy–a history

While this information is still very fresh and many initiatives are still in the early stages of development, it is important to realize that the discussion on identifying and reducing energy use in data centers actually has roots dating back more than a decade.

By William J. Kosik, P.E., CEM, LEED AP October 18, 2007

While this information is still very fresh and many initiatives are still in the early stages of development, it is important to realize that the discussion on identifying and reducing energy use in data centers actually has roots dating back more than a decade.
A 1999 Forbes Magazine article titled “ Dig More Coal—the PCs Are Coming ” by Mark Mills and Peter Huber (also authors of “ The Bottomless Well ”) analyzed the impact of the Internet on the global energy markets.
The article was controversial (and in retrospect, visionary) in nature, it pointed out that the Internet and more specifically computers are having a dramatic impact on the world’s electrical energy supply.
At the same time, Lawrence Berkeley National Laboratories researchers were studying actual measured power consumption of existing data centers. This effort, led by Bill Tschudi and his team, was among the first thorough studies on the actual make-up of data center power consumption and where the major inefficiencies exist. Based on these studies, researchers focused on server power supplies and uninterruptible power supply systems where efficiencies would often be less than 50% for certain types of system topologies and load levels.
The resurgence of the economy and the development of faster and more power-hungry PCs and cabinet-mounted servers–namely the blade servers—occurred in tandem with this. For the first time, the focus shifted from electrical power distribution, which was often the limiting factor in bringing on new types of server technology, to the cooling systems.
IT managers were stumbling over the ability to cool high-density server installations, and that had to be resolved. The power intensity of the cooling systems was limiting the overall growth potential of the IT systems, even if physical space was available to house the equipment. Managers had to look at the whole power delivery chain, not just “watts per square foot,” when considering a new IT technology deployment.
Conservation and responsibility
Unrest in the Middle East refocused the world on the volatility and dependence on foreign energy exports. Detroit responded with new, more fuel-efficient and hybrid cars that can, in some cases, out-perform their gasoline-fueled counterparts in both power and fuel efficiency.
The computer industry had long ago implemented energy conservation techniques (sleep mode, screen savers, turning off hard drives, etc.) in their PC and laptop technology and was commonly using the EPA’s Energy Star labeling system, but few energy reduction tactics were actually being used in enterprise server and storage equipment.
The resource conservation and environmental responsibility movement also gained momentum during this period with many major corporations developing their green agendas. The U.S. Green Building Council’s LEED rating system gained acceptance in the design, construction, and commissioning of green facilities. Mass media and mainstream publications (Time Magazine, Newsweek, the Wall Street Journal) all publishing very detailed, technical articles on topics such as global warming and energy use. Natural disasters, such as the record number of hurricanes, the deadly tsunami in Asia-Pacific, and the record high temperatures across the globe also gained notice.
On Dec. 20, 2006, Congress enacted Public Law 109—431 , which tasked the EPA to ”… study and promote the use of energy-efficient computer servers in the United States.” The law also included developing “… recommendations regarding potential incentives and voluntary programs that could be used to advance the adoption of energy efficient data centers and computing.”
Prior to this law being passed, the IT industry formed The Green Grid , a group aimed solely at the development and proliferation of energy-reduction strategies for data centers and IT technology. This group has published a number of white papers on power use within the data center.
An important element of The Green Grid’s constituency is the processor, graphics board, and computer manufacturers. These members have one of the most important roles in developing the overall energy reduction/performance improvement roadmap, since this is the beginning of the pipeline in terms of influencing the power delivery to the computers.
Some things haven’t changed
In this exciting arena of new thought leadership and technological development, some very fundamental criteria have not changed—uptime and cost.
Arguably the awareness of and need for uptime, availability, and reliability has probably increased, since our society is so heavily laden with electronic transfer and storage of information. Loss of processing and access to data spells trouble for businesses that rely heavily on electronic data in terms of lost revenue, dissatisfied customers, and, in the worst case scenario, risk to human life.
Add to this the compliance issues stemming from Congress’s Sarbanes Oxley Act of 2002, and the term “mission critical” no longer applies to a small niche of banking and governmental operations.
Businesses are still required to reduce expenses—both upfront expenditures and ongoing operational costs such as maintenance, repair, and utility costs. Depending on the lifecycle cost of a project, managers may be able to push through projects with higher up-front costs and lower annual operational expenditures, since the long-term cost may ultimately be lower.
Each business will have different criteria for evaluating the first costs versus lifecycle costs by using solid engineering economic principals (payback, return on investment, net present value) or by more subjective methods (social responsibility, good neighbor policy) that do not necessarily lower costs.
The good news for data centers is that traditional payback models used in exploring energy-efficiency upgrades show faster returns due to the continuous operation and the fairly constant load profile. On a square-foot basis of raised floor area, a data center will cost five to ten times as much to construct as a commercial office building; its annual electricity costs will be 25 to 30 times as much. So as a percentage of initial costs, introducing energy-reduction alternatives in a data center compared to a commercial office building will have a much stronger economic argument.
All of these areas of research have carved out their own distinctive channels to flow ideas and information. We now see a confluence of these flows of information, which, like in nature, often end of with turbulent flow, and the channel becomes more crowed with ideas and approaches. However, the hope is that once the ideas pass through the whitewater of conflicting interests and widely varying approaches, they will eventually flow into a large body of calm water where ideas and philosophies can combine and create a greater whole.