Transformer efficiency: Yesterday’s news, tomorrow’s concerns
Although many engineers and design professionals don’t get a lot of time to spend looking under the hood of equipment, we should certainly take a closer look at transformers.
In case you haven’t heard, there is an interesting debate going on concerning transformers. On one side, the U.S. Dept. of Energy has proposed—and has pending legislation on—requiring improvements to the existing transformer efficiency standards. The DOE has made its analysis available to the public for comment. On the other side, those most affected are pushing back against this proposed rule. Details regarding this debate are covered later in this article. Also, refer to “A brief time line of transformer laws and standards” for a little history on the topic.
Brief refresher on transformer theory
Primary and secondary windings are coiled around the transformer’s core. Current passing through the primary windings creates a magnetic field, or flux, which induces voltage in the secondary coil. A load connected to the transformer enables secondary current to flow at a rate that depends on the load’s impedance.
Existing only in theory, a perfect transformer would have no losses. However, real-world transformers have both load and no-load losses—primarily because of the test procedures used to confirm them. No-load losses include hysteresis and eddy currents; load losses are those caused by skin effect and winding resistance.
This explanation is very basic and a disservice to engineers who design transformers. However, this short version is sufficient for the scope of this article.
Defining transformer efficiency
National Electrical Manufacturing Association (NEMA) developed its TP-1 in 2002 as a guide for determining transformer efficiencies. This analysis redefined how energy efficiency was calculated in two ways. First and foremost it set standards to which transformer efficiency is measured (see Table 1).
NEMA Class I efficiency levels for 3-phase, dry-type, low-voltage transformers |
|
kVA |
Efficiency |
15 |
97.0% |
30 |
97.5% |
45 |
97.7% |
75 |
98.0% |
112.5 |
98.2% |
150 |
98.3% |
225 |
98.5% |
300 |
98.6% |
500 |
98.7% |
750 |
98.8% |
1000 |
98.9% |
In some respects, the major change involved the actual efficiency calculation. This wasn’t as simple as just increasing efficiency for transformers. It redefined the point at which efficiency was measured. Prior to enactment of the law, transformer efficiency was based on the transformer’s full operating load. As part of TP-1, the point at which the transformer efficiency is now based is 35% of full load for low-voltage dry-type transformers, and 50% load for liquid and medium-voltage transformers. Prior to TP-1, the DOE performed a study that determined the average loading of transformers to be around 32%. The point of highest efficiency is commonly called the sweet spot.
Sweet spot
The sweet spot for transformer efficiency occurs when the no-load losses equal the load losses. Transformer manufacturers were left with a challenge to change the way they designed transformers—and in particular, the transformer core. Prior to the enactment of these requirements, transformers were designed to operate most efficiently between 80% and 100% of full-load rating. Now the new constraint—for transformers to be most efficient at 35% of full load—is requiring transformer manufacturers to look at efficiency and losses closer to the no-load conditions.
Manufacturers are addressing this challenge with several process improvements including:
- Making the cores larger by increasing the amount of steel to reduce the flux density
- Using a better grade of magnetic steel
- Implementing better construction practices such as using miter joints instead of butt joints during core lamination, which reduces no-load core losses
- Using grain-oriented electrical steel to improve core-loss performance.
A combination of these methods is often used to achieve the appropriate results. Generally, manufacturers increased the efficiency of transformers at 35% load by using combinations of these methods. Keep in mind that the energy efficiency requirements did not target the efficiency when the transformer is fully loaded, but did create several issues that design professionals must consider.
Specifying engineers face new challenges because of the changing focus on transformer efficiency. Changing transformer characteristics to accommodate new requirements creates increased inrush current, transformer size, and transformer loading.
Design considerations
Increased inrush current: When a transformer is first energized—either upon installation or just a power outage—it experiences inrush current. Good coordination dictates that the overcurrent protection, which serves the transformer, is designed to allow the transformer to energize at the same time it protects it from damage. In the past, the inrush current was typically between six and 12 times the full load current of the transformer, while the inrush current can be between eight and 17 times the full load transformer current with the new transformer cores. One manufacturer offers a 45 kVA transformer with inrush current of 17 times the transformer’s full load amps (FLA).
The one-line diagram in Figure 2 shows a typical distribution for a 75 kVA delta-wye transformer. The National Electrical Code (NEC) requires primary and secondary protection with the secondary sized at 125% of the transformer’s FLA. The NEC allows the primary breaker to be sized at up to 250% of the transformer FLA.
When the transformer is energized, the primary breaker is the only overcurrent protective device through which the inrush current will pass, and therefore it is the only breaker required to coordinate with the transformer inrush. Now, consider both fault and overload damage protection: Which breaker should protect the transformer?
Ideally it’s the primary breaker, but according to the NEC, either or both breakers could be the protection for the transformer. The primary is ideal because it would protect from any fault that occurs between the secondary conductors and the primary breaker. Although it is ideal, accommodating the inrush and protecting the transformer with just the primary breaker can create quite a challenge. There are also transformer types where the NEC allows primary-only protection, which is equally challenging.
The time-current curve shown in Figure 3 is for this distribution and shows both primary and secondary at 125% as allowed by the NEC. When the inrush was eight times the FLA or less, this scenario was acceptable. However, with the greater inrush current, the primary overcurrent protection must be increased to be more than 125% and the secondary breaker must provide transformer damage protection. This should be evaluated with the local authority having jurisdiction as some inspectors may also require the primary breaker to protect the transformer from damage.
The time-current curve in Figure 4 represents the best of both worlds: primary and secondary breakers that protect the transformer from damage, and allow for larger inrush current. However, this example does use an oversized breaker with the longtime pickup dialed down to create a primary breaker that operates within 125% of the transformer’s FLA.
Size: With the increases in core materials and the amount of copper, transformers are actually larger than they used to be. A 15 kVA transformer is now the size that a 30 kVA transformer was 10 years ago. It takes up more space and makes it challenging to do a one-to-one replacement. This statement is also true for aluminum transformers.
Transformer loading: The DOE and NEMA did investigations into transformer loading and determined that low-voltage dry-type transformers are typically only loaded to 35% of the transformer’s capacity. Although this may be the case, engineers typically select transformers to be 80% loaded, using NEC demand factors.
The real waste in efficiency appears to be how using NEC-required demand factors results in actual operating loads that are 45% less than design-calculated loads. This should be one of the considerations when selecting a transformer size: spare capacity on paper and what we all know the spare capacity will really be when the transformer is operational. Also, many manufacturers have posted test results for their transformers that identify efficiency percentages at 25%, 35%, 50%, 75%, and 100% of capacity. Design professionals should review and understand these data when selecting transformers.
Pending legislation
The DOE is reviewing pending legislation (Notice of Proposed Rulemaking, NOPR, 10 CFR 430) to determine if it should require improvements to the existing transformer efficiency standards. It has analyzed three low-voltage dry-type transformer sizes—25 kVA, 75 kVA, and 300 kVA—based on different efficiency levels, as well as the associated manufacturing, operating, and lifecycle costs.
The proposed efficiencies for comparison are the existing NEMA TP-1-2002 levels; Trial Standard Level 1 (TSL-1), which is also identified as Efficiency Level 2 (EL-2); and NEMA Premium Efficiency (EL-3). The terms the DOE uses are Trial Standard Level (TSL), Efficiency Level (EL), and Candidate Standard Level (CSL), but the actual proposed rule is identified by the TSL number (see Table 2).
Efficiency levels for 3-phase, dry-type, low-voltage transformers |
|||
kVA |
Efficiency |
||
NEMA TP-1-2002 |
TSL-1/EL-2 (Proposed in NOPR 10CFR-430) |
NEMA Premium Efficiency (EL-3/CSL-3) |
|
15 |
97.0% |
97.44 |
97.90% |
30 |
97.5% |
97.95 |
98.25% |
45 |
97.7% |
98.20 |
98.39% |
75 |
98.0% |
98.47 |
98.60% |
112.5 |
98.2% |
98.66 |
98.74% |
150 |
98.3% |
98.78 |
98.81% |
225 |
98.5% |
98.92 |
98.95% |
300 |
98.6% |
99.02 |
99.02% |
500 |
98.7% |
99.17 |
99.09% |
750 |
98.8% |
99.27 |
99.16% |
1000 |
98.9% |
99.34 |
99.23% |
The current DOE economic analysis is available on its website (www1.eere.energy.gov) and included as part of the proposed rule. The analysis indicates between a 4.5-year payback (design line 7: 75 kVA transformer) and an 8-year payback (design line 8: 300k VA transformer). The intent is to extrapolate these data across the board to all sizes of dry-type transformers, and it appears to somewhat justify transformer cost. The DOE also makes its economic analysis spreadsheets available to the public and has included some provisions for user input. Items such as load level basis and high utility rate areas can be selected to recalculate the payback.
The debate mentioned earlier in the article is evident in the payback analysis and market impact. Manufacturers, utilities, design professionals, and even NEMA have joined forces to push back on the DOE’s analysis for the proposed rule. Their major concerns include:
- The assumptions made by DOE were not validated by a third party or by market pricing
- The “theoretical” transformers used in the model were never tested at prototype installations
- The lifecycle costs do not include real market components such as the limited sources for the high-grade steel that would be required
- A clear and complete breakdown of what is included in the lifecycle analysis is not provided
- Higher first costs of transformers would promote refurbishing old transformers over replacement of new units
- Stricter standards can have negative impacts on small manufacturing facilities and the American transformer industry overall.
The preceding list is definitely the short version. However, it is evident that the industry has valid concerns that should be addressed and to which the codes and standards making bodies should respond.
From a building design perspective, the entire study has caused a different group of concerns:
- Environmental: The big problem is that we are using 35% of the material in every low-voltage dry-type transformer across the country, which requires a lot of unnecessary harvesting of steel, copper, and other materials; it’s surprising this receives no attention
- For many owners, the breaking point for payback vs. infrastructure investment is five years, and these transformer efficiencies—even using DOE’s numbers—are right up against that
- So much attention is paid to increasing transformer efficiency by less than 1%, but there are so many other building components that could benefit from even minimal efficiency standards.
The exciting thing about this topic is that it is a live debate, actively going on as you read. This is only a proposed rule, with the public comment period just about to start.
As with most debates, it’s up to you to form your own informed opinion and act accordingly.
Ferris is an electrical project engineer with TLC Engineering for Architecture. He specializes in power distribution for healthcare facilities.
A brief time line of transformer laws and standards
For the last decade, the three major players in the development of energy standards in transformers are the DOE, NEMA, and U.S. Environmental Protection Agency (EPA). The DOE and NEMA conducted a significant amount of research and analyzed transformer performance, costs, and the potential impact of legislation. These reports were based on both liquid filled and dry-type distribution transformers with primary voltages less than 69 kV and secondary voltages less than 600 V. The following is a summary of these efforts:
2002: NEMA developed NEMA TP-1-2002: Guide for determining energy efficiency in distribution transformers. This document describes the formulas for energy efficiency and defines at what percentage of loading transformer efficiencies are documented. It also includes the energy efficiency requirements that transformers must meet to be considered NEMA Class I for both liquid filled and dry types. At this point, NEMA’s work was only a guideline.
2005: The Energy Policy Act of 2005 (EPAct 2005) set standards for low-voltage dry-type transformers, which specified that all low-voltage dry-type transformers manufactured on or after Jan. 1, 2007 must be Class I Efficiency Level as defined by NEMA in TP-1-2002. This made the transformer efficiency guidelines—as well as the 35% loading standard for low-voltage dry-type transformers in NEMA’s TP-1—a law.
2006-2007: The EPA, which previously had adopted the efficiency levels of TP-1 as the standard for the Energy Star label, proposed suspending the Energy Star program for distribution transformers because the standard models in the market met the criteria. EPA also stated that its research indicated that energy efficiency improvements were not cost effective at the time it suspended the program. The Energy Star program for transformers is still closed today.
2007: the DOE established standard vs. minimum efficiency values for all distribution transformers through 2,500 kVA in its 10 CFR 431 Subpart K. This ruling took effect for all transformers manufactured after Jan. 1, 2010.
2008: NEMA announced the use of the Premium Efficiency mark on transformers. To obtain this mark, transformers must exceed NEMA TP-1 with a 30% reduction in losses.
2009: After the DOE established the minimum efficiency rule in 2007, environmental groups filed a lawsuit against the DOE challenging it. In July 2009, the U.S. Court of Appeals approved a settlement agreement allowing the DOE standards to go into effect on Jan. 1, 2010, but also requiring the DOE to reassess the standards and, if necessary, promulgate new efficiency standards for distribution transformers.
2011: The DOE initiated its notice of proposed rulemaking to review and amend the current standards in effect for distribution transformers, including the aforementioned transformer types. As of the writing of this article, no final legislation has yet been created as a result of this initiative.
2012: If the proposed rulemaking is determined to be justified, the DOE must provide a final rule by Oct. 1, 2012.
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.