Why some industrial organizations find benchmarking difficult

Benchmarking, in the simplest terms, involves comparing performance to peers, understanding gaps in operations, and taking steps to close those gaps and improve performance. Unfortunately, it is not as simple as it sounds.

03/06/2013


The concepts behind benchmarking research for industrial operations have been applied successfully by many leading organization in the world. However, many companies still struggle with the basics. Benchmarking, in the simplest terms, involves comparing performance to peers, understanding gaps in operations, and taking steps to close those gaps and improve performance.

 

 Unfortunately, it is not as simple as it sounds. Below are the top three challenges I’ve seen industrial organizations face while benchmarking operations.

 

 

 

Granularity: Benchmarking research is too general or too specific

 

Often the topic that needs to be benchmarked is too general or too high level, which makes it difficult to take specific steps for improvement. A good example of this is benchmarking “operational excellence.” This means many things to many different companies and it’s difficult to garner real, actionable steps for improvement when companies look to benchmark such a topic.

 

On the other extreme, there’s a similar issue if the business process that needs to be benchmarked is very specific. For any benchmarking process to be successful, it’s critical to understand how the outcome of the process impacts the key goals of the division or plant or even the overall organization. This, in turn, means specific is good, but that can make the quest for data very challenging.

 

A good example of this would be trying to benchmark the throughput and mean time to failure metrics for a very specialized piece of machinery in a specialized industry. All I can say is good luck getting a statistically relevant sample in such a case.

 

 

 

Data: Availability, quality, statistical relevance, and more.

 

The second key challenge with benchmarking industrial operations is data. If all data had the following characteristics, benchmarking projects would almost always run smoothly:

 

  • Easily available
  • In one central location
  • Using common definitions
  • Having a statistically relevant sample size

Unfortunately, this is not the case and it takes a lot of hard work to get there.

 

For example, if you’re planning to benchmark the quality processes of five plants in North America, and there is no consistency in the way data is collected across these plants or how metrics are defined, it will be an uphill battle. In such a case, it becomes very challenging to effectively execute the benchmarking process internally and even harder to do it externally.

 

Getting value from the results

 

The final challenge has two parts and focuses on the way the results of the benchmarking process are utilized. This stage is often more important than the benchmarking process itself.  If what you learn from the results of the benchmarking process isn’t applied to the business, the entire exercise may have been done in futility.

 

The first challenge is in understanding which actions need to be taken based on the results and also how to execute these actions. To accomplish this effectively there needs to be buy-in from all levels of the organization as well as the right culture in place to accept the change due to the new actions.

 

The second part of this challenge lies in answering the question, “What happens next?” Organizations that think about the benchmarking process as a one-time exercise are likely to fail. The key to the success of any benchmarking process is in setting up a culture and process of continuous improvement.

 

The real value of a benchmarking exercise is delivered when you learn from the results of the program, apply those recommendations, track the success of the actions, and continuously improve based on the results of those actions.

 

Matthew Littlefield is principle for LNS Research will be holding the first meeting for the Global Quality Advisory Counci in March. For more informnation, click this link.

 

 

 



Anonymous , 03/15/13 09:16 AM:

Great view on the benchmarking issues that definitely will help to improve the process itself
Consulting-Specifying Engineer's Product of the Year (POY) contest is the premier award for new products in the HVAC, fire, electrical, and...
Consulting-Specifying Engineer magazine is dedicated to encouraging and recognizing the most talented young individuals...
The MEP Giants program lists the top mechanical, electrical, plumbing, and fire protection engineering firms in the United States.
Water use efficiency: Diminishing water quality, escalating costs; Lowering building energy use; Power for fire pumps
Building envelope and integration; Manufacturing industrial Q&A; NFPA 99; Testing fire systems
Labs and research facilities: Q&A with the experts; Water heating systems; Smart building integration; 40 Under 40 winners
Maintaining low data center PUE; Using eco mode in UPS systems; Commissioning electrical and power systems; Exploring dc power distribution alternatives
Protecting standby generators for mission critical facilities; Selecting energy-efficient transformers; Integrating power monitoring systems; Mitigating harmonics in electrical systems
Commissioning electrical systems in mission critical facilities; Anticipating the Smart Grid; Mitigating arc flash hazards in medium-voltage switchgear; Comparing generator sizing software
As brand protection manager for Eaton’s Electrical Sector, Tom Grace oversees counterfeit awareness...
Amara Rozgus is chief editor and content manager of Consulting-Specifier Engineer magazine.
IEEE power industry experts bring their combined experience in the electrical power industry...
Michael Heinsdorf, P.E., LEED AP, CDT is an Engineering Specification Writer at ARCOM MasterSpec.