Part II: Driving Manufacturing Performance with OEE

November 29, 2018 | Mariner

In the first segment of this series, I shared some initial thoughts on the recent Industry Week article, “Finding Manufacturing Performance Gaps with OEE” by author Louis Columbus.  I thanked Mr. Columbus for his contribution to the important discussion of Overall Equipment Effectiveness (OEE) as a metric for driving manufacturing operational excellence. He played devil’s advocate to all of the enthusiasm today for using OEE as the centerpiece of manufacturing improvement programs.  I agreed with his conclusion that “if OEE is not applied properly and supported by accurate and unambiguous data, it could hinder an organization from achieving its true operational excellence potential”.

That said, I hinted that the final two segments of this series would challenge some of the “lessons learned” Mr. Columbus shares in the article.   In this segment, I want to consider the first two (of four) “lessons learned” that the author discusses.

Lesson Learned #1: Beware the use of OEE as a multi-plant measurement metric

Mr. Columbus asserts that OEE is only appropriately used “as a baseline for each [individual] plant’s performance” when there are no differences between plants in terms of “product lines, suppliers and product runs”.  He suggests that when conditions vary across plants “comparison across plants [is] meaningless.” We disagree.  As with most performance metrics, the design of the measure is critical to whether it is ultimately effective and fair.  We have worked with many manufacturers who have established very effective cross-plant improvement programs and scorecards based on OEE.  He is right to caution about the risk of comparing apples and oranges.  However, calculating OEE, its components, and tracking improvement trends as well as comparing relative performance across facilities can all be very effective in a multi-plant scorecard.  With accurate, detailed data collected directly from the shop floor, manufacturers can better understand root causes for differences in OEE results.  This enables plant management to focus attention on the issues that matter – whether improved supplier performance or smoother changeovers.

No single metric tells the “whole story.”  But as single metrics go, OEE tells a pretty important one.  The real “lesson learned” here is that the metric needs to be backed by data analytics that enable the manufacturing team to drill into root cause issues and use that insight to address recurring situations that impact operational excellence.  There is no fixed value for OEE that represents “excellence” for all manufacturing operations.  Any performance measurement system that assumes so is misguided.  However, understanding the drivers that impact OEE results is important to achieving optimal performance. Building a program around OEE that educates and informs all members of the production team on what influences its outcome is a worthwhile undertaking.

Lesson Learned #2:  Aim to create a trusted, scalable, data set that isn’t inflated or politicized

No argument with this point.  Having accurate, timely, and trusted data driving any performance metric or evaluation program is critical.  It focuses attention and discussion on action to address issues rather than on the measurement itself.  It is essential to credibility and a sense of fairness that the data set enables more than just the metric outcome itself, but a true understanding of the underlying conditions.

Mr. Columbus goes on though to imply that including OEE in compensation and performance reviews is a trend that is misguided.  He suggests that “manufacturers are practically asking for the data to be skewed.”  We disagree.  If management communicates that improving OEE is important to the company’s strategy and business success, tying compensation to other metrics sends mixed signals.  Improving OEE takes determination, effort, creativity, and a willingness to try new approaches.  Absent a benefit, many stakeholders would resist change and the perceived risks it entails.  Management is left using OEE as a “stick”. OEE becomes a prod to induce improvement by focusing on shortfalls from theoretical optimal performance.  Manufacturing teams might even be reluctant to take bold steps to improve OEE results fearful that “extra effort” might result in an unsustainable benchmark for the future.

I believe that including OEE in pay-for-performance plans is a strategic decision that should not be made lightly.  For leaders that are considering this option, I recommend careful planning and preparation.  Read Ken Gibson’s “The Five Essentials of Pay for Performance.”  In that article, the author stresses the importance of aligning what is important to the company’s strategy, business results, and shareholder value with the metrics selected.  This is absolutely critical.  If manufacturing excellence is not critical to overall business success, executive leadership will be reluctant to make the investments needed to build and sustain ongoing OEE measurement and improvement.  Leadership will be tempted to under fund the performance payout pool.  This can undermine the perceived commitment to operational excellence.  OEE could become yet another statistic that gets reported on, but does not garner much energy or enthusiasm.

As part of a well-designed pay-for-performance program, OEE is a simple and effective metric to drive the right behaviors for all members of the operation.  Procurement is encouraged to balance cost decisions with delivery performance capabilities, for example.  Plant managers are not encouraged to sacrifice quality to achieve seemingly higher output.  Operators and process engineers are encouraged to talk to each other and work together.  But again, Mr. Columbus’ emphasis on trusted and scalable data sets must be heeded.  When your team is properly incented, they will press for access to accurate data and the tools to analyze it so that they can make improvements.  Without the support of a strong Industrial Internet of Things (IIoT) solution line, Spyglass and the robust, scalable performance of an IIoT platform like Microsoft Azure, your teams could become frustrated or worse.  Without detailed, accurate data and analysis, “trial and error” approaches could reverse OEE outcomes.

If your organization is committed to operational excellence but does not want to commit to a pay-for-performance program, OEE can still be an effective metric to guide your efforts.  Think about non-compensation benefits that can reinforce the importance of operational excellence and improving OEE results.  Work with your HR team to identify creative ways to recognize and encourage teams that move the needle.  Demonstrate your commitment to the program by staying on top of trends, asking good questions related to the metric, and personally recognizing individuals who contribute to OEE improvement.

Coming Up:  Final thoughts on OEE “Lessons Learned”

In the final installment of this series, I’ll take a look at the last two “lessons learned” in Louis Columbus’ Industry Week article.  Hope you join me for a look at the importance of getting “underneath” the OEE metric and the importance of considering setup standards and Total Effective Equipment Performance (TEEP) in my next post.

Read Part 3 of the Series

This post was authored by Mark Adelhelm