More Data ≠ More Process Knowledge
Aligning Process Mining with Pre-Existing Methodologies
Six Sigma methodologies were developed, refined, and ultimately adopted when data was scarce and manual processes were the norm. As operations became more digital, this data was often logged but remained in a “black box” in terms of drawing conclusions, insights or being available to business users. As an organization, we cannot fix what we cannot track, and this gap in visibility hinders the potential for ongoing process improvement and business monitoring activities.
Define, Measure, Analysis, Improve, and Control (DMAIC) is a common framework used to plan a process improvement program. This framework requires improvement projects to first define the problem and then measure the impact in order to draw conclusions and propose improvements. This concept places math and statistics as an integral part of the improvement effort and by measuring key metrics such as cycle times, processes that could be simulated, improved, and ultimately controlled. This is a tried and true approach that has completely reshaped industries. Six Sigma has helped companies such as GE and Toyota reach new heights becoming market leaders in their respective industries.
If Six Sigma can be applied to a wide range of businesses and have a proven track record of success, why would anyone want to change or evolve from this standard approach? Performing Six Sigma projects requires investment from the business to launch data collection plans, process deconstruction, and ultimately implementation of new controls. Most importantly, Six Sigma initiatives require time, resources, and business support which is not necessary for all projects – especially projects outside of manufacturing. This has led to more businesses following the Six Sigma methodology but selectively applying it based on the project. Typically, Six Sigma is applied to manufacturing processes but not to “back office” processes and even then, is not a guarantee of a successful project.
The process improvement mindset must be extended to all processes, whether they are machine or human-driven. The consequences of not adopting this mindset can range from operational efficiencies to a full revolt. Though causation might not be completely proven, major cosmetics company shareholders launched multiple class-action lawsuits against the company claiming poor oversight of the implementation of its ERP system. The lawsuit claimed the stock price was negatively affected because its recent SAP roll-out suffered sub-par controls at the process level, resulting in inaccurate accounting of sales, COGS, inventory, and accounts receivable. The company confirmed that they were unable to fulfill shipments of $60 million worth of product.
An extreme example, undoubtedly, but indicative of what occurs in the business world in regard to process redesign. Technology solutions are meant to support business processes; however, oftentimes documentation or subject matter expert input is not readily available. Automate a poor process and you are only doing failing faster.
Undoubtedly, their manufacturing and warehousing operations are overseen by robust Six Sigma practices in the traditional manufacturing sense. Their service processes could have benefitted from similar oversight in terms of traditional process improvement prior to launching such a massive IT overhaul.
Process improvement methodologies have a key place in business but as we’ve highlighted have some significant challenges that need to be addressed. One of the biggest being, how quickly can we understand our current processes and create process documentation. In traditional projects, this is being done by meeting with subject matter experts (SMEs) in workshops and “walking through the process.” With the growing prevalence of “team-based software”, the complexity of how work gets done becomes even harder. No longer are we seeing one owner, but rather multiple stakeholders with various vested interests being part of a larger end-to-end platform.
An opportunity to improve our six-sigma methodology exists when we apply process mining to the equation. Process mining leverages these system and event logs to create an objective, fact-based process model. These process models include all system-driven tasks such as order entry, processing, and delivery, which in most cases is roughly 80% of the entire end-to-end process. Where we are missing details is the non-system-driven work, so things like “I need to email Carol”, “let’s put this into Excel” are critical for the work to be completed but not as easily understood. This is where we engage with our SMEs as we can fill in the missing manual tasks and ask detailed questions on why certain steps are occurring in the process. In this manner, we are guaranteed to capture 100% of the system processes, reduce the burden on our business SMEs and ultimately complete our current state analysis quicker.
If our goal is to improve our processes, process mining can only go so far, we still must interpret the data through the lens of our business operations. Data gleaned from process mining does not represent the entire end-to-end process nor does it contain the context associated with the business operations. This is where there remains a need for “traditional process improvement” to utilize the outputs of process mining and add further context and clarity. A good example of this is the confusion between causation and correlation.
One pitfall of manual process mapping, depending on a room of experts, is confusing causation with correlation. Correlation is when two or more factors move in proportion to one another and causation is when one of those factors, when moved, causes the other to move. This can be counterintuitive and, at best, confusing and, at worst, dangerous and wasteful. A simple example involves ice cream and murders. It is known that in metropolitan areas when ice cream sales rise so do murder rates. And when sales go down so does the number of murders. So, does eating ice cream cause people to rob others at gunpoint? Obviously not. Statistically, there is a correlation. As temperatures rise in cities, people go outside more thus there are more chances to be a murder victim. Also, as temperatures go up, people buy more ice cream. So, there is causation between temperatures and ice cream sales, and causation between temperatures and murders, but not between ice cream and murder. (Marchand, 2017)
If applying Six Sigma without process mining has its biases and limitations, the converse can be equally problematic. Applying the power of process mining tools without a grounding in the principles of Six Sigma can be akin to a layperson performing surgery on a friend. Six Sigma experience is critical in translating the outputs of process mining into remediation actions with the best ROI. Otherwise, you may find yourself chasing down process variations that are statistically insignificant or updating systems when a business process change would be more efficient. A Six Sigma Black Belt can analyze the outputs of process mining and identify the process owners who need to, for example, use poka-yoke to mistake-proof processes, create dashboards to illuminate those who are not following the process, or simplify the ways a user can enter a process. Without a solid grounding in process improvement and Six Sigma, the power of process mining can take a team on expensive and time-wasting wild goose chases.
Leverage the abilities of process mining to derive actual processes, pain points, and opportunities for improvements based on the system logs. Only then should the SME’s be interviewed as we’ve already captured most of the process details and can ask specific questions to further build out non-system activities and why activities are done in a particular manner. Beyond reducing the burden on SMEs, this approach also ensures that we have a 100% documented process from the system perspective – we are not relying on SMEs to remember everything done in a system.
We then leverage Six Sigma mindsets and tools to then drive the adoption of these changes to the process and systems impacted. The biggest difference is instead of following a DMAIC approach, which is a waterfall-based sequential process, we follow a more agile workflow. Since we have already connected to the source systems and modeled the process, we are able to start in the middle at “Measure” and then cycle through Define, Measure and Analyze phases until we are confident in the updates to the process.
In practice, this is very similar to DMAIC in traditional six sigma, as the Define phase is where we are creating a hypothesis and then working to confirm that hypothesis with data. In process mining first, we already have all the data – so instead of confirming the impacts, we are now working back to confirm the business reasons for these activities to occur. This cyclical approach ensures that we understand the entire process and can drive improvements in the most effective manner.
Once these updates have been made to the processor system, as a closeout we want to ensure we are realizing those benefits and have improved the process as expected. However, in traditional six sigma we would need to do another time study which is a short duration review of the process. With process mining, we can automatically create our control charts and alerts when we deviate from the process and then further refine this future state process. Instead of having an improvement project, we are moving to an improvement program where we continually monitor and improve the process.
All things being equal, a company that is more efficient is going to be more profitable and ultimately long-term successful. In addition to the efficiency gains, companies that follow this approach will have a much better understanding of their current processes and be able to adapt more quickly to changing market conditions. In the past few years, we’ve seen how SARS, Covid-19, China trade disputes all have dramatically reshaped supply lines and those that can transform quickest will have a competitive advantage.