Simon Copley offers some valuable tips for successful process modelling in FMCG manufacturing
As manufacturing consultants, we are often asked by FMCG brand owners to help them understand their complex production processes so that they can improve performance or forecast behaviour with new products or formulations.
FMCG manufacturers typically face a number of common challenges, such as manufacturing new products with existing production assets, reducing production costs and energy consumption, and increasing throughput.
Addressing these challenges relies on a strong knowledge of the current process’s underlying physics and chemistry, but often manufacturers do not possess enough detail in this vital building block of innovation. One valuable way to gain this level of understanding is to build a model or simulation of the process.
In an ideal world, every manufacturer wants a perfect computer model that captures every last detail of their process. It could be used to understand the impact of any alterations before they have even been made, or determine compatibility between their current manufacturing assets and any new product designs or recipes. But without this level of detail, manufacturers are often reliant on expensive and lengthy trial and error testing, needing either a pilot setup or costly downtime on the main line.
In the middle ground, an imperfect model is often the best investment. Drawing conclusions from a model is quicker and cheaper than testing, but these judgements can only be trusted if the model is sufficiently detailed. This article shares some of our recommendations and experience gained from helping clients build imperfect models of their process.
Why do you need a process model?
Models can be used for many purposes, including:
- to optimise processes for better output quality, throughput and profit;
- to diagnose the cause of issues, scrappage and stoppages; or
- to understand the likely cost of altering equipment to manufacture new products.
Without a model, the team may not necessarily agree on how the process is currently functioning. It is often assumed that operators, R&D teams, and other stakeholders have the same understanding of a manufacturing process. But we typically find knowledge ‘silos’, equipment evolution and operator-created workarounds all cloud understanding.
For example, during one recent project we received multiple differing opinions on how things worked: descriptions from the line operators about a process step opposed information from the original designers of the line, who’d had little involvement during the process evolutions of the previous decade. Both viewpoints were invaluable for the development of the project but had the company had an updated process model their viewpoints might not have diverged.
What is the business case?
Given the expense of modelling, it’s important to match the model’s level of detail to the business need. It’ll never be cost effective to develop a detailed, perfect model for an aspect of the process that has little effect on the final product quality. And similarly, poor value is returned if a model is too simplistic yet was intended to inform high-value or safety-critical decisions.
Which inputs should be modelled?
Of course you don’t have to model everything, but you do have to model the right things. Because even for very simple processes, a surprising number of input parameters affect the output quality.
The most obvious might be those that are controllable, such as conveyor speeds, temperatures, ingredient ratios, product residence times, or other machinery parameters. But don’t neglect less obvious inputs such as ingredient variability, ambient conditions or air flows within equipment to name but a few.
From experience, the right parameters can be identified by:
- Involving day-to-day operators throughout, so that nothing is overlooked.
- Bringing in a fresh pair of eyes to question assumptions and highlight ‘forgotten’ inputs, which is often where accuracy is forfeited!
- Listing and ranking all input parameters by their expected impact.
- Prioritising the highest impact parameters for inclusion, then iterating the model to include others when further accuracy is needed.
- For vital yet overly complex inputs, identifying if proxies could simplify the challenge. Simple experimentation may also help.
In a recent project to improve a heat seal process, this approach allowed us to model the effect of temperature variations with a simple yet powerful thermal model. Ranking the sources of thermal losses meant we could focus on the most significant ones, which allowed us to quickly reach a level of accuracy to give confidence in our conclusions. A number of alterations were then modelled, and the best performing were implemented with significant improvements to the consistency of the process.
Up to now we have only discussed fully automated processes with high consistency. But you can’t assume consistency for processes requiring human action, such as food assembly lines with many operators. For these cases, it’s important to consider the variability of staff performance, and if necessary, take measurements to allow you to judge the impact of these production steps on the overall process.
When is complex simulation needed?
There are a range of powerful software packages available for simulation, including ANSYS, COMSOL and MATLAB, but we often find it’s more useful to start with a simple Microsoft Excel model. If you can rapidly build an imperfect model that provides 80 percent of the value, that might be all that’s needed. And although Excel is commonly seen as not a ‘proper’ way to do modelling, the fact remains that Excel is king and is often the quickest and easiest tool for the job.
Unlocking the final accuracy of the system can be much more expensive though. So while the first 80 percent might be achievable in Python or Excel, you might need more complex tools such as computational fluid dynamics (CFD) to get to 90 percent.
This two-step methodology has been applied to great success when working for a client with a variable performance cooling tunnel. A spreadsheet model was initially used to understand conditions throughout the tunnel before it became clear that the actual performance depended on the exact interaction between the product and cooling air. To help with this, the Excel model was supported by results from selected use of thermofluidic CFD, which allowed us to more closely model reality. The combined model was then used to analyse several improvements with the best being implemented to deliver significant performance improvements and increased line speed.
Top tips for successful process modelling
The issues discussed above are all fairly obvious when you take a step back, but we have often seen otherwise well intentioned modelling and simulation exercises deliver limited results for entirely avoidable reasons.
If you are planning to build a process model then it’s worth considering the following top tips:
- Be clear on what you will use the model for. What is the business case, and what accuracy is required?
- Make sure you involve the right stakeholders: those with day-to-day knowledge of the process, as well as those responsible for the wider business case of the model.
- Identify the parameters with the biggest effects first, focus on these and postpone including others known to have negligible effect. You can always iterate the model if unsure
- Take a pragmatic approach: if 80 percent of perfection is unlocked by a simple spreadsheet or script, then weigh up the marginal cost before investing in a more complex simulation
- Although many FMCG manufacturers often believe they need a complete and accurate model of their production processes, in our experience we have found imperfect models are almost as valuable. Indeed, the payback from them is often much faster than expected and can deliver unexpected improvements and insights along the way too.