Putting the digital into formulation
Industry 4.0 is already here and changing the formulating industry, at Formulation 4.0 you will hear firsthand from pioneers in the field.
Delegates heard about how the UK is leading the way with major collaborative activities well supported by academic, industrial and government resources and how individual companies are leveling the playfield through taking advantage of the new technologies.
The UK is leading the way in the application of Industry 4.0 to formulation and at Formulation 4.0 delegates got an in depth insight into 2 of the leading programmes currently underway, as well as the view of different industry players involved.
The EPSRC Connected Everything: Industrial Systems in the Digital Age is aimed at providing the tools which will enable Industry 4.0. Professor David Brown introduced us to this aspirational project and some of the tools that may enable Formulation 4.0 for your processes and plants.
CAFE4DM - the Unilever / The University of Manchester led Prosperity Partnership is pioneering the application of digital manufacturing techniques in formulated products. Jo Cook and Professor Phil Martin told us about how the Centre in Advanced Fluid Engineering for Digital Manufacturing (CAFE4DM) is revolutionising the traditional chemical engineering approaches which have been used to incrementally improve the manufacture of formulate products, through the development of new academic and industrial approaches.
CPI is also leading the way with the universities of Birmingham, Edinburgh and Leeds with the ambitious development of a major facility to enhance manufacture of formulated liquid products. Katharina Roettger explained how the facility will enable companies to innovate in the complex area of manufacturing new multi-component formulations, without having to disrupt their current manufacturing until new formulations have been optimised. A key part of this facility will be the advanced process control needed to enable digital manufacuring, Simon Mazier from Perceptive Engineering talked about how such control allows realtime control of product quality attributes. Dave Berry from CPI explained how the learning from liquid product manufacture is to be applied in a new flexible powder manufacturing system.
Professor David Brown, University of Portsmouth
Professor Phil Martin, The University of Manchester
Katharina Roettger, National Formulation Centre, CPI
Dave Berry, National Formulation Centre, CPI
Simon Mazier, Perceptive Engineering
Sean Bermingham, PSE
Mo Chowdhury, AkzoNobel
Dr Simon Gibbon - AkzoNobel RD&I
Dr Tom Rodgers - The University of Manchester
Dr Bernadeta Pochopien - BP
Dr Helen Ryder - The University of Manchester
Formulation 4.0 - Meeting Summary
Tom Rodgers (The University of Manchester) and Simon Gibbon (FSTG/AkzoNobel) introduced Formulation 4.0. Simon explained the genesis of the meeting and how it was expected that everyone would be a little outside their comfort zone, as today you would hear about everything from mechanical engineering to business engineering, with lots of formulation too. Simon talked about how at first reading many the concepts of the 4th industrial revolution do not appear to fit well with formulated products. So the idea of the meeting was to showcase how industry 4.0 was being applied to formulation today, the plans for future application within research projects and through open facilities how you could leverage these advances to your own formulation. Simon then highlighted how AceForm 4.0 European funded project had shown how digitalisation was a key part of being able to move towards more sustainable ways of working and would be essential to underpin the circular economy.
David Brown – University of Portsmouth
The technical content of the meeting started at the mechanical end with Connected Everything, this is an EPSRC funded network with over 90 groups involved, following the financial crash of 2008, the UK realised it needed to enhance its manufacturing base. “Connected Everything – Industrial Systems in the Digital Age” is an EPSRC-funded Network Plus project, which aims to identify the key challenges we face as digital technologies transform our industrial systems and to highlight research excellence relating to these challenges. It is creating and supporting new collaborations between academics from diverse discipline areas to address these challenges. It provides a forum for discussion and knowledge exchange, as well as funding for risky early-stage research projects in partnership with industry.”
David highlighted how they were working with both SMEs and large companies with projects which are envisioned to lead to autonomous factories. David works extensively with the food industry, from specialist tea bagging to massive dairy processing operations. The cloud is a key part of the vision of the future, not just for storage but also for processing, however in order to ensure resilience and allay security concerns raw data is not passed to the cloud, considerable processing is done locally and information is passed to the cloud, where aggregated date from many factories is used to provide improved information back to the factory, however if access to the cloud is interrupted production can still continue as there is sufficient local processing to make the necessary control decisions. The use of artificial intelligence is only possible if training data sets are created which contain the full range of conditions which will experienced during operation of the factory – so things like start-up / shutdown / put also transients in electric power or variation caused by raw material differences.
Full measurement and control has been fitted to an SME’s spray coating system for bamboo food containers. The combination of high speed / high definition cameras, vibration sensors, thermography, temperature / humidity sensors allows not just standard control functions to be carried but also early stage fault detection and correction as well as final product quality control. The introduction of AI in this SME factory, means that for spray coating onto bamboo , the high speed camera looking at the droplets of the liquid in the spray through analysis of size / colour is able to determine whether the coating will pass quality control once coated, of course this is then rechecked with further cameras at the end of the process.
Dairy filling machines have been fitted with sensors on motors where local processing using wavelet packet transforms is able to remove the influence of noise and highlight the information which is able to locally show when the machine is approaching a failure state, suggest altered operating conditions which will delay or even remove the need for maintenance. As the machines are at 100% occupacity tied to supply contracts with bid penalty clauses for non-supply the ability to remove unprogrammed shutdowns has huge financial benefits. Such a large dairy processing company is also able to compare machine to machine but also factory to factory and identify long term trends and from comparisons continuously bring all machines up to the operational effectiveness of the best in the company.
David is surprised that currently equipment manufacturers / integrators have not wanted to get directly involved with Connected Everything, so all the work has been retrofitting equipment to existing systems, rather than integrating at machine build.
However sophisticated the AI is, the weak point is often the human computer interface, so suitable displays where multiple views are available with only the information visible which is needed to make the current decision is displayed so multi-layer displays for operator / engineer etc. are essential, equally keeping displays clear through such approaches as a hyper-sphere where the screen displays a circle and good performance is indicated signal at the centre, and as quality of operation deteriorates as the signal moves to the edge of the sphere, such that corrective action can be taken before the line operates outside safe / efficient conditions. Some systems are piggybacking on other media technologies from augmented reality to games etc..
Sean Bermingham - Process Systems Engineering(PSE)
PSE key approach is to identify the mechanistic models which enable formulation, these models can be truly molecularly mechanistic or at a more abstracted level, but they always relate parameters of ingredients and processing to the required properties. Digitisation vs digitalisation – we doing digitalisation exploiting all the IT tools available.
R&D is enabled by digital design and manufacture by digital operation. Mechanistic models are typically developed through small scale experiments then large scale experiments for blind testing, at which point the validated models are moved into manufacture for operation.
PSE have been working on a project called “Systems based Pharmaceutics” with a number of pharma companies, this aims to provide optimal design with fewer resources, reduced risk and better end-use performance. This relies on rapid configuration, calibration and deployment of mechanistic models using systems approaches - drug substance manufacturing - product manufacturing - drug delivery: organ absorption & pharmacological kinetic (PK) - drug delivery: pharmacodynamics. Understand factors which effect each stage - failure at any stage of manufacturing is a blocker - so should be designing manufacturing process and product (formulation) in parallel to avoid these issues using “Systems-based pharmaceutics”.
Models / digital twins pervade the entire formulation process - digital twins of manufacturing process, drug product and patient population / individual patient - plan source make deliver buy - digital twin of pharmaceutical supply chain.
The Medicines Manufacturing Industry Partnership’s (MMIP) project ADDoPT is basically is delivering a virtual medicine manufacturing system such that the medicine the virtual medicine / process is developed before anything is tested in real world, allowing more rapid progression from idea to product, as issue are dealt with at the virtual stage.
PSE produces a complete software framework across the key processes for pharmaceutical production - gFormulate, gCrystal, gSolids, gCoas – map to the different mechanistic unit operations – each is calibrated per unit operation, so for a drying step you may have data from vapour sorption measurement or individual drop drying data etc for spray dryer. Then these unit operations combine into a system model. Perceptive Engineering then develop control systems based on these models.
To produce your calibrated science-based digital twin you may need to look at 10-20 factors hence need access to high performance computing (HPC), this allows you to understand the complete variability and whether the designed formulation will be manufacturable or not, effectively debugging / debottlenecking your production in-silico without waste. Sensitivity analysis - which parameter is responsible for how much of the variability - virtual system assurance - uncertainty in knowledge – ultimately understand what parameters are responsible for the variation and whether those parameters can be controlled to sufficient degree to allow manufacture under sufficient control.
Access to lots of HPC input, understand role of variability, but often don’t know how big variability really is, so use batch experimental data to understand continuous process, kinetic model applied to sensitivity analysis, sensitivity analysis applied to reaction kinetics.
Virtual experimental design replaces much of real experimentation.
Jo Cook Unilever
Products of the Beauty and Personal Care division of Unilever fall into two class, high volume enduring and low volume high turnover. The biggest gains in leveraging in-silico design and manufacturing, are for efficiency of materials for high volume products and rapid optimisation for high turnover products. Typical products have between 15-30 raw materials, so optimisation of such products and coping with the variations in often bio-sourced ingredients are key to successful manufacture. Many of the products need to be rapidly innovated as driven by fashion trends.
To get the benefits across the formulated product lifetime, Unilever has developed two approaches: digital product engineering, a suite of in-silico tools for formulation and processing; and digital manufacturing, a suite of novel measurements and analysis methods for process data.
In order to exemplify how CAFE4DM would help Unilever one of the first things Jo did was to define formulation as being “a recipe, with a list of ingredients and the process steps to make the dish” (product).
One ambition is in-silico first recipe design - here a suite of models would allow many recipes to be tested in parallel without the need to carry out any experimental work, drastically accelerating product development. This is possible with the in-silico tools putting expert knowledge into the hands of all product scientists – so having linked rheology to viscosity, developed rules for scale-up, combined with high throughput product assembly and performance measurement, the time limiting step is now data capture.
Another ambition is quality right first time – basically by measuring impurities in raw materials and having models for the impact of impurities on final performance, the recipe will be automatically adjust to keep the performance in specification despite the presence of impurities.
A further ambition is process to end-point – instead of fixing processing parameters based on extensive process development, realtime measurements of a relevant physical property would be used to determine when a process has reached completion, thus offering the opportunity to shorten the production process.
CAFE4DM is split into a series of work packages, Jo leads WP2 which is tasked with developing the necessary structure-property relationships that are needed to underpin the in-silico design – these come from sophisticated measurement techniques, machine learning from performance data and from fundamental ab-initio calculations. A typical structure property relationship is viscosity of isotopic liquids - 30 components from pallet of 100s - measuring structure of surfactants and simulating structural features - machine learning group contribution all leading to prediction of properties.
Phil Martin – The University of Manchester
Industry 4.0’s digital twin are models for science, however as same ingredients can give very very different viscosities on different occasions, the modelling needs to take account of the process – i.e. complete effect of fluid flow / temperature / pressure and anything else that might determine the microstructure the ingredients adopt. Dave Prosser and Tom Rodgers are developing models which deal with lots of complexity - integrate DPD (dissipative particle dynamics) with CFD (computational fluid dynamics) via constitutive equations to join up longer times / length scale to continuum models.
Traditional scale-up has relied on achieving purely same turbulence, this is not enough to give identical microstructures in these complex products. So using process analytics to measure in line - accelerate engineering design validated by electrical resistance tomography which validates CFD calculations and laboratory NMR measurements.
One of the ongoing challenges is how to get a probe into a vessel without disturbing the process, while this is a long time measurement challenge, now CFD is capable of designing probes which don’t effect the process. An ability to deal with less precise in-line versus very precise off-line measurement, viscosity vs rheology, low field NMR instead of high field NMR off line – is being developed. Using Density Functional Theory (DFT) to understand Raman signals from giving predicted vibrations which identify the impurities present.
A further approach is being taken within CAFE4DM to ensure that the approaches developed give the greatest benefit is the study of behavioural change – from operators controlling the factory, to leaders using the information to make quicker decisions, to just better data visualisation and even to complete organisational change.
Our two exhibitors gave brief talks on their products , Phil Kay from SAS explained how JMP allows information to be extracted effectively from complex non-linear formulation data and Nektaria Servi from SMS showed the capabilities of their systems from measuring vapour sorption over a wide range of conditions and on a range of different sample types.
Katharina Roettger - CPI
The Prospect CL project involves CPI, Perceptive Engineering, the universities of Birmingham (where the kit currently is), Leeds and Edinburgh, it is aimed at proving real-world, scaleable, predictive tools for complex liquids. The project will demonstrate predictable scale-up, validate new process sensor technology, validate new process technology, validate new formulations. The predictive scale-up/scale-down approach will be validated across 0.1L, 1L, 10L, 100L and 1000L vessels. A design of experiments (DoE) looking at high internal phase emulsions showed that producing a model based on D90 droplet size measurement was not good enough to control the properties - need more complex particle size distribution parameters. Psuedo-random binary sequence (PRBS) experiments for MPC development gives control of particle size and viscosity from one-step ahead real-time predictions – using this approach on 100L equipment gave similar results as seen in the laboratory scale DoE.
Next steps – quantitative validation of DoE at 100L scale, scale-up of MPC through adaptive modelling, test predictive scale-up approach on new model system.
Simon Mazier – Perceptive Engineering Ltd
Perceptive Engineering carries out advanced process control, enabled by the Perceptive APC Software Platform - data import/export / quality monitoring / analysis, SPC monitoring, multivariate modelling / process monitoring / MPC & continuous / batch optimisation.
The aim is not just to model the process but model the product and use both to control the process – lean model management combined with agile processes. Context is everything - machine learning can learn the wrong thing.
A further driving force for this is that Pharma is moving from batch to continuous, while also aiming to make personalised medicines a reality – so predictive models are required to allow on the fly manufacturing switches. Vision would be scalable blocks, to allow both scale up in production and the push towards personalised medicines.
Lean model management requires both the development and validation of models, which must be robust enough to cope with all process / raw material variation.
Through analysis of large quantities of process data it is possible to identify where the inflection point that indicates the process is about to go out of specification, so allow process to run towards out of specification, but through reacting when the inflection point is detected there is time to respond to get plant back into optimum before product goes out of specification – adaptation.
Through the use of pseudo random binary signal to discover error in model then using a second control loop to spot where the first control loop is failing, an adaptive model is developed which maintains control even when non-optimum, so one course corrects the other.
Real time manager architecture - cloud based model management - don’t treat as black box show to operators so they can use the data.
Predict QC results and ship product before QC comes in, saves 25minutes on 2 hour batch.
Dave Berry – CPI
The talk started with a couple of quotes “All models are wrong, some models are useful” - George Box – Statistician & “The future is already here - it’s just not very evenly distributed” - William Gibson – Writer – to my mind science could be substituted for models in the first and the second was very much the driver for this meeting, we suspected that Industry 4.0 is already here in some parts of the formulated product industry but not in other. The need for scalable agile continuous manufacture requires real-time alteration of processing parameters.
MPP models particulate processing - wet granulation process - GEA Consigma 1. Twin screw granulation - scale to 25 easily.
Need to get enough signal so particle sticking on sensor window and then dropping may be ideal, not free flow.
Bits ok don’t need to do everything.
Paint of microbiological challenges due to drop in allowable concentrations of microbials as they cause skin sensitisation.
Microstructure of products changes very rapidly what effect
Continuous manufacture issue of change over - long term contamination deterioration in performance - change happening slowly lots of due diligence - need to know about process / product / interaction
Cleaning task 2
Joining together - converging technologies - measurement no models, models no measurement - infrastructure piece.
Pharma need for targeted medicines - continuous now allows flexibility in batch size - release of batches easier to get in continuous process - compared to batch - tighter specification makes it much harder for generics manufacturer to make the drug.
Build validation into the process, so can then change the products.
Dick Boddy - Statistics for Industry
Statistics for digital formulation - training for scientists. Step wise multiple regression is a workhorse for dealing with experimental data, but need to avoid correlated variables and bad experimental design will make solution more difficult. Ideally proper “Design of experiments” approaches should be used as these give 0 correlations between variables and allow interactions to be unambiguously identified. In circumstances where the input variable show many inter-correlations then principal component analysis (PCA) or Factor Analysis should be used. Techniques such as cusum plots and autocorrelation plots are also important.
Data allows you to stay ahead of the curve and internet of things (IoT) is now resulting in data no longer being scarce, whether this is very local weather data or process parameters. AkzoNobel has developed new business models in decorative paints with the visualiser app allowing everyone to be their own interior designer and in heavy duty marine paints where a computer system is able to predict when anti-fouling coatings need to be cleaned or repaired.