I was sparked into arranging this discussion by a question that Stuart Clarke asked at the joint SCI/RSC Colloids / FSTG Meeting in January and a subsequent discussion with Paul and Stuart Clarke. Then Colin Bain suggested that such a topic would resonate with Tom McLeish, so Simon managed to persuade Paul, Stuart and Tom, as acdemics to be futurologist what has traditionally been the very industrial led topic, very much feeding them to the lions. They were all suitably provocative largely painting a picture of revolutionary formulation needing to go back to first principle to rebuild formulation, with one of the challenges being to retain the vast pragmatic experience which exists within different industries.
Tom McLeish gave us a verbal view of revolutionary formulation based on experience from the past, where the formulation / design of long chain polymer melts has gone from an entirely empirical process to one that is now driven by precise understanding of the role of topology and structure in generating desired rheological behaviours. Tom explained has this had taken an act of faith on the part of all concerned, industry allowing academics to work on understanding pure materials which appeared to have little industrial relevance and academia faith that if they delivered new understanding industry would be willing to validate this new science. After 25 years industry has robust reliable product design process and academia has a new understanding of polymer melt rheology. To make this happen industry had to fund research not just do work / supply materials in kind and funding bodies had to provide at least matching funding. Tom had a nice analogy that instead of going through the industrial forest of polymeric complexity, he was able to climb up the in-silico knowledge mountain and hence was able to see the way through the forest from above.
So for formulation science Tom feels that first we need to belive there is a theory of formulation or set of theories and go mountain climbing to start looking for the big rules, big structure, high dimensional spaces of mixing, know what is linear and what is non-linear, is there something that looks like thermodynamics, can we find the short cuts / empirical strategies. Tom fully acknowledged that his example is trivial compared to much of the formulation discussed during FF, we will need to keep the academic / industrial collaboration together for 25 years at least. So Tom wondered how his friends at EPSRC would cope with this, but there is already a great example in the UK Quantum Technologies Plan under which BEIS has has brought together Innovate UK and the Research Councils for a 25 year programme. So can formulation bring together communities that cross between academia and industry which will sustain a 25 year engagement that ensure that an employment rich innovation based industry grows in the UK and doesn't depart for global shores.
I guess my unhelpful immediate reaction was along the lines of "its formulation science Tom, but not as we know it", which I now regret as the more I think about Tom's vision the more it resonates with my experiences.
Paul Bartlett took an apparently different approach, with a couple of generic questions to which he posed some answers.
The first question is one of unity / focus. EPSRC managed to engage a very diverse set of scientist and engineers in the FFCP grants, so are they all doing formulation science or even do we know what formulation science is. Once we are sure we are doing formulation science then it by definition spans many industries, where formulation is often the key IP which companies hold, so how do we achieve the degree of openness to spot what are the generic questions and what are the generic knowledge bases. The FFCP projects focus on highly specific to address these specific future formulation challenges, but are we missing the generic.
The second question is are we focussing too much on making things. Making things is obviously important, but in fact we need to understand things across time, from raw materials to long term product stability. So essentialy are there way to produce a theory which tells us how important parameters evolve over time - so we can measure things and make predictions of for example long term stability - our 1 day stability test to predict many year performance.
Paul felt that in order for the computer simulations discussed during the meeting to start to fill some of our gaps in understanding then they will need to at a more coarse grained level, where we can cope not only with the time evolution of a formulation, but also the stocahstic variability across different batches.
The future vision is that we go from shelf reaching to predictive science, all sorts of pressures are making this imperative - need to remove materials of concern, need to improve robustness of products so they can cope with varability and then as described for biopharmaceuticals where formulation is no longer the last step in the drug development process, drastically increasing the cost of getting it wrong.
We have accepted that formulation science has to deal with complexity, formulations with 20 or 30 components, no one gets sacked for adding a new ingredient into a formulation, to achieve a robust predictive science simpler more effective formulations are both needed and will be a byproduct of this predictive formulation science. Biology works with a very small set of components but links them together in very sophisticated / adaptive ways to produce the correct properties for the environment. Our formulations are crude they do something at time 0 and ideally still do that same thing at time 1 year.
Paul is not convinved that high throughout will get us a revolution in formulation, reach bottle faster. For discovery we need a new synthetic chemistry which will connect molecules across length scales to give us additional control to produce the properties required.
Generic feature not addressed to-date?
Free energy pathways for shelf-life predictions
Long-time predictability from short-time lab experiments
Novel coarse-grained simulation techniques
Data analytics for formulation variability / design
Slide 2 - Future demands on formulation science
"Bucket" chemistry -> Predictive science
Formulation "environment" expanding
Complexity - is it necessary - Adaptive?
Stuart Clarke described himself as a very surprised aademic from Cambridge to find himself on a formulation panel. However, experience of formulation issues through the 90's DTI-Link programme, day to day academic surface scientist understanding how molecules absorb.
Stuart felt that formulation science being built on a reductive model where models of formulations are built from understanding pure components, has meant that the generic models are rarely found or used. Similarities to the situation in the 90's but with lots of nice advances, highlight how as Tom said this is a 25 year challenge even to address some of the simpler systems, so there is an ongoing need to involve funding bodies, as it can be a challenge to get industries' sight beyond the next 6 months to the next product, but industry engagement is key to keep the academics "honest" focused on the big industrial challenge, and not get side-tracked into nice science. Only with these generic models would we be able to effectively screen materials / optimise formulations in-silico to drive true predictive formulation, these models will need to incorporate the formulation expertise that already exsits in industrialist minds.
Nice things have happened over time, our capability to understand these complex system is definitely improving - electron microscopy, synchroton, afm-ir. Differnt length / time scales, chemical complexity, temperature pressure shear. Huge parameter space 20 / 30 components - synergies / antagonisms, is it ever going to be realistic to understand these complex systems. Clever Characterisation for Smarter Formulation meeting is very relevant. In-silico very attractive, but commercial system very complex - is it feasible, need to keep chemistry but go across time / length scales - will it ever going to be accurate enough to avoid screening.
Theory plus modelling from multi-disciplinary - hunt as a pack - need around same table to understand these problems. Different approach systems formulation - never deal with complexities - single components are often complex differnt molecular weights etc.. So step back and concentrate on parameters which determine properties of interest, won't know what a specific component does, but know about the group / class of molecules which do things - not a molecular approach but an effective approach.
Slide 1 - Reductive Approach
Understand components -> build models ('generic understanding') -> predict materials / screen / optimise
Basic problems similar in many ways to those of 20 years ago
(some nice advances!!)
-> Need longer term vision / structure
EPSRC / Industry / Academia -> maintain know-how / experience
How to keep on agenda (if important but not 'sexy')?
-> Advances for the future
Exp: growth in capability (Electron microscopy/synchrotons/AFM-IR etc.)
Lengthscales / timescales / chemical insight / in-situ P, T, gamma
Huge parameter space and synergistic etc..
Clever Characterisation for Smarter Formulation II - 6th November, RSC
Modelling: very attractive but complex commercial systems challenging
Include chemistry/many components/bug systems/shear etc.
When will this be accurate enough to make screening unecessary?
Chemistry/Chemical Engineering / Applied Maths / Physics / Materials Science
Slide 2 - Alternative Approaches
E.g. 'System Biology' -> 'Systems formulation'
Will never be able to deal with full chemical and physical complexity (sovle a sub-set of 'easy' problems)
Forget details- localise on the parameters that matter (effect physics properties we care about).
High throughput studies -> key parameter analysis
(Robin's talk today)
'understanding' but not 'molecules'...('effective' components?)
I can see a lot of commonality between 3 visions: negative formulations / removing components (what did the other 15 components do?); using the big theory for formulation to drive simplification; need for UK formulation community.
in terms of the formulation network then there is the possibility to apply for a network grant from the EPSRC - mainly to support network building.
Other countries treat formulation differently - larger academic base in France, UK its "chemists in industry", Germany more elaborate approach across the different Max Plancks / Frauenhofers / Universities.
UK formulation science is strongly routed in industrial expertise, need to move to wider base with more talking across the community - would a stronger academic community help this? However, it is important we don't just have the conversation, but also capture the knowledge. In general we don't do the analysis of accidents - i.e. when a formulation fails you don't get the Hazop type capturing of what went wrong in the way it is in chemical process safety. To allow this process to occur with the inherent knowledge capture, then industry has to give up a bit of space (secrecy) to give academia some space to synthesis.
CPI is getting industry together, which is starting to give a mechanism for translation of formulation science (knowledge / expertise) between industries.
The UK has an active professional formulation community (FSTG / Joint Colloid Group / IChemE manufacturing etc.), and a developing corporate community, how do we bridge across the professional formulation discussions into the corporate formulation discussions.
The discussion could be seen to imply that industry doesn't know what it is doing, in fact a lot of industries do understand what they are selling, so not starting from a zero knowledge state and any revolution will need to take this knowledge forward. The huge opportunity is to go to predictive design.
In pharmacy you need to get the experimental data in order to get regulatory approval, so in-silico design does not help - so academia needs to engage with regulation to allow innovation.
Regulation could be seen as stifling innovation, but in fact COSHH / REACH drive innovation by forcing removal of components.
Regulation / specification tests are often used in the development process and while this obviously aids final acceptance / approval, they are often inappropriate in that they are either too complex, too slow or require a completely formulated product and as such increase resource needed for formulation development or slow the process. There is a need for development tests which guide formulation develoment, these could be physical tests, indications from sophiticated analytical equipment or even in-silico simulations. Effectively screening formulations so that fewer formulations are put through the regulation / specification tests.
Many of the examples both in the presentations / the discussions were around performance properties which could be easily measured, i.e. rheology, allowing measurement to drive optimisation and give the opportunity for acceleration. The performance of many formulations are complex not easily measured, i.e. mouth feel / skin feel and so make it difficult to drive formulation.
Many raw materials are also not constant, very vary batch to batch / season to season, any revolutionary formulation science will need to be able to deal with these variations.
Formulators are looking for unexpected synergy, how will you ever get this from in-silico - won't see the surprises - never replace laboratory formulation - guidance / picture of how things interact. Physical science will find synergies, but won't connect to the full formulation performance.
Surprising that there has been no mention of sustainability in the presentations or discussions - surely the biggest challenge that formulation faces is the need to change the palette of materials to drive formulation towards a circular economy.
Formulation community will need to be interdisciplinary, but also across different industries. Different industries don't have the same language so how do one industry spot their solution from another industry. Aerospace does it well, but perhaps easier with common performance criteria.
In general while processing has been mentioned today, processing aspects have been understated and they are absolutely key to successful formulation, a need not only for revolutionary formulation science, but also revolutionary formulation engineering.
The end of the discussions we were left with a lot to think about and some a big open question - perhaps formulation is always going to be evolutionary.
Personally I believe that there is the potential for a new approach, for example I suspect that many formulations are more complex in terms of number of components because the processing is in fact too simple. So to take a simple example a processing aid is often added to enable efficient manufacture, I suspect if the time / temperature shear behaviour of the major components and their interactions was better understood then it would be possible to find a set of manufacturing conditions which allow this component to be eliminated from the formulation.
So the revolution will be that a component is only added once it has been shown that it is either essential for performance or no manufacturing conditions exist in which the product could be manufactured without it. This sounds obvious / what we do now, but in fact it will only be possible once we can combine the theory to direct our formulation, with high throughput techniques that allow us to explore complex parameter spaces fully and use the advance in characterisation to understand the structure of our formulations.