Pharma 4.0 refers to the new tools and processes that are enabling smart, decentralised production, with intelligent factories, integrated IT systems, IoT and flexible, highly integrated and automated manufacturing systems. The life science industry has been collecting data in large historian systems for over 40 years and the industry continues to complete projects to physically connect all devices and systems. Data collection and visualisation to improve the performance of the manufacturing supply chain has been a goal for the life science industry for a very long time. However, in GMP manufacturing, it is not about being new – it is about using proven solutions and approaches to create never before seen quality and reliability standards.
There is growing interest among the leaders and decision-makers in the pharma industry around Pharma 4.0. Pharma manufacturers face an ever-present need to remain competitive in a marketplace where product portfolios are diversifying, innovative start-ups challenging the status-quo, supply chain partners becoming more integrated and patients more involved in decisions around their care. Realising the promises of Pharma 4.0 will be the market differentiator for businesses competing in this environment.
Change & reward with reduced patient risk
Pharma 4.0 is a revolution and with any revolution comes change and ultimately reward for those that adapt the quickest and most effectively.
We’re already seeing some examples of revolutionary new thinking when it comes to digital-to-physical transfer (transferring digital instructions to the physical world). Johnson & Johnson is working with HP Inc on 3D printing medical devices including contact lenses in community settings. Scale is currently a challenge but it’s only a matter of time before factories are redundant.
The life science industry is reducing risk with the Pharma 4.0 revolution - the new products, processes and services will be cheaper, faster, safer and higher quality than their predecessors.
Appetite for change
In a survey of business and operational leaders from across the life science sector conducted by Zenith Technologies in 2018, 58% of respondents said that Pharma 4.0 will drive the most change in life sciences over the next 5 years – more than any other technology area. Interestingly, the same number of respondents said that they are currently most focussed on digitalisation, with only 46% stating that Pharma 4.0 was their current priority.
When asked their motivation for investing in new technology:
77% said they want to save money in manufacturing processes
69% want to save time
62% are aiming for increases in revenue
Only 19% want to understand patients better
Ultimately, the criteria that drives decision-making and investment in the life science industry remains the same - business leaders want to reduce cost, increase efficiency and revenue. This remains true for Pharma 4.0.
In the Zenith survey, respondents were asked about automation; 42% of participants said they are currently ‘very automated’ and 12% ‘automated wherever possible.’ In 5 years, 50% want to be ‘very automated’ and 23% ‘automated wherever possible’. The desire for automation is clear as it offers several operational benefits – faster, more reliable and ultimately cheaper processes – but its adoption has been a decades-long and gradual process.
The potential that Pharma 4.0 holds for automation is massive with individual management processes throughout manufacturing expected to become automated.
A simple example is the approach to a temperature gauge giving a higher than expected reading during manufacture. An automated system could detect this reading, interpret event data against previous information and decide upon a course of action and rectify the situation. This would negate the need for an operator to intervene and make an assessment on the required course of action and carry out said action. The time saving for each event will often be minimal, but when scaled across an entire factory or at the enterprise level – the impact becomes significant.
Future developments will also allow machine learning algorithms to adjust manufacturing lines and production scheduling much more quickly than with human intervention. New developments will also pave the way for predictive maintenance and the opportunity to identify and correct issues before they happen.
While MES systems are becoming more widely adopted, they have added more complexity to manufacturing environments. This has brought about implementation challenges. In response, long-term thinking sits at the forefront of every project to ensure that the process is cost-effective and continuously delivers value. With the shift from paper-based processes to intelligent, electronic systems that will be accelerated by Pharma 4.0, MES systems will give more meaningful business insight, remove room for error, and enable resources to be better used. For example, instead of just measuring the downtime of a piece of equipment for overall equipment effectiveness (OEE), MES can provide batch, cleaning, maintenance and operator inputs as context for the downtime to allow for detailed trend analysis and precise preventative measures to be put in place.
As better approaches to recording and accessing data in realtime are adopted, production, both in single plants and across global facilities, will be completely revolutionised by increasingly sophisticated and more connected MES systems.
Connecting everything and creating context
The foundation of any change to a manufacturing environment driven by Pharma 4.0 thinking will be contextualised data and connectivity. Every system and piece of equipment needs to be able to record and distribute reliable event data, communicate with other systems and pieces of equipment and subsequently access relevant and reliable data. Once this connectivity is in place, operational teams have the basis for making better choices, or have the need to make choices removed as self-learning systems do this by interpreting data.
It is imperative that data is contextualised for it to be useful – the integrated technologies need to know what the data is and when it was created. A database with time-stamped data is essential as consistent time data makes every subsequent interpretation and decision simpler and more reliable. If we use the example of the temperature gauge again, when an irregular temperature reading is given, the automated system will look for data around a previous event that mirrors the current one – it simply cannot do this reliably if said data is not accurately time-stamped.
By adding context information, such as, product or recipe names, process phases or batch identification to time-series data in process historians, the value of the data for process engineers is greatly increased. However, historians are ‘write’ optimised and not ‘read’ optimised, creating a technical constraint as they store and compress new data making extraction and interpretation arduous. Finding the relevant historical event and building the process context around it can be laborious, requiring manual manipulation of data rather than an automated approach.
With the right systems, software and approach in place - operators can find specific batches, filter by products or phases and create and overlay profiles for good and bad batches. The ability to search data over a specific timeline and visualise all related events in that timeframe quickly and efficiently will allow users (and eventually machines) to predict more precisely what is occurring or what will occur across industrial processes. Human-machine interfaces such as this are one of the many lauded Pharma 4.0 disruptions that will revolutionise pharma manufacturing.
Big Data analytics draws data from sources that have traditionally been disconnected and looks for relationships and trends that were previously undetectable. For example, combining production data with that from sales and dispatch systems can streamline production planning. ERP tools can already do some of this on a much smaller scale, however they use smaller data sets than a Pharma 4.0 plant will generate so the conclusions and recommendations are comparatively less valid. The life science industry has been doing Big Data analysis for over 20 years. However, there are inherent dangers that will be exponentially more problematic when interpreting the larger data sets that will be created by Pharma 4.0 approaches.
Spurious correlations, which are inevitable when dealing with numerous variables and data points, will no doubt seem attractive to operators and engineers but must be ignored to ensure data is used effectively. If a business acts on this data without confidence, the consequences can be very problematic and even detrimental. It is vital that businesses adopt the datadriven improvement cycle approach – DMAIC (Define, Measure, Analyse, Improve, Control).
The only correlations that should be acted upon must be carefully hypothesised, tested and validated. Process engineers must measure a large amount of data with a small number of variables, wait, monitor and define improvements before implementing change and starting the cycle again. This iterative approach will only be successful if complete, contextualised and accurate data is collected from a fully integrated network of systems and machines.
Disruptive thinking & technology
There are a number of systems and technologies that will develop to enable the better gathering and use of data as the pharmaceutical industry embraces Pharma 4.0.
Big Data is stored in a data lake. It is vital that businesses upload the data into the data lake correctly through optimised ingestion practices, otherwise it becomes a data swamp. Ingestion tools are already readily available, allowing for more time to be spent on analysis. As discussed previously, contextualising data is important - a time stamp makes data ingestion easier and importantly, when records are created at the same time as a batch, the correct ingestion allows time-series and context data to be linked. These ingestion tools can then make accurate suggestions and semi-automate the linking of data.
Edge devices / edge computing
Edge devices collect data and run local analytics on a piece of equipment or process. They store local data for a limited time and only send information to the data lake if there is an event. This keeps the data lake clean of unnecessary information. The resulting data can be used to create insights into performance.
Machine Learning & digital twins
Machine Learning and Artificial Intelligence will lead to significant change in manufacturing practices. With ‘deep learning’, computers will train themselves by running scenarios and learning from the outcomes using the massive amounts of data available in data lakes with the aim of creating a digital twin.
Using simulations to model a process and running scenarios in this virtual system, rather than running experiments on real equipment, is already used in pharmaceutical process development. The concept of the digital twin goes beyond traditional modeling simulation to create a generic digital representation of an asset. The twin captures multiple characteristics of the asset from sensor data, and the data can be used for deviation or anomaly detection, prediction, and simulation. The digital twin models can also learn continuously as they adapt to new information. Building on the DMAIC approach, digital twins will reduce variables by modelling processes, removing a variable and supporting tests for improvement.
Asset performance management & utilisation
Asset performance, throughout its lifecycle, is key to every organisation. When assets can talk to each other and communicate data, engineers can get a better understanding of causes and effects of faults and their impact on performance on a much more detailed scale. Asset performance management tools that can access, interpret and visualise the relevant data in a data lake will offer better predictive asset analytics, risk-based maintenance and condition-based monitoring. Data can also be used to monitor utilisation more effectively and inform decisions around purchasing extra machines by confirming whether current assets are being fully utilised.
Contextualised data in a lake will open the door for the development of clever industrial apps that monitor asset utilisation and performance. These apps will be significantly cheaper than current industrial-scale software, and offer realtime, remote, visual monitoring of assets.
The future of production
The life science industry has been slower to adopt cuttingedge technologies than other sectors. However, it has spent decades using data to drive operational improvement. Embracing the potential for Pharma 4.0 is going to be critical to future operations of all manufacturers. Fully automated and connected facilities that can create, interpret and act upon reliable data will take advantage of all that digital manufacturing has to offer.
The life science industry has spent many years becoming more energy-efficient and using better materials and equipment to make improvements. Pharma 4.0 will see the industry connect equipment across plant and enterprises and use better, more reliable and larger volumes of data to revolutionise manufacturing.Zenith Technologies’ role in the Pharma 4.0 revolution is to help manufacturers connect their systems and equipment and ingest and analyse the data created to inform operational decisions.
Courtesy: Zenith Tech