Checking and reviewing the suite of validation documentation can be very time consuming. TQS Integration can provide the resources needed to ensure GMP and all regulatory compliance requirements are met. Why not let us do the heavy lifting - providing you with faster end-to-end quality review processes - to ensure speed to value and implementation of your processes and more importantly freeing up your capacity and resources. TQS Integration can help to alleviate these pressures by placing people with the right skills, at the right time, in the right place.
TQS PI Documentation set will provide evidence that the PI System has been validated.
The expected system lifecycle steps include but not limited to:
"I want to thank you all for all the great work and all your efforts to expedite this important validation activity and everything you are doing in general. I really appreciate it. Not only can we put our validation work in your trusted hands, but your work has also freed up so much time for us to focus on other areas of the operation."-Top 10 Pharma Company
How we can help you
At TQS, the validation strategy has been developed to systemically test the PI System at different levels. The Installation Qualification (IQ) covers the minimum set of verification to assure proper installation of the software components for the PI System. Operational Qualification (OQ) will verify the correct operation of the PI system against the user requirements and design specification including those around data acquisition and storage, start up and shutdown and high availability etc.
Dedicated Quality Assurance Engineers will monitor, review, and approve every phase of the process to ensure the implementation and design of the PI System adhere to company standards and regulations. QA will conduct and participate in every phase of the SDLC including requirements review, design review and test case reviews including test evidence. Having empirical evidence of the fact that the PI System works as expected ensures a successful outcome during inspections with regulatory organisations.
For information, please contact us.
Machine Learning (ML) will undoubtedly transform manufacturing and grow from a few selected application such as Predictive Maintenance to a wide range of use cases. The technology already exists today, libraries are widely available under open-source licenses and on-premises IT infrastructure as well as cloud service allow these applications to scale.
So, what is holding it back?
One area that limits the wide adoption of ML models is the underlying data structure. Companies have heavily invested in their data infrastructure and the creation of meta databases (mostly ISA-95 and ISA-88), but the productizing of ML models is still lagging. There are several reasons for this:
Industrial standards ISA-95 and ISA-88 provide a framework to structure the equipment and batch model, but by design do not support ML modeling. For example, one equipment can have several ML uses cases that all require a different structure, e.g. example multivariate batch modeling, predictive maintenance, forecasts for predictive control, …
One approach to structure industrial models is ML Relational Mapping (MLRM). It builds on the already existing object relational mapping (ORM) by linking existing type systems. The concept does not require restructuring existing data models and is therefore fast to implement:
MLRM adds an additional type or class that links for example equipment and batch types as well as provides definitions for the ML model. By separating the functionality, this approach does not clutter the existing type system and provides the flexibility to define different models for one class or multi class models without the need to restructure.
The following shows an OSIsoft AF based UI that implements MLRM:
Machine Learning applications will show grow rapidly in the Manufacturing Environment. The challenge will be to provide the right structure, so that ML models can be built on top of existing type systems. ML Relational Mapping (MLRM) provides a flexible approach by implementing a model specific type system that links to existing data models.