JVT

Process Validation - Problems and Recommendations: An Interview with Cliff Campbell | IVT

Cliff Campbell, B.E., is Principal, Cliff Campbell Consulting, Ltd., Cork, Ireland. Mr. Campbell is a globally recognized expert in process validation. He and Paula Katz (FDA) recently published "Process Validation Revisited" in the Journal of Validation Technology in Q4 2012. This paper received the JVT "Paper of the Year" award in 2013 and is currently atop the "selected publications" list on the FDA website.

Mr. Campbell studied Control Engineering in University College, Cork, Ireland. His initial career work involved working on measurement and control systems including PAT within the oil and gas industry in UK and Ireland. He subsequently worked in the Irish pharmaceutical sector and ultimately formed Cliff Campbell Consulting. He has served the pharmaceutical industry on a global basis for 25 years.

The following is a conversation on the current state of process validation as perceived by Mr. Campbell. It discusses current issues, and recommends key approaches for the pharmaceutical industry to help implement the PV guidance.

The Journal sincerely thanks Mr. Campbell for his pertinent and relevant comments.

Paper Overview

Pluta: Congratulations on the success of your process validation paper [1] with Paula Katz. To get us started today, will you give us an overview and key points of your paper?

Campbell: We had several objectives in the paper. These included clarifying some specific points in the guidance as well as providing encouragement and direction to companies beginning to embark on its deployment. Further, we intended to summarize the regulatory drivers that resulted in the development of the guidance. We were eager to position the guidance as an elaboration rather than a total rewrite of its 1987 predecessor.

Regarding specifics, criticality is no longer a yes or no state, but rather a continuum with control over attributes and parameters commensurate with their impact on process variability and output. Statistics is a required part of compliance with the cGMPs, and the 2011 Guidance reminds manufacturers of this requirement. It clearly recommends that firms "employ objective measures (e.g., statistical metrics) wherever feasible and meaningful to achieve adequate assurance". The "rule of three" has been effectively rejected, with emphasis now being on protocols that build upon process design knowledge. This knowledge must identify criteria and performance indicators that allow for science and risk-based decision-making about the manufacturing process and subsequent improvements.

Regarding knowledge management, we introduced the concept of "validation trilogies", with significant variables located at the top of the information hierarchy. Rather than having evidence scattered across a multitude of documents, each variable now receives its own three-stage biography, maximizing transparency and line-of-sight.

In terms of implementation, we wanted to provide practical recommendations and advice to manufacturers regarding the format and content of each of the three of PV. At Stage 1, significant variables (an umbrella term for CPPs, IPCs and CQAs) are defined and justified for each unit operation, including operating limits and monitoring requirements; this represents the essential control strategy. At Stage 2, once facilities and equipment have been qualified in terms of fitness for purpose, the Stage 1 control strategy is addressed by enhanced sampling and analysis to confirm reproducible commercial manufacturing. At Stage 3, the control strategy is subject to inter-batch monitoring to assure an ongoing state of control.

There are two aspects to maintaining a state of control. The process must be capable (i.e. operating within specified operating limits), as well as being stable or drift-free. Rather than swamp the monitoring program, the extent to which variables should "endure" within Stage 3 should be determined by risk assessment, with the focus on variables that are predictive rather than indicative of process performance.

Industry Problems

Pluta: What are the major problems you see with industry working to implement the new guidance?

Campbell: Many companies are well on their way in terms of implementing the guidance. It's been three years already, so it'd be kind of worrying if we were still at the "navel-gazing" stage. Most people that I've been dealing with already have Stage 3 pilot projects in process or nearly ready to begin. As far as challenges are concerned, the ones that I hear mentioned most frequently include the following:

  • Is it sufficient to "tweak" existing procedures, or do we need a total overhaul?
  • Dealing with poorly understood legacy processes
  • Dealing with corporate inertia/resistance
  • Organizational issues:
    • Handshakes between stages and across silos
    • Access to statistical expertise
    • Data acquisition tools and responsibilities
  • How and when to apply the guidance to CMOs (Contract Manufacturing Organizations).

Another item that keeps cropping up is this question of how to deal with "small number of batches". It's not rocket science to figure out that in these situations every batch is effectively a validation batch, and should be governed by an enhanced sampling and testing program.

Recommendations Overview

Pluta: What are your recommendations to address these problems?

Campbell: In my view, there are five main areas that companies must address in order to successfully implement the Guidance. These are as follows:

  1. Control strategy
  2. Applied statistics
  3. Standardization
  4. Simplification of the lifecycle
  5. Compliance culture

A company that addresses these areas will be well on its way to implementing the lifecycle approach to PV and, more importantly, be successful and net value from the implementation.

Control Strategy

Pluta: Let's begin with variation identification and strategies to control variation. What are your thoughts?

Campbell: As an engineer, I think the idea of control strategy is tremendous, and long overdue. I know there are some procedural aspects of compliance control that are mentioned in ICH Q8 and Q11. However, if you analyze the 2011 Guidance, it boils down to the identification, quantification, justification, evaluation and verification of significant variables -- process, material or environmental. Compared to the historical approach based on fragmentation, this is achieved incrementally across all stages of the lifecycle. Companies should develop a set of interconnected spreadsheets that map each of the three major stages. Where things get a bit tricky is when different groups and sub-groups are responsible for various aspects of the evidence chain. Everyone talks about the need for handshakes between Stage 1 and 2 and stages 2 and 3. I agree wholeheartedly, and believe these handshakes need to be far more than just checklists and matrices. We should begin with the end in mind. The control strategy should specify the content and format of related control charts needed in Stage 3. Stage 1 identifies the critical areas, Stage 2 demonstrates successful control, and Stage 3 maintains control throughout the lifecycle.

I could talk about this at great length, but for me at least, a best-in-class control strategy should contain a thorough and complete process flow diagram. This document should provide set-points, ranges and rationales for significant variables, an enhanced sampling plan that is a direct extension of the process map, and protective limits and control chart requirements. The process flow diagram should use language that Ken Chapman introduced more than 30 years ago including delineation of potential adverse consequence and approved remedial action in the event of undershoot and/or overshoot. Ken Chapman's insights are crucial -- a control strategy must incorporate feedback and response commitments -- otherwise it is merely a measurement strategy. What could be more simple, logical or risk-based? What is described here is really not new. And if you think about it, the control strategy is a refinement of content traditionally found in process development reports.

Applied Statistics

Pluta: FDA has greatly emphasized the use of statistics in the guidance, and statistics applications are now a pivotal part of the PV lifecycle. What is your view on this?

Campbell: Like everyone else, I think applied statistics are essential for success. Lynn Torbeck in particular is due an enormous debt for his pioneering efforts in this field. This obviously includes experimental design support during Stage 1, justification of sampling plans at Stage 2, and control chart support at Stage 3. FDA used terminology "wherever feasible and meaningful" regarding statistical sampling. PDA Technical Report #60 also addresses the "feasible and meaningful" issue. For example, risk-based sample size justification is a given during Stage 2, but the extent to which this should be require full-blown statistics will vary (e.g. discrete drug product units vs. homogeneous drug substance solutions). The statistics issue reminds me of industry's indiscriminate rush to qualification following publication of the 1987 Guidance. The last thing we need is the same thing to happen all over again vis-à-vis statistics. To illustrate a more general point in terms of keeping things simple, I saw a situation recently where a company had a really heavy (almost excessive) applied statistics procedure in place, resulting in an overwhelming Stage 3 commitment to statistical data treatment. In spite of this, or maybe because of it, they neglected to consider the rudimentary issue of measurement uncertainty analysis for analytical equipment and process instrumentation.

Just one final comment on statistics, relating to enhanced sampling. The 2011 Guidance advises that Stage 2 should have a higher level of sampling, additional testing etc. than would be typical of routine commercial production. It strikes me that we should be approaching this the other way around, i.e. specify and justify a Stage 2 sampling strategy first off, then justify a relaxation of this commitment for routine commercial batches.

Standardization

Pluta: I know you're a strong advocate of standardization. Can you elaborate?

Campbell: I'm talking here about three key concepts that are very much interrelated. First of all, our efforts should be recipe driven, with a playbook of unit operations clearly identified, defined and accessible to all. Next, we should be modular, and recognize that complex processes can be assembled from less-complex components. Finally, we should start doing taxonomy, both at the system and process level, with a view to compiling and reutilizing institutional knowledge.

The traditional approach to standardization, certainly when it comes to validation, was very much based on Word-based templates. These are certainly useful, the objective being to eliminate subjectivity, and make our documented evidence more predictable and amenable to regulatory review as a result. I think we can go a bit further however, and be more data-centric as well. What I'm advocating is a recipe-based approach to each of the three stages of PV, which ties in with the trilogies I was talking about earlier. By developing a control strategy cut-sheet for each significant variable type, you can very quickly start playing any process "game" you wish along similar lines to a deck of cards or a musical score. The process automation community has been doing this for years, when specifying and developing electronic batch records etc. Even if you never got to the stage of standardization across projects, this is well worth the effort within the context of a single project. I know there's a counter argument which says "every process is different", but I've never really bought that line of thinking. It's all about pattern-recognition: your set-points and ranges may be totally different to mine, but we can share the same building blocks and use the same vernacular. Regarding the systems-thinking aspect, I know this has become a cliché, but it's all about silos, which I guess are a necessary evil and come with our territory. What goes on inside of each silo is driven by SMEs and what I would call "lines of longitude". To execute the process validation lifecycle, however, we also need to be thinking on the horizontal, i.e. "lines of latitude". Both views are necessary, and in fact complement one another. One deals with technical detail and the other with compliance essence. An obvious question then poses itself: "who takes responsibility for our lines of latitude"? My vote goes to distributed, team-based ownership, with Quality Unit oversight.

Simplification of the PV Lifecycle

Pluta: Let's talk about simplification. How do you simplify the PV lifecycle?

Campbell: This is all to do with communication across silos and objective data-driven handshakes between each of the 3 lifecycle stages. When this successfully occurs, properly constructed executed specifications become our validation protocols without transcription or further rework. Also, if companies haven't already done so, they should at least be considering Quality Engineering as a legitimate function within the organizational chart with emphasis on the design and delivery of "self-regulated" systems and equipment.

It has always struck me that compliance has been treated as an unnecessarily complicated and time-consuming activity. There are a number of reasons for this, including an undeserved aura or mystique that surrounded industry's handling of FDA's 1987 Guidance. With some notable exceptions, the extent to which validation departments still distrust or disregard their engineering counterparts is disheartening. This reminds me of Janet Woodcock's description of the industry as being like a super-tanker, which takes an inordinate amount of time to change course after receiving a request or instruction to do so. Dr. Woodcock has also been encouraging the industry to adopt an "efficient, agile, and flexible mindset" to implement the new PV guidance. If we were to take her at her word, we'd at least be considering Quality Engineering as an integral part of our organizations. To come back to your question, I see no reason why a simplified and economical validation lifecycle based on the principles of self-regulation cannot be achieved. I've used this term a number of times in the past, and some people have misinterpreted it to imply elimination of the Quality Unit or of the Agency. This is patently not the case, and perhaps "self-evidential" might be a better label. To clarify, what I mean to say is that as soon as the requirement for an item of any type whatever is declared (e.g. equipment, instrument, sample, parameter) all of the acceptance criteria and requisite test procedures are declared from the outset, based on a combination of prior knowledge, experience and process understanding.

Compliance Culture

Pluta: What major compliance culture differences have you encountered in the course of your consultancy assignments?

Campbell: I'm talking specifically about implementation of the 2011 Guidance here. Within the industry and also within individual companies, I've seen visionaries and early adopters at one end of the scale and the "do not disturb" lobby at the other end. The vast majority of organizations are somewhere in between. These "wait and see" people want to do the right thing, but are waiting for the visionaries to lead the way. At company level, I've seen situations where one or two individuals act as standard bearers for "new PV", encountering organizational apathy or resistance along the way. I've also heard mixed messages in regard to the incremental cost of doing new PV, and whether this can be achieved with existing ort charts and budgets. My own view is that done properly, it can actually result in cost savings rather than penalties.

Final Thoughts

Pluta: Any final thoughts for our readers - especially practical lessons for our industry?

Campbell: I encourage readers to continually develop an inquisitive and learning attitude - especially learning from the recognized experts in the fields of quality and compliance. There is no need to "reinvent the wheel" when so much excellent work has already been described. Deming, Juran, Crosby, Ken Chapman - to name just a few - have already developed basic principles in quality. Validation and compliance professionals must strive to understand these concepts and apply them to their specific situation. I also suggest incorporating the experience of thought leaders from other fields who have offered their wisdom, particularly in regard to systems thinking and first principles. In my experience, these fellow-travellers and their insights have direct application to a post-modern pharmaceutical industry. A few of these follow:

"Objects must not be multiplied beyond necessity" (William Ockham, 14th century).

William of Ockham was an English scholar and logician, best known for "Ockham's Razor" summarized above. It can also be stated as "simpler explanations are generally better than more complex ones." I think the relevance to the pharmaceutical lifecycle is clear. Overkill is an occupational hazard within our industry, and knowing when to stop is very important. For example, we do not need to conduct risk assessments on top of other risk assessments. In terms of documented evidence, you probably already know that protocol size can be inversely proportional to its significance. Process understanding is the key to success. We should of course also be aware of Einstein's constraint: "Everything should be kept as simple as possible, but no simpler."

"The meaning of a statement lies in the method of its verification" (Carnap, 20th century).

Rudolf Carnap was a German philosopher who was active in Europe before 1935, and in the US subsequently. He was a strong advocate of logical analysis as a means of systematically resolving technically complex problems and assertions. In terms of relevance to pharma, the quotation above is the most cogent explanation of the PV lifecycle that I have come across. It confirms the immediate and hardwired connection between specifications and evidence. In terms of protocol preparation, it results in a very economical statement of the obvious. The executed specification becomes the validation protocol, and protocol sentences are an integral part of the underlying specification.

"To be is to be the value of a variable" (Quine, 20th centry)

Willard V. Quine was an American philosopher and logician in the analytical tradition. From 1930 until his death 70 years later, he was directly affiliated with Harvard University. Quine had a strong interest in ontology, a field that addresses what entities exist within a domain, and how such entities can be categorized and interrelated. The Quine slogan quoted above goes back to the concept of standardization. Systems, processes, and components can be categorized so that their datasets and test procedures become converted into readily available institutional knowledge. For example, your approach to temperature measurement may be totally different to mine in terms of its values, but we share a common and configurable vocabulary including target value, proven acceptable range, engineering units, measurement uncertainty, control limits, calibration frequency, and so on.

Taken collectively, the above concepts help to address the three fundamental elements of knowledge management, namely, what there is (X), what it's like (Y), and how we know (Z).

References

  1. Campbell, Cliff and Paula Katz. "FDA 2011 Process Validation Guidance: Process Validation Revisited." Journal of Validation Technology, Volume 18, #4, 2012.
  2. FDA. Guidance for Industry. Process Validation: General Principles and Practices. January, 2011.

Download the entire article in PDF format




Product Added Successfully

This product has been added to your account and you can access it from your dashboard. As a member, you are entitled to a total of 0 products.

Do you want access to more of our products? Upgrade your membership now!

Your Product count is over the limit

Do you want access to more of our products? Upgrade your membership now!

Product added to cart successfully.

You can continue shopping or proceed to checkout.

Comments (0)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
  • Use to create page breaks.
Image CAPTCHA
Enter the characters shown in the image.
Validation Master Plan Download banner