JVT

Quality-by-Design for Analytical Procedures: Introduction | IVT

Peer Reviewed: Method Validation


A new approach to analytical procedures has arrived. For production processes, quality-by-design (QbD) is being used successfully; the same QbD approach can be applied to analytical procedures. In addition, there is now a technique to definitively link the data to its intended use. These are exciting times for testing laboratories and the users of the data they produce. This series of articles will introduce, describe, and explain this QbD approach so you can start to use it with the analytical procedures in your laboratory.

With this new approach comes a number of buzzwords: QbD, analytical procedure lifecycle, analytical target profile, decision rules, measurement uncertainty, and target measurement uncertainty. Before discussing these terms, take a look at the reasons this approach is needed and is being developed and adopted.

A major deficiency with today’s approach to analytical procedures is an inability to definitively demonstrate and ensure the data is fit for its intended use. Have you ever wondered if a method is fit for use and you can’t get a clear answer? Have you ever wondered why the transfer of a method that is fully validated to today’s standards fails a method transfer? Have you been in a situation wherein you needed to know how much variability comes from the manufacturing process and how much comes from the analytical procedure but had difficulty determining these variability components?

There are multiple causes to these problems. We do not have a defined process to quantify the risk and probability of failure and the purpose of the data. The method to completely and adequately characterize variability is not broadly known or used. There is no commonly accepted, statistically-based technique to adequately translate the use of the method into key performance criteria for method validation. This can be summed up by there being no guidance on how to demonstrate if a method is fit for its intended use. The United States Pharmacopeia (USP) General Chapter <1225> on method validation and ICH) Q2 Validation of Analytical Procedures provide guidance on the parameters that need to be included in the validation of an analytical procedure, but they do not give guidance as to the acceptance criteria for these parameters, nor how to link the validation to the intended use of the data.

Without these processes, techniques, and guidances, the acceptance criteria for validations of analytical procedures are often vague and are not numerical. Examples of such requirements are:

  • The test result must be sufficiently accurate
  • Any bias must not be such that an incorrect decision will be made
  • The variability of the test result must be small enough so that the decision made is fit for its intended use and not based on random variability
  • The result is based on the data and is not based on random variability
  • The test method must be sufficiently sensitive so that the decision that an analyte is present or absent is correct and not solely based on the method’s capability.

These are laudable goals of a validation, but are not numerical and not practicable because it is difficult to judge if they have been met. Often the decision that the method validation passed contains an element of subjectivity.

The good news is that the new QbD approach uses risk analysis and probability to clearly define the purpose of a procedure using a decision rule. From this, the analytical target profile is created that is “the combination of all performance criteria required for the intended analytical application that direct the method development process.” (1). The analytical target profile then guides the analytical procedure through its entire lifecycle, ensuring the method produces data that is fit-for-its-intended-use. 

The QbD approach as applied throughout the lifecycle of an analytical procedure is illustrated in the Figure. The analytical procedure has three stages in its life. Method design includes the research and development of a method to meet the key performance indicators. Method qualification demonstrates the critical variables, those that affect the uncertainty, are in control, and a major outcome of method qualification is the control strategy for those critical variables. Qualification includes the activities now associated with method validation. Continued verification includes those activities, such as control charts, used to confirm the method performs adequately during routine use and in response to change, including method transfer between laboratories. The overall driving control for these activities is the analytical target profile that is derived from the decision rule that defines the purpose of the reportable result. 

Figure: The QbD Approach as Applied throughout the Life of a Method from Method Design, Qualification, and Continued Verification.

The QbD Approach as Applied throughout the Life of a Method from Method Design, Qualification, and Continued Verification.

In the next paper, the analytical procedure lifecycle and QbD approach will be presented in greater detail.

References

  1. M. Schweitzer, et al., Pharmaceutical Technology 34 (2), 52–59 (2010).

General References

  1. M.L. Jane Weitzel, “The Estimation and Use of Measurement Uncertainty for a Drug Substance Test Procedure Validated According to USP <1225>” Accreditation and Quality Assurance: Journal for Quality, Comparability and Reliability in Chemical Measurement 17 (2), 139-146, 2012, DOI: 10.1007/s00769-011-0835-5, 2011.
  2. M.L. Jane Weitzel and W.M. Johnson, “Using Target Uncertainty to Determine Fitness for Purpose,” Accreditation and Quality Assurance: Journal for Quality, Comparability and Reliability in Chemical Measurement 17 (5), 491-495, (DOI) 10.1007/s00769-012-0899-x, 2012.
  3. M.L. Jane Weitzel, W.M. Johnson, Application of ISO/IEC 17025 Technical Requirements in Industrial Laboratories; Method Validation, Friesenpress, 2013.
  4. V.J. Barwick and E. Prichard (eds), Eurachem Guide: Terminology in Analytical Measurement – Introduction to VIM 3. ISBN 978-0-948926-29-7, 2011, available at: http://www.eurachem.org/index.php/publications/guides.
  5. H. Ramsey and S.L.R Ellison(eds.), Eurachem/EUROLAB/CITAC/Nordtest/AMC Guide: Measurement Uncertainty Arising from Sampling: a Guide to Methods and Approaches Eurachem, ISBN978 0 948926 26 6, 2007, available at: http://www.eurachem.org/index.php/publications/guides.
  6. International Vocabulary of Metrology – Basic and General Concepts and Associated Terms, JCGM 200:2012, Joint Committee for Guides in Metrology (JCGM), 2012, available at: www.bipm.org.
  7. S.L.R. Ellison and A. Williams (eds), Eurachem/CITAC guide: Quantifying Uncertainty in Analytical Measurement 3rd ed., ISBN 978-0-948926-30-3, 2012, available at: http://www.eurachem.org/index.php/publications/guides.
  8. ASME B89.7.3.1-2001 Guidelines for Decision rules; Considering Measurement Uncertainty in Determining conformance to Specifications, copyright 2002 by the American Society of Mechanical Engineers.
  9. J.K. Taylor, Quality Assurance of Chemical Measurements, Lewis Publishers, 1987, p. 266.
  10. J.N. Miller and J.C. Miller, Statistics and Chemometrics for Analytical Chemistry, 6th ed., Pearson Education Limited, 2010.
  11. J.M. Juran and A.B. Godfrey (eds), Juran’s Quality Handbook, 5th ed., McGraw-Hill – International Edition, 2000.
  12. AOAC International, How to Meet ISO 17025 Requirements for Method Verification (accessed March 9, 2012), available at: http://www.aoac.org/accreditation/accreditation.ht.
  13. USP <1225> Validation of Compendial Procedures.
  14. Eurachem Guide: The fitness for Purpose of Analytical Method: A Laboratory Guide to Method Validation and related topics, 1998, ISBN 0 948926 12 0, (accessed March 9, 2012), available at: www.eurachem.org.
  15. G.T. Wernimont, “Evaluating the Ruggedness of an Analytical Process,” Use of Statistics to Develop and Evaluate Analytical Methods, 78-82, 1985.
  16. USP, USP Medicines Compendium.
  17. Royal Society of Chemistry, “Terminology – The Key to Understanding Analytical Science. Part 1: Accuracy, Precision and Uncertainty,” AMC Technical Brief No.13, 2003, available at: http://www.rsc.org/images/brief13_tcm18-25955.pdf
  18. ISO, Accuracy (trueness and precision) of measurement methods and results -- Part 2: Basic method for the determination of repeatability and reproducibility of a standard measurement method.
  19. S.L.R. Ellison, V. Barwick, T.J. Farrant, Practical Statistics for the analytical scientist: a bench guide, Royal Society of Chemistry, 2009
  20. L.B. Barrentine, “An Introduction to Design of Experiments A Simplified Approach,” ASQ Quality Press, 1999.



Product Added Successfully

This product has been added to your account and you can access it from your dashboard. As a member, you are entitled to a total of 0 products.

Do you want access to more of our products? Upgrade your membership now!

Your Product count is over the limit

Do you want access to more of our products? Upgrade your membership now!

Product added to cart successfully.

You can continue shopping or proceed to checkout.

Comments (0)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
  • Use to create page breaks.
Image CAPTCHA
Enter the characters shown in the image.
Validation Master Plan Download banner