Latest Event Updates
ANDA Submissions – Prior Approval Supplements Under GDUFA, FDA Guidance document, oct 2016, Generics
The presentation will load below
//////ANDA Submissions, Prior Approval Supplements, GDUFA, FDA Guidance document
Process Validation of Existing Processes and Systems Using Statistically Significant Retrospective Analysis
Process Validation of Existing Processes and Systems Using Statistically Significant Retrospective Analysis – Part One of Three
The FDA defined Process Validation in 1987 by the following: “Process Validation is establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specification and quality attributes.” 1The purpose of this article is to discuss how to validate a process by introducing some basic statistical concepts to use when analyzing historical data from Batch Records and Quality Control Release documents to establish specifications and quality attributes for an existing process. In an ideal world, the qualification of processing equipment, utilities, facilities, and controls would commence at the start up of a new plant or the implementation of a new system. This would be followed by the validation of the process based on developmental data and used to establish the product ranges for in process and final release testing. However, the ideal case may not exist and thus there are incidences where commissioning of facilities or new systems occurs concurrently with the qualification and process validation; or the facility and equipment are “existing” and there is no such documentation. Some facilities, equipment, or processes pre-date the above definition by many years and therefore have never been validated on qualified equipment, utilities, facilities, and controls. Additionally, in some cases, no developmental data exists to establish the product ranges for in process and release testing.
Basics of Process Validation
Before examining the existing processes, it is important to first understand the basic concepts of Process Validation. Figure 1 is a flow chart defining Process Validation from the developmental stage to the plant floor. As a simplistic example, a process begins with the raw materials being released, then the raw materials are mixed, pH is adjusted, purification occurs by gel chromatography, excipients are added for final formulation, and the product is filled and terminally sterilized. Each of these steps has defined functions and therefore would have a designed goal. For example, purification would not begin until the desired pH is reached in the previous step. Therefore, the desired pH is an in process attribute of the pH adjustment stage and the amount of buffer used to adjust the pH is a processing parameter. Each of these steps has attributes that one would want to monitor to determine that the product is being produced acceptably at that step such that the next process step can start. The ranges for these attributes are generally determined by process development data so that if the process attributes are met, then there is a high degree of confidence that the final container is filled with a product of acceptable attributes as determined by the developmental data.
Click on image for larger view
During Process Validation, there needs to be approved Standard Operating Procedures (SOPs) in place that Plant Operators have been trained on. Analytical testing should be performed by SOPs and the Quality Control (QC) analysts should be trained in these SOPs as well. The analytical tests should have been previously validated by normal analytical methods validation and documented as such. Also, the equipment used to prepare product must be documented to be qualified for its installation, operation, and performance, commonly referred to as IQ, OQ, and PQ. There are methods for performing such qualification retrospectively, but for the purpose of this discussion, it is only important to note that the equipment must be qualified.
Relating Process Validation to Existing Processes
Existing processes may lack developmental data for in process ranges and release testing. If a retrospective analysis of existing data is used to establish process ranges, including input, output, and in process testing parameters, then the process can be treated like a new process by following the basics of Prospective Process Validation. The difference between traditional Prospective Process Validation and Prospective Process Validation based on retrospective analysis is that in place of developmental data to establish ranges, the retrospective analysis reviews data from past Batch Production Records, QC test reports, product specs, etc. Thus the items that need to be in place are: approved SOPs, and Batch Records with personnel training, equipment qualification (IQ, OQ, PQ), QC methods validated, approved SOPs and training for QC personnel. With these in place, all that is missing is a Process Validation Protocol with defined ranges. The best sources of this information are approved completed batch records, process deviation reports, QC Release data, and small-scale studies. From these, the following items must be completed:
- Critical parameters and input and output parameters must be defined.
- A statistically valid time frame or number of batches must be determined.
- The data used to establish the parameters must be extracted from controlled documents.
- The data extracted from the controlled documents will be analyzed to establish ranges.
Each one of these steps will be examined in the following sections to describe them in further detail.
Critical parameters and input and output parameters defined.
In The Guidelines on General Principles of Process Validation, 15 MAY 1987, it states that:
The validity of acceptance specifications should be verified through testing and challenge of the product on a sound scientific basis during the initial development and production phase.1
It is important to determine which parameters in your process are critical to the final product. When determining these parameters and attributes a variety of personnel with different expertise should be utilized. Assembling a team of professionals is a starting point and this committee should be a multi-disciplined team including Quality, Validation, Systems Engineering, Facility Engineering, Pharmaceutical Sciences (or R&D), and Manufacturing. When determining the parameters and attributes which are critical, it is important to consider those which if they were not controlled or achieved, then the result would have an adverse effect on the product. A risk assessment should be performed to analyze what the risk is and what the results are if a specific parameter or attribute is not controlled or achieved (e.g. the resulting product would be flawed). Risk assessment is defined by The Ontario Ministry of Agriculture, Food and Rural Affairs as:
- the probability of the negative event occurring because of the identified hazard,
- the magnitude of the impact of the negative advent, and
- consideration of the uncertainty of the data used to assess the probability and the impact of the components. 2
Click on any image for larger view
Figure 2 is a list of general questions to consider when assessing risk while Figure 3 is an example of Fault Tree Analysis – a formal approach to evaluating risk, where a Top Level Event is observed and through questions and observations the cause of the event can be determined.
National Center for Drugs and Biologics and National Center for Devices and Radiological Health, “Guidelines on General Principles of Process Validation,” Rockville MD. 15 MAY 1987.National Center for Drugs and Biologics and National Center for Devices and Radiological Health, “Guidelines on General Principles of Process Validation,” Rockville MD. 15 MAY 1987.Ontario Ministry of Agriculture, Food and Rural Affairs (2000), Queen’s Printer for Ontario, Last Updated March 22, 2000; WEB:http://www.gov.on.ca/omafra/english/research/risk/assum1b.html.
Process Validation of Existing Processes and Systems Using Statistically Significant Retrospective Analysis – Part Two of Three
A Statistically Valid Time Frame or Number of Batches
How large of a sample set is needed of previously recorded data to determine ranges that are truly representative of the process, and will the ranges be useful in the Validation effort and not set one up for failure? This is a difficult question to answer, and it is important to note that the batches selected should have no changes between them, thus be produced with the same processing conditions. The draft FDA Guidance for Industry, Manufacturing, Processing, or Holding Active Pharmaceutical Ingredients from March of 1998 suggests that 10-30 consecutive batches be examined to assess process consistency. 1
This is a good target statistically because when selecting a sample size or population, the concept of Normality or Degree of Normality becomes important. As a general rule of thumb, the population should be large enough that the distribution of the samples within the population approaches a Normal or Gaussian distribution (one defined by a bell-shaped curve). Thus in theory if the samples are normally distributed then 99% of the samples will fall within the +/- 3 Standard Deviation (S.D.) range. 2 When considering this, one should think in the sense of Statistical Significance (the P-Value). The P-Value is the significance that a given sample will be indicative of the whole population. Thus at 99% (a P-Value of 0.01) then a given sample has 1% chance of falling outside the +/- 3 S.D. range or (assuming no relationship with other variables) the sample has a 99% statistical significance as representative of the population. Finally, when considering the central limit theorem, which has the underlying concept that as the sample size gets larger, the distribution of the population becomes more normal, then in general a sample size of 10 – 30 as the FDA suggests would have a high chance of being distributed normally.
The data used to establish the parameters must be extracted from controlled documents.
When the number of batches to review is selected, the next step is to determine from what documents the processing data will be extracted. Typically the range establishing data must be taken from approved and controlled documents (see the examples below).
Examples of Controlled Documents:
- Batch Production Records (BPRs)
- Quality Control (QC) Lab Reports
- Limits establish by Licensure
- Product License Agreement (PLA)
– Biologic License Agreement (BLA)
– New Drug Application (NDA) or Abbreviated (ANDA)
- Product Release Specifications
- Small scale bench studies simulating plant environment.
The data extracted from the controlled documents will be analyzed to establish ranges.
Having established where the data will be selected from, the data must then be analyzed for specific trends such to define ranges for the Process Validation Protocol’s acceptance criteria. This acceptance criteria will be what the “Actual” process data collected during the execution of the Protocol will be compared to, in order to verify its acceptance. This part is where much thought needs to be applied so that the acceptance criteria are not so tight that failure is eminent or so broad that the achievement of the criteria proves nothing. Listed below are general steps that can be incorporated to determine the analysis.
- Draw “X Charts” and analyze for outliersApply a model to determine Normality
- Determine the +/- 3 S.D. range and plot on Trend Chart
- Determine the Confidence Interval as compared to +/- 3 S.D.
- Process Capability Indices
- Recording the Maximum and Minimum
- Assign Acceptance Criteria Range and justify
Drawing Trend Charts
Trend Charts, also referred to as X-Charts, are a good way of plotting data points from a set of data where the target is the same metric (for example pH as measured at a specific point in the process). It is a matter of defining the X-axis by the number of samples and the Y-axis by the metric that is being used. As an example, the X-axis could be a list of the batches by batch number and the Y axis could be pH. Figure 4 is an example of a type of trend chart. This way the data is presented graphically and can be appreciated with respect to setting a range.
With the data plotted, one can quickly assess any visible trends in the data. Additionally one can no begin the task of applying statistics to the data. It is important to determine if there are outliers in the data. Outliers may exist and can usually be rationalized by adverse events in processing as long as they are reported appropriately. Outliers can also exist as samples that are “statistically insignificant.” As mentioned before, the P-Value is the significance that a given sample will be indicative of the whole population so that outliers would have a very low P-Value. One method for determining outliers is to use a box-plot where a box is drawn from a lower point (defined typically by the 25th percentile) to an upper point (typically the 75th percentile). The “H-spread” is defined as the distance between the upper and lower points. 3 Outliers are then determined to be any data that falls outside a predetermined multiplier of the H-spread. For example the lower outlier limit and upper outlier limit are defined as 4 times the H-Spread, anything above or below these limits is statistically insignificant and are outliers.
Apply a model to determine Normality
With the accumulated data plotted, the Degree of Normality should be investigated so that the data can be analyzed by the appropriate method. There are several models for determining the Degree of Normality; some common ones are the Kolmogorov-Smirnov test, Chi-Square goodness-of-fit test, and Shapiro-Wilks’ W test. 4 Once the Degree of Normality is determined a more appropriate statistical method can be applied for setting ranges. If the data is determined to be non-Normal than there are two approaches to evaluating the data. The first way is to apply a Nonparametric Statistical model (e.g. the Box-Cox Transformation 5 ), however, these tests are considered to be less powerful and less flexible in terms of the conclusions that they provide, so it is preferred to increase the sample size such that a normal distribution is approached. 5 If the data is determined to be Normal or the sample size is increased such that the data is distributed more normally, then the data can be better analyzed for it’s range characteristics.
Click on any image for larger view
Determine the +/– 3 SD range and plot on Trend Chart
The data having now been displayed graphically should be analyzed mathematically. This can be done by using simple statistics where the mean is determined as well as the standard deviation. The mean refers to the average of the samples in the population. The standard deviation is the measure of the variation in the population from the mean. If the distribution proves to be normal, as from our normality tests above or by selecting a large enough population such that the central limit theorem predicts the distribution to be normal, then it stands that 99% of the data will fall within the +/–3 SD range. Using our example from Figure 4, the data is analyzed for its mean and standard deviation using the displayed formulas in Figure 5. Once this is determined, the +/– 3 SD can be applied to the trend charts by drawing them as limits at their values. This graphically displays the data as it is applied per batch and how it fits within the statistical limit of +/– 3 SD (see Figure 6.)
- U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research, Guidance for Industry, Manufacturing, Processing, or Holding Active Pharmaceutical Ingredients, March 1998, 36
- StatSoft, Inc. (1999). Electronic Statistics Textbook. Tulsa, OK: StatSoft. WEB: http://www.statsoft.com/textbook/stathome.html
- Rice Virtual Lab in Statistics (1993-2000), David M. Lane, HyperStat Online, Houston TX, WEB:http://www.ruf.rice.edu/~lane/rvls.html
- StatSoft, Inc. (1999). Electronic Statistics Textbook. Tulsa, OK: StatSoft. WEB: http://www.statsoft.com/textbook/stathome.htm
- Box, G.E.P., and Cox, D.R. (1964), “An Analysis of Transformations,” J. Roy. Stat. Soc., Ser. B.
Process Validation of Existing Processes and Systems Using Statistically Significant Retrospective Analysis – Parts Three of Three
Determine the Confidence Interval as compared to +/- 3S.D. With the +/- 3 S.D. ranges determined, it can be considered important to evaluate what confidence there is that the next data point will fall within this range. The rationale for determining this level is to justify that the +/- 3 S.D. range provides a confidence that 99% of the data is within that range. Similar to the +/- 3 S.D. range, the confidence interval is a range between which the next measurement would fall. This level is typically 99% or greater. Thus a 99% confidence interval means, “there is 99% insurance that the next value would be in the range.” To calculate a 99% Confidence Interval, one needs to consider the area under the standard normal curve, the mean, the standard deviation, and the population size. Determining this level can be done using the formula in Figure 7. The confidence interval can be added to our previous example and is displayed in Figure 8.
Figure 8: Trend Chart with +/- 3 S.D. and Confidence Interval.
Figure 7: Definition of Confidence Interval Formulas.
Following our example:
Confidence Interval (99.9%) = 6.23 ± snc ( 0.151202/(30) 1/2)
Confidence Interval (99.9%) = 6.95 to 6.02
The confidence interval at 99.9% is slightly more narrow than the +/- 3 S.D. range which follows the trend if the +/- 3 S.D. range provides that 99% of the data will be within the range if the data is normally distributed.
As can be seen in the trend charts, the data fits well within the +/- 3 S.D. range, and therefore the confidence level is very high that the next data point that is collected will be within this range. Therefore this range may be appropriate to use as acceptance criteria based on the statistics. If the confidence level was wider than the +/- 3 S.D., then the data would have to be analyzed such to investigate if there were errors in calculating the degree of normality, the +/- 3 S.D., the confidence level, the outliers, or errors in the sampling technique to show that it was not computational error.
Process Capability Factors
Another method to setting ranges to be used as acceptance criteria are the Process Capability Indices defined by:
CPu = (USL – µ) / s (or in our example 3 S.D.)
CPl = (µ – LSL) / s (or in our example 3 S.D.)
USL = Upper Specification Limit
LSL = Lower Specification Limit
An industry accepted standard CP would be 1.33.1 This would mean that only 0.003% of the testing results would be out of the specification or 99.997% would be within the specification. This is a similar concept to confidence level.
Recording the Maximum and Minimum
Recording the maximum and minimum values in the data is important because it is a quick way to see if the data is all within the +/- 3 S.D. range. Additionally, if the maximum and minimum are within the +/- 3 S.D. range, than there is an additional level of confidence since all of the data would be within the range. Lastly, the data may be determined to be non-normally distributed and in such case, confidence may predict to high a possibility for failure at the +/- 3 S.D. range so in the interim, the maximum, and minimum values can be selected to be the range until further data can be collected to define the range (this refers back to increasing the sample size in order to approach a more normal distribution).
Assign Acceptance Criteria Range and Justify
Using all of the above analysis techniques, knowledge of the process and agreement on by a cross-discipline committee, acceptance criteria ranges can be assigned for the critical parameters and attributes. A general course of action would be to start by recording all the data at a given point in a spreadsheet, calculating the mean, S.D., population size, +/- 3 S.D., 95% and 99% confidence intervals, plotting the trend charts with appropriate ranges, and then deciding on which range makes the best sense. When selecting the acceptance criteria, a cross-functional committee should be utilized with backgrounds of QA, Manufacturing, Validation, R&D, and Engineering present. The
ranges should be selected and justified by scientifically sound data and conclusions. The ranges should be within the PAR for the product, which means that if +/- 3 S.D. is selected, the range should be checked at the upper and lower limits to verify that acceptable product is prepared. This should be done prior to a final agreement on the range and incorporation into the validation protocol. A report should be written to document the ranges with the rationale for selecting them and the justification for determining the limits as well as any determination that the ranges are within the PAR. Additionally, those ranges which are not to be included should be discussed within the report to justify why they are not to be recorded. A process validation protocol should be prepared with theses ranges for acceptance criteria and the process should be run at a target within the acceptance criteria ranges at least three consecutive times using identical procedures to verify that the process is valid.
Since the ideal case of validating a process during its implementation does not always exist in the pharmaceutical, biopharmaceutical, biotechnology or medical device industries, it may be important to determine a way to validate these processes using historical data. The historical data can be found in a variety of places as long as it is approved (e.g. approved and completed BPRs or quality control release documents, etc). A cross-functional team should perform a risk assessment on the parameters and attributes to determine which ones would be included in the process validation. A range establishing study for the attributes and parameters should be performed to evaluate historical data and analyze the data set for the concepts of normality, variation (standard deviation), and confidence. With a high degree of confidence, acceptance criteria ranges should be set for each parameter and attribute and a process validation protocol should be written with the appropriate ranges. This protocol should be approved and executed at target settings within the acceptance criteria ranges, from the start of the manufacturing process to the finish using qualified equipment, approved SOPs, and trained operators. In a final report for the process validation, the degree to which the process is valid would be determined by the satisfaction of the approved acceptance criteria.
1. Box, G.E.P., and Cox, D.R. (1964), “An Analysis of Transformations,” J. Roy. Stat. Soc., Ser. B., 26, 211.
2. Kieffer, Robert and Torbeck, Lynn, (1998), Pharmaceutical Technology, (June), 66.
3. Lane, David M., Rice Virtual Lab in Statistics (1993-2000), HyperStat Online, Houston TX, WEB: http://www.ruf.rice.edu/~lane/rvls.html.
4. National Center for Drugs and Biologics and National Center for Devices and Radiological Health,(1987) “Guidelines on General Principles of Process Validation,” Rockville MD. 15 MAY.
5. Ontario Ministry of Agriculture, Food and Rural Affairs (2000), Queen’s Printer for Ontario, Last Updated March 22, 2000; Web: http://www.gov.on.ca/omafra/english/research/risk/assum1b.html.
6. StatSoft, Inc. (1999). Electronic Statistics Textbook. Tulsa, OK: StatSoft. WEB: http://www.statsoft.com/textbook/stathome.html.
7. U.S. Department of Helath and Human Service, Food and Drug Administration, Center for Drug Evaluation and Research, Center for Biologics Evaluation and Research, Center for Veterinary Medicine, “Guidance for Industry: Manufacturing, Processing, or Holding Active Pharmaceutical Ingredients,” Rickville MD. March 1998, 36.
//////////Process Validation, Existing Processes and Systems, Retrospective Analysis
ICH Q10 was published in its final version already in 2008. However, today many companies still have problems to understand how to implement ICH Q10 “Pharmaceutical Quality System” into practice. Quality Assurance and GMP are basic requirements which have been implemented for many years in the pharmaceutical industry (including the API industry). So what is needed to demonstrate that a Pharmaceutical Quality System has been implemented? Please read more about the GMP Questions and Answers.
ICH Q10 was published in its final version already in 2008. However, today many companies still have problems to understand how to implement ICH Q10 “Pharmaceutical Quality System” in practice. Quality Assurance and GMP are basic requirements which have been implemented for many years in the pharmaceutical industry (including the API industry). So what is needed to demonstrate that a Pharmaceutical Quality System has been implemented?
ICH offers a set of questions and answers which provide more details about the expectations. They were published in 2009 already but are not well-known by the industry. ICH writes: “When implemented, a company will demonstrate the use of an effective PQS through its documentation (e.g., policies, standards), its processes, its training/qualification, its management, its continual improvement efforts, and its performance against pre-defined key performance indicators (see ICH Q10 glossary on performance indicator). A mechanism should be established to demonstrate at a site how the PQS operates across the product lifecycle, in an easily understandable way for management, staff, and regulatory inspectors, e.g., a quality manual, documentation, flowcharts, procedures. Companies can implement a program in which the PQS is routinely audited in-house (i.e., internal audit program) to ensure that the system is functioning at a high level.”
The questions and answers document also states that there is no certification program in place for a Pharmaceutical Quality System. In addition, ICH provides information about how product-related inspections will differ in an ICH Q8, Q9 and Q10 environment. ICH writes: “In the case of product-related inspection (in particular, preauthorization) depending on the complexity of the product and/or process, greater collaboration between inspectors and assessors could be helpful (for example, for the assessment of development data). The inspection would normally occur at the proposed commercial manufacturing site, and there is likely to be greater focus on enhanced process understanding and understanding relationships, e.g., critical quality attributes (CQAs), critical process parameters (CPPs). The inspection might also focus on the application and implementation of quality risk management principles, as supported by the pharmaceutical quality system (PQS).”
In addition to ICH, regulatory authorities also provide further information. The British Authority MHRA, for example, answers the question: Should a company have a procedure to describe how it approaches QRM related to manufacture and GMP? The answer is: “Yes, the procedure should be integrated with the quality system and apply to planned and unplanned risk assessments. It is an expectation of Chapter 1 that companies embody quality risk management. The standard operating procedure (SOP) should define how the management system operates and its general approach to both planned and unplanned risk management. It should include scope, responsibilities, controls, approvals, management systems, applicability, and exclusions.”
The ECA Academy summarised the most relevant questions and answers from regulators like ICH, EMA, FDA etc in a GMP Questions & Answers Guide which allows readers of the document to search for certain GMP questions. A subject index at the beginning of the document lists the most frequent searched terms.
//////////PQS, ICH, Pharmaceutical Quality System
GDUFA: FDA’s new Guidance on Self-Identification of Generic Drug Manufacturers
FDA’s new Guidance requesting generic drug manufacturers who want to export to the USA to self-identify has recently been published in a finalised form. Read more here about what types of generic drug manufacturers are affected and which company data are required by the FDA.
The GDUFA (Generic Drug User Fee Amendments) is a legislative package which came into force in 2012 and entitles the US-American FDA to collect fees from generic drug manufacturers, who strive for a marketing authorisation for the American market. An annual fee has to be paid after the successful registration.
The core of the document is the obligation to “Self-Identify” for those companies that have to submit essential site-related information to the FDA. The details of this self-identification are set in a Guidance for Industry entitled “Self-Identification of Generic Drug Facilities, Sites, and Organizations” published on 22 September 2016 by the FDA in the finalised form.
The Guidance describes the following elements:
1. Which types of generic facilities, sites, and organizations are required to self-identify?
2. What information is requested?
3. What technical standards are to be used for electronically submitting the requested information?
4. What is the penalty for failing to self-identify?
Hereinafter, you will find a short summary of these four topics:
1. Companies that manufacture finished generic medicinal products for human use or the APIs for them, or both are required to self-identify as well as companies that package the finished generic drug into the primary container and label it. Besides, sites that – pursuant to a contract with the applicant (generic drug manufacturer) – repack/redistribute the finished drug from a primary container into a different primary container are also required to submit a self-identification as well as sites that perform bioequivalence/bioavailability studies. Last but not least, the obligation to self-identify also concerns sites that are listed in the application dossier as contract laboratories for the sampling and performing of analytical testing.
2. Essential data are: the D-U-N-S number (a unique nine-digit sequence specific for each site / each distinct physical location of an entity), the “Facility Establishment Identifier, FEI” (an identifier used by the FDA for the planning and tracking of inspections) and general information with regard to the facility (company owner, type of business operation, contact data, information about the manufacture of non generic drugs).
3. The HLS standard (Health Level Seven Structured Product Labeling) requested for generic applications (ANDAs) has to be also used for the submission of self-identification information. A detailed description of this standard can be found in the Guidance “Providing Regulatory Submissions in Electronic Format – Drug Establishment Registration and Drug Listing“.
4. Companies that fail to self-identify do not have to expect an explicit penalty. However, such a failure leads to two drawbacks: first, the likelihood of a site inspection by the FDA prior to approval is higher. The second drawback which is much more serious is that all the APIs or finished drugs from a manufacturer who hasn’t self-identified are deemed misbranded. For the FDA, such products are not allowed for importation in the USA.
To the satisfaction of the FDA, the regulations set in the GDUFA and the provisions laid down in the new Guidance represent a major contribution to an enhanced transparency in particular of complex supply chains.
//////////GDUFA, FDA, new Guidance, Self-Identification, Generic Drug Manufacturers
Analytical Lifecycle: USP “Statistical Tools”, Analytical Target Profile and Analytical Control Strategy
Analytical Lifecycle: USP <1210> “Statistical Tools”, Analytical Target Profile and Analytical Control Strategy
The United States Pharmacopeia (USP) is currently undertaking further steps towards a comprehensive analytical lifecycle approach by publishing a draft of a new General Chapter <1210> Statistical Tools for Procedure Validation and two Stimuli Articles regarding Analytical Target Profile and Analytical Control Strategy in Pharmacopeial Forum. Read more about the life cycle concept for analytical procedures.
Following the recently announced elaboration of a new general chapter <1220> “The Analytical Procedure Lifecycle” the United States pharmacopeia (USP) is now proceeding in its approach for a comprehensive analytical lifecycle concept. A further step towards this approach is the draft of a new USP General Chapter <1210> Statistical Tools for Procedure Validation which has been published in Pharmacopeial Forum (PF) 42(5) in September 2016. Comment deadline is November 30, 2016.
Additionally, two Stimuli Articles regarding “Analytical Control Strategy” and “Analytical Target Profile: Structure and Application Throughout The Analytical Lifecycle” appeared in the same issue of the PF.
In the draft chapter <1210> Statistical Tools for Procedure Validation, the USP Statistics Expert Committee presents a revision to the proposal of <1210> published in PF 40(5) [Sept.–Oct. 2014]. On the basis of the comments and feedback given by stakeholders, the committee has addressed their concerns about the narrow scope and details on methodology to be used. The chapter is proposed as a companion to general chapter <1225> Validation of Compendial Procedures with the purpose of providing statistical methods that can be used in the validation of analytical procedures. A revision of general chapter <1225>, including a new section on Lifecycle Management of Analytical Procedures, has been published for comment in PF 42(2) in March 2016.
Specifically, the revision clarifies the accuracy and precision calculations while removing specific linearity requirements. Linearity may be inferred from accuracy or other statistical methods as deemed appropriate. The chapter discusses all of the following analytical performance characteristics from a statistical perspective:
- detection limit,
- quantitation limit,
- and linearity.
Additional related topics that are discussed in the draft include statistical power, two one-sided tests of statistical equivalence, tolerance intervals, and prediction intervals.
Furthermore, up to now, four Stimuli Articles regarding the analytical lifecycle have been published:
- “Lifecycle Management of Analytical Procedures: Method Development, Procedure Performance Qualification, and Procedure Performance Verification” in PF 39(5),
- “Fitness for Use: Decision Rules and Target Measurement Uncertainty” in PF 42(2),
and new in PF 42(5)
- “Analytical Control Strategy”, and
- “Analytical Target Profile: Structure and Application Throughout The Analytical Lifecycle”.
The Analytical Target Profile (ATP) is the focal point of the lifecycle approach. It is comparable to the Quality Target Product Profile (QTPP) which is defined in ICH Q8. The Stimuli Article emphasizes “that the current approach to development, validation, verification, and transfer of analytical procedures has served the industry well.” The lifecycle approach – comprised of the development (design, stage 1), qualification (stage 2), and monitoring of the performance of analytical procedures (control strategy, stage 3) – is an extension of the current guidance, taking advantage of the learnings from ICH Q8 – quality by design (QbD) concepts. The Article considers in particular the following questions (and provides examples):
- What is an ATP, and why is it useful?
- How can the ATP criteria be established?
- How can an ATP be applied during the three stages of the procedure lifecycle?
According to the article, “an additional advantage of using an ATP is that it can drive the development of a robust control strategy, resulting in better, more consistent performance of an analytical procedure throughout its lifecycle.”
In the Stimuli Article on the Analytical Control Strategy (ACS), the following questions are discussed:
- What is the ACS?
- What is the relationship between the ACS and the ATP?
- What is the quality risk management (QRM) process and how can it be applied to an analytical procedure?
- How does the ACS apply to the product lifecycle?
Additionally, examples of the following are provided:
- How to develop an ACS using the QRM process, and
- How to develop and apply a risk-based replicate strategy to minimize variability.
The USP Expert Panel would appreciate any feedback on the suggested approaches, as well as any alternative approaches for consideration.
Following your registration on the USP Pharmacopeial Forum website you can get to the proposal for general chapter <1210> and the complete stimuli articles. Because of the importance of the new USP chapter <1220>, ECA and USP join forces and organise the first joint event “Lifecycle Approach of Analytical Procedures“, in Prague, Czech Republic, from 8 to 9 November 2016.
/////////Analytical Lifecycle, USP <1210> “Statistical Tools”, Analytical Target Profile, Analytical Control Strategy
All participants of the GMP training course “Product Transfer” will receive a special version of the Guideline Manager CD including documents and templates useable for site change projects.
According to the European GMP-Rules, written procedures for tranfser activities and their documentation are required. For example, a Transfer SOP, a transfer plan and a report are now mandatory and will be checked during inspections.
As participant of the GMP education course “Product Transfer” in Berlin, from 25-27 October 2016 you will receive a special version of the Guideline Manager CD with a special section concerning product transfers. This section contains, amongst others, a Transfer SOP and a template for a Transfer Plan. Both documents are in Word format and can immediately be used after adoption to your own situation.
Regulatory Guidance Documents like the WHO guideline on transfer of technology in pharmaceutical manufacturing and the EU/US Variation Guidelines, are also part of the Guideline Manager CD. Due to copyright reasons, this CD is not available for purchase and can only be handed out to participants of the Product Transfer course.