Reference Standard Material and its relevance?

The US Food and Drug Administration defines a reference-standard material as a “highly purified compound that is well characterized”.

The US Pharmacopeia (USP) defines reference-standard materials as “highly characterized specimens of drug substances, excipients, reportable impurities, degradation products, compendial reagents, and performance calibrators”.

Scientists performing analytical testing use reference standards to determine quantitative (e.g., assay and impurity) as well as qualitative (e.g., identification tests) data, performance standards, and calibrators (e.g., melting point standards). The quality and purity of reference standards, therefore, are critical for reaching scientifically valid results.

Reference standards can be segregated into two groups: chemical and nuclidic  Chemical purity must be determined for both groups; nuclidic reference standards, however, also need to be evaluated for radionuclidic and radiochemical purity.

Types of reference-standard materials

Reference-standard materials can be broadly categorized as such:

  • Assays—used to determine potency for active pharmaceutical ingredients (APIs) and salts
  • Degradation products—used to identify and possibly to quantitate degradation products
  • Process impurities—used to identify and possibly quantitate process-related compounds
  • Resolution—used to determine assay performance or impurity method
  • Metabolites—used to identify and possibly to quantitate substances generated through a metabolic process.

The level of characterization depends on the intended use of the reference standard. For example, a reference standard used to determine potency requires full characterization and qualification. A reference standard used as a resolution component or identification requires less discerning analyses.

Sources of reference-standard materials

Reference standards can be compendial or noncompendial and are typically obtained from the following sources.

Compendial (primary):

  • Pharmacopeias such as the United States Pharmacopeia (USP), European Pharmacopoeia (EP), or Japanese Pharmacopoeia (JP)
  • Nationally recognized standard institutions such as the National Institute for Standards and Testing (NIST).

Noncompendial (secondary):

  • The user (custom manufactures or synthesizes the reference standard)
  • Contract manufacturer
  • Companies such as chemical suppliers.

Reference-standard materials that are synthesized by the user or supplied by a contract manufacturer or secondary company must be characterized. Both the reference standards and drug substance may be synthesized initially using the same process. The reference standard should be of the highest purity possible; the drug substance may require further purification to become a reference standard (additional purification steps used for a drug substance should be fully described and included in any regulatory filing).

Storage and impurity detection

Impurities classified as organic (process and drug related), inorganic, or residual solvents can be introduced during the manufacturing process for the drug substance, drug product, or excipient and/or through storage of the material. Impurities should be controlled throughout the manufacturing process. Impurities that are process-related should be kept to a minimum to avoid degradation and unwanted pharmacological effects. Compounds that are susceptible to hydrolysis, for example, should be thoroughly dried to remove moisture and then stored in a desiccator. Reference standards that contain a high percentage of organic volatile impurities may experience purity changes over time as the solvents evaporate. Impurities within acetone, a Class 3 solvent, for example, are permissible up to 5000 ppm or 0.5%, according to USP and ICH guidelines . The amount of acetone present may change during storage because of its volatility and therefore may alter the reference standard’s purity.

Another reason to limit impurities is demonstrated in the following scenario. Consider a reference standard that is 90% pure. The remaining 10% of impurities have to be identified and monitored through the life of the material. More analytical tests must be performed, and the probability of the purity changing during the review period increases. Then consider a reference standard with a purity of 99.9%, which has less need for additional characterization and potential degradation. Such a product can be monitored more effectively. In addition, this type of standard reduces the degree of systematic and random error from the combined analytical tests.

If the reference standard is in a salt form, the amount of salt present must be determined so that the purity can be corrected for content. Applying the molecular weight to the correction will not account for residual salt that may be produced during synthesis. If possible, it is recommended the reference standard be in a salt-free state to reduce the characterization tests required.

Article courtesy: Web resources

Leave a Comment

Your email address will not be published. Required fields are marked *