The development of bioanalytical methods and subsequent quantification of analytes in a range of biological matrices is fundamental to the drug discovery and development process. Having a reliable and robust methodology, for all target species, including human, plays a key role in understanding how investigational drugs are metabolised following administration.
By accurately evaluating both the analytes and metabolites associated with the parent compound within appropriate biological fluids such as saliva, plasma or urine, we can answer important questions about how the candidate drug behaves and facilitate confident decision-making.
This article discusses bioanalytical method development, the potential challenges involved in this process, and how skilled scientists in bioanalytical laboratories can overcome these.
What is bioanalytical method development?
In clinical development, bioanalytics is the process of measuring analytes in biological samples. This includes the quantitative analysis of drugs and their metabolites, and biomarkers in biological matrices.
Bioanalytical method development relates to the elucidation procedures used to test biological samples, leading to safety and pharmacokinetic evaluation of the data produced for a candidate drug undergoing testing. Bioanalysis can be carried out using a range of different techniques depending on the target molecule.
For example, a method may employ LC-MS-MS or ELISA. LC-MS-MS is typically suited to small molecule analysis and ELISA macromolecule, protein analysis. The purpose of bioanalytical method development is to determine a robust, reproducible and reliable method of analysis for the target analytes that can be used to produce concentration data. This includes outlining the operating conditions of equipment used, any limitations, and the appropriateness of the methodology for its intended use.
Bioanalysis method development processes also outline:
- Sampling techniques
- Sample preparation
- Data evaluation methods
Bioanalytical method validation
Following development, bioanalytical method validation is then used to demonstrate the reliability and reproducibility of the proposed methodology. During the validation stage the same methodology is used to measure known standards and quality controls within the appropriate biological matrix to check the accuracy, precision, selectivity, sensitivity, reproducibility, and stability of the method which has been developed.
This validation stage, ensures that the data obtained is reliable and able to inform critical decisions throughout the drug development journey by answering important questions around the accuracy, variability and limitations of the bioanalytical method.It also helps to answer other important questions around the interpretation of bioavailability, bioequivalence, toxicokinetic and pharmacokinetic studies carried out in drug development.
Common challenges in bioanalytical method development and validation
Bioanalytical method development is a complex process that requires an agile approach. As a result, researchers may face several challenges when developing and validating biological analysis methods. Some of the most common challenges outlined by FDA reviewers include:
- Flawed extraction techniques
- Analytical issues
- Calculation problems
- Reporting issues
- Sample shipping issues
Flawed extraction techniques
Extraction is a critical component of bioanalysis, and involves both the removal of any interfering components and the preparation of the sample for analysis. There are several techniques used for extraction. Common types include, but are not limited to:
- liquid-liquid extraction
- protein precipitation
- solid phase extraction (SPE)
Each of these extraction methods apply different techniques to prepare the sample for analysis and isolate the desired analyte from the biological matrix. The extraction approach depends on the type of sample and its unique characteristics. For example, urine is high in salt content, plasma is high in phospholipids, and whole blood is high in red blood cell content.
Bioanalytical method development relies on efficient extraction. If the sample is not prepared appropriately, bioanalysis data cannot be reliably obtained. As a result, flawed extraction techniques can result in significant challenges to bioanalytical method development and validation.
As well as challenges in the sample preparation stage, issues can also arise during the sample analysis stage.
High-performance liquid chromatography (HPLC)
High-performance liquid chromatography (HPLC) is an analytical technique used in bioanalysis to separate and identify components in a biological sample. It is the most common chromatography technique used in most bioanalytical laboratories across the globe.
HPLC certainly has its advantages, though there are potential problems that may still arise in bioanalytical method development. Some of the problems that investigators encounter with HPLC include optimising column chromatography, mobile phase contamination, and avoiding carry over in the autosampler.
Optimising column chromatography: One of the most common problems that can occur with analytical columns in HPLC is deterioration. Signs of deterioration can include peak shape problems, such as split peaks and shoulders, lower retention times, decreased resolution and high back pressure. These signs can suggest that contaminants are present on the column inlet, or that there are channels or voids affecting the packing bed.
Optimising column chromatography is particularly important in higher efficiency columns, where deterioration is usually more evident. In order to prevent deterioration, it is vital that researchers ensure that the appropriate column protection and sample preparation measures are in place to get the most out of each column.
Mobile phase contamination: Contamination in the mobile phase can lead to rising baselines, noise and spikes in the chromatogram, ultimately affecting the reliability of the HPLC results. One of the main sources of contamination is water – even distilled and deionised water can cause mobile phase contamination.
To prevent water contamination from causing issues, researchers should only use distilled or deionised water, and furthermore, remove additional contaminants by passing the deionised water through a preparative C18 column or activated charcoal. Another way that investigators can avoid mobile phase contamination is to use only HPLC grade solvents, ion pair reagents, salts, and acid modifiers.
Failure to prevent mobile phase contamination can result in analytical issues and affect the reliability of a bioanalytical method.
Liquid chromatography-tandem mass spectrometry (LCMS)
Liquid chromatography-tandem mass spectrometry (LCMSMS) is an analytical technique used to determine compounds in a sample. This method combines physical separation techniques (liquid chromatography) with mass analysis techniques (mass spectrometry).
With multiplex capability, LCMS is a powerful technique used commonly for bioanalysis. However that does not mean that it comes without its challenges. Issues can occur with the instrumentation if not optimised correctly, particularly in mass spectrometry parameters.
For instance, issues can occur due to contamination, complex sample matrices, trace-level analytes, and complex sample preparation procedures. As a result, it is vital that investigators optimise mass spectrometry parameters correctly in order to ensure the data is reliable for bioanalytical method development.
Internal standard selection
As part of the process of method development, an appropriate internal standard is the basis for regulating a chromatographic assay. Without it, losses during sample preparation and instrument variation cannot be accounted for and where there are large number of steps in sample extraction prior to analysis, it leads to high variability.
It is possible to develop a chromatographic assay without an internal standard, as is the practice with ELISA however, it will not have the same accuracy and precision as with an internal standard. As with an ELISA assay, wider acceptance limits may need to be considered and the data produced should only be considered exploratory/semi-quantitative.
The selection of the right internal standard can make the difference between a good assay and one that just isn’t fit for purpose producing unreliable data that lacks reproducibility.
The most ideal internal standard is one that is as close to the chemical structure as possible to the analyte ensuring it undergoes the same separation from a biological matrix and behaves relatively the same in detection. FDA Guidelines define an internal standard as one that is ”a structural analogue or stable isotope of the analyte”. The very best internal standards are stable isotope labelled versions of analytes. These are commonly labelled with deuterium and/or carbon 13 but nitrogen 15 can also be used. The object with labelling is to have the internal standard with at least 3 amu difference in mass from the analyte and preferably more.
Isotope labelled compounds being chemically identical behave as if they are the target analyte undergoing the same physical or chemical separation processes. However, having a different mass to the analyte, an internal standard can be quantitated independently when using mass spectrometry after chromatographic separation. The only drawback of an isotope labelled internal standard is the amount of unlabelled analyte it contains as an impurity. This is expressed in the isotopic purity of the internal standard. Too much unlabelled analyte in the internal standard reference material will limit the amount of internal standard that can be added before it interferes with determination of low levels of the analyte.
The next best option for an internal standard is one that is almost chemically and structurally the same, one that differs ever so slightly to the analyte. For example, a methyl, methylene, ethyl or perhaps hydroxyl group different. Again, these should be selected such that they undergo the same physical and chemical processes during extraction. Several internal standard candidates may be evaluated during method development.
Quantitation by LC-MS-MS will show two chromatographic peaks at two different MRM transitions much the same as using a stable isotope internal standard except you might get chromatographic separation depending on your column and mobile phase selection.
Bioanalytical laboratories typically produce two types of report; method validation r and analytical. These are sometimes appended, but may also exist as separate documents.
The analytical report should cover the following features:
- Title page
- GLP compliance and quality assurance statement
- Study summary table
- Method description
- Preparation and quality control standards
- Sample receipt and storage procedures
- Experimental phase
As a requirement for study approval, the FDA names reporting issues as one of the most common challenges in bioanalysis method development.
Where the report contains issues, for example, problems with data quality, sample receipt, sample preparation procedures, or missing information, it cannot be accepted by regulatory bodies.
Sample shipping issues
Shipping issues can cause significant challenges in clinical research a slate or missing shipments can cause study delays. Additionally, samples must be collected, handled and stored appropriately in order to assure the integrity of the sample and the reliability of the bioanalytical method data.
Investigators should outline the required steps to be followed when collecting biological samples. Furthermore, the conditions for handling samples must also be clear. The samples may need to be frozen during shipping, or stored at a specific temperature in order to ensure their integrity. They may also need to be stored for a specific period of time.
Accounting for these steps in the planning stages of a clinical study can help to mitigate future challenges in bioanalytical analysis ensuring the resulting study data is reliable.
Bioanalytical method development and sponsors
Bioanalytical method development and validation is a basic requirement for the drug approval process. Robust bioanalytical data is required for approval by both the European Medicine Agency (EMA) as well as the Food and Drug Administration (FDA). As a result, sponsors that apply for an investigational new drug application (IND) must submit bioanalytical methodology and validation data in order for the study to be approved.
Prior to developing bioanalytical methods, the sponsor should determine the analyte of interest and consider previous analytical methods that have proved successful for this compound.
In order to be suitable for validation, the methodology must consider the following parameters:
- Reference standards
- Critical reagents
- Calibration curve
- Quality control samples (QCs)
- Selectivity and specificity
- Stability of the analyte in the sample matrix
Any changes to previous or established bioanalytical procedures, along with any issues encountered and the subsequent solutions, should be recorded during the development stages of the methodology. This helps provide a rationale for the final methodology of choice.
GLP and GCP frameworks in bioanalysis
Both Good Laboratory Practices (GLP) and Good Clinical Practices (GCP) provide frameworks for clinical and preclinical study procedures, with the aim of ensuring the quality, integrity, and reliability of clinical data. In assessing data in both clinical and preclinical phases of research, bioanalytical laboratories have a responsibility to follow GLP and GCP guidelines. You can find more information on these guidelines in Organisation for Economic Co-Operation and Development (OECD), and The International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH) documentation:
Bioanalytical laboratory services
We are a full-service CRO offering a range of bioanalytical support services. We provide sample evaluation and analysis from purpose-built bioanalytical laboratories, helping support your bioavailability, bioequivalence, toxicokinetic and pharmacokinetic studies.