Table of Contents
Analytical Method Development and Validation (AMV) involves the process of creating and testing new analytical techniques to accurately measure components of products.
This method development can include both fundamental research and applying existing theories to predict unknowns. Once a method has been created, it must then be validated to ensure that it produces consistent results when compared against recognized standards.
This article will focus on the validation of newly developed methods, not revalidation of those already in use.
The process of revalidation is when another test is conducted after changing component parts to make sure that the same conclusions are still drawn from the data collected. Different types of validation exist depending on what part of the analyte being measured is being tested, but all validations must demonstrate accurate results in order for them to be accepted and used.
Steps of analytical method development
The first step in developing an analytic technique is to determine if the current methods are adequate or sufficient. If they are, then your task is done!
You have already started the process by defining what you want to measure and determining how to go about measuring it. The next steps should be familiar to you as a scientist — experimentation, theory, and application.
Experimentation means putting your new methodology to use by testing it on samples that you have already analyzed using other methods. This way you can compare your results with those of the past techniques without investing more money in equipment and reagents.
Theory refers to mathematical equations used to predict the result of experiments. While not always needed, there are times when theoretical calculations help make sense of experimental data.
Application comes last because no matter how well designed and rigorous your new analysis technique is, it will not prove its quality unless you apply it to something real.
Analytical method development does not necessarily mean creating and adapting a brand-new technique, but rather improving upon existing ones. By doing this, you shift the focus away from reinventing the wheel and onto making things better for future researchers.
Laboratory equipment needed
The first step in developing an analytical method is to evaluate whether the current methods are adequate for your research or not. If they are, then you can simply keep using them!
Many times however, researchers will need to make some changes to these already established protocols to make them more effective or improve the accuracy of the analysis. This is where method development comes into play!
By defining what steps should be done during method development, we as scientists get rid of poor quality laboratory practices that may have worked well back when the method was created, but no longer.
There are several different types of method development, such as changing reagents, altering reaction conditions, or even changing how the sample is treated once it’s got there. All of these affect the way the chemical being analyzed reacts with other chemicals, and this impacts the outcome of the test.
So, how do you know if something is working? You could try running a few samples by the new protocol to see if it works better than before, but that would require taking extra precautions to ensure that the results remain consistent.
Alternatively, you could run a lot of samples at one time under controlled settings to determine if the altered protocol produces similar, if not better, results. Both of those strategies are examples of method validation.
Sample preparation is one of the most important steps in any analysis you perform. If done improperly, there can be disastrous consequences for your test results.
Most analytical tests require samples of blood or other fluids, tissue pieces, or substances such as urine or feces to determine what chemicals are present in the experiment. These samples must be properly collected, stored, and processed to ensure accurate results.
Proper sample collection includes using correct needles or lancets to get the right amount of fluid, storing the sample correctly, and processing the sample correctly. All these steps depend on well-established protocols that work for your specific testing method!
Analysts who perform chemical analyses often have specialised equipment and reagents available to them. They may also develop their own internal procedures or use “standard” methods published by national or international laboratories with certified experts overseeing everything.
When performing an analysis on a new specimen, it is very helpful to run some preliminary experiments to see if the protocol works and whether the required controls are included.
An analytical instrument is any device used to analyze samples for chemical compounds or elements. This could be something as simple as a test tube with reagents, or more advanced devices such as gas chromatography (GC) machines or mass spectrometers (MS).
Many of these instruments require an initial step of method development before you can use them. This means figuring out what part of the sample you want to run in the machine, what reagent(s) needed to be added, and how much of each reagent should be given to the sample.
Once this has been done, the analyst runs the experiment! The experimentalist takes their prepared sample and places it into the analyzer. They then start the equipment and wait for results.
The experimentalist also needs to make sure that they have enough of each reagent until the analysis is complete so that the right amount of chemicals are in the solution when the sample enters the machine.
Beyond determining if a method works, or is working for your lab tests, you will want to evaluate how well it performs its function. This is referred to as method verification or validation.
Methods that have been verified are good tools for your business, but there are many ways to do this! There are different types of methods validation, with some being more important than others.
Generalized accuracy, precision, and/or specificity testing are usually the first steps in most method validations. These tests determine whether the method is performing its job correctly by comparing results to a known “truth” or benchmark.
By doing this before using the test result too seriously, you can avoid wasting time and resources proving something that has already proven itself to work.
Testing the same sample twice within an hour or day is not enough- you must also test the same sample at multiple concentrations and compare those to see what the normal range is. Generalizing from these samples is then used to calculate what other samples should be.
When calculating cutoffs or normal ranges, you must use the same statistical techniques that you would use in research studies- such as percentiles or z scores.
Factors to consider during method validation
During analytical method development, you will need to determine what factors must be considered for successful analysis. These include instrumentation, reagents, sample preparation, temperature, time, etc.
Instrumentation is the part of the test that produces a signal used to identify the presence or absence of an analyte in your samples. There are many different types of instruments including spectrophotometers, fluorimeters, mass spectrometers, and others. Each one has its benefits and drawbacks so it is important to know which ones work best with your particular analytes.
A common source of error when using an instrument is called carryover. This happens when leftover chemicals from a previous run remain in the system and affect the accuracy of the next measurement. Instruments can have special precautions against this, such as having separate waste containers for each chemical being used.
Using appropriate equipment is an integral part of ensuring accurate results throughout every stage of a testing process. Different manufacturers’ versions may differ slightly, but they all follow similar protocols.
The next major factor in analytical method development is sample preparation or pre-analytics. This includes extracting chemicals from the tissue, breaking down large molecules into smaller ones, and/or purifying the analytes of interest.
Sample preparation can be very specific depending on what type of chemical you want to analyze. For example, if you wanted to determine how much calcium is in a given amount of milk, then you would need to extract it first!
There are many different ways to do this, so no matter which way you choose, just make sure that you do not break down the samples too much as some substances take longer to process than others.
Some examples of sample preparation include hydrolysis (the splitting up of larger molecules) for sugar analysis, drying the specimen before weighing, adding reagents to the specimens, etc. Depending on the analyte, adequate specificity and sensitivity cannot be guaranteed without appropriate pretreatment of the sample.
This is perhaps one of the most important steps in any analytical method development or validation. It can be referred to as quality control, precision testing, accuracy testing, verification, etc. The key here is defining what level of analysis this fits under. If you want to test how well your new technique works on a few samples that already have an established method, then this is defined as accuracy testing. Testing whether your modified procedure produces the same results as the existing methods are definitions for precision testing.
If you want to make sure your new way of analyzing something is correct, then quality control is the best term to use. Your final determination will depend on which phase of analytical method development you are working on at the time.
Quality control is typically done after completion of each step in the process, but it can also be done during some stages. For example, if you were performing ICP-MS analysis, you would check to see if there was enough material left over from the previous stage to continue with the next one.