The Never-Ending Challenge of Pesticide Analysis
A growing target list, increasingly complex matrices, and the need for low limits of detection can make our field seem like an uphill treadmill. Here, I share my thoughts on some of the major challenges – and consider how new technology might help us push through the pain barrier.
sponsored by Thermo Fisher Scientific
Earlier this year, I had the pleasure of delivering the plenary lecture at the 1st International Symposium on Recent Developments in Pesticide Analysis in Prague, Czech Republic (watch the presentation online at: http://tas.txp.to/1115/Mastovska
I wanted to provoke discussion, and so decided on a bold (perhaps even intimidating) title: “New and Never-Ending Challenges for Pesticide Routine Testing Laboratories.” Why do the challenges feel never-ending? Firstly, pesticide residue analysis must constantly react to three (ever-changing) compounding factors: large numbers of analytes, low limits of detection, and a diversity of matrices.).
Moreover, the increasingly global nature of trade in the food industry adds to the mix. Wider sourcing of raw materials (and distribution of products), unknown pesticide use in certain regions, and different regional regulatory landscapes all add extra complexity and scope. At Covance, we are well aware of the global nature of the challenge and are focused on global harmonization. That means using the same robust methods, the same SOPs and quality systems – even the same laboratory information management systems – across the company, which is no mean feat.
From a regulatory point of view, even more challenges emerge. We know that there are different maximum residue limits and different compounds in use around the world, but pesticide residue analysis is more than just meeting the appropriate regional regulations. Global companies – and our clients – are increasingly interested in measuring everything, in everything, from everywhere – setting global specifications based on the strictest requirements in each case. Our target lists are growing...
For regulatory and contract labs, strange (and sometimes unknown matrices) are a regular occurrence – especially when it comes to botanicals and other supplements. And though analyzing an unknown sample for (known or unknown) pesticides is clearly an extreme case, it does highlight a challenge that will not go away: the matrix. Perhaps more importantly, it also highlights a trend; gone are the days when cereals, fruits and vegetables were the mainstay of analysis. The matrix challenge appears to be an ever-increasing circle that began with produce, grains and oils, and then expanded to include specialized matrices, such as spices, tea, cocoa, and so on. Today, the circle has grown bigger still, with herbal drug mixtures, dietary supplements... The list continues – as does the complexity.
Maintaining quality in the mayhem
In our labs, we use the SANCO guidelines for pesticides analysis both for validation and routine quality control as a minimum. The importance of quality control, particularly for difficult matrices, cannot be understated. In these difficult matrices, quantitation accuracy can represent a significant challenge, because unknown matrix effects can potentially affect sample preparation (recovery) and quantification (signal suppression/enhancement).
Clearly, in all walks of analytical life, identification of contaminants is of paramount importance. Just the presence of certain unexpected contaminants could have huge economic implications (and actually make quantification unnecessary in some cases). Conversely, the quantification of a wrongly-identified compound is entirely pointless.
In short, we need very high confidence in our results. For identification with MS/MS, SANCO/12571/2013 states that the minimum should be:
- ≥ 2 product ions
- ± 30 percent maximum relative tolerance for ion ratios.
But are we satisfied with minimum confidence? Notably, improved selectivity and identification confidence can be gained by developing methods that fully exploit the significant analyte overlap between GC-MS/MS and LC-MS/MS, using orthogonal selectivity as a means of confirmation. Another way of improving confidence in challenging matrices is by developing methods that closely evaluate multiple MS/MS transitions – not just the ones that offer sensitivity, but rather those that confer better selectivity.
Last but not least, the use of high-resolution accurate-mass (HRAM)-MS instruments, such as the Q Exactive™ systems, can increase confidence in compound identification by providing additional accurate mass information and thus increasing selectivity. And though right now we don’t use such technology routinely for pesticide analysis, in difficult cases (where other techniques have failed to give us the confidence we need), we have found the selectivity of HRAM-MS analysis very useful. In other applications areas, for example, non-targeted analysis of adulterants, full-scan, accurate-mass, high-resolution data really comes into its own.
When we consider our ever-expanding list of compounds in our target list (right now, we are currently validating a method that looks at over 500 compounds), the ability of HRAM-MS systems to perform non-targeted analysis starts to look increasingly attractive.
What do targeted and non-targeted really mean?
There appears to be a slight lack of consensus on the meaning of targeted and non-targeted – at least in my experience. From a holistic standpoint, you can consider the difference as two simple questions:
- Targeted: is compound X in the sample?
- Non-targeted: what is in the sample?
The reality is, of course, much more complex – and I believe that it is important to consider both data acquisition and data processing. If you are using analyte-specific conditions, then your data acquisition is targeted (for example, multiple/single reaction monitoring, selected ion monitoring). If not, you are acquiring data through non-targeted means (for example, full-scan MS, all-ion fragmentation, data-independent MS/MS). However, when it comes to data processing, the complexity increases; after all, can’t we process non-targeted data in a very targeted way?
At this point, Rumsfeldian analogies are inevitable:
- Known knowns: targeted processing of targeted – or non-targeted – acquisition data, using analyte-specific conditions (retention time, MRM or selected ions) in the data processing method created with reference standards.
- Known unknowns: (non-)targeted processing of non-targeted acquisition data, using database/library search (fragment match, structure correlation, accurate mass) to get presumptive identification.
- Unknown unknowns: non‑targeted processing of non-targeted acquisition data, using chemometric (differential or statistical) analysis, followed by identification of compounds of interest. A little like trying to find a needle in the haystack.
The realities of non-targeted analysis
Having defined non-targeted analysis, we are now in a position to consider the challenges, which I hinted at earlier with the term “analyte-specific conditions.” When we think about non-targeted analysis, we typically focus on the mass spectrometry aspect. But in my presentation in Prague, I told the sad (but poetic) story of “Ten Little Pesticides,” where only one lonely pesticide was identified in non-targeted analysis. My point was: how do we know that all analytes of interest even make it to the data processing step? In other words, all steps of the analytical workflow (extraction, cleanup, separation, ionization, detection, identification) could lead to loss of analytes of interest. The real challenge here? Optimizing non-targeted methods and establishing adequate quality control for those methods.
Despite that warning about non-targeted approaches, let us not be too quick to dismiss the power of HRAM-MS in addressing some of the broader challenges in pesticide analysis. HRAM-MS has utility across the full spectrum of users, which includes academia, pesticide R&D labs, government, the food industry, and contract testing laboratories. We can break that down more simply into two areas: research and routine.
In research, HRAM-MS is clearly useful for discovery and identification of new metabolites, for fate studies for new pesticides, or for the identification of unexpected/illegal pesticides. For routine use, I believe HRAM-MS is well suited as a complementary tool to targeted analysis of pesticides for comprehensive testing or – especially in the commercial world – for the development of risk-based target lists for customized food-safety testing programs. Indeed, we are launching two non-targeted methods that we feel meet our clients’ needs.
What is potentially powerful in both areas is the ability to retrospectively interrogate data, which could be particularly interesting when considering emerging contaminants or investigating whether a new problem is in fact a new problem at all.
A single platform?
As the sensitivity of HRAM-MS instruments increases, I can see a point in the future where we can conduct both targeted analysis and non-targeted screening on a single platform – a very attractive proposition. In fact, for less complex matrices, we are probably pretty close to that point already. But...
Implementing new technology involves a great deal of effort for accredited routine labs (new method development, validation of all aspects), so I suspect that many laboratories will continue to use triple-quad instruments for quite some time. Nevertheless, there’s certainly a real buzz about non-targeted analysis at conferences – the introduction of GC to the Orbitrap™ portfolio will probably add to that buzz. Right now, I get the sense that non-targeted data acquisition (with its potential to speed up method development) followed by streamlined and targeted processing of that data is a good midpoint between the old and the new for routine labs (we don’t need or want every sample to be a research project!). Data processing is an ongoing challenge, but it seems that the software is fast catching up with the hardware.
In five or ten years’ time, who knows how far we will have traveled on our treadmill?
Rich Whitworth completed his studies in medical biochemistry at the University of Leicester, UK, in 1998. To cut a long story short, he escaped to Tokyo to spend five years working for the largest English language publisher in Japan. "Carving out a career in the megalopolis that is Tokyo changed my outlook forever. When seeing life through such a kaleidoscopic lens, it's hard not to get truly caught up in the moment." On returning to the UK, after a few false starts with grey, corporate publishers, Rich was snapped up by Texere Publishing, where he spearheaded the editorial development of The Analytical Scientist. "I feel honored to be part of the close-knit team that forged The Analytical Scientist – we've created a very fresh and forward-thinking publication." Rich is now also Content Director of Texere Publishing, the company behind The Analytical Scientist.