Cookies

Like most websites The Analytical Scientist uses cookies. In order to deliver a personalized, responsive service and to improve the site, we remember and store information about how you use it. Learn more.
Fields & Applications Environmental, Liquid Chromatography, Mass Spectrometry, Gas Chromatography, Clinical

Chasing the Dioxin Detection Dragon

sponsored by Thermo Fisher Scientific

Before I consider how far we have come in the chase after the lowest detectable level for dioxins, I think it’s good to look back to see where we started. The topics I discuss in this article are based on work from several colleagues, and I’d like to acknowledge them here: B. L’Homme, C. Calaprice, D. Krumwiede, H. Mehlmann and, finally, D. G. Patterson Jr, who also wrote a recent article on a similar topic – To Attograms and Beyond – in The Analytical Scientist towards the end of last year.

Patterson delved into biomonitoring studies, so I’ll be brief and just note that they aim to discover how much of a particular contaminant  – dioxins in this case – actually end up in our bodies and can be done in two ways, either by using environmental measurements and complex models to predict or by direct human sample measurement. The second option makes biomonitoring easier, because you have a direct measurement and don’t need to do any modeling. But it’s also more challenging for analytical chemists. The levels are much lower, so detection limits become increasingly important. It’s a balancing act then, but I believe direct biomonitoring is the way to go).

In the early days

Back in 1945 the spraying of DDT was tested on beaches in the state of New York with the aim of eradicating mosquitoes – while children gleefully played in a fog of the chemical. In the 1950s, on aircraft returning from exotic locations DDT was sprayed in the cabin. Of course, in such exposure incidents the route is obvious and we need less sensitive methods. (As an aside, even in 2015 you may have been in an aircraft when the doors have closed and the pilot has stated, “Don’t be afraid [...] it’s harmless” as the flight attendants pass through the cabin with their best smiles, waving aerosol cans of some chemical – probably permethrin. Maybe in another 50 years we will look back with surprise on this practice as well).

Following these DDT “tests” – and the dawn of realization – researchers published papers in the 1960s on pesticide storage in human fat tissue with limits of detection in the ppm range (1). Not bad for the day (and I dream of dioxins at ppm levels from an analytical chemistry point of view!) In 1965, researchers measured DDT and DDE pesticide residues in human milk as well, using GC-ECD for quantitation (2). However, DDT is just one molecule and the levels were high so the work was relatively straightforward. Thankfully, the levels of all persistent organic pollutants (POPs) have been decreasing since the 1960s. And though the decrease is positive for humankind, it does represent an analytical challenge.

Winds of change

In 1988, Patterson et al. published a landmark paper showing the correlation between adipose tissue and serum levels of 2,3,7,8-TCDD – and that’s why we no longer use adipose samples (3). I’m surprised this paper has not been cited more often for that very reason. (Any group doing serum analysis should be citing this paper as a validation of their work.) But along with a shift to serum analysis, demand for sensitivity increased yet again. After all, there is more fat in milk (>5 percent) than in serum (<0.5 percent), which results in a respective shift from ppt levels to ppq levels in serum. On the plus side, participation rates in volunteer studies are on the up...

Lower limits of detection demand the best chromatography and highest sensitivity afforded by magnetic sector MS systems. And at the same time, we must not lose sight of reproducibility. Currently, when measuring dioxins at the femtogram level, we can expect RSD values of 20–30 percent. So, if we consider a move to the attogram level, what variation can we expect? Assuming an adapted Horwitzian “trumpet” curve approach, we could predict 50–60 percent, which is unacceptable.

In today’s routine biomonitoring labs, we can expect that for 2,3,7,8-TCDD at the ppt level (it’s actually lower) in a 5 ml serum sample we will actually inject around 15 fg (with a 60 percent recovery). So we have 20–30 percent RSD.

But let’s not forget the drive towards smaller samples. If I ask you to choose between giving 10–20 ml of blood or taking a finger-prick test, I can guess which you would prefer. Certainly the right direction – no surgery, no hospital, no syringe, no fear – but now we’re talking about 20–50 µl, and our need for sensitivity just increased again. At such sample volumes, we gain the potential to study those who are not typically included, such as the very young and very old, or isolated populations – or even dolphins.

We cannot forget our uncertainty; in 20 µl of blood there will be only 0.1 fg of 2,3,7,8-TCDD – that’s 100 ag, a real challenge. And even though it is at very low levels, it’s also the most toxic, so it serves as an excellent benchmark. Another Patterson paper appeared in 1996 that boosted sensitivity with GC×GC, getting down to around 335 ag for 2,3,7,8-TCDD (4). We later revisited this work with modern instrumentation in 2011 using cryogenic zone compression (CZC) with a loop modulator (5) on a high-resolution magnetic sector MS system. And in doing so, we are edging closer to our 0.1 fg goal.

Limits to limits

Working at such low levels poses a number of challenges. Some are instrumental; in particular, the trade off between sensitivity and accuracy (I go into more detail in a presentation I gave at the 10th International Symposium on Recent Developments in POPs Analysis in 2015: info1.thermoscientific.com/pops-analysis). But at these levels, we can also be confounded by isobaric species contamination of our standards. And as Ferrario et al. noted (6) “It is ironic that the advances in technology that have allowed the progressive lowering of detection limits have reached a limit imposed by the very contaminants the technology was designed to measure.” In other words, even if an instrument has a limit of detection of 50 ag, without a dedicated cleanroom, ultrapure standards, and due care, the limit is essentially unattainable.

How do we fight against the challenges? First, we need to intensify our efforts in sample preparation. As noted, DBS analysis followed by micro liquid-liquid extraction or micro-extraction by packed sorbent (MEPS) is one option. Another is volumetric absorptive micro-sampling (VAMS), a method published in 2014 (7) that is very interesting in terms of reproducibility, especially given that quantitative analysis is the aim. I go into detail about the pros and cons of these methods in my presentation, but I’d like to note that although the VAMS method is not as precise as current routine methods, it certainly feels like a step in the right direction.

After sample preparation, we must find novel ways to optimize our instrumental measurements. We need to use the most sensitive instruments – today, that is still magnetic sector MS systems – but there is still room for further improvement. Scientists, including some manufacturers,  are  exploring a number of areas with a view to improving sensitivity, including ion volume geometry and emission current to improve ionization efficiency; others offer alternative ionization methods, for example, APCI GC-HRMS; and the potential for multi-collector GC-HRMS. And I’ve already mentioned the real potential of CZC to optimize the GC separation step. Thermo Fisher Scientific is also working to further improve its time-controlled (t)-CZC approach, which hopefully becomes commercially available in the future, to enhance the signal of certain selected peaks (8) – these are all moves in the right direction.

So, where do we stand today? The good news is that a renewed focus on sample preparation and the evolution of technology are coming together to the point where the >0.1fg TCDD target is reachable; however, we must not forget that continual evaluation of measurement uncertainty is essential as we explore the attogram world.

Jean-François Focant is Professor of Chemistry at the University of Liege, Belgium.

Enjoy our FREE content!

Log in or register to gain full unlimited access to all content on the The Analytical Scientist site. It’s FREE and always will be!

Login

Or register now - it’s free and always will be!

You will benefit from:

  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine
Register

Or Login via Social Media

By clicking on any of the above social media links, you are agreeing to our Privacy Notice.

About the Author

Jean-Francois Focant

Jean-François (Jef) Focant leads the organic and biological analytical chemistry group of the mass spectrometry laboratory at the University of Liège in Belgium, where his research interests include the development of new strategies in separation science and the implementation of emerging strategies for human biomonitoring and food control. “I’ve been active in the field of dioxin analyses for the last 15 years and chaired the international Dioxin 2011 symposium in Brussels,” says Jef. Well known as a dioxin expert, he is also active in characterization of complex mixtures of volatile organic compounds (VOCs) for medical and forensic applications. “Working on the hyphenation of state-of-the-art analytical techniques to solve practical analytical issues is what I really enjoy doing,” he says.

Register to The Analytical Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:

  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine

Register