Charting the Contaminant Iceberg: Part Two
Torsten C. Schmidt reflects on the major challenges – both analytical and translational – facing analytical scientists as they strive to have a positive impact on the environment
James Strachan | | 10 min read | Interview
What are some of the biggest challenges in environmental analysis?
There are several major analytical challenges. For separations, I’d say we’ve made meaningful progress with HILIC, mixed-mode chromatography, and ion-exchange separations, often in combination with mass spectrometry. These methods are promising but haven’t yet been standardized or widely implemented in regular monitoring.
Moving beyond separations, the advances in high-resolution mass spectrometry (HRMS) have also been significant. Previously, we were limited to targeted analysis, focusing on just a few compounds – maybe a few hundred, at best. But the rise in high-resolution mass spectrometry has expanded our scope, enabling us to conduct broader suspect screening or non-targeted analysis. This means we can detect compounds that we may not even have reference standards for, which is valuable for environmental monitoring.
A particularly promising area right now involves efforts to quantify compounds even without reference standards, which can be essential for prioritizing unknown contaminants. Anneli Kruve, Stockholm University, Sweden, for example, is doing substantial work on quantitative screening of unknowns. Using artificial intelligence to predict quantities of unidentified compounds using a system that helps decide if a compound is worth identifying and, if necessary, developing standards for it. This approach isn’t limited to environmental analysis; it has potential applications in other areas, like identifying unknown biomarkers in biological fluids.
Why haven’t those methods you mentioned been widely standardized or implemented in regular monitoring yet?
Standardization naturally takes time, but it often feels like the process could be faster. Setting up standards, like those under ISO norms, to ensure new methods are usable in routine labs, sometimes lags far behind the research. In part, that delay comes from researchers themselves. Once they’ve developed a new method, published it, and seen it recognized, there’s little incentive for them to engage in the lengthy, often bureaucratic, process of standardization. This work isn’t as exciting or rewarding, especially in academia, where incentives are largely aligned with novel findings and high-impact publications, not with procedural work.
This disconnect means routine labs can end up relying on outdated methods, even when better options exist. For example, we’re now focusing more on the “greenness” of analytical methods in environmental analysis – methods that produce data for environmental improvement should ideally avoid creating environmental problems themselves. In recent work, we evaluated the sustainability of common analytical methods, like those from the US EPA, and found that many use large amounts of hazardous solvents or outdated extraction techniques, despite more sustainable alternatives having been available for decades.
So, it’s not unique to water analysis or advanced methods like high-resolution mass spectrometry; it’s an issue across various routine analyses. Sustainable replacements exist, but someone has to take the lead in standardizing and promoting them. Indeed, if our methods aren’t routinely adopted or aren’t sustainable, we may solve a scientific puzzle, but we won’t make the environmental impact we set out to achieve. And that's crucial to me and, I think, to many of us in the field.
So, although I recognize problems with current incentive structures in academia, I believe we should dedicate more time to implementation. Perhaps a good starting point is remaining open to collaborating with those who can guide these methods into routine monitoring. If we’re more receptive and proactive in those conversations, we might see more of our work transition from theory to practice, which is, ultimately, the impact we’re all hoping for.
Are there any other big challenges in environmental analysis that you’re seeing at the moment?
One major trend I’m noticing, and hoping the community has embraced, is the increasing focus on data processing and analysis. Right now, we still see a lot of research papers that thoroughly document the analytical method – details on the columns used, MS settings, gradient conditions, and so on. But when it comes to data processing, there’s often just a line saying something like, “Data were processed with the manufacturer’s software,” and that’s it. That’s clearly insufficient, especially since data processing can actually have as much impact on the final outcome as the analytical method itself.
We’ve done some work on this, along with others in the field, which shows just how crucial data processing decisions are in influencing the results you get. Unfortunately, standards for reporting on data processing aren’t yet on par with those for the analytical setup, and this can hinder reproducibility across studies. With the same samples and the same analytical methods, outcomes can vary widely simply because of differences in data processing. So, we’re aiming to raise awareness and push for better reporting standards in this area to improve reproducibility and overall data quality.
Is the issue you mentioned mainly due to communication standards and practices, or does it also involve privacy concerns or copyright restrictions related to manufacturers' software?
It’s probably a combination of both. For those relying solely on manufacturer software, there can be limitations on what they’re able to report. However, in the NTS and HRMS communities, open-source software is actually quite widely used, so there are viable alternatives to proprietary software. Tools like Skyline, XCMS, MZmine, and other feature-based or chemometrics approaches, such as Region of Interest (ROI) analyses, are very common.
While open-source options provide flexibility, the issue often lies in the reporting of settings and user-defined parameters – those inputs are not always documented as thoroughly as they could be. To address this, we and others are working on developing data processing methods that minimize user input, aiming to make the results more consistent. Ideally, with standardized settings, you would achieve the same or very similar results regardless of who processed the data or what instrument was used.
Are there any emerging areas of contaminant research that you think are underexplored or where future analytical efforts should focus?
I see a lot of potential in bridging the currently separate areas of analyzing environmental contaminants and looking at ecosystem impacts, particularly in aquatic systems. My background is in chemical analysis, focusing on contaminants like organic and inorganic compounds. However, we’re starting to recognize that it’s not only anthropogenic contaminants we need to be concerned about – naturally occurring compounds can also have serious impacts. For instance, the Oder River disaster on the Polish-German border highlighted this, where higher salt concentrations and warmer temperatures enabled certain algae to grow explosively. The algae then released toxins that killed fish and other organisms. This wasn't directly due to a man-made contaminant but rather a natural toxin exacerbated by anthropogenic influences.
Cases like this emphasize the need for a comprehensive approach that combines chemical data with biological analysis. To address such incidents effectively, we should combine information on chemical composition with data from environmental omics, such as eDNA and RNA-based methods, to see how ecosystems respond to changes in both chemical and biological conditions. This combined approach could advance our understanding of how ecosystem dynamics are influenced by both anthropogenic and natural factors, which is key for developing more effective environmental protection.
Another emerging area involves wastewater analysis. Traditionally, wastewater is treated to reduce contaminants before being released into the environment, but recent research shows it can provide critical insights into public health. For example, monitoring wastewater has been effective in tracking illicit drug use, as well as assessing general public health. The COVID-19 pandemic also underscored its potential, as wastewater surveillance proved valuable for tracking SARS-CoV-2 spread. This approach could be extended to other viruses, such as influenza, to monitor broader public health trends.
Perhaps the biggest underexplored area is antibiotic resistance. Antibiotic-resistant bacteria and resistance genes, along with antibiotic residues, are likely to become a major health issue in the coming decades. Monitoring and understanding the sources of antibiotics that contribute to this resistance – whether from medical facilities, agriculture, or wastewater – will be essential. We’re starting to look into this area, collaborating with medical faculty to investigate sources of antibiotics in the environment. This work, integrating chemical analysis with health-related insights, could be pivotal for addressing the antibiotic resistance crisis and represents a powerful example of the potential in bridging these different analytical fields.
To make some of these ideas a reality, do we already have the necessary analytical technologies?
For most of the topics we’ve discussed, I think the tools we need are largely available. Of course, there’s always room for further refinement, and I might look back in 10 years and see things differently, realizing we were missing crucial aspects. But, for now, I believe the main hurdle is not so much a lack of tools but rather a need to bridge gaps across disciplines. For instance, connecting chemical analysis with environmental impact studies doesn’t necessarily require new tools – just a coordinated effort and a deeper understanding of other fields’ requirements.
One area that may need more maturity, though, is in understanding the effects of what we’re detecting. As we get more sensitive, identifying lower and lower concentrations, we have to ask what that actually means. The presence of a compound alone doesn’t tell us about its risk or toxicity. Effect-based methods, rather than just concentration measurements, are essential for gauging actual impacts. This is an area that has seen progress with bioassays and in vitro testing, but these methods aren’t yet widely used in routine monitoring.
We have a promising approach in effect-directed analysis, where we detect an effect first and then try to identify the chemical responsible. This involves running bioassays, fractionating samples, and narrowing down possible causative agents with techniques like LC-MS. While it’s a great concept and there have been some successful cases, it’s rare to definitively pinpoint the exact chemical causing an observed effect. So, while there’s potential here, further advancements are likely needed.
Another challenge lies in implementation. While chemical analysis methods are part of frameworks like the Water Framework Directive, effect-based methods, even for well-studied endpoints like estrogenic activity, haven’t yet made it into regulatory standards. This comes back to the academic-to-application gap, where political and economic factors can play a significant role.
Given all the challenges and hurdles we’ve discussed, do you feel that the work you and others in your field are doing is having a meaningful impact on the environment right now?
Absolutely. If I didn’t believe that, I’d probably feel a bit discouraged! I’m confident that our work is making a difference, even if it’s sometimes slower than we’d like. Awareness has definitely increased, and we’ve seen tangible progress in areas where the pace has been encouraging. For instance, the regulation around persistent, mobile, and toxic (PMT) substances is being incorporated into chemical registration processes. That wouldn’t have happened without advancements in analytical techniques proving the relevance of these compounds.
There are several areas where our field’s contributions are clearly making an impact politically. One example is microplastics. The issue gained attention rapidly, and the push to reduce emissions into the environment wouldn’t have been possible without the analytical methods we developed. These methods made it possible to measure microplastics accurately, looking at number, size, and mass across various environmental systems.
While not every individual project or piece of research might have immediate or visible effects, I firmly believe that, as a whole, our work is essential. Environmental analysis provides the data that makes people care, and without it, many of these issues would remain unseen or unaddressed. So, yes, I do think we’re having a positive impact, and we’re doing work that’s critical to advancing environmental protection.
Torsten C. Schmidt is Professor, Instrumental Analytical Chemistry and Centre for Water and Environmental Research (ZWU), University of Duisburg-Essen, Germany
Over the course of my Biomedical Sciences degree it dawned on me that my goal of becoming a scientist didn’t quite mesh with my lack of affinity for lab work. Thinking on my decision to pursue biology rather than English at age 15 – despite an aptitude for the latter – I realized that science writing was a way to combine what I loved with what I was good at.
From there I set out to gather as much freelancing experience as I could, spending 2 years developing scientific content for International Innovation, before completing an MSc in Science Communication. After gaining invaluable experience in supporting the communications efforts of CERN and IN-PART, I joined Texere – where I am focused on producing consistently engaging, cutting-edge and innovative content for our specialist audiences around the world.