Power (Wish) List
We asked The 2024 Power List: what’s missing from the analytical toolbox?
James Strachan | | 7 min read | Technology
Alexander Makarov: A technique to “see” a single molecule with atomic resolution directly and rapidly.
Teresa Rocha Santos: Screening devices for fast laboratory and on-site determinations of emerging contaminants are lacking. Specifically for microplastics, there is a need for instrumentation for the quick and easy identification and quantification of polymeric particles. As the particle size decreases the difficulty rises exponentially.
Wim De Malsche: Chromatographic systems should be standard equipped with two types of detectors, one for quantification and one for identification. The needed technology is out there: current detection technologies allow for localized detection, current (miniaturized) flow technologies allow for detection without messing up the separation during detection.
Frances S. Ligler: Inexpensive methods for nondestructive analysis of living systems in 4 dimensions at the cell and molecular level, including deep in tissues. Many investigators are currently tackling this problem, employing technologies such as IR, 2-photon, multispectral, impedance or quantum spectroscopy, usually with fairly expensive equipment. Useful solutions will be inexpensive, flexible and may not even require direct contact with the tissue. Lots of challenges remain to solve this problem.
Koen Sandra: The toolbox will never be complete. The life sciences domain is so dynamic and the diverse analytical requests come at such a high pace that we continuously need to challenge and reinvent the analytical arsenal. This creates interesting opportunities and keeps us going. Today, for example, there is a need for technologies that shed light on the structural details of large molecular assemblies, such as transgene-carrying viral vectors and mRNA-encapsulating lipid nanoparticles. This particulate matter again drastically raises the bar when it comes to structural characterization. But rest assured, our analytical community will also crack that code as well as the many others that will follow.
Data and AI
Ron Heeren: Accessible artificial intelligence tools that bring together morphological and molecular imaging data with the aim to facilitate easier clinical diagnostics.
Michal Holčapek: We can generate enormous data sets with huge complexity, but then the data processing is starting to be a bottleneck. We have the instrumentation capable of generating such data sets, but we may have difficulties with the speed and quality of data processing. During reviewing manuscripts or reading articles already published in my field of lipidomics, I have observed frequent problems with identification, quantification, and data reporting. Some people are overwhelmed with the amount of data, which may result in a low quality of published papers even in the leading multidisciplinary journals. I recommend paying more attention to data processing and bioinformatics, which is far behind instrumental developments.
Konstantin Shoykhet: Although we have versatile instrumentation which can successfully take a sample from the input and produce numbers as an output, there is significant room for improving the data interpretation and validation. For example, intelligent estimation of the quality of a result, and flagging. Also understanding of the entire workflow from sample taking and preparation, through the possible interferences during the analysis itself, to the adequate interpretation of analytical data results in a global context, requires more attention with increasing complexity of the workflows. In particular, the correct data interpretation and classification depends on qualification and interdisciplinary education of the operators.
What about mass spec?
Torsten Schmidt: I have been using compound-specific stable isotope analysis (CSIA) in my group to study contaminant sources and transformations since my postdoc days. When I started, I expected the field to expand exponentially because of the unique information provided by isotopic composition and its changes that no other analytical method can offer. Although the use of CSIA has highly successful niches and the instruments have been commercially available for decades, widespread application is still hampered by the rather sophisticated methods that require special expertise and the absence of further ground-breaking instrumental developments. With the combined efforts of manufacturers and researchers, this may be about to change if widely available high-resolution mass spectrometers achieve sufficient precision to measure the small differences in natural abundance isotopic composition. The possibilities of performing multi-isotope analysis of compounds and position-specific isotope analysis with specific mass fragments are almost limitless. In recent years, there have been some exciting reports demonstrating the potential of HRMS-based SIA, but the community is small and progress has been slow.
Robert Kennedy: I'd love to see an ionization method that is as general as ESI but not limited by ionization suppression effects… That would be quite useful. I also think we need better compound identification tools for small molecules. NMR is the gold standard but requires too much material and time for many projects. MS is not there yet in terms of quickly providing unequivocal identifications for many compounds.
Lingjun Li: Hardware wise, it would be wonderful to have better instrumentation that offers highly sensitive ionization source that works for a broad range of biomolecules with less ionization suppression and deeper molecular coverage; software wise, we would benefit from having improved computational tools that can integrate multidimensional data acquired from disparate techniques and complementary modalities to achieve a more comprehensive understanding of biological systems and their functioning in health and disease.
Stefan van Leeuwen: One important aspect is how to get to a high level of confidence for identification and quantification in mass spectrometry when authentic reference standards are lacking. Historically, we used authentic reference standards for unequivocal compound identification, and reliable quantification. As regards HRMS (e.g. non target screening), there is a high degree of confidence that this technique may bring, but currently the highest degree of confidence cannot be achieved without confirmation through an authentic reference standard. The question that rises here: can we increase the availability of reference standards, or can analytical scientists come up with smart solutions to create a high level of confidence where society can rely on, without reference standards.
Davy Guillarme: A mass spectrometry-based detector capable of providing high sensitivity for analyzing very large molecules, such as mAbs, mRNA, and AAVs, would be highly valuable. While ESI-MS performs reasonably well for such large molecules, its sensitivity is often insufficient. The development of a new or revolutionary ionization mode and/or mass analyzer specifically designed for very large molecules would be greatly welcomed.
Michael Gonsior: Extremely high resolution MS with extreme sensitivity and fast response times.
R. Graham Cooks: The same item that has been missing for two decades, a powerful portable commercial mass spectrometer for point-of-care measurements. It must privilege speed over other parameters and must have MS/MS for chemical specificity.
Gary Siuzdak: In metabolomics, and chemical analysis in general, effectively filtering out artifactual (in-source fragmentation) mass spectrometry data from datasets.
Four Thoughts on Accessibility
Wim De Malsche: Current HPLC instrumentation is too bulky and expensive. Recent developments in optics and more general detection devices as well as in pumping instrumentation allow for shoe-box compatible solutions. For many applications ultra-high pressures are not needed, especially when performances similar to conventional micron scale packings can be attained using larger channel (packing) dimensions through miniaturization and by applying lateral (vortex) flow methodologies.
Alexander Makarov: This goal could be facilitated by national funding agencies, especially if several of them are working together and involve vendors.
Konstantin Shoykhet: International collaborations and educational programs might be the way to go here. However, availability of new instruments for researchers worldwide is not the primary issue, in my opinion. One important issue is the state of scientific-technical infrastructure in a specific country or geography. It is important that all the factors fit: level of education of the personnel, infrastructure from stable power delivery to availability of reagents, spare parts and services, societal interest in science and research. All these topics are strongly influenced by the status of education, science and research in the society in specific cultures, countries and geographies.
Andrew Ault: Where analytical scientists can help address these global environmental challenges is by generating high quality data with lower cost instrumentation. In the atmospheric space we often have a choice between low quality data from really cheap sensors (with numerous flaws) or really expensive instrumentation that provides amazing analytical detail. Analytical scientists have the potential to provide a middle ground with chemical information that instrumentation can be deployed widely, but not be so expensive that it is inaccessible to 99% of researchers. The advances in 3-D printing and miniaturization through microfluidics have amazing potential in this space. A contribution my laboratory is working on is providing low cost size separation (3-D printed aerosol cyclones) and low cost aerosol collection into microfluidic devices. These types of efforts are needed to leverage the amazing lab-on-a-chip advances in the past decade.
Over the course of my Biomedical Sciences degree it dawned on me that my goal of becoming a scientist didn’t quite mesh with my lack of affinity for lab work. Thinking on my decision to pursue biology rather than English at age 15 – despite an aptitude for the latter – I realized that science writing was a way to combine what I loved with what I was good at.
From there I set out to gather as much freelancing experience as I could, spending 2 years developing scientific content for International Innovation, before completing an MSc in Science Communication. After gaining invaluable experience in supporting the communications efforts of CERN and IN-PART, I joined Texere – where I am focused on producing consistently engaging, cutting-edge and innovative content for our specialist audiences around the world.