Landmark Literature 2018: Part II
Every year, we ask experts from across the analytical sciences to select one eye-catching article from the past 12 month. Here, the rising stars from our Top 40 Under 40 Power List pick their top papers of 2018 – and tell us why.
Sergio C. Nanita, Mikhail Savitski, Ken Broeckhoven, Jean-Francois Masson, Anneli Kruve, Juris Meija, Cecilia Cagliero, Hiroshi Tsugawa | | Longer Read
De-Bugging Mass Spec Data
By Hiroshi Tsugawa, Researcher, RIKEN Center for Sustainable Resource Science, Yokohama, Kanagawa, Japan.
Landmark paper: H Mohimani et al., “Dereplication of microbial metabolites through database search of mass spectra”, Nat Comm, 9, 4035 (2018).
Advances in computational mass spectrometry are essential to grasp the diversity of metabolomes in living organisms. Metabolite annotation is straightforward when an authentic standard compound is available; however, the coverage (<5 percent) of mass spectral libraries is far removed from the structural diversity found in biological samples. To increase the coverage of metabolic profiling in biology, we need a standard-free annotation pipeline.
In natural product chemistry, “dereplication” (identifying known natural products in a biological sample) is an essential process to accelerate novel antibiotic discovery from natural sources without repeating time-consuming identification processes. Liquid chromatography coupled with high-resolution tandem mass spectrometry (LC-MS/MS) is a popular platform for high-throughput screening of natural products, but informatics methods for small molecule dereplication remain immature.
The dereplicator+ program described in my chosen paper facilitates the process by using the GNPS (global natural products social molecular network) mass spectra repository. It incorporates the algorithm for estimating false discovery rate (FDR), which helps to control false positive identifications in an automated annotation workflow using MS-based metabolomics platforms. The authors provide the program as a user-friendly web-application and command line application (http://cab.spbu.ru/software/dereplicator-plus/). The platform not only facilitates natural product replication but also increases the number of characterized metabolites in mass spectrometry-based metabolomics approaches – an important step forward.
By Sergio C. Nanita, Principal Investigator, DuPont Industrial Biosciences, Wilmington, Delaware, USA.
Landmark paper: JJ Hsiao et al., “Improved LC/MS methods for the analysis of metal-sensitive analytes using medronic acid as a mobile phase additive”, Anal Chem, 90, 9457–9464 (2018).
Over the past few decades, there have been many advances in chromatography instruments and mass spectrometers, leading to the powerful LC-MS methods available today. Yet, rugged methods for the analysis of “metal-sensitive” compounds are still frustratingly lacking. We have known for a long time that certain compounds, such as phosphorylated metabolites, phosphorylated peptides and carboxylic acids, can form analyte/metal complexes during LC-MS analysis, which can be detrimental to method sensitivity and reproducibility. In some cases, analyte/metal complex formation makes LC-MS analysis unreliable or impractical.
A recent article by Jordy Hsiao and colleagues caught my attention because it addresses the above-mentioned problem in a simple and practical way. The authors demonstrated that LC-MS analysis of phosphorylated metabolites, phosphorylated peptides, carboxylic acids and other metal-sensitive compounds can be improved significantly by adding a “sacrificial ligand” to the mobile phase at trace levels. The task of this mobile phase additive is to bind to any available metal throughout the LC-MS system, so that the analytes do not. The authors tested pyrophosphoric acid, EDTA and medronic acid as additives; medronic acid added to the mobile phase as a metal complexation agent at micromolar concentrations was the ligand of choice reported in the paper. I expect that this approach will be adopted quickly for metabolomics, proteomics, and other analytical methods where reliable measurements of phosphorylated species are challenging. The paper also discusses the use of new HILIC-type stationary phases, which provide improved capability for the analysis of ionic compounds, and represent excellent options for methods employing medronic acid as mobile phase additive.
By Mikhail Savitski, Team Leader and Head of Proteomics Core Facility, European Molecular Biology Laboratory (EMBL), Heidelberg, Germany.
Landmark paper: CM Potel et al., “Widespread bacterial protein histidine phosphorylation revealed by mass spectrometry-based proteomics”, Nat Methods, 15, 187–190 (2018).
Protein phosphorylation is widely regarded as the most influential protein modification in life sciences, particularly for its effects on cellular signaling. Over the past 15 years, mass spectrometry-based proteomics has made tremendous progress in identifying serine, threonine, and tyrosine phosphorylation on a proteome-wide scale. Yet, it has long been known that phosphorylation can also occur on histidine in bacteria, and more recently the presence of this histidine phosphorylation was proven in human cells by antibody-based methods.
However, strategies for detecting histidine phosphorylation from complex mixtures using mass spectrometry have so far been lacking, due to the prevailing dogma that histidine phosphorylation would be lost upon sample preparation because of the acidic conditions necessary for phospho-peptide enrichment.
Potel and colleagues have challenged this assumption and shown that mild acidic conditions are not as detrimental to histidine phosphorylation as previously thought. The resulting sample preparation strategy, in combination with mass spectrometry, enabled the first sensitive proteome-wide mapping of histidine phosphorylation in E. coli.
The results are very exciting, and demonstrate that there are over an order of magnitude more phospho-histidine sites than previously thought in this organism, which opens several exciting avenues of research. Which proteins are responsible for these novel histidine sites? What exactly is their function? Could inhibition of histidine phosphorylation be exploited for antibiotics development? Furthermore, these technological developments will enable mapping of histidine phosphorylation in other organisms and will likely lead to new advances in biology.
Who Watches the Watchmen?
By Ken Broeckhoven, Associate Professor, Department of Chemical Engineering, Vrije Universiteit Brussel, Brussels, Belgium.
Landmark paper: PK Dasgupta et al., “Flow-cell-induced dispersion in flow-through absorbance detection systems: true column effluent peak variance”, Anal Chem, 90, 2063–2069 (2018).
Over the last year, our group has spent a lot of time and effort trying to elucidate the different contributions to extra-column band broadening, so this article was of great interest to me. The article starts with a comprehensive history of absorbance detection in LC, and a discussion of its technical limitations from a sensitivity and extra-column perspective. The authors then propose a novel method to determine the overall contribution of the detector to the dispersion of a chromatographic band. Their methodology consists of diverting a part of the eluent to waste prior to the detector. By extrapolating measurements at different split ratios towards zero flow to – and thus zero residence time in – the detector, they obtain the true efficiency of the band before the detector. In a second series of experiments, they use a modified detector to vary the path length and thus detection cell volume, to disentangle the dispersion contributions from the cell in- and outlet and from the detection cell itself. The article highlights some very important aspects of detector cell-induced band broadening, the first being that the flow cell dispersion is flow rate dependent (except for very long cells), which is often overlooked. A second important aspect is that, for a detector with cell volume (Vcell), the volumetric dispersion should not (as is almost always assumed) lie in between Vcell²/12 (perfect plug flow) and Vcell²(perfect mixing in cell), as poorly swept regions might result in a value equaling a multiple of Vcell² – the traditional “perfect mixer” terminology to describe the detector contribution is poorly chosen. Finally, for short path lengths, the dispersion in the inlet/outlet zones influence each other and make up the majority of the flow cell dispersion, rather than its detection cell volume. I firmly believe that this study will be a reference for further work on detector dispersion in LC – one of the most important contributors to extra-column dispersion.
Two Sensors Are Better Than One
By Jean-Francois Masson, Professor of Chemistry, Université de Montréal, Canada.
Landmark paper: J Shu et al., “Plasmonic enhancement coupling with defect-engineered TiO2-x: A mode for sensitive photoelectrochemical biosensing”, Anal Chem, 90, 2425-2429 (2018).
Chemical and biological sensors rely on sensitive materials that convert a detection event into a physical signal, mainly relying on the electrical, optical, piezoelectric or mass sensing properties of the transducer. The field relies on sensors of increasing performance for detecting lower concentrations of analytes, but the basic principles governing them have not changed much in the past decade. Hence, advances in sensing performance have more often come from employing new biochemical schemes rather than fundamental advances in transducer technology. The opportunity still exists to further improve sensors by combining sensing technologies in new forms of transducers. To be truly advantageous this combination must yield a response that is better than the individual components. This is exactly what has been accomplished in my landmark paper by Tang and coworkers at Fuzhou University, China.
Based on photoelectrochemical principles, they have devised a sensing platform that benefits from the plasmonic effect of AuNP and the photoelectric response of TiO2. When an analyte joins the two together, for example, strands of DNA, the hot electrons generated by the AuNP under visible light irradiation significantly increases the photocurrent of TiO2. In the absence of the analyte, the photocurrent was low. The novelty lies in the synergic effect between the defect-engineered TiO2-x substrate and AuNP as a new sensing modality. This new sensing modality was shown to be very sensitive to DNA with a detection range from 1 pM to 10 nM. While this range is achievable by other biosensing methods, the measurement of lower photocurrent than was achieved in this article is foreseeable, and therefore holds the promise of further improvement in the performance of the sensor.
I find it refreshing to see that the combination of two hot fields in biosensing, namely plasmonics and electrochemical sensing, can lead to further advancement in sensing and I am hopeful that this will lead to other significant advances in the field.
By Anneli Kruve, Humboldt Fellow, Institute of Chemistry and Biochemistry, Freie Universität Berlin, Germany.
Landmark paper: MM Plassmann et al., “Nontarget time trend screening in human blood”, Environ Sci Technol Lett, 5, 335–340 (2018).
LC-HRMS is used extensively for identifying (possibly toxic) compounds that humans are exposed to in their daily lives. In spite of the great potential, it is still hard to make sense of the huge amounts of data obtained from non-targeted screening experiments. One of the most crucial problems is that current technology does not yet allow automatic annotation of all the features found in LC-HRMS analysis, which makes the prioritization of the relevant features highly important. An obvious solution would be to focus on the most prominent features; however, the intensity of the peaks in mass spectrum does not correlate to the concentration of the compound because of the vastly different ionization efficiency of different compounds. Additionally, high-intensity peaks in the mass spectrum may lack biological significance.
My choice for landmark paper describes the first human blood exposome study to include a significant time series. They propose using time series analysis as a tool to prioritize features, as increasing intensities in the time series analysis may refer to bioaccumulating compounds or to compounds to that we are increasingly exposed to. They analyzed human blood samples from 1983 to 2015. By focusing on the time series they were able to narrow important features down from 14,460 to 716. Finally, some of the substances were confirmed with the aid of standard substances.
The current shortcoming of this approach is that time series samples are rarely available and results obtained today and five years ago, in Tokyo and New York, on Orbitrap and ToF instruments, with and without LC separation, are not directly comparable. Still, this paper demonstrates the vast possibilities of uncovering time trend screening with non-targeted screening and hopefully we will see a major breakthrough in routine time trend analysis soon.
Once More, With Feeling
By Juris Meija, Research Officer, Metrology, National Research Council Canada, Ottawa, Canada.
Landmark paper: AL Plant et al., “How measurement science can improve confidence in research results”, PLoS Biology, 16, e2004299 (2018).
Most of us have heard stories regarding the inability of scientists to replicate the studies of others. This “reproducibility crisis” is a serious problem in all fields because most scientists tend to overestimate their confidence. In the 1990s, for example, contemporary determinations of the gravitational constant were in severe disagreement, the half-life estimate of technetium-97 isotope has doubled between the 2003 and 2012, and the primary international standard of arsenobetaine was revised by 20 percent a decade ago after errors were revealed from an interlaboratory comparison. These powerful examples show that chemists cannot simply ignore the reproducibility crisis.
Reproducibility is the business of metrologists and my chosen article by Anne Plant and her NIST coworkers provides fresh guidance on how practices of measurement science can improve confidence in the conclusions of many research studies. This article reminds us to consider principles of metrology in our daily work; for example, evaluate the robustness of the data, look for any possible sources of uncertainty, and pay attention to potential systematic biases.
By Cecilia Cagliero, Assistant Professor, Department of Drug Science and Technology, University of Turin, Italy.
Landmark paper: M Varona et al., “Solid-phase microextraction of DNA from mycobacteria in artificial sputum samples to enable visual detection using isothermal amplification”, Anal Chem, 90, 6922–6928 (2018).
Reading literature and attending conferences in 2018, I was impressed by the number of studies dealing with the development of new analytical methods for the diagnosis of infections that are no longer major threats in Western countries.
Infections are still the main killers in developing countries. Just looking at tuberculosis, the WHO estimated around 10.4 million new cases in 2016 but less than two-thirds of these were diagnosed and reported to health authorities. Proper and rapid diagnosis is key to controlling infection – without it, efforts to provide adequate and prompt treatment are useless. Tests for developing world settings should not only be highly sensitive and specific, but also affordable, rapid and adoptable in-field by workers with minimal training (point-of-care assays).
The past year saw a number of very interesting studies dealing with the development of non-invasive sampling approaches, analysis of the volatiles emitted by pathogens and statistical tools (applying a metabolomics approach). However, I was most impressed with the paper by Varona and colleagues, who combine their knowledge in varying analytical fields (the authors’ specialisms range from sample preparation to ionic liquid chemistry and biomolecular analytical chemistry) to develop a very simple and reliable tool for the diagnosis of mycobacteria. By extracting genomic DNA from the bacteria with a polymeric ionic liquid SPME approach and analyzing it with an isothermal nucleic acid amplification method coupled with visual detection, the authors developed an approach suitable for point-of-care application.
Both genomic and metabolomic approaches are being explored for diagnostics, with both showing significant strengths and weaknesses. Regardless of who comes out the winner in the battle of metabolomics versus genomics for diagnosis of infections, there will be one definite winner: human health and wellbeing.