Subscribe to Newsletter
Techniques & Tools Mass Spectrometry, Microscopy

Ten Year Views: With Ruedi Aebersold

Ruedi Aebersold is Professor emeritus of Molecular Systems Biology and Chair, Department of Biology, ETH Zurich and Faculty of Science, University of Zurich, Switzerland

In January 2023, The Analytical Scientist celebrates its 10-year anniversary! We’re using the occasion as an opportunity to bring the community together and reflect on the field of analytical science. To do that, we’re speaking with leading figures and friends of the magazine to understand how far the field has come over the past 10 years, what lessons have been learned, which memories stand out, and where we go from here. In this installment, Ruedi Aebersold discusses exciting progress in proteomics, data-independent acquisition, and more.

What has been the decade’s most significant development in analytical science?
 

A general trend is the continuous increase in performance and the development of faster, more precise, higher-resolution technologies. There has also been a noticeable diversification of techniques. For example, optical techniques have been pioneered to sequence nucleic acids and reveal how they are organized in the nucleus. Another transformative area has been microscopy – particularly cryo-EM – for structural biology and for tomography measurements on cells and very thin sections.

In the nucleic acid field, there has been significant advancement in single-cell measurements, RNA expression, and DNA analysis in single cells. Exciting progress has also been made in the proteomics field; we are beginning to learn things that you can’t discern from bulk analysis, such as how our individual cells are composed. Most analytical papers on the proteome have focused on which proteins are present in a sample and at what concentrations. More recently, we have begun measuring other aspects of the proteome – for instance, proteoforms through top-down MS, how proteins interact, and how they change their shape. These exciting developments, driven by analytical techniques, bring us closer to understanding biological functions.

Has anything come out commercially in the last 10 years that you think is particularly innovative?
 

It is unbelievable what state-of-the-art electron microscopes – such as those developed by FEI and Thermo – can do. In mass spectrometry, there have been incremental improvements at every level, amounting to significant increases in performance. A decade ago, we would have been happy to detect 3,000 proteins in a complex sample such as a cell lysate. Now, we can detect 8,000 proteins from one tenth of the input material. It has also been 10 years since the data-independent acquisition (DIA) technique to obtain better qualitative and quantitative results was published. The uptake was quite fast and, if you have a large set of samples, it’s now the method of choice.

Has the speed of progress over the last decade surprised you?
 

The speed of progress has not surprised me, but the speed of implementation has. Around 20 years ago, it took five to 10 years to go from publishing a technique to  see its actual use by the community. Now, uptake is much faster. The DIA SWATH technique was taken up very fast – within about two years – and there were a lot of users. Cryo-EM analysis was applied very quickly, too. The same is true of some nucleic acid techniques – for single cell RNA sequencing, translation was very fast.

What do you think has driven the increase in translation speed?
 

I think these new techniques are phenomenal and address what the scientific community actually needs. If you develop a technique for a niche area, you will not see much translation. If the technique is extremely complicated, such as measuring molecular distances in  live cells (which involves engineering cells and having the right light acceptors and donors in close proximity), uptake is slow because the technical hurdles are significant. In contrast, if you know how to run a mass spectrometer for proteins, advancing from DDA acquisition to DIA acquisition is relatively simple.

Are there any hard lessons the industry has learned over time?
 

There are always false starts and things that look extremely promising, but then fizzle away. I think one of the field’s hardest lessons was the advent of surface-enhanced laser desorption/ionization MS. Twenty years ago, analytical science was introduced to this as a powerful biomarker detection technology. The machine was simple to use and it was implemented in many hospitals so that clinical labs could run blood plasma samples in large numbers and find biomarkers in bodily fluids. Unfortunately, it did not work – and this set the proteomics field back a few years until newer, more robust methods emerged that had clearly defined and tested  analytical boundaries. As a result, proteomics has been particularly good in developing accessible, transparent computer algorithms based on statistical tools that that validate the identity and quantity of observed molecules. Metabolomics, as  afield is not quite as far along yet in that regard.

Thinking about your lab and your research, what were your biggest highlights over the past decade?
 

I was trained as a biochemist, so I always viewed the technological work we did – particularly in the proteomics field – as an avenue to address important biological questions. After fighting with multiple technical issues, we now have robust, powerful proteomic techniques to ask complex biological questions. For me, the highlight of the past few years is that we can transition back to biology to determine the biochemical state of cells and tissues with these new techniques. We can start asking questions about cells in different states, such as how the organization of proteins in various complexes differs. Paola Picotti, the successor of my position at ETH, – I’m now retired – developed limited proteolysis-coupled MS, which can measure how a protein’s shape changes as a function of cell state. This technique is based on the fundamental principle that the function of a protein is dependent on a certain structure – which can now be tested on hundreds of proteins at a time in a single analysis.

What exciting things do you see on the horizon?
 

Mass spectrometry has established itself as a technique that can obtain different types of data from proteomes, including the composition, localization, modification, interactions and shape of proteins. I think proteomics will continue to broaden and become more powerful. There are attempts to use the principles from nucleic acid sequencing on proteins, in which billions of proteins are deposited in a flow cell that can then be probed to understand which are present in a given location. This will bring protein analysis to the level of examining single molecules. These approaches got a long way to go, but this is very exciting.

I also hope the new techniques which probe many functionally relevant attributes of the proteome will become more mainstream and widely applied. Proteomics – and, to some extent, metabolomics – has been inhibited by various practical factors. For example, the techniques are considered complicated, largely because they use complicated instruments with a tendency to break down. I see the instruments, computational tools and the technology in general  becoming more robust and widely used, which should improve accessibility. More people using a technique means more people having creative ideas and producing interesting results, so I look forward to a broadening of the user base and of the ensuing results.

What interesting questions might we be able to answer with these new technologies?
 

I think the increase in throughput and robustness will allow us to collect large amounts of data and use artificial intelligence techniques to learn new biology. For example, one of the fundamental questions in biology and medicine is: how does a change in the genome affect a (disease) phenotype? For almost all diseases, the relationship between genomic variants and phenotypic expression is very complicated. With the new techniques we have available, we can start to generate datasets to understand how specific genomic changes affect the cell biochemistry and disease trajectory in patients. This cannot be done in one experiment; we need a lot of data and computational tools. I’m hoping that, within the next 10 years, this area will advance.

Are there any trends you see that are slightly worrying?
 

I’m worried about the economic environment. A year ago, when money was cheap, there were huge amounts of venture capital to spin out interesting ideas from universities and investors actively sought out new technologies. Now, I know colleagues with very exciting new technologies who are struggling to find funding. Startups need funds to build, maintain, and market new innovations and I worry that this pipeline will take a hit. Many amazing technologies that should reach the market might not.

Another limitation is academic research culture. Often, funding agencies give a lot of research money to whatever they consider translational. They want diseases cured and medicines improved, but that takes time – and analytical sciences are the foundation of those advances. Reduced investment into basic technological advancements will have a detrimental effect down the chain.

Receive content, products, events as well as relevant industry updates from The Analytical Scientist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

About the Authors
Georgia Hulme

Georgia Hulme is Associate Editor at The Analytical Scientist


James Strachan

Over the course of my Biomedical Sciences degree it dawned on me that my goal of becoming a scientist didn’t quite mesh with my lack of affinity for lab work. Thinking on my decision to pursue biology rather than English at age 15 – despite an aptitude for the latter – I realized that science writing was a way to combine what I loved with what I was good at.

From there I set out to gather as much freelancing experience as I could, spending 2 years developing scientific content for International Innovation, before completing an MSc in Science Communication. After gaining invaluable experience in supporting the communications efforts of CERN and IN-PART, I joined Texere – where I am focused on producing consistently engaging, cutting-edge and innovative content for our specialist audiences around the world.

 

Related Application Notes
FUSION PTR-TOF ABOARD NASA DC-8 FOR ASIA-AQ CAMPAIGN

| Contributed by IONICON

An End-to-End Targeted Metabolomics Workflow

| Contributed by Agilent Technologies

Charge heterogeneity characterisation of an IgG4-based mAb using AEX coupled to MS

| Contributed by YMC

Related Product Profiles
Higher Peaks – Clearly.

| Contributed by Shimadzu Europa

Compact with countless benefits

| Contributed by Shimadzu Europa

The fine Art of Method Development

| Contributed by Shimadzu Europa

Register to The Analytical Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine

Register