Subscribe to Newsletter
Techniques & Tools Mass Spectrometry

Keeping Ahead in Life Science

sponsored by Thermo Fisher Scientific

Ken-Miller

ASMS is clearly a big show for you…

Definitely. It’s hard to directly measure the impact of ASMS but, beyond the product launches, it is very much about educating our customers and giving them an exemplary and highly memorable experience. This year, we had a live speed painter and turned our hospitality suite into something we called “Club Quan” – complete with robot DJ – in the evenings; it was pretty amazing. We hope it allowed us to truly show our appreciation to our customers.

What were the main messages at ASMS?

Firstly, having launched a number of new products to “transform science” last year, we wanted to prove that we’d delivered on the promises we made back then and share progress in terms of new features and applications.

Secondly, software was a big point of emphasis for us this year. We introduced some really powerful new workflows for small molecule and protein discovery. In particular, we were incredibly excited to announce the new PepFinder software. The biopharma market has really taken off.  It’s a huge opportunity for us as we can leverage our leadership in protein analysis and characterization. PepFinder allows detailed, quantitative characterization of protein drugs and appears to be the right software at the right time. We’ve been staggered by what it can do as well as the reception it has received. It created a big buzz at ASMS.

Finally, we were very proud to launch the Q Exactive HF, which is really all about productivity, particularly in proteomics.

How does the Q Exactive HF contribute to the advancement of proteomics?

This is the latest chapter in the Orbitrap story. It builds on our Q Exactive Plus platform by combining it with an ultra-high field Orbitrap mass analyzer. Essentially, the Q Exactive HF doubles the spectra acquisition rate, which means comparable results (to the Q Exactive) in about half the time – or twice the resolution in the same run time.

As Ian Jardine noted in his feature last month (see tas.txp.to/0714/jardine), in the early days, the ion trap coupled with John Yate’s SEQUEST was a great starting point for proteomics. He described the move away from the linear ion trap/Fourier transform (FT) MS instrument as a gamble. And it’s true; we knew the Orbitrap would harm our FT business, but the technology was so compelling that many of us had no doubts about moving forward. Sure enough, within two years, there had been an almost complete shift from what had been a very robust FT business to Orbitrap. But I guess it’s better to eat your own lunch than have someone eat it for you…

What do you see as the near future of proteomics?

Proteomics can revolutionize patient health care. Marker discovery and clinical research requires the analysis of samples from many patients – and that’s been very difficult to do from a proteomics perspective. As a comparison, next-gen systems allow whole-genome sequencing in about a day and costs have been falling dramatically. Up until recently, there has been no good way to address large studies at the proteome level; protein analysis typically required fractionation and  very long LC-MS runs which could take days or weeks per sample. Methods have improved dramatically with faster instruments and multiplexing technology to a point where, for the first time, it’s practical to think about proteomics playing an important role in large-scale biological or medical research studies, something that has simply been too labour-intensive and way too expensive to contemplate before. With workload and costs starting to come down, it’s not hard to imagine personal proteomics beginning to play a role in routine health monitoring.

You seem very focused on the clinic…

That should come as no surprise as our corporate mission statement is to enable our customers to make the world healthier, cleaner and safer. We on the staff derive a great deal of pleasure from seeing evidence of that on a daily basis.

What few people realize is that Thermo Fisher Scientific is the fifth largest clinical company on the planet, making all kinds of reagents, kits, diagnostics and so on. For me, it’s exciting to be working for a company that can now add proteomic screening and genome-based clinical tests to this portfolio. The Life Technologies acquisition is clearly part of that ambition.

I see the life sciences as a vast continuum, starting with research, for example, in proteomics, metabolomics and lipodomics, to understand biology and identify potential markers of disease, health, toxicity or drug efficacy. That research must then be translated into specific assays or platforms that clinics and pharma companies can use to analyze patient samples and deliver improved health care.  We aim to facilitate that entire translational journey from discovery to reliable diagnostics – a total analytical ecosystem.

Such a strong influence by a single company could be considered dangerous, does that concern you?

I see your point, but you have to consider that there is plenty of competition. Competitive research technology, such as QToF MS, is out there and constantly improving. It forces us to keep pushing forward. And while our competitors complicate our lives, we know that ultimately it’s a good thing. It keeps us on our toes. We all have to fight to stay competitive. We have a lot of internal capability, but we also work closely with a handful of core collaborators, giving them full access to our technologies, so that they become true partners in the development process.

One great example of successful collaboration is our work with Amgen and one of their scientists, Zhongqi Zhang, who spent over 15 years developing the software that became the basis for PepFinder 1.0. At first, we wondered why Amgen would want to license its own software to us, but it became clear that, from a regulatory perspective, there are distinct advantages to moving everyone towards an industry standard. Driving science forward is a team effort.

What are your thoughts on data-independent acquisition (DIA)?

There’s a real divide in MS applications. In targeted analysis, you know what you’re looking for and the goal is to quantify as accurately, robustly and inexpensively as possible. In discovery, you need comprehensive analysis to reveal as much about your sample as possible by using fastest, highest resolution MS systems. Then there’s a big grey area in the middle. In an ideal world, it would be great in all situations to look at a sample in an untargeted way, and identify and quantify as many components as possible. That’s essentially the promise of data-independent acquisition.

We’ve been doing data-dependent acquisition (DDA) for a long time.  It’s incredibly fast and sensitive, but it’s a stochastic process; the 10 precursor ions (for example) selected by a survey scan for further mass selection in one sample may not be the same in another sample. DIA has captured the popular imagination because it seems that we can have our cake and eat it too. Indeed, it has the potential to create a high-resolution digital archive of all components in a complex mixture in an unbiased way. In brief, you step up the mass range in increments (for example, 25 m/z windows in SWATH™) and fragment everything in each incremental window to create a very complex, multiplexed MS/MS spectrum. Of course, those complex spectra are both a blessing and a curse. Yes, you have spectral representation of everything present in a given window but, on the other hand, deconvoluting that highly complex data post-acquisition is an extremely complicated process.

SWATH™ and other DIA software attempt to get over that hurdle by matching components in the acquired spectra to a mass spectral library. Our DIA methods offer a considerable advantage in that Orbitraps acquire very high resolution, very accurate data. We can typically be accurate to within 5 ppm whereas QToF data is extracted in a 50 ppm window. If the search window is opened to 50 ppm, you will have contribution from a lot of different species, making it difficult to distinguish real signal from noise. The tighter the window, the easier it becomes to separate signal from noise, with the added benefits of improved sensitivity and analytical precision. It’s all about selectivity. In other methods we’ve developed, we shrink the m/z mass range windows (for example, from 25 m/z down to 5 m/z) thus reducing the complexity of each spectrum, which has the same result. Where we must be careful (and I think the proteomics community has learnt this the hard way) is understanding the absolute need for analytical rigor in terms of how we assess data quality and mitigate the risk of false positive  identifications. It caused problems in the early days of proteomics and I think it has the potential to cause further problems in DIA if robust statistical tools are not used.  Ultimately, for simple samples, DIA works well; for more complex samples, it can be more challenging. Having said that, with Orbitrap we can dig deeper and derive more accurate quantitative data even in more complex samples, so I think there’s plenty of potential to explore.

How far are we from Alexander Makarov’s dream of an Orbitrap in every lab?

It’s a nice idea! But I still think it’s early days for Orbitrap technology. It’s used extensively in the high end research environment right now, though it is beginning to be used in some interesting quantitative and routine applications. A good analogy is automobile vendors – most of them invest heavily in Formula One or rally race teams; and that’s where the discoveries and inventions are made. But inevitably, the most useful innovations trickle down to the mainstream market. Thermo Fisher Scientific has much the same philosophy. We experiment at the high end and, through a process of refinement, make that technology more accessible to a broader cross-section of our customer base, both in terms of price and in terms of application. We must continue to push innovation into the mainstream.

From R&D to VP

I started along my career path with an undergraduate degree in chemistry and the intention to become a doctor but, after completing the first year, I realized it wasn’t for me. However, the experience was not wasted; I gained both a solid grounding in biochemistry and a real insight into the huge impact of clinical and translational science. With that prominent in my mind, I joined Genentech as a research associate. I was fortunate to start at the infancy of the biotech industry, and got a great education in biologic drug development, from cloning, expression, purification, characterization, the approval process, and sales and marketing of protein-based drugs. The guys down the hall from our R&D lab were just figuring out how use mass spectrometry to analyze proteins, which really fascinated me.

As time went on, I realized that I was a people person and the lab started to feel a little restrictive. I went back to school and graduated with an MBA from the University of California, Berkeley, in 1991. Since then I’ve worked for a succession of analytical instrument companies, initially in sales and then marketing. It has been extremely valuable to once have been a customer – it helps me understand the challenges that our customers face and to develop products and programs to help them succeed.

I joined Thermo Fisher Scientific in 2000, which was really the first big proteomics wave with ion trap instruments and SEQUEST™ software. Supporting and sustaining  collaborations continues to be a source of pleasure and inspiration. Now, I’m VP of marketing for our life sciences mass spectrometry business, and it’s great to be working for Thermo Fisher at a time when we are deeply involved in so many aspects of a field that has always meant a lot to me.

Receive content, products, events as well as relevant industry updates from The Analytical Scientist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

About the Author
Rich Whitworth

Rich Whitworth completed his studies in medical biochemistry at the University of Leicester, UK, in 1998. To cut a long story short, he escaped to Tokyo to spend five years working for the largest English language publisher in Japan. "Carving out a career in the megalopolis that is Tokyo changed my outlook forever. When seeing life through such a kaleidoscopic lens, it's hard not to get truly caught up in the moment." On returning to the UK, after a few false starts with grey, corporate publishers, Rich was snapped up by Texere Publishing, where he spearheaded the editorial development of The Analytical Scientist. "I feel honored to be part of the close-knit team that forged The Analytical Scientist – we've created a very fresh and forward-thinking publication." Rich is now also Content Director of Texere Publishing, the company behind The Analytical Scientist.

Related Application Notes
Site-specific differentiation of hydroxyproline isomers using electron activated dissociation (EAD)

| Contributed by SCIEX

High-Resolution Accurate Mass Library for Forensic Toxicology

| Contributed by Shimadzu

Industrial Safety Hazard Monitoring

| Contributed by IONICON

Related Product Profiles
ASMS 2024: Innovations Unveiled

Higher Peaks – Clearly.

| Contributed by Shimadzu Europa

Compact with countless benefits

| Contributed by Shimadzu Europa

Register to The Analytical Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine

Register