Subscribe to Newsletter
Techniques & Tools Mass Spectrometry, Liquid Chromatography

Rodeo Champ

For most of my industrial career, I’ve been a modeler and have had to deliver practical results. I could use any approach I thought would shed light on the problem and that generally entailed using a lot of computing power.

Two-dimensional LC (2D-LC) is just one of the many very interesting techniques that I used at Rohm and Haas. In fact, we unleashed a great many experimental techniques from size exclusion chromatography to matrix-assisted laser desorption/ionization time of flight mass spectrometry (MALDI-TOF MS) to figure out polymer structure and quantitative aspects of block copolymers.

But for 2D-LC, I collaborated with two very good experimentalists (one of them was Robert Murphy) to develop the technique. This was in the early days for commercial applications, although polymer 2D-LC first appeared in the 1960s. I remember some of the initial work done in Canada was very interesting and it was all done by reinjection after collecting fractions. But people weren’t investigating hundreds of samples, they were focusing on just one or two. We wanted to go beyond the primitive 2D-LC stage and get into comprehensive 2D-LC, where you take a fraction of the first dimension and shoot it into the second dimension; you keep doing this synchronically through the first dimension chromatogram. However, we had to work out how to do it first... and that entailed establishing the method and development cycle before getting into the sampling problem: how many samples do you take from the first dimension into the second?

One day I was sitting down thinking about the sampling issue, as it was uncharted territory. Could I develop a mathematical theory for it? I thought about it for several days and then I realized the essence of the problem was analogous to Nyquist sampling in signal processing, where you need to work out the number of samples you can take across a signal without losing fidelity. Applying this to 2D-LC, you have to discover how many samples you need across the first dimension peak to prevent losing fidelity in the second dimension, but without distorting the first dimension. I worked it out (it’s relatively simple mathematics and I talked about this at Pittcon 2015). I explained that if you’re simply trying to identify a sample, such as a peptide sequence using a mass spectrometer, then 2D-LC is very good at reducing the saturation in your chromatogram, and you don’t have to sample as fast as you have to when using the technique for quantitative applications. You simply select some of the first dimension to go through to the second dimension, use a mass spectrometer as a peptide filter and use a database to identify the proteins.

Déjà vu – it’s 2000 all over again

Interestingly, all of the early 2D-LC sampling work I did has cropped up again. At Pittcon, there was a talk on 2D-LC and the presenters said that they had a particular liking for papers on sampling and dilution, the latter being important because in 2D-LC you dilute samples by the product of the dilution factor in both dimensions. This multiplicative dilution causes detector sensitivity problems.

We published on this in 1998 and it is in fact the second most quoted paper after Michelle Bushey and James Jorgensen’s work on computer-controlled 2D-LC.

There was a publishing hiatus after about 2000 – nothing else appears in the journals before 2010. So, I would say we were pioneers; we wanted to use 2D-LC for complex polymers and we needed to develop the basic science. Of course, application of techniques and developing the basic science was one of my jobs throughout my industrial career, the purpose being to give the company a competitive edge. Fortunately, we were allowed to publish technique-oriented papers so that we could enlighten others, although as you can see from the aforementioned dates, it took people a while to catch on to what we were doing...

The new sheriff in town  by Daniel Armstrong

Wrangling orthogonality in multidimensional chromatography  by Michelle Camenzuli

Reigning in multidimensional data  by James Harynuk

Vacuum-uv lone star versus ms  by Kevin Schug

Rodeo champs  by Mark Schure

Young guns: ionic liquids for gcxgc  by Jared Anderson

Receive content, products, events as well as relevant industry updates from The Analytical Scientist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

About the Author
Mark Schure

Adjunct Professor of Chemical Engineering at the University of Delaware, and Chief Technology Officer at Kroungold Analytical, Blue Bell, Pennsylvania, USA. 2015 Dal Nogare and Uwe Neue award winner.

Related Application Notes
Site-specific differentiation of hydroxyproline isomers using electron activated dissociation (EAD)

| Contributed by SCIEX

High-Resolution Accurate Mass Library for Forensic Toxicology

| Contributed by Shimadzu

Industrial Safety Hazard Monitoring

| Contributed by IONICON

Related Product Profiles
ASMS 2024: Innovations Unveiled

Higher Peaks – Clearly.

| Contributed by Shimadzu Europa

Compact with countless benefits

| Contributed by Shimadzu Europa

Register to The Analytical Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine

Register