Going Against the Grain
We spoke with some of this year’s Top 40 Under 40 Power Listers to find out their most controversial opinions…
| 3 min read | Discussion
Katelynn Perrault: Optimization is a really important tool in chemical analysis – but one has to know when “good enough” is sufficient. Sometimes, we have to sacrifice the use of optimal settings to make our lives easier and to be more productive. This is really important in the work I do, because we typically handle batch analysis and high-throughput data processing. For example, we might make a slightly less than optimal column selection to prevent downtime in changing hardware or to avoid issues brought on by retention time shifting. We might also choose to limit our MS mass range to reduce the data size for downstream batch processing.
Finding the right balance is a battle – particularly when going through peer review – but, ultimately, what is best for a single project is not always best for long-term instrument function or productivity. Sometimes, we have to accept that “good enough” is synonymous with “fit for purpose” – a term that I much prefer! I know that this is often a point of contention with manuscript reviewers when they are viewing a single study in isolation from the bigger picture of a large research platform.
Ali Salehi-Reyhani: I’ll say that we don’t need more STEM graduates; we need more STEM graduates who are scientifically literate and competent. We need to approach teaching in a way that gives more exposure to critical thinking and developing hypothesis-driven approaches to problems.
Georg Ramer: In my opinion, there are many ways in which we can make science more open, transparent, and reproducible. More and more journals are becoming stricter with their requirements for data sharing and virtualization technologies have made reproducible, open data processing very accessible. Where my opinions differ from the majority of the field is that I believe black-box hardware and closed-source software should not have a place in science. Analytical sciences too often rely on complex commercial instruments and it is not uncommon for them to perform undocumented data preprocessing before outputting “raw” data. Reproducing a result shouldn’t hinge on buying the exact same instrument from the exact same vendor.
Andrew Ault: On a related note,I think a challenge is adapting to an environment that has shifted away from building new instruments to simply enhancing existing ones. I worry that the onus for instrument development has moved away from non-profit entities – such as universities and government labs – to companies that have to balance profits with scientific advancements. These companies can do great things in the short term, but if we lose the capability for fundamental research that focuses on instrument development, I worry that we will limit the breadth of our analytical capabilities.
Alexandre Goyon: The pharmaceutical field is quickly evolving and a variety of drug modalities and delivery systems are being investigated. However, the application of analytical chemistry to solve diverse, real-world challenges may not be valued as much as fundamental studies. Industrial partners are often underrepresented in journal editorial boards and conference scientific committees, which may create a disconnect between fundamental and applied research. For example, liquid chromatography plays an essential role in supporting quality control in the industry, whereas exploratory techniques such as mass spectrometry are given more importance in academia.
Pierre-Hugues Stefanuto: I agree. A lot of people believe that separation science will disappear with the rise of more powerful MS. In my opinion, chromatography will always have a seat at the table.
Michael Marty: Other scientists are very offended when analytical instruments are called machines. My controversial opinion is that I don’t really care. I’d be confused if you called a guitar a machine, but I wouldn’t correct you for calling a high-performance liquid chromatography device a machine.