Subscribe to Newsletter
Techniques & Tools Data Analysis

The Greatest Analytical Odyssey: Part 2

Jeremy Nicholson

Roy Goodacre

The irreproducibility problem
 

Roy Goodacre: It’s well documented that there is a reproducibility crisis in science – some of these issues are highlighted in “A manifesto for reproducible science.” The authors in this paper conclude that a staggering “85 percent of biomedical research efforts are wasted, while 90 percent of respondents to a recent survey in Nature agreed that there is a ‘reproducibility crisis.’” This may be dated information, but nothing has really changed.

In chemical analysis, the truth of our measurements should be immutable. For the analytical science field, this is usually represented by mass, abundance, and identity. As analysts, we should be prepared to stand by our data and our reasonings should be evidence-based. In metabolomics, there are significant challenges in metabolite identification. My team's opinion piece on this topic was published earlier this year, which further pushes this ideology – ensuring metabolite identifications are based on common sense, facts, and evidence.

Jeremy Nicholson: Irreproducibility undermines scientific integrity by casting doubts on the accuracy and validity of research. It wastes resources as researchers invest time and funding into efforts that cannot be built upon or validated. This challenge has far-reaching implications for policy decisions, medical treatments, and technological advancements, as unreliable findings can lead to misguided actions. Publication bias arises when only positive results are published, skewing the scientific knowledge base and impeding progress. 

Moreover, the replication crisis in biomedical research raises concerns about the robustness of scientific findings. A lot of modern biomedical science is multidisciplinary and at the top end requires the integration of physical, mathematical, and biological sciences and this is also a challenge to referees and journal editors meaning that there is increased effort required to check if analyses are accurate or even appropriate.

As an associate editor of a major journal, I see manuscripts making impossible claims based on statistically underpowered studies using poorly executed experiments and without appropriate validation on a daily basis. These problems are found in the world’s top journals (and far more often than you might think). Of course, this is driven by the ever-increasing demand to publish and raise grant funding in the background of an increasingly impatient world. 

However, we as scientists are responsible for delivering appropriate, accurate and correct data and models to serve societal demands. If we fail to do that, we are invalidating the thing that sets science apart from all other subjects – the pursuit of fundamental and objectively demonstrable truths.

Davy Guillarme

Hans-Gerd Janssen

The data challenge 
 

Davy Guillarme: One of the biggest challenges facing analysts today is data processing. With today’s much more powerful, faster and more sensitive instruments, we are faced with a veritable tsunami of data that we need to be able to process quickly so that it does not become the bottleneck of analytical methods. The new generation of scientists is much better prepared for data analysis, whereas scientists who graduated in the 20th century sometimes feel overwhelmed in this area. Indeed, this science is evolving very rapidly, and this trend is likely to accelerate with the use of machine learning and artificial intelligence.

Hans-Gerd Janssen: We can perform stable separations with huge data sets, but they aren’t necessarily reliable. We cannot confirm if each number in a table is correct or if every peak assignment is on point. Improving the accuracy of this data is one of our main challenges.

Jessica Prenni:

Mario Thevis

Mario Thevis: In the light of the complexity of cause-and-effect questions, a major challenge appears to be the deducing of causalities from analytical data. Interdisciplinarity seems to be of even greater importance now than ever before.

Jessica Prenni: The mass spectrometry-based metabolomics field is facing a significant challenge in standardizing data acquisition and analysis – specifically in nontargeted workflows. This current lack of standardization limits the comparability of data across experiments and the reuse of data.

Credit: All headshots supplied by interviewees

Receive content, products, events as well as relevant industry updates from The Analytical Scientist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

About the Author
Markella Loi

Associate Editor, The Analytical Scientist

Related Application Notes
An End-to-End Targeted Metabolomics Workflow

| Contributed by Agilent Technologies

Real-time VOC categorization, comparison, and chemical composition of flavorings

| Contributed by Plasmion GmbH

Eliminating the Logistical Challenges of NMR Data Processing with Browser-Based Software

| Contributed by ACD Labs

Related Product Profiles
Higher Peaks – Clearly.

| Contributed by Shimadzu Europa

Compact with countless benefits

| Contributed by Shimadzu Europa

The fine Art of Method Development

| Contributed by Shimadzu Europa

Register to The Analytical Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine

Register