The Greatest Analytical Odyssey: Part 2
This year’s Power Listers cite reproducibility and data analysis as two big challenges faced by analytical scientists
Markella Loi | | 3 min read | Discussion
The irreproducibility problem
Roy Goodacre: It’s well documented that there is a reproducibility crisis in science – some of these issues are highlighted in “A manifesto for reproducible science.” The authors in this paper conclude that a staggering “85 percent of biomedical research efforts are wasted, while 90 percent of respondents to a recent survey in Nature agreed that there is a ‘reproducibility crisis.’” This may be dated information, but nothing has really changed.
In chemical analysis, the truth of our measurements should be immutable. For the analytical science field, this is usually represented by mass, abundance, and identity. As analysts, we should be prepared to stand by our data and our reasonings should be evidence-based. In metabolomics, there are significant challenges in metabolite identification. My team's opinion piece on this topic was published earlier this year, which further pushes this ideology – ensuring metabolite identifications are based on common sense, facts, and evidence.
Jeremy Nicholson: Irreproducibility undermines scientific integrity by casting doubts on the accuracy and validity of research. It wastes resources as researchers invest time and funding into efforts that cannot be built upon or validated. This challenge has far-reaching implications for policy decisions, medical treatments, and technological advancements, as unreliable findings can lead to misguided actions. Publication bias arises when only positive results are published, skewing the scientific knowledge base and impeding progress.
Moreover, the replication crisis in biomedical research raises concerns about the robustness of scientific findings. A lot of modern biomedical science is multidisciplinary and at the top end requires the integration of physical, mathematical, and biological sciences and this is also a challenge to referees and journal editors meaning that there is increased effort required to check if analyses are accurate or even appropriate.
As an associate editor of a major journal, I see manuscripts making impossible claims based on statistically underpowered studies using poorly executed experiments and without appropriate validation on a daily basis. These problems are found in the world’s top journals (and far more often than you might think). Of course, this is driven by the ever-increasing demand to publish and raise grant funding in the background of an increasingly impatient world.
However, we as scientists are responsible for delivering appropriate, accurate and correct data and models to serve societal demands. If we fail to do that, we are invalidating the thing that sets science apart from all other subjects – the pursuit of fundamental and objectively demonstrable truths.
The data challenge
Davy Guillarme: One of the biggest challenges facing analysts today is data processing. With today’s much more powerful, faster and more sensitive instruments, we are faced with a veritable tsunami of data that we need to be able to process quickly so that it does not become the bottleneck of analytical methods. The new generation of scientists is much better prepared for data analysis, whereas scientists who graduated in the 20th century sometimes feel overwhelmed in this area. Indeed, this science is evolving very rapidly, and this trend is likely to accelerate with the use of machine learning and artificial intelligence.
Hans-Gerd Janssen: We can perform stable separations with huge data sets, but they aren’t necessarily reliable. We cannot confirm if each number in a table is correct or if every peak assignment is on point. Improving the accuracy of this data is one of our main challenges.
Mario Thevis: In the light of the complexity of cause-and-effect questions, a major challenge appears to be the deducing of causalities from analytical data. Interdisciplinarity seems to be of even greater importance now than ever before.
Jessica Prenni: The mass spectrometry-based metabolomics field is facing a significant challenge in standardizing data acquisition and analysis – specifically in nontargeted workflows. This current lack of standardization limits the comparability of data across experiments and the reuse of data.
Credit: All headshots supplied by interviewees
Associate Editor, The Analytical Scientist