Cookies

Like most websites The Analytical Scientist uses cookies. In order to deliver a personalized, responsive service and to improve the site, we remember and store information about how you use it. Learn more.
Subscribe to Newsletter
Techniques & Tools Technology

The Analytical Lab of 2050: Automation and AI

Industry 4.0 is a way of describing how new technologies will pave the way for greater efficiency, flexibility, and connectivity. The concept has mainly been applied to manufacturing and the “fourth industrial revolution,” but it also applies to analytical labs where automation, robotics, and AI are set to change how labs operate. We’re already on that path, so how will things look in 2050? 

First, let’s take a step back and assess the current situation. Both automation and AI are at a relatively early stage in terms of their maturity – especially AI, which isn’t currently well integrated into analytical labs. 

Automation is easier to implement at most stages of laboratory workflows with high availability of platforms with application specific software. Software-as-a-Service (SaaS) and Cloud technology are increasingly part of the analytical lab today. But applications of AI, such as deep learning and machine learning, intelligent robotic lab assistants and Augmented and Virtual Reality are still emerging and not yet mainstream – likely because AI often requires more specialized depth of expertise and access to large amounts of high-quality data with fewer platform tools available to support adoption. 

In the future, all these technologies will be transformative in their ability to help researchers interpret complex data and improve overall lab productivity.  

Though it’s not realistic to expect a lab to be fully automated and digital overnight, we are seeing steady progress – with analytical lab instruments, workflows, and services becoming more intelligent with each new generation. 

The potential of AI and automation technology 
 

Automation in the lab today is driven by the goal of increasing sample throughput via a reduction in per-sample hands on time. For some, the primary appeal of automation comes from a reduction in human errors/variance in manual sample preparation and processing, resulting in increased reproducibility and accuracy. This is true at any sample throughput, and the commercial availability and adoption of more automation solutions aimed at mid- and lower-level sample throughput are anticipated. 

The reproducibility of automation can enable smaller sample volumes, which are essential when sample availability is limited – as is increasingly the case for clinical and life science applications. And as areas such as multi-omics and precision medicine take hold, the number of analyses from a single sample is also trending higher; this will continue to drive down the desired amount of sample per analysis.  

Downstream, automation in data processing and interpretation will address bottlenecks – caused by human contributions – to research and lab productivity. As robotics and automation become easier to use and address well-defined and repetitive workflow components, lab scientists can instead spend their time on additional (and potentially more interesting) projects.   

As we look to future decades, the health of people and our planet are an essential consideration. From a sustainability perspective, smaller samples and robotic sample processing will reduce solvent use and waste. However, the ease of automation and removal of labor – along with our adoption of data-intensive approaches, such as machine learning – can result in a desire to scale-up. This scale-up should provide more robust, accurate data more quickly, potentially reducing the number of times a study needs to be repeated – also reducing waste from a sustainability perspective. However, if we run many more experiments with more controls and replicates, the sustainability benefits gained from smaller sample sizes will be countered – and we may also increase plastic waste.

With regard to AI, we’ll continue to see advances – both in terms of the science and the productivity/efficiency of the analytical lab. For example, AI will help efficiently visualize and analyze large, complex, and heterogeneous data sets while also allowing analysts to see anomalies and establish patterns in the classification of these large datasets, leading to improved experimental outcomes and insights. AI will enable the next level of interpretation and quantification for advanced imaging-based applications, such as in cells and tissue.  

AR and VR as enablers of virtual work environments are beginning to emerge in the analytical lab. These methods provide capabilities for 3D visualization of cells and molecular structures and interactions as in biopharma therapeutic development; VR also offers the opportunity for simultaneous, interactive viewing for collaborative scientific insights. For example, At Agilent, AR methods were accelerated at the beginning of the COVID-19 pandemic so that field support teams could remotely help customers successfully troubleshoot their instruments and workflows.

Looking forward, AI and the Cloud will not only accelerate high performance, productive, secure, and efficient global laboratory operations, but also global partnership and collaboration – inside a company or across external partners. This remote sharing of data was key during COVID; similarly, interlaboratory data sharing and data aggregation at scale can support more robust machine/deep learning results.

In the context of automation and AI, droplet technologies are another trend that will change the analytical lab, especially as single cell technologies take hold. Today, plastic plates, tips and tubes are used in robotic sample workflows; much of this plastic has the potential to be eliminated through droplet formats. But this also has significant implications for data scale, complexity, and computational power: one sample taken a droplet at a time may generate tens of thousands vs a single data set! 

Finally, an important example of where AI, automation, and robotics are together being applied is in synthetic biology and cellular manufacturing, through which biological cells are reprogrammed for useful and practical purposes. At the core of Synthetic Biology is the Design-Build-Test-Learn cycle, which is done iteratively thousands of times as scientists and engineers seek to understand and control the dynamics of cellular pathways and metabolism in optimizing cellular performance for our intended purposes. This biologically-based synthesis and production not only enables new and improved products within the bioeconomy, but is also a potentially transformative sustainable alternative to petrochemical-based synthesis. 

Changing attitudes 
 

In addition to the technological advances that are bringing Industry 4.0 to the analytical lab, changes in attitudes are also acting as an accelerator for these trends.

We’ve all come to realize how important it is to remotely access and control automated workflows in our laboratories and experimental data from our laptops at home. Such access was critical during the first 1.5 years of the pandemic, and it is now becoming a workplace expectation that will continue to propel the digital transformation of the laboratory. Hybrid and remote work are increasingly prevalent when laboratory infrastructure is not required; automation, IT and advanced software are critical in continuing to enable this new reality.

As scientific understanding deepens and we seek to further advance our knowledge of new layers of life science and biology, our interrogations require more complex experiments and data integration which is driving the adoption of new measurements, tools, and technologies. Single cell omics, spatial omics, and measurements that are time dependent – as in the analysis of live cells – are all becoming increasingly prevalent. These data sets are large and complex, generally require automated data acquisition and analysis processes, and often need to be integrated and visualized in new ways to glean understanding.

Another contributing trend is the increasing demand for analytical instrumentation and measurement talent. Instruments that are more intelligent and easy-to-use interfaces will require fewer expert resources, maintaining lab productivity while freeing-up highly skilled scientists and technologists for other priorities. 

A recent Agilent-led survey of pharma lab leaders supports this observation, with responses highlighting the urgency to improve and update laboratory processes. Survey respondents said that they:

  • Wanted to achieve quicker results (55 percent)
  • Saw a demand for superior quality (44 percent)
  • Wanted to improve data integrity (43 percent)
  • Found that their current workflow requires optimization (83 percent).

Attitudes towards sustainability are also changing. Over the past 3–4 years, climate change has become evident in the extreme weather we are experiencing in so many parts of our planet – hurricanes and fire seasons like never before, shifts in agriculture to adapt with changing weather patterns, rising ocean temperatures, and species nearing extinction. 

Though the challenges are significant on a global scale, we all need to do our small part. There are choices we can make and things that we can do in our personal lives and in the laboratory.  We can’t change the world, but we can take more ownership for our parts, and be a model for others. 

The 2050 analytical lab 
 

The question of how these trends will play out over the next few decades is rather difficult to predict. However, I’m hopeful much of the following will occur well before 2050:

  • Labs will be data and outcomes driven, and fully integrated digitally. Complete, comprehensive, accessible, and secure laboratory data will enable the true digital transformation of the laboratory.
  • Labs will have extensive automation, robotic systems, and miniaturization with programmed end-to-end smart and connected instruments and workflows fully integrated in the lab ecosystem… allowing human force to focus on innovation and more impactful outcomes. 
  • Artificial intelligence will be far more widely standardized, accepted, and implemented across all laboratory opportunities; it will enable the next level of depth and breadth of measurement data interpretation, including multidimensional, multi-modal data. 
  • Combined automation, robotics and AI solutions will have increasingly sophisticated feedback loops that allow automated systems to detect mistakes or non-ideal conditions and make real-time corrections without user intervention. 
  • Sample tracing will be comprehensive, stating at the point of sample collection and continuing through to data analysis, with tracking and integration across multiple workflows used to analyze the same sample. 
  • Machine vision technologies, robotic assistants, and intuitive collaborative robots with excellent human perception will naturally engage with us and in their environments - assisting our work and research, contributing to lab operations, and increasing human productivity, as well as increasing lab safety for dangerous procedures. 
  • New technologies will provide for cost-effective, sustainable alternatives to plastics and single-use consumables.  
  • Biological based synthesis and green chemistry will be standard, and labs will use renewable gases and solvents for all but the most specialized needs. 
  • There will be no unplanned downtime of analytical instruments – and the analytical industry will achieve its Net Zero goals.  

Timing is always hard to predict but given the goals we as an industry are setting in terms of sustainability and Net Zero, as well as the obvious practical benefits, there is cause for optimism. 

For companies today that recognize the value of automation, robotics and AI, wondering where to start, my advice is to start simple and address a well-defined problem you want to solve. And reach out to industry providers, peers, and collaborators to learn from their experiences. 

We’re a long way off 2050, but Industry 4.0 technologies are already making life better for analytical labs.

Acknowledgments: I would like to express my appreciation to Anya Tsalenko and Genevieve van-de-Bittner for their valued discussions

Receive content, products, events as well as relevant industry updates from The Analytical Scientist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

About the Author
Darlene Solomon

Darlene Solomon is Chief Technical Officer at Agilent Technologies

Related Application Notes
Efficient AQbD Method Development for Pharmaceutical Impurities

| Contributed by Shimadzu

Micromilling of uniform nanoparticles for space applications

| Contributed by FRITSCH

Fast analysis of Vitamin E in palm oil using Supercritical Fluid Chromatography

| Contributed by Shimadzu

Related Webinars
Techniques & Tools Spectroscopy
The Analytical Spectroscopy Technology Forum

| Sponsored by WITec GmbH, Bruker Optics, Hamamatsu Photonics Europe GmbH, and DRS Daylight Solutions

Techniques & Tools Liquid Chromatography
The Next-Level LC-MS Technology Forum

| Sponsored by ACD Labs, Agilent, Tosoh and Andrew Alliance (Waters)

Techniques & Tools Thin Layer Chromatography
Comprehensive HPTLC Fingerprinting for Quality Control of Herbal Drugs

| Sponsored by CAMAG

Most Popular
Register to The Analytical Scientist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Analytical Scientist magazine

Register