The Time is Now
Ready or not, automation is coming soon to analytical labs everywhere. Here’s why we should rejoice – rather than resist – the rise of the machines.
Bob Boughtflower, Paul Hopkins | | Opinion
In every industry, scientists typically follow set workflows to ensure that a product or formulation is prepared in a consistent manner. For analytical scientists, following methods and procedures provides confidence in the result that we generate. Depending on the sector, the application and the task, these workflows range from being somewhat intuitive to being completely locked down and controlled.
However, almost all of these workflows are subject to some degree of human interpretation or style that can introduce variations – either deliberately or unknowingly – into the output. Furthermore, external “noise” often has an influence and is, by definition, difficult to control. For example, a sonication step in a method may be subject to influences from many factors (water depth, water temperature, bath size, power, and so on).
In the analysis of valuable products such as pharmaceuticals, there is still a substantial element of manual processing involved, particularly in the preparation of samples. These processes are carried out within tight specifications or guidelines (according to where in the regulatory framework they belong), but variation may still be introduced. This could be due to differing interpretation of instructions, such as “stir until dissolved”, or human error, such as recording a reading incorrectly. Any error that is introduced in the sample preparation will likely propagate and lead to variation or even failure of a process specification.
There is little doubt that most, if not all, of these “human” errors or variations could be eliminated by introducing automated processes to manage these tasks more reproducibly. Perhaps even more importantly, a machine can record and confirm the progress of the process to ensure compliance with the requirements or flag any issues before errors are propagated, providing assurance of data integrity. Of course, deliberate automation only becomes worthwhile when the process or product volume is high enough and well enough understood – or important enough to justify this level of investment.
As automation becomes more commonplace in the world around us, mankind continues to debate whether the increasing use of robots for some activities and jobs is a threat to society – perhaps our very future. Automating hazardous tasks is an easier sell; but using robots to improve reproducibility or eliminate non-value-adding, repetitive tasks is still met with suspicion from many – even in scientific circles.
So, should we embrace automation? For routine, set work flows, where there is a consistent throughput and a good return on investment; adoption of automation seems to be a sensible approach. The simplicity or complexity of automation should be dictated by need and indeed ambition – we do not advocate for automation for the sake of automation, but where it adds real value. However, this value may come in many guises and thus, we need to be more strategic in how we make this journey.
For example, even if many procedures never reach a scale sufficient to justify building a dedicated machine, introducing automation for key steps to ensure the most common errors are reduced or eliminated may still be valuable. If we’re particularly clever, we can introduce these new technologies with one eye on the future automation of the entire process. This allows building sophisticated workflows from a series of core tasks such as weighing/ liquid dispensing/ dissolution and transfer to measurement devices. Where the need never arises or the volume levels are not met for full automation, the individual, optimized core task devices and procedures could still be used independently, and still constitute consistent best practice.
This could be a particularly beneficial strategy for industries where there is a need to transfer methods or procedures around the world, where the return on investment does not justify investment in full-scale automation (for example, due to low throughput or lower labor costs), but equivalent results need to be generated and procedures controlled.
Key parts of the process can be usefully automated to remove the human variation, while still using local operators to link these core tasks together. Thus, the fundamental performance and compliance of the process is maintained. As scale increases and productivity improves, upgrades to full automation have a more obvious return on investment (ROI) and can be carried out confidently based on well-proven use elsewhere.
This approach also influences the design of automation, as it implies component, task-based modules that can be used on a stand-alone basis to perform a specific, otherwise error-prone task; or configured with other modules into systems to manage a more complex process. On the other hand, where it’s unlikely that a task will ever be carried out in stand-alone mode, a more integrated (specifically configured) design can be considered as an expert system – but should still be capable of being linked to and transfer work to and from more sophisticated workflows. This also raises the stakes for commercial involvement to develop appropriate “off the shelf” automation that is interchangeable within a common platform.
All of these proposed approaches support identification and adoption of more standardized interfaces and formats for work preparation and execution; resulting in vendor agnostic instrumentation, seamlessly integrated and managed. Allotrope Foundation are pioneers of this journey.
As technology continues to develop at a rapid pace, what were aspirational approaches are becoming a reality. Imagine a world where compliance is built in and machines become intelligent enough to handle errors and process data automatically. In fact, rather than aspirational, these become fundamental expectations of automation.
Automation could guarantee that a process was carried out correctly and progressed through every stage without incident, with all important parameters checked, monitored and recorded. Data integrity compliance comes as a standard feature. As the cost and complexity of monitoring and tracking existing processes in a fast-changing regulatory environment increases, the case for expert automation becomes ever stronger. Further developments in artificial intelligence and automated decision-making mean that automated release of products following testing is an achievable reality.
Time to answer?
The time is upon us where we can successfully automate many of the sophisticated processes that many still believe rely on the skill of highly trained human operatives and scientists to be successful. Well-designed automation will increase reliability, reproducibility, productivity, confidence and process compliance whilst decreasing error rates, time to decision, and – ultimately – cost.
We would like to see a more deliberate assessment of the potential for automating many tasks that make up key processes in the analytical science industry. In fact, workflows in many industries stand to benefit from this building block approach, which can be adopted as required as steps towards a long-term automation strategy – there are numerous examples of this happening in other industries, such as medicine and automotive manufacturing.
We need to think more strategically and innovatively about how we develop and ultimately deploy automation. Strategies need to be enabling – not just built around what is commercially available or accepting of an inefficient, non-robust “traditional” way of completing a task. Having a fit-for-purpose, scalable approach could remove some of the aforementioned barriers, and is likely to result in increased adoption.
All of the very real benefits of automation will only become a commercial reality if enough buy-in occurs from the user industry as well as the commercial vendors, who will need to work more collaboratively to develop co-operative automation (hardware and software). The science industry in general can be somewhat individualistic about the particular vendor or design of automation that is adopted, and there needs to be more collaboration around performance requirements and desirable standards. It is not entirely down to the big company end-users to each try and reach these goals – the companies that support them, such as CROs, are arguably the ones who have the biggest interest and throughput requirements in winning the contracts that make sense for the automation investment.
Change management also has a very important role to play here, and companies can afford to be much more transparent about which roles really require highly trained human staff, and what processes should intentionally be automated.
Transformative change is something that is often mooted. Maybe now is the time to deliberately replace out of date, inefficient, error prone processes with automated machines and systems able to industrialize laboratory and factory processes to a previously unimaginable level.