More than a year has passed since I expressed my hopes for chemometrics to take center stage in this magazine. Where are we now?
Ewa Szymańska |
When I shared my thoughts on chemometrics a year ago, I was surprised and delighted by the varied reaction:
“Chemometrics has very powerful tools, but often the discussions are very theoretical. The more practical applications that are published the better.”
“Software should flag whether certain data requirements are not being met.”
“Any method that requires statistical treatment (e.g., SAS) for interpretation is an inadequate method.”
“Universities should include chemometrics after statistics in the curriculum, especially for students training in analytical chemistry.”
“Chemometrics has, in fact, been moving towards center-stage in the last few years, but mostly in academia and front-line industrial applications.”
Such comments indicate that chemometrics has not been forgotten and that it stills triggers a full spectrum of emotions – at least in certain circles. So, has anything changed over the last year? Is chemometrics receiving the attention it deserves – and if not, why not?
Actually, I think chemometrics is doing OK. Last year, chemometricians developed, and added many new tools (for example, variable selection methods, new approaches for data integration and image analysis) to an already extensive toolbox that aids in the planning of experimental work and the analysis and interpretation of analytical data. In fact, if you have a quick browse online, you will find new software toolboxes, scripts and codes that you can easily download and test on your data. The International Chemometrics Society now has more than 600 members – another positive sign. And scientific progress within the field of chemometrics is reported not only in analytical chemistry and specific chemometric journals, but also in journals that cover more specific applications of analytical chemistry, for example in diagnostics or flavor analysis.
Perhaps, it should come as no surprise that chemometrics is ‘on the up’. After all, it should be an essential part of many complex multidisciplinary projects, acting as the ‘glue’ that helps bring different aspects together. In addition, the transfer of tools from academia and into applied use in industry is becoming more commonplace, facilitated by bodies like COAST (see tas.txp.to/0914/coast). But implementation not only requires the provision of code and databases – it also demands time and know-how. In many cases, chemometric tools are highly complex and user manuals are not sufficient; chemometric expertise is required in order for them to be used correctly. Therefore, either an analytical chemist must be trained extremely well or an on-site chemometrician must be involved in data analyses.
There are other risks. Chemometric resources are numerous and freely available – and the amount of data that can be analyzed is almost without end. Such a plentiful supply of tools and ‘work’ sounds like a blessing but, in practice, it can be a curse. Although many people understand the great potential of chemometrics, hardly anybody knows how to choose the right tool for the job (1), which means that data sets are not handled with optimal methods but with ‘popular methods’, leading to suboptimal results and possibly wrong conclusions.
For example, we should always check a number of different methods or their combinations before choosing the ‘best’ one; solid criteria to assess data and method quality are critical in this process. Automation is another important step, as it allows the selection process to be performed in a fast and unbiased manner. Finally, explanation and visualization of effects caused by different method selections is also essential to understand what happens with the data during analysis and to interpret results correctly. In my view, this is the most critical step: taking a closer look at data and methods to understand the interplay between them.
According to Lutgarde Buydens (see tas.txp.to/0914/buydens), the Holy Grail for chemometrics is a theory that facilitates the selection of the methods that best suit the data available and the analytical goal. The simple rules and guidelines that embody the ‘Theory of Chemometrics’ require a helicopter view of the different methods and data types – in addition to integration of expert knowledge. Developing such a theory is clearly a long-term ambition and not an afternoon job. Furthermore, any rules and guidelines that are establish will need to be updated and adapted as new chemometric tools become available. The good news is that progress is already being made in specific data analysis steps by the groups of Buydens, Beate Walczak and Peter Filzmoser, who offered a glimpse of the future during presentations at the 14th Conference on Chemometrics in Analytical Chemistry this year in Richmond, VA, USA.
You may be wondering what you can do to speed it up. Well, here are few suggestions that I’ve compiled from last year’s comments:
- Invest your time; learn and teach more about chemometric methods and applications.
- Implement theory in practice and vice versa; share and transfer knowledge between academia and applied analytical chemistry labs.
- Feel free to ask – and don’t be afraid to be asked – about data and methods.
- Create new jobs for chemometricians and data analysts in your lab, so that they can help you with your data challenges.
Finally, believe in chemometrics!
- R. G. Breretron, “The Evolution of Chemometrics”, Analytical Methods, 5, 3785-3789 (2013).