This column starts to answer the question, “how does one actually find FAIR data?” with a detailed example from Imperial College London.
Tony Davies Columns
Tony Davies marks the passing of Svante Wold, who gave us “chemometrics”. It all started with a grant application!
Tony Davies has discovered there is a new UK National Data Strategy and that it is on the right lines: echoing many of his suggestions in this column over many years.
COLID is a Finding Aid, essentially a “catalogue of catalogues” collating any data source with which it is connected. It collects and provides metadata about basically any resource that you want to incorporate, links endpoints, such as spectra, in a repository or details in a chemicals database, and links it all semantically with any other related resource.
Tony Davies and Luc Patiny introduce us to free online NMR data processing tool in “NMRium browser-based nuclear magnetic resonance data processing”. They run through the background to the project, how it works and how you can try it yourself. There is a video introduction and an online demo page where you can play with different scenarios.
Following our articles on the FAIR initiative, we now look at some examples of the FAIRification of data handling, collection and archiving.
The Tony Davies Column offers a challenge to us all with another contribution on FAIR data, which should be Findable, Available, Interoperable and Readable. It is clearly the way we should all be going, everybody from manufacturers and software developers, through researchers to publishers needs to work together.
Following on from a recent column that reported on work which had shown that weight fractions were often incorrect concentration units to use in quantitative chemometric studies, Howard Mark goes into more detail.
Peter McIntyre and Tony Davies remember Bill George, a real Welsh character and educator whose style and charisma influenced many to go on and not only stay in science but to rise to leading positions either in industry or academia.
In quantitative analysis, is it better to weigh materials when making up standard solutions or to use volumetric techniques? Traditionally, the answer has been “volume”, however, things may not be as straightforward as they seem. Henk-Jan and colleagues have conducted a new experiment, using robots for both sample preparation and spectroscopic analysis which may provide a definitive answer. Unfortunately, the answer must wait for publication of their paper, but Tony and Henk-Jan’s history of this question makes interesting reading nevertheless.
Whilst automation is not a panacea, it can improve the accuracy of manual tasks as well as freeing up our time for more challenging tasks. The authors explore some particular examples they have come across and lessons learned from them.
Tony Davies and Mohan Cashyap discuss this topic with help from a number of industry experts. Whilst there are undoubted computing and networking issues for regulated industries in allowing working from home as if the user was in the lab, they are not insurmountable.
With a significant proportion of our regular readership probably under home lock-down, we were wondering if we could help you at this difficult time by pointing out some useful online resources. So, when we finally come out of this pandemic, you could do so better skilled and more up-to-date than when we went in to it.
Tony and Lutgarde Buydens give us an update on the planning for the major EuroAnalysis 2021 conference, which is being held in Nijmegen, the Netherlands, at the end of August 2021. At this stage, they are keen to gather suggestions from readers on topics they would like to see covered. Groups are also invited to consider hosting their own event under the EuroAnalysis 2021 banner.
The authors offer many useful points to consider when using pre-processing techniques.
A recent conference on Extractables and Leachables in Hamburg not only allowed two ex-colleagues to meet after many years, but also provided information on developments and trends in the regulatory environment. Not only are ever lower levels of detection required, but also analytical requirements are being placed on companies further back in the materials’ supply chain that have not had to make such considerations before.
Do you have the budget for a new instrument? Well, stop… wait! Tony Davies and Marian Draaisma have very useful guidelines. They run through questions you should be asking yourself (and your team), information you should gather and how you should go about the selection process for a new spectrometer. If you’re not buying just now, this is definitely a column you will want to bookmark for future reference.
Tony Davies and Roy Goodacre raise some issues around the reliance just on vast quantities of data collection in omics experiments. As they put it, should we “just keep throwing the mass spectra, nuclear magnetic resonance data sets and our ion mobility fingerprints onto a big pile for the statisticians to fight over?”.
Tony Davies, Peter Lampen and Robert Lancashire are worried about their metadata, or perhaps the lack of it. With the explosion of data and ways to mine and make use of it, having accurate and appropriate metadata about analytical data sets is vital if they are to be reused efficiently or at all. This is also an area that is being increasingly targeted by regulators, with the US FDA issuing guidance at the end of 2018; others will follow. You have been warned.
Hafiz Abdul Azeem recently presented some interesting results from his work on atmospheric aerosols. Following their capture, he combined the optimisation of the extraction process with chromatographic separation and mass spectroscopic detection to identify various sources of pollution through their emission marker fingerprints.1 One spin-off of this work has been the use of a specific biomarker from cellulose combustion to potentially warn of low-heat smouldering in, for example, agricultural materials in bulk storage.