A Raman spectroscopy method was optimised to examine the chemical changes of aspirin tablets after interaction with helium temperatures.
Articles and Columns
Sampling and Vikings seems to be the next unexpected connection within Kim Esbensen’s Sampling Column. Kim has been exploring an area of Southern Norway from where the founder of the Theory of Sampling, Pierre Gy, believed his ancestors originated. You will have to read the column to find the “smoking axe”! Oh, and there is an interesting report on the 10th World Conference on Sampling and Blending.
Tony Davies has started a timeline of significant spectroscopic system developments aligned with Queen Elizabeth’s reign as recently celebrated in her Platinum Jubilee. Jumping from Princess Anne the Princess Royal’s birth to Heinrich Kaiser certainly makes for a novel approach! Tony hopes that we can turn this into an online resource with your help.
John Hammond has taken a break from his Four Generations magnum opus and reports on the recent meeting of the ISO technical committee on reference materials (ISO TC 334).
This article describes MALDI imaging’s potential uses in pathology applications, and the benefits of the technique to map hundreds of biomolecules (proteins, lipids and glycans, for example) in a label-free, untargeted manner or for imaging target proteins using a modified immunohistochemistry protocol, often from a single tissue section.
This article describes the use of synchrotron X-ray fluorescence and absorption spectroscopies to image metals in the brain.
This column has invited two world-renowned experts in near infrared (NIR) spectroscopy to let the world benefit from decades of leading-edge experience, especially regarding sampling for quantitative NIR analysis.
This article looks at three related spectroscopic techniques/tools in the toolbox, namely, Fluorescence, near infrared (NIR) and Raman; and discuss the “what”, “where” and “how” of these techniques are being used to improve the quality of the measurement processes associated with them.
This column starts to answer the question, “how does one actually find FAIR data?” with a detailed example from Imperial College London.
Despite a multitude of chemical and physical methods capable of detecting fingerprint residues, there are substantial challenges with fingerprint recovery. Spectroscopic methods have played a critical role in the analysis of fingerprints, used to identify the chemical constituents present, examine their degradation over time and compare the chemical variation between donors.
The latest in this series of “Four Generations of Quality” considers the essential component that controls our modern instrument systems and the associated concept of data integrity that is fundamental to the quality of the data being generated.
Sampling is nothing more than the practical application of statistics. If statistics were not available, then one would have to sample every portion of an entire population to determine one or more parameters of interest. There are many potential statistical tests that could be employed in sampling, but many statistical tests are useful only if certain assumptions about the population are valid. Prior to any sampling event, the operative Decision Unit (DU) must be established. The Decision Unit is the material object that an analytical result makes inference to. In many cases, there is more than one Decision Unit in a population. A lot is a collection (population) of individual Decision Units that will be treated as a whole (accepted or rejected), depending on the analytical results for individual Decision Units. The application of the Theory of Sampling (TOS) is critical for sampling the material within a Decision Unit. However, knowledge of the analytical concentration of interest within a Decision Unit may not provide information on unsampled Decision Units; especially for a hyper-heterogenous lot where a Decision Unit can be of a completely different characteristic than an adjacent Decision Unit. In cases where every Decision Unit cannot be sampled, application of non-parametric statistics can be used to make inference from sampled Decision Units to Decision Units that are not sampled. The combination of the TOS for sampling of individual Decision Units along with non-parametric statistics offers the best possible inference for situations where there are more Decision Units than can practically be sampled.
This article is about photoacoustic imaging and spectroscopy, and their use for looking inside us, where they have a number of benefits. Hilde Jans and Xavier Rottenberg explain the fundamentals and how new technology may be bringing a new photoacoustics age.
Tony Davies marks the passing of Svante Wold, who gave us “chemometrics”. It all started with a grant application!
Kim Esbensen, along with Dick Minnitt and Simon Dominy, tackle the ever-present dangers in sub-sampling; in this case in the assaying lab of mining companies.
John Hammond continues his Four Generations of Quality series and starts to look at changes that will affect our activities into the future.
John Hammond continues his journey through four generations of quality, this time focusing on some of the specific Quality “tools” in use in both the ISO and GxP environments; how they are defined, applied and used; and how they have evolved with time.
Tony Davies has discovered there is a new UK National Data Strategy and that it is on the right lines: echoing many of his suggestions in this column over many years.
Getting your sampling right can hardly be more important than in the nuclear waste industry. This column describes how the Belgian nuclear waste processing has benefited from the Theory of Sampling, and how it has led to important insights leading to significant potential improvements in the field of radioactive waste characterisation.
This sponsored article describes the RADIAN ASAP, a dedicated direct analysis system, which uses Atmospheric Pressure Solids Analysis Probe (ASAP) technology to analyze solids, liquids and solutions.