In lieu of an abstract, here is a brief excerpt of the content:

Bruce Kaiser and Aaron Shugar 450 strict criteria, the concentration values given by the various instruments’analytical software (calibrations) are often inaccurate. Using concentrations derived in this manner to draw conclusions regarding provenance or technology of manufacture, for example, usually results in erroneous deductions. Benchtop lab based XRF systems rely on destructive analysis of fragments removed from objects or of material that has to be properly “prepared” to ensure that the samples are homogenous. This typically is done by homogenizing the samples by milling to a uniform particle size and then adding approximately 80% by weight prescribed glass formers such as borax, mixing the batch well and then using a furnace to make fused glass beads (This was being done as early as the late 1950’s for XRF analysis; Instituut 1957:154; International Atomic Energy Agency 1960:44). For XRF analysis of glass objects (both lab based and handheld), taking samples and homogenizing them is still the only way to get a “failsafe” accurate elemental analysis of the bulk glass composition. Indeed this is still the very technique required by the quantitative XRF laboratory systems. Unfortunately this process cannot be used on most important cultural heritage objects prepared from glass because its destructive nature is not acceptable for these irreplaceable materials. With the advent of the most advanced handheld X-ray fluorescence spectrometers that have non-destructive, non-invasive (no sampling required) analytical capabilities, the potential for direct object analysis without causing damage, even the formation of color centers in the glass through radiation damage, becomes practical. However, great care must be taken when applying XRF on any type of unprepared object – surface compositions will be highlighted and non-ideal geometries will compromise data quality. The physics of characteristic X-ray emission from the elements has not changed since it was first discovered in the late 1800s (the beginnings of X-ray fluorescence spectroscopy can be tracked to the finding of high energy photons (X-rays) by Wilhelm Conrad Röntgen in 1895), and is dependent on several factors that can have an exceptional effect on quantification. The fluorescent X-ray intensity will depend on: the inverse square of the distance from the element/atom to the detector, the matrix density (an exponential factor), elemental X-ray energy emission (an exponential factor), element location in the sample matrix (an exponential factor), any incident beam filtering and the incident beam energy, distribution and angle (all also exponential factors). What this ultimately means is that for any change to any of these parameters in your sample, the reported quantitative emission changes very rapidly resulting in large variation in the resulting reported elemental concentrations. The equation that highlights many of these sensitivities is given in the Section on the Depth [3.138.200.66] Project MUSE (2024-04-19 21:28 GMT) Glass analysis utilizing handheld X-ray fluorescence 451 of Analysis below. The sensitivity to incident beam energy can be understood by reviewing the NIST website dedicated to X-ray attenuation (Hubbell and Seltzer 2012). It also means if an unknown sample is non-uniform (always the case for ancient and historic glass objects, which have surface alkali depletion due to weathering) the automated calibrations built into these instruments will report concentrations that have no real physical basis or meaning. In addition, there is no way any algorithm or ‘tuned’ calibration can take all these unknown variables into account. Thus, if you do not have good sample uniformity, which is afforded in a prepared sample, quantitative analysis of a non-uniform material, by any XRF system, and the resulting data must be wrong. That being said, the raw spectrum contains an exceptional amount of data from which some quantification information can be derived. One should always examine the spectra closely to glean as much as possible from this data, as long as the physics that produced it are well understood. Non uniform glass – The value of qualitative data analysis Based on the above discussion on the physics of X-rays interacting with material (glass is focused on here, but the information is relevant to all analyses) and the things one must take into account before accepting any quantitative information, one would think that the usefulness of XRF is quite limited. This is particularly the case with works of art and archaeological materials since a great preponderance of the material to be investigated in these fields is non-uniform as a function of depth and XRF is a surface analysis technique. Actually...

Share