logo


Dec-2019

Radiometric measurements: accuracy, repeatability and errors

Radiometric measurements for industrial processes have been around for several decades.

Thomas Schmidt and Dr. Steffen Müller
Berthold Technologies GmbH & Co. KG

Viewed : 2549


Article Summary

They are an important component in the performance of critical level, density and flow measurements. Nuclear measurement gauges work where conventional measuring technology fails. They offer outstanding measurement results under extreme conditions. High temperatures, pressures and other difficult ambient and process conditions are no problem for radiometric measurements. Typical measurement tasks are non-contact level and density measurements in various vessels, bunkers and piping systems, interface measurements in oil separators and moisture measurement. They can also be used as non-contact limit switches.

What is a radiometric measurement?
Nuclear measurement gauges operate on a simple yet sophisticated concept – the principle of attenuation. A typical radiometric measurement consists of
• A source that emits γ-radiation, produced from a nuclear radioisotope.
• A vessel or container with process material under investigation.
• A detector capable of detecting γ-radiation.

If there is no or little material in the pathway of the radiation beam, the radiation intensity will remain strong. If there is something in the pathway of the beam, its strength will be attenuated. The amount of radiation detected by the detector can be used to calculate the desired process value. This principle applies to virtually any nuclear measurement. Nuclear measurement technology is highly reproducible. Using the laws of physics and statistics, as well as sophisticated software, the success of any nuclear-based measurement is almost granted. However, correct and exact application information is imperative for the design of an accurate and reproducible measurement. Considering the benefits of a totally non-contacting and non-intrusive technology, nuclear measurement technology becomes the number one method and only choice for the most difficult and challenging process measurement applications.

Every measurement involves and comprises errors and radiometric measurements are no exception. In this document we will elaborate on the nature and causes for such deviations and how they can be suppressed by applying best practices and good workmanship.

Radiation sources
There are many known natural and artificial radioactive isotopes, not all of them are used for radiometric measurement. In industrial applications just a few nuclides are actually used for measurement purposes. The radioactive isotope is usually placed in a rugged, steel-jacketed, lead housing for maximum safety. The housing shields the radiation, emitted from a radioactive isotope, except in the direction where it is sup- posed to travel. Using a small collimated aperture in the shield, the beam can be projected at various angles into the pipe or vessel. This warrants a high quality of measurement with minimal exposure of personnel to radiation. Basically, the ALARA (As Low As Reasonably Achievable) principle for maximum work safety applies on everything that has to do with nuclear isotopes.

Radioactivity
The most commonly applied isotopes are Cesium- 137 and Cobalt-60. Sometimes Americium-241 is used for certain applications. They differentiate from each other in half-life time, but also by the emitted gamma energy. It is very easy to confuse the meaning of “activity” and the “energy” of the emitted radiation from a source.

• Activity describes the average number of the isotope’s nuclear decays that result in an emitted gamma quantum. Or in other words: the amount of radioactive material.
• Each gamma quantum has a specific energy. The energy distribution of emitted gamma quant is characteristic for each isotope. The gamma energy is directly linked to the ability of the radiation to penetrate through materials (media, vessel, etc.).

It is important to understand that the number of emitted gamma quanta – and hence the activity – has nothing to do with their energy. This is similar to the colour of light which is not linked to its brightness. Also, the half-life time should not be confused with the lifetime of a source! While the half-life time of an isotope is an unchangeable physical property of the isotope and refers to the time in which the activity of a material reduces to half, radiometric sources are typically designed for a lifetime of 10 to 15 years, independent of the isotope being used. The selection for a specific measurement task is based on the media to be measured, but also the setup (i.e. point or rod source) and onsite constructional specifics. The decay of any isotope is following the stochastic theory and therefore can be analysed with the use of statistical methods.
 
Point sources
Point sources are widely used in literally any measurement task, whether it is density, level or the use as limit switch. They usually comprise an inner source capsule, which securely encapsulates the radioactive material and a shield with a shutter mechanism to block the radiation in a controlled way.

Rod sources
A rod source is a device where the active area is continuously distributed over the complete measuring range. This can be done by winding an activated Cobalt wire (Co-60) on a carrier. One of the benefits of a rod source is, that a “perfectly” linear calibration of the detected signal (count rate vs. level %) can be achieved, meaning the percent change of the count rate is direct related to the percent change of the process value.

Detectors
The radiation detector contains a crystal made from a special polymer material or an inorganic crystal, like doped sodium iodide – the so-called scintillator. The scintillator converts the incoming gamma into flashes of visible light. The crystal is optically coupled to a photo multiplier, which converts light into electrical pulses. While the vacuum photomultiplier has been used successfully for decades, nowadays also silicon photomultipliers (SiPM) are available and are becoming more and more established in industrial detectors.

Fig.4 shows schematically how a detector works. When the emitted gamma quants hit the crystal, after having passed through the walls of the vessel, pipe and the measured product itself, each gamma photon in the beam might generate a light flash, resulting in thousands of subsequent light pulses that are recorded by the photomultiplier tube. Each light pulse is converted into electrical pulses by the photomultiplier. After digitising of the signal, these pulses are counted to determine a so-called count rate, which is typically expressed as counts per second (cps) or frequency (Hz). The intelligence that distinguishes between various measuring tasks (i.e. level or density) is implemented in the transmitter or control unit. The count rate is used to compute a process related signal which can be used for a display, an analog current output or bus connections into a DCS or PLC.

The detector measures any γ-radiation arriving in the scintillator, without distinction of “useful count rate” deriving from the source or natural background radiation from the environment. We will learn later how interference radiation coming from weld inspection, and changes in natural background radiation, etc. can be handled.

Point detectors
Detectors with a small scintillator are called point detectors. They often employ a small cylinder as scintillator, e.g. 50 mm diameter and 50 mm in height. They are typically used for density applications but also for level switch or continuous level measurements. Depending on the measurement task, other scintillator sizes may be used. Due to the small sensitive volume of a point detector it is not greatly affected from background radiation. Additionally, point detectors can be easily equipped with a lead collimator to further suppress sensitivity to background radiation.


Add your rating:

Current Rating: 3


Your rate: