[PSUBS-MAILIST] Sensor calibration

Sean T. Stevenson via Personal_Submersibles personal_submersibles at psubs.org
Fri Nov 27 15:45:58 EST 2020


I just thought I would share a couple of thoughts regarding calibration, because I noted Jon mentioning possibly using a sensor with a 25% range to calibrate a galvanic sensor with greater range, and while you might be able to get in the ballpark for a linear calibration factor, strictly speaking this is not a proper calibration.

In general, the range of your calibration standard should always match or exceed the range of the device you are calibrating, for a couple of reasons. The first is that while extrapolating a linear factor can indeed produce a reasonable slope estimate, you lose the ability to generate linearization data via lookup tables or polynomial corrections that are applicable within the calibration range once that range is exceeded. The second is that absolute error may be proportional to reading, but may also increase in the domain beyond the calibration range, and you are then unable to characterize this error in the extrapolated region.

Calibrating one sensor against another one is subject to the accumulated maximum error of both, and while it may indeed be possible to obtain a very high accuracy calibrated sensor for use as a calibration standard, the more accessible standard for an oxygen concentration calibration outside of the laboratory is to use gases of known concentration at extremely high accuracy, and the easiest and most accessible standards to use are 100% inert gas (0% oxygen), dry atmospheric air (20.95% oxygen), and of course, 100% oxygen, because these are not subject to any sensor error. This gives you a three point calibration which, in contrast to a two point, will also provide some estimate of error in the regression, because the three points will never be perfectly in line. That error, in turn, can be considered the minimum error apparent in any two-point calibration performed within a lesser range using the first sensor as the calibration standard, and of course the second sensor can also be independently calibrated against appropriate gases at two points within its range, just using the comparison against the other sensor to establish error bounds.

With CO2, in the absence of a custom reference standard calibration gas, you are limited to two point calibrations using oxygen or inert gas for 0 ppm, and atmospheric air for (as of today) 414 ppm. Again, for sub use you need to measure up to 5000 ppm, so this is technically not a proper calibration, because it extrapolates. To properly calibrate this sensor would require sourcing a certified high accuracy reference standard calibration gas, ideally at 5000 ppm CO2, so that your sensor calibration is entirely within the range of the calibration standard, and the three point calibration again provides you with an estimate of minimum error in the measurement.

This may not matter to most of you /us, but I point it out just for the sake of information. It is always good to be aware of the limitations of measurements. An instrument which reads to three decimal places can instill false confidence in its accuracy if the error in that measurement is on the order of the first or second.

Sean
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.whoweb.com/pipermail/personal_submersibles/attachments/20201127/9912a872/attachment-0001.html>


More information about the Personal_Submersibles mailing list