How to calibrate a Data Logger
Title: How to Calibrate a Data Logger/Data Acquisition SystemAuthor: Dr Duncan James, creator of Tcal, a specialised data-logger calibration process.
As the author of Tcal I am often asked: "How do you calibrate a data logger." Mainly I am asked by customers who are simply intrigued by what I am doing during an onsite visit.
If we exclude some of the more specialist instruments, most data loggers simply measure an input voltage. Unsurprisingly, the basis of every calibration technique I have ever seen is to input a known voltage and check the instrument records the correct value. Depending on the thoroughness of the calibration technique this may be done for a single positive voltage, also for a zero voltage, also for a negative voltage and sometimes for a series of voltages over the entire range. In general, the more voltages that are tested, the better.
That is pretty much it. Timebase is easily done with a known-frequency input. Stability is effectively already tested as long as the calibration data collection takes, say, an hour to do. Sometimes the response to a fast-changing signal is important. In my experience this has been accounted for during the design of work procedures before the instrument was even bought. In the rare cases that response time has to be part of the calibration process I personally recommend recording the response of the instrument to a step change in voltage.
A problem with modern technology is that there can be so many channels and ranges that the number of individual measurements can number over 6,000 for a 5-point calibration of a 100 channel instrument. I have heard other calibration experts conclude that "therefore a complete calibration is impossible so it is necessary to simply choose a few to test properly". I believe that this is misleading because of the opportunities provided by modern, fast computers.
I have seen two main approaches to speeding up the calibration process. One approach is to use a macro or script that controls both the data logger and a voltage generator. The system can then sit and calibrate itself. A downside to this approach is that it involves a lot of software development. For an instrument-type that does not sell large numbers of units, these development costs are shared between fewer end users.
The second approach, that I am most familiar with, is to automate every step of the process except for one. A human still has to manually input a series of voltages and then save the data to text files with predictable names. The obvious downside of this approach is that it might require, say, an extra hour of staff time. However, I remain an advocate of this hybrid automated/manual approach: I think it is an example of where the time-saving of automation is significantly less than the time it takes to program! Also, a single software product (such as Tcal) can then be used for any instrument type: making comparison of results between instruments much easier.
Which modern approach is best? Obviously it depends on your personal needs. I encourage you to visit my website http://www.drduncanjames.co.uk/Tcal to find out more. There are many free resources on there which I am happy to share with all in the industry.
For sales and technical information contact:-Dr Duncan James
Phone: +44 (0) 77 5757 4942
Email: drduncanjames@gmail.com
Website: http://www.drduncanjames.co.uk/Tcal
May 2017
Home - Search - Suppliers - Links - New Products - Catalogues - Magazines
Problem Page - Applications - How they work - Tech Tips - Training - Events -