Maybe you were away from college on the day they did instrumentation theory: you know accuracy, resolution, repeatability and all that stuff. You are in good company – many engineers have either forgotten or have never really understood this area of engineering. The terminology and fairly esoteric technical concepts applied to instrumentation are confusing.
Nevertheless, they are crucial to selecting the right measuring instruments for your application. Get the selection wrong and you could end up paying way over the odds for over specified transducers; get it wrong the other way and your product or control system may lack critical performance.
This article focuses on position transducers and explains some of the terminology; key considerations of specifying appropriate instrumentation for your application and some common pitfalls.
Firstly some definitions:-
- An instrument’s Accuracy is a measure of its output‘s veracity
- An instrument’s Resolution is a measure of the smallest increment or decrement in position that it can measure
- A position measuring instrument’s Precision its degree of reproducibility.
- A position measuring instrument’s Linearity is a measurement of the deviation between a transducer’s output to the actual displacement being measured
Most engineers get their knickers in a twist about the differences between precision and accuracy. We can explain the difference between accuracy and precision using the analogy of an arrow fired at a target. Accuracy describes the closeness of an arrow to the bullseye.