Quote:
Originally Posted by sperry
No, I get the difference between accuracy and repeatability.
But for a set of calipers, where accuracy is usually the desired trait, why does the display have a .1mm resolution if the instrument is only accurate to .2mm? Shouldn't the last digit always be a multiple of .2mm to give the most accurate measurements?
|
It's even more pronounced on the inch side - resolution to .001" with accuracy only to .005".
Kinda like I tried to explain, the electro-mechanical system in the instrument can accurately distinguish a movement as small as .001" so that is what they set the resolution to. But the overall accuracy of the instrument, probably due to it's length and maybe it's user-zeroing method and user-thumb-pressure-dependent measurement method, is only good for +/-.005". For some instruments the accuracy is spec'd as a per-length spec - like .001"/inch.