I'm having trouble truly understanding how to apply calibration tolerances to various makes and models of load cell/strain gauge based weight scales.
I found and downloaded the NIST Handbook 44 - Specifications, Tolerances, and Other Technical Requirements for Weighing and Measuring Devices but really that thing is so full of superfluous information I've not been able to really extract what I need out of it.
For example, a Mettler-Toledo Wildcat WS60LVR Floor Scale. . . 60kg span, 0. 02kg resolution. . . but what the heck's the accuracy? Is it percentage of fs/iv based? Is it a simple +/-kg value?
I don't know, but I'm really trying to do this "right," instead of just going "Eh. . . I'll calibrate it to a tolerance of +/-0. 04kg and call it a day. "
Any help would be appreciated. . . :-D
Does it have a class number, like III, IIIL...etc?
That one is Class III
I worked with Mettler Toledo for years and had many of the same hang ups you had. What I used to do was gnats azz every calibration. Then, I knew it was good. Most scales take little or no time to cal, but you alreqady know that.
I got with one of our engineers and he summed it up like this, the indicator and the cells will have a cumulative error percentage. Depending on the class of scale (Regrettably Hand book 44 is the bible on this. . . . . . . . <shaking head and weeping silently>)
I learned early on with scales that the best thing to do is to do a complete cal, snug in the load cells and get the cal as tight as possible. Then there is no real argument. Not much of an answer I am afraid. But take the listed error for the cells, and the error for the meter. That will ball park you.