Measurement Uncertainty is Overrated

Started by Duckbutta, 01-18-2009 -- 23:00:03

Previous topic - Next topic

flew-da-coup

I used to work for Sypris and they are legit. They have their ducks in a row. As for SIMCO I wouldn't work for them. I know nothing about Davis.
You shall do no injustice in judgment, in measurement of length, weight, or volume.Leviticus 19:35

Kalrock

I did work at Transcat in Houston and I thought they did a really good job the Uncertainties were exactly what they were suppose to be.   

Uncertainties aren't based upon the individual tech because honestly that's not a quantitative measurement and that's what calibration is mostly about.   Also as far as the Houston lab goes there was only one tech that was brought in that didn't have military cal training and he was brought in as a meter beater, but we trained him as good as your going to get.   

I think your problem is kind of like the problem I had when I worked for Davis after that.   I found it to be exactly like you were describing Transcat, but I know why it's like that and it's because of absolutely horrible management.   So don't blame your dislike for Uncertainty system because of a bad experience at a lab.   There are bad labs and good labs all over the place just like there are bad techs and good techs.   sh!t I knew techs when I was with the service that weren't worth a bucket of piss and I'm sure it's the same know matter where you go.   

So don't get mad at the system try to change it.   I sounds to me like your suggesting there should be more training and if that is something you think is pertinent then I agree.

I hope that was more on subject for you.

Thanks.

Duckbutta

Coup, I'm with you on Simco. Anyone with an ounce of integrity would draw the line there. I would rather mow lawns than work for them. I'd still have my dignity and the pay would be about the same.

Putting the quality of Simco labs aside, what's the deal with their "certs"? The format is atrocious, and with a logo so archaic, shouldn't they be printing them on parchment paper? Come on, Simco! At least look the part.

Duckbutta

#18
Kalrock,

Please don't try to educate me about uncertainty analysis. I forgot more about it than you know. The process is a significant contributor of error in any measurement. Inferior technicians perform inadequate processes that are chock full of sources of error. A 10 uV uncertainty on paper can quickly become a 100 uV uncertainty in reality. And anyone that doesn't consider the process in their analysis is either lazy or incompotent. Probably both.

Kalrock

Well then how is anyone suppose to know what kind of uncertainty they have?  I'm just saying if you think that there is a problem what is your solution because otherwise it just sounds like complaining.   Unless you want some answers from the community.   In that case the only thing that comes immediately to mind is standardizing the industry and that could mean everyone being required to have an ASQ cert.   I mean hundreds of other industries require some kind of standardized test to check for competence so that could be ours.

Duckbutta

#20
Google "Type A Uncertainties" and then get back to me.

Kalrock

Look I know how to do Uncertainties, but that still doesn't factor out the human error element that you keep talking about.   I mean Uncertainties are really just statistics and statistics can be made to say whatever you want.   I thought that was one of the reason behind your post.   

Wilk

Quote from: Kalrock link=topic=1183. msg12199#msg12199 date=1232724288
Look I know how to do Uncertainties, but that still doesn't factor out the human error element that you keep talking about.    I mean Uncertainties are really just statistics and statistics can be made to say whatever you want.    I thought that was one of the reason behind your post.    

Kalrock,
I was actually going to try help you out untill I saw that post.   It pretty much proves you don't understand uncertainties, or the reason's we use them.

Human error will alwalys be a factor of measurement uncertainty.   It will always be a portion of all type A testing.   And it will always be tested in PT's which are a requirement of all 17025 labs.   Also when you get away from calibrating 77's with 5700's the human error can be one of the largest contributions to uncertainty.

As far as statistics goes,  try making them say whatever you want, and getting them by a 17025 assessor.   See the magic behind real statistics is that they are only as good as the data that goes into them, and the guy doing the math.    Crap in, crap out theory.   With real data, I don't care what statistical meathod you chose to use, you will come up with answers that are very close to each other, or you did the math wrong.   Trust me that there are no real statistical meathods out there that will give you answers way out in left field if they are done correctly.

As far as different companies go, compare the scopes.   They are all pretty much on par.   BMC's, and qaulity of work are 2 quite different things though.   And remember contract review is a 50/50 responsibility between the vendor, and client.   Did you ask for, and pay for what you wanted.   Our did you just blindly send something in for cal? 

flew-da-coup

Quote from: Wilk on 01-24-2009 -- 15:12:45
Quote from: Kalrock link=topic=1183. msg12199#msg12199 date=1232724288
Look I know how to do Uncertainties, but that still doesn't factor out the human error element that you keep talking about.    I mean Uncertainties are really just statistics and statistics can be made to say whatever you want.    I thought that was one of the reason behind your post.    

Kalrock,
I was actually going to try help you out untill I saw that post.   It pretty much proves you don't understand uncertainties, or the reason's we use them.

Human error will alwalys be a factor of measurement uncertainty.   It will always be a portion of all type A testing.   And it will always be tested in PT's which are a requirement of all 17025 labs.   Also when you get away from calibrating 77's with 5700's the human error can be one of the largest contributions to uncertainty.

As far as statistics goes,  try making them say whatever you want, and getting them by a 17025 assessor.   See the magic behind real statistics is that they are only as good as the data that goes into them, and the guy doing the math.    Crap in, crap out theory.   With real data, I don't care what statistical meathod you chose to use, you will come up with answers that are very close to each other, or you did the math wrong.   Trust me that there are no real statistical meathods out there that will give you answers way out in left field if they are done correctly.

As far as different companies go, compare the scopes.   They are all pretty much on par.   BMC's, and qaulity of work are 2 quite different things though.   And remember contract review is a 50/50 responsibility between the vendor, and client.   Did you ask for, and pay for what you wanted.   Our did you just blindly send something in for cal? 

Spot on.
You shall do no injustice in judgment, in measurement of length, weight, or volume.Leviticus 19:35

Kalrock

Quote from: Wilk on 01-24-2009 -- 15:12:45

Kalrock,
I was actually going to try help you out untill I saw that post.   It pretty much proves you don't understand uncertainties, or the reason's we use them.

We use TURs instead of TAR because they provide us and our customers with a more reliable and accurate understanding of our readings.  This isn't just for them, but for us as well because if you don't know that your standard and is any better than your UUT then your practically just doing a lick stick job.  Some customers like some I&E guys don't give a rats ass about TURs, but some like engineers need to know with confidence that there reading are spot on.  If you think different or have a better way of saying it or whatever go ahead and let me know.  I'm only here to learn.

Quote from: Wilk on 01-24-2009 -- 15:12:45

Human error will alwalys be a factor of measurement uncertainty.   It will always be a portion of all type A testing.   And it will always be tested in PT's which are a requirement of all 17025 labs.   Also when you get away from calibrating 77's with 5700's the human error can be one of the largest contributions to uncertainty.

I do understand uncertainties and why they are used.  I was trying to understand duckbutta's point.  I felt that his real gripe was with an individual lab that was run poorly.

Quote from: Wilk on 01-24-2009 -- 15:12:45
As far as statistics goes,  try making them say whatever you want, and getting them by a 17025 assessor.   See the magic behind real statistics is that they are only as good as the data that goes into them, and the guy doing the math.    Crap in, crap out theory.   With real data, I don't care what statistical meathod you chose to use, you will come up with answers that are very close to each other, or you did the math wrong.   Trust me that there are no real statistical meathods out there that will give you answers way out in left field if they are done correctly.

As far as different companies go, compare the scopes.   They are all pretty much on par.   BMC's, and qaulity of work are 2 quite different things though.   And remember contract review is a 50/50 responsibility between the vendor, and client.   Did you ask for, and pay for what you wanted.   Our did you just blindly send something in for cal? 

I think your right I know that you can change your k factor to give the appearance of a better TUR and I don't think it's technically wrong as long as you state it, but like you said it's a 50/50 responsibility.  Some customers want a sticker and some want their equipment calibrated.

Anyways I need to state that I don't have any problem with Uncertainties and their uses.  I was just trying to understand Duckbutta's problem.  If I'm wrong about something that's fine nobody knows everything and I sure as hell wouldn't claim to.


Wilk

flew-da-coup,
Thanks, that was really the best way I could find to sum that up.

Kalrock,

If you where just playing Devils Advocate with Duckbutta then cool.   I wasn't trying to start some childish flame war or anything.   It just seemed like the conversation was getting a little off into left field from the truth.   If you are looking to learn, google is free and all the information is there.   Most all ISO documents have "free look alikes" out there that have the info needed to learn about this.   Past that look for document's, not opinions.   Opinions are everywhere, and most of them in this industry are biased for one reason or another.   I try to stay unbiased, but its tough so I just stay away from aurguments that aren't supported by documentation.

NIST, NCSLI, and ILAC's websites probably are some of the best sources for free info IMHO.   Specifically ILAC as they are the ones who set the rules for the 17025 accrediting bodies. 





skolito

Just out of morbid curiosity how many of your customers actually use your readings and uncertainties in their process?????

I would say 5% if that. Most of the calibrations I do are for ISO 9000 requirements and the customer doesn't care what the reading is they want to know was it in or out, but we are a working lab not a standards lab. A standards lab is another thing in itself and I want my uncertainties from my standards spelled out in front of me when I calculate mine. as far as a Fluke 87 the 5500/5520 cannot do the capacitance at a tur of 4:1 so just use standard caps that's what we do. Uncertainties have their place and we use them as such but if a customer sends a 1.5" 300psi 5% pressure gauge that cost him $15 and he wants it done 17025 with uncertainties what is the freaking point other than $100-$150 bucks for my company its useless for them because it on a production line and isn't even looked at until I cal it the next year. Its their auditor or my biggest headache "UL" causing him to have it done.

Wilk

That is going to vary greatly from lab to lab, and customer to customer.   Uncertainties on 4% devices doesn't do much other than help establish traceability.   On better equipment though proper use of uncertainties can help you realize that certain equipment can be used at accuracy levels well within the stated specifcations of a device.   Most reference standards are a great example of this.   A nice reference shunt for example could carry a specfication of 0. 1%.   When measured at the right lab, and by using the appropriate measured values and uncertainties you can easily make measurements well within 0. 05%.   

Your calibration of a 77 with a 5520 is a great example of this also.   If your 5520 shows that every time you use it, or have it calibrated that capacitence value is repatable within half of its spec, and the calibration lab you use has decent measurement uncertainies than you very well could be 4:1.   With stock specs you are about 3. 5:1, so if the 5520 is testing within 75% of the spec, you could be good to go.   This is the value add of uncertaines to a calbration lab.   Correct use of uncertainty information can prove that your standards can be used dramatically more accurate levels that the maximum allowable specifcations provided by the MFR.   It can also prove that the MFR specifcations are way to tight for that instrument unfortunately.         

Proper use of uncertainties is becoming more, and more nessecary to meet the calibration requirements of end user equipment because the manufactors are now putting what used to be metrology grade specs in the end users hands.

I can't really put a % number on the equipment that clients are using uncertainties, but I would bet big money it is way above 5%.   Alot of calibrations produce correction factors as data versus in or out of tolerance.   Accelerometers, microphones, load cells, pressure trancsducers, and so on.   Its a pretty big list.   Anyways, all of those types of equipment must be used with calibration factors, and thier uncertainties to be utilized correctly.   



   

CalLabSolutions

The industry is changing...

We can no longer live by 4 to 1..  We are quickly approaching an accuracy floor, because field test instruments are becoming more and more accurate. 

We just did a procedure for a Yokagawa WT3000 Power Analyzer.  Our customer wanted to test this unit (as best they can) using the standards they had in their lab (5720, 5520, 5790 ext.).  We had to drop some test points like 1A at 1MHz and we were only able to get 2 to 1 in some points. 

We also did a procedure for the Agilent N9340A\B Spectrum Analyzer.  The Scale Fidelity test on this handheld unit rivals the ESA's specifications.  We had to test this just like we the PSA's Scale Fidelity.

I think these instruments are a good indication of the future.  Field test equipment will be almost as accurate as our lab standards. And all we will be able to do is test it as best we can, state our uncertainties and move on..

Mike....


Michael L. Schwartz
Automation Engineer
Cal Lab Solutions
  Web -  http://www.callabsolutions.com
Phone - 303.317.6670