Voltage measurement and sourcing

Started by Xrayvissle, 04-26-2010 -- 15:29:02

Previous topic - Next topic

Xrayvissle

Just had an intersting subject occur at work and would love some feedback on it.  When using a caibrator or any test equipment that makes a voltage, both AC and DC.  The subject was line frequency distortion, or something resembling that, and how it affects AC voltage measurement and sourcing.  If anyone can point me in the direction of white-papers, proof, tech reference material, anything would be a help.  I've come the the understanding that the only time line frequency would become an issue with measurement was when testing power (3 phase) or when calculating Power Factor, not when testing a handheld DMM.  Anyone have any answers?

Bryan

Not really what you are looking for but the Agolent test procedure for the 8360B/L series sweepers has a test for line related spurs, uses a spectrum analyzer to look at 50 or 60 Hz offset of the carrier, test calls for running the spec an off an inverter @ 55 Hz.  I suppose it would check plumbing integrity or componenets breaking down in the power supply.  Have not seen a bad one so have never diagnosed.  Seems like I have seen similar with other sources once upon a time.

Xrayvissle

Quote from: Bryan link=topic=1625. msg14865#msg14865 date=1273023059
Not really what you are looking for but the Agolent test procedure for the 8360B/L series sweepers has a test for line related spurs, uses a spectrum analyzer to look at 50 or 60 Hz offset of the carrier, test calls for running the spec an off an inverter @ 55 Hz.   I suppose it would check plumbing integrity or componenets breaking down in the power supply.   Have not seen a bad one so have never diagnosed.   Seems like I have seen similar with other sources once upon a time.

Kinda in the ballpark of what I'm looking for.  In actuallity, a tech at my shop stated that the calibrator we use has a degraded accuracy when testing instruments at AC 60 Hz.  Says the mfr recommends not even using the 60 Hz frequency because of possiblity in line jitter.  B. S.  in my book.  If thats the case, every electrical parameter ever measured by any calibrator at 60 Hz would be rediculously inaccurate.  Due to line jitter.  Can anyone else back this up as true of false.  (calibrators in question are Fluke 55XX's)