Wrong! Wrong! Wrong!

Started by CalLabSolutions, 04-14-2016 -- 17:24:41

Previous topic - Next topic

CalLabSolutions

I just had an epiphany, and realized we have been doing automation WRONG for 30 years!

We have been trying to write standard UUT scripts with flexible standards; when we should have been writing standardized test process with flexible UUT drivers.  Think about it, standardize the test process, standardize the uncertainty calculations and metrology becomes easier!  And so does automation!

I was stumped trying to explain the advantages of automation using Metrology.NET to a customer of mine when I realized in his mind what we did was backwards.  But in reality, the whole industry has been doing it backwards for 30 years.

Mike. 
Michael L. Schwartz
Automation Engineer
Cal Lab Solutions
  Web -  http://www.callabsolutions.com
Phone - 303.317.6670

N79

Care to expand on this, maybe with an example? I'm not quite sure what you're describing.

griff61

You would then be required to have flexible scripts at both the UUT and the standard ends of the process. Which is basically what we have now, in some of the more useful automated procedures, no?
Sarcasm - Just one more service I offer

CalLabSolutions

#3
I could see that working griff.. But for me the standards shouldn't be the thing we interchange..   Mostly because accredited labs put so much effort into their calibration test process and Uncertainty Calculations..

Compared to the standards the UUT's contribution to the uncertainty is resolution and repeatability..  But the standard and test process contain specs, cal traceability resolution, gauge R&R, confidence levels, coefficients, ect... ect... ect...

Instead of building a driver on the fly for the standard...
[@std]FREQ <Value>HZ
[@std]VOLTS <Value>V

We should be building the UUT driver on the Fly
[@DUT]FUNC VAC
[@DUT]RANGE <Value>
[@DUT]MEAS[I$]

But the problem is no system in the past has been built this way... 
When I started building http://www.metrology.net this was not my goal to build this specific feature.  But now that I'm writing automated calibration procedures in it... I realize it's 180 degrees different from how I have written all my automation in the past.. 
And yes.. I have to get some good examples to show people.

Mike     
Michael L. Schwartz
Automation Engineer
Cal Lab Solutions
  Web -  http://www.callabsolutions.com
Phone - 303.317.6670

CalLabSolutions

#4
Instead of writing a 34401 Procedure with a DC Volts Section..
Lets write a Source.Volts.DC  Test Process with our 5720.. With only three calls to an unknown UUT (Reset, Setup and Measure). Note this is how Metrology.NET does it. 

Now the Test Process for the 5720 is called with a simple RUN function..  It first resets every thing then for each test point it calls the UUT's Setup function passing the Test Point Data..  It then read the Test Point Data for the DC Volts value and impedance and sets up the 5720 and turns the output on.  Once stabilized it calls the UUT's Measure Function.

Now we want to do the same thing for a TDS3000 scope.  The only think we need to write is the UUT's  Reset, Setup and Measure functions..  Inside the Test Point are all the UUT's Required Settings that match its UUT Driver.  Only now instead of Range and function for the DMM.. We have Scale, Offset, Position, Impedance, Timebase and Trigger.

But nothing on the 5720 Process / Unc side was changed.
Michael L. Schwartz
Automation Engineer
Cal Lab Solutions
  Web -  http://www.callabsolutions.com
Phone - 303.317.6670

N79

I have a lot of experience too in writing calibration tools, in fact I'm working on an ambitious project right now that automates the engineering of measurement systems. Maybe it's similar to what you are trying to accomplish with metrology.net (at least I think... to be honest, and please take this as constructive criticism, its hard to tell exactly what metrology.net is trying to accomplish by looking at the website, it's so generalized that it doesn't seem to be saying anything).

If I'm interpreting what you are saying in the OP correctly, I agree. I think it's possible to automate (on the fly!) the creation of the actual measurement procedure using a pool of physical and metrological principles. Basically use the benefit of the speed of the computer to generate a procedure to calibrate or verify the specs of a UUT that finds a compromise between familiarity/ease/quickness of the calibration and accuracy of the method. Once detailed specs of all available instruments, artifacts, cables, adaptors, etc., are known, the computer should be able to spit out what can measure what and which technique that measurement should use and also, if it knows something about the commands for the remote instruments, be able to actually generate instructions for the user and assist where it can in performing the calibration. The computer should be able to determine the optimum method with the available instruments, just as a capable engineer would.

At least, this is what I'm trying to prove with my software. Is this anything like you were thinking? Or am I way off base?

CalLabSolutions

Yes.. We are now working on updating the Metrology.NET Website to make things easier to understand.

When we started the project.  The goals where..
1) Collect Data so we can use it! 
2) Be Database, Language and Platform agnostic.
3) Scalable..
4) Better Uncertainty Calculations.
5) Better Flexibility of test equipment.. 

What is on the site now is information from our Proof of Concept.  Now that we have working copies deployed we need to up the documentation.

Mike
Michael L. Schwartz
Automation Engineer
Cal Lab Solutions
  Web -  http://www.callabsolutions.com
Phone - 303.317.6670

CalLabSolutions

I know what I am doing with Metrology.NET has never been done before so it is hard to explain it.  One of the engineers using it call it indeterminate programming because when you are writing you Lab Standard drivers you don't know all the UUT's it will be testing and when you are writing your UUT you don't know what standards will be used.  So when you are developing the code it is indeterminate how the end to end calibration will be preformed. 
That is until the Metrology Engineer or Technician configures the specific UUT to be tested on a specific station using a specific set of standards.

But talking to my developers, asking them how can I explain this to people; it is really a laired architecture that divides up the workload of metrology software into several pieces than can be ran on any operating system and written in any language.  At the heart of it all is the messaging system between the layers.  This is what allows Metrology.NET to perform the data collection on one computer system and calculate the measurement uncertainties on a completely different system written in a completely different programing language.

Its a tool that allows an engineer to focus on writing the best solution in that specific software layer and other engineers to focus their expertise in a different areas.  This system allows you to use each others tools when it makes sense.  Metrology.NET add ease of integration.

Mike 
Michael L. Schwartz
Automation Engineer
Cal Lab Solutions
  Web -  http://www.callabsolutions.com
Phone - 303.317.6670

scottbp

So instead of writing separate procedures for each individual UUT (e.g. 500 or so procedures for handheld meters that can be calibrated by a particular calibrator), you write one procedure for the calibrator, then all you have to do is supply a list of test points (with acceptance limits) for each UUT, and the procedure steps through the list and takes care of the rest, right?
Kirk: "Scotty you're confined to quarters." Scotty: "Thank you, Captain! Now I have a chance to catch up on my technical journals!"

CalibratorJ

This sounds oddly familiar......

griff61

Sarcasm - Just one more service I offer

N79

I think a higher-level LabView-type platform specifically geared towards the calibration world would be a good idea. You could have a huge library of different components (instruments, artifacts, cables, adaptors, etc.) that the user could drag & drop and connect together, draw a lasso around the collection of connected components considered as the "standard", and the software would calculate the accuracy of the ad hoc standard and the TAR of entire setup. A play button would perform the verification of the test unit and all the uncertainties would be calculated at the time of measurement.

Taking this a step further (which is what I'm attempting to do now), is have the software's A.I. determine what available standards (and combinations of standards) would be best to verify a function of a test unit using a score based on a trade-off of accuracy and efficiency (i.e., if you have a test unit voltmeter, you probably would prefer to use a sufficient 5720 rather than a 732/divider as a source due to efficiency). So the software actually creates a virtual measurement system based on built-in metrological and physical principles and can come up with rather unique ways to perform a measurement, with the majority of solutions resembling a rube-goldberg machine, but is also smart enough to determine the most efficient and sufficient method to verify the entire instrument. Some of the algorithms are based on neural networks, but constrained due to someone (me) deciding how efficient certain measurement methods are.