Part 2 (You're there)
So your Agilent 34461A is sitting on your desk all set-up and running. What now?
Well, now you use it.
Lets go through a few of the most common ways that you might use your new multimeter.
Each of the six white buttons has one or two easy to access functions. Secondary functions, if present will be represented by placing it in square brackets [Like this]. The functions can be accessed by hitting the blue oval shift button, and then the appropriate white button.
Left to right we have:
- Top Row:
- DCV/[DCI] (DC Voltage and Current)
- ACV/[ACI] (AC Voltage and Current)
- Ω2W/[Ω4W] (2 and 4 Wire Resistance)
- Second Row:
- Freq (Frequency)
- Cont/[Diode] (Continuity and Diode check)
- Temp (Temperature)
I mentioned in the previous blog that I checked the multimeter was working by measuring between the +5V and GND pins on a USB cable attached to the computer. That was done using the DCV setting.
Next to me I have a circuit used to create an adjustable servomotor control signal.
Let's see how easy it is to check how well the multimeter works with circuits.
First we'll yank out the chip and have a look at the resistor values only (since the 61A can't measure capacitors directly).
The table will list the colour bands on the capacitors, what that means, and then show the value measured with the resistor in-place.
I'll use the following convention for naming:
Left to Right we have
|P Gre Bla R
|Or Or Bla R
|Br Bla Bla R
|Br Bla Bla R
|Br Or Bla R
|Stated on Pot
note 1: The last three resistors are connected in a more complex way and so have been ignored here as I don't really want to pull the circuit apart.
All resistors are ±1% and so have measured values that are within the specified range.
After measuring a few simple things around the desk (a battery USB charger thing and a R/C aircraft connections) I thought I'd try something that I am more likely to accurately know the values of.
I grabbed a National Instruments NI-USB 6008. As well as being a DAQ device, this also has a 12bit 0-5V analogue output. Being 12bit, we have 4096 possible values across that 5V range which gives an output resolution of approximately 1.22mV.
To keep the numbers nice for the testing, I instead picked a value of 1.25mV as the voltage resolution.
Using LabVIEW I created a simple VI which slowly incremented the voltage from 0V to 5V in 1.25mV steps. After each step, LabVIEW then read the output of the Agilent 34461A (see next blog post for how to do this) 20 times. This allowed both a test of output of the NI-6008, a simple test of the voltage reading of the 61A, extended run-times, and the USB communication of the 61A.
These readings as well as an average were then exported as a CSV file.
Importing this into excel allowed me to plot the data to see how it looked.
It's easily observable that the maximum "error" was about 6mV. Observing the specifications for the 6008 device, we can see the following "Absolute Accuracy (no load)" values:
|Maximum at full scale
Given that, it seems like the values read from the 61A match what was expected quite well (and at least shows that the device appears to work).
I didn't increase and then decrease the voltage to check hysteresis, but it'd be simple enough to do if required for your purpose.
The actual test took around 10 hours to run (so I left it overnight). When I returned, the voltage increasing had stopped, and the 34461A gone into a "screen dim" mode which is pretty cool. The NI6008 was still flashing away happily.
I did a similar test but with a 16bit device instead (PCI 6221). Although this device could output +/-10V, I decided to keep the save 0V-5V range, and keep the same voltage set points.
Due to a slip-up on my part, I tested in 1.5mV steps instead of the 1.25mV steps as with the 6008. However it was still useful despite only having about 75% of the number of data points, and I wasn't going to do another overnight test and all the coordination that it involved.
Put simply, there was about a factor of 10 reduction in the error. With all errors being less than 1 mV (ranging from -0.1mV to ~+0.6mV). It does however show that it is a more accurate device than the 6008.
The first image below shows both errors on the same scale, the second shows each with their own scale.
One interesting thing to note is that both devices seemed to follow the same basic up-down trend (apart from around the 4V region). I'm guessing this may be temperature related, but I can't be sure without testing (and I can't be bothered doing that).
Overall, the 34461A works as both an everyday testing device, but also one suitable for remote monitoring.