This chapter will cover a number of instrument performance tests that I conducted to ascertain the capability of the instrument and verify its specifications compared to its datasheet. This is perhaps the most exciting part of the review, not least because it was accomplished through a mixture of using WaveForms, the WaveForms SDK and pyvisa together with a bunch of BNC/banana adapters and the Keithley 2450 SMU from a prior RoadTest as the most accurate instrument I have for generating and measuring voltages. Standby power measurements were made with the Tektronix PA1000 Power Analyzer, also from a prior RoadTest. Note that no user calibration was undertaken – the values were recorded using the factory calibration of the ADP3450.
Oscilloscope Input Accuracy
The first test concerns the accuracy of the oscilloscope input channels. These input channels are the heart of the measurement system, powering the Scope, Logger, Spectrum, Network, Impedance and Tracer modules. As a result, their accuracy is very important and this seems to be a part where the Analog Discovery Pro takes a bit of a divergence from the industry norm of faster sample rates, instead preferring higher 14-bit resolution and a lower sample rate, possibly to ease data processing demands.
To test the accuracy of the oscilloscope inputs, all four inputs were wired together through the use of four BNC T-connectors. To one end of the chain, a BNC to banana socket adapter was used to connect the Keithley 2450 SMU which would be generating voltages to be measured. The other end was connected to another BNC to banana socket adapter, connected to the sense inputs on the SMU. While the current is low enough that voltage drop is unlikely an issue, such an arrangement guarantees accuracy and ensures the chain is unbroken, otherwise a sense error will occur.
Using the WaveForms SDK, for each test point, 32,768 samples were recorded from both channels. The average of 32,768 samples was used as the representative value, however, the peak-to-peak amplitude of the noise in the 32,768 samples was also recorded as one way to understand the noise level of the channel (and the SMU, although unlikely). The data was analysed to take the difference of the measured value to the read-back value from the SMU which offers the highest accuracy I have (about 6-digit DMM equivalent). This was performed on both the 1V and 25V ranges offered by the ADP3450, although testing only went up to 20V due to the SMU’s ranges and accuracy considerations.
In the 1V range, it seems that with the exception of Channel 2, all channels seem very similar on their readings. The error for those channels over the full scale ranged about 1-2mV in total and is relatively consistent across the full voltage range. Channel 2 seemed to have a consistent gain error, which resulted in an offset of about 0.7mV and gain error up to around 2mV. Compared to the ±0.5%±10mV specification for this range, the results show a better performance (in part due to averaging away the noise).
The noise contribution is flat across the board except for one small spike that may be due to a mains power disruption. The peak-to-peak noise floor rests around 20mV, however, this is probably from the SMU.
Repeating for the 25V range shows a similar characteristic where Channel 2 was the least-closely matched channel. The channel offsets ranged from -25mV to +10mV with an overall error ranging from -100mV to +115mV. This also compares favourably to the specification which promised ±0.5%±100mV.
The peak-to-peak reading noise was also relatively flat throughout, with a floor around 110mV.
Oscilloscope Input Noise Floor
The most traditional method of quantifying an oscilloscope’s noise floor is to do an all-channels-open or all-channels-terminated test. I decided to do this with all terminals open as this was the easiest to achieve.
With the setting at 1mV/div which presumably has the front end in the 1V range, the channels showed a mean peak-to-peak noise around 4.25mV and an RMS noise of about 0.53mV. The theoretical quantisation noise for an ADC of 14-bits (16,384 levels) at the 1V range (-1V to 1V = 2V span in the ideal case) equates to 0.122mV. This implies that the inputs have about 9-bits of clean resolution in this range.
Engaging the 20MHz bandwidth limiter reduces the mean peak-to-peak noise to 3.99mV and the RMS noise to 0.50mV which is only a slight improvement.
Terminating the first channel into a 50-ohm BNC terminator did not result in any appreciable change to the noise readings, although I found shorting the inputs on the probe itself did result in increased noise levels due to pick-up in the probe and shorting loop.
With the unit set to a range of 500mV/div which presumably configures the front-end in the 25V range, the channels had a mean peak-to-peak noise level of about 107mV and an RMS noise level of 13.4mV. The theoretical quantisation noise at the range would be 3.05mV. This implies the inputs have about 9-bits of clean resolution in this range as well, superior to most general 8-bit oscilloscopes. Unfortunately, this also implies that the 14-bit headline figure may not be fully available – but this is not unusual, as many oscilloscopes have a smaller “effective” number of bits due to the contribution of front-end noise.
A look at the noise in the frequency domain shows that it seems to be concentrated in a shelf in the <500kHz region.
Oscilloscope Input Frequency Response
Unfortunately, I don’t have any wide-ranging signal generator to use to test the frequency response, so I have to rely on the ADP3450 to judge itself using its own Wavegen outputs. In this case, I cabled each of the inputs in parallel and as they are high-impedance, fitted a 50Ω terminator to the end of the chain to dampen any signal reflections. Running the Network module gives the following plot in absolute mode, which shows slight deviations between channels and descending amplitude with frequency which is the combined effect of the Wavegen output and Scope input behaviours (with a tiny bit of cable loss potentially).
Switching into relative mode and using the first channel as the reference allows us to compare channels to each other. Repeating this with reversed order of connections allows us to confirm the result.
There are minor channel frequency response differences up to the 25MHz limit offered by the Network module (due to Wavegen output limitations) but the channels are all within 1.8dB of each other at 25MHz. The true input bandwidth limitation is not measured, however, using the 5-samples-per-wave approximation which is often recommended, the 100MSPS default mode is probably best suited to 20MHz and below, while 125MSPS can probably be good for 25MHz. Above this, repetitive-offset sampling would be required to build a better image of the waveform even though the Nyquist frequency would usually imply 50MHz or 62.5MHz may be possible due to phase alignment issues although the datasheet does specify a bandwidth of >55MHz.
Input Skew
Attempts to measure input skew were not particularly fruitful perhaps in part to the good synchronisation of the ADP3450 and the relatively limited sample rates.
Measurement of the probe compensation signal from another oscilloscope in parallel across all four channels, zoomed into the individual sample level showed no measurable offset. Despite the probes being in 10X mode and the compensation being manually adjusted, the readings from all channels are virtually identical.
Attempting to see further in, I chose the maximum oversampling for best time resolution and could not clearly identify any channel-to-channel skew as all traces overlapped one another, save for some noise. Trying this with the digital channels also did not yield any interesting results with perfect time alignment being observed across all sixteen inputs. I suspect this is because all instruments share a common time-base.
Wavegen Output Voltage Accuracy
The second key component of the ADP3450 is the Wavegen arbitrary waveform generator outputs. These are used in the Wavegen, Network, Impedance and Tracer modules and accuracy is desirable although perhaps not as important as the input to the circuit is measured in the majority of the modes. Regardless, the output of each Wavegen was tested separately across its full range with the Keithley 2450 SMU at 10PLC being the judge.
Overall, both AWG channels seem well matched with an error that ranged about 10mV either side. This compares well with the datasheet specification ±5%±10mV at <=1V and ±5%±25mV at >1V.
Wavegen Output Frequency Response
Testing the frequency response of the Wavegen output at 100mV amplitude using the ADP3450 revealed the following characteristic –
The 3dB bandwidth was about 16.6MHz which is above the 15MHz claimed in the datasheet, however, this was directly into an oscilloscope channel which is a high-impedance load.
However, using the instrument to judge itself may not be a fair test, so I also repeated the test with the output set to 1V amplitude hooked into my Rohde & Schwarz RTM3004 Oscilloscope with the termination set to 50-ohms to more fairly replicate the conditions expected when actually driving a matched load.
The results suggest that the 3dB bandwidth measured around 12.7MHz which is less than the claimed 15MHz on the datasheet. It also has a different shape to that of the above test – perhaps it cannot achieve the result when driving a 50-ohm load. The result is virtually identical for the second Wavegen channel with a bandwidth of 12.9MHz.
Just to be sure, I tried again, but with 100mV amplitude in case it is slew-rate limited, but the results were virtually identical.
I also decided to try again but with a 5V amplitude as there are separate ranges involved, in case the high-range has a better frequency response into 50-ohms.
Unfortunately, and as expected, the bandwidth was actually lower at a 5V set amplitude, reaching 10.4MHz. Just to be sure of the results, I re-tested using high-impedance mode.
The 3dB bandwidth has increased to 16.4MHz which is very similar to the original measurement. As a result, it seems that the full bandwidth is not available when the output is loaded.
Digital I/O Voltage Output Accuracy
The digital I/O voltage forms the Supplies module, but also affects the voltage threshold for digital signalling and decoding as used in the Logic, Patterns, StaticIO and Protocol modules. The accuracy of the supply is important as it will affect these modules, but also may affect users who intend to use it as a voltage-adjustable power supply for small loads.
At open-circuit, the output voltage setting appeared to be very granular with an offset of about 8mV to 17.5mV low, which is quite acceptable.
Digital I/O Voltage Output I-V curve
However, one may wonder what happens when the output is put under a load, so I also did an I-V surface sweep using the Keithley 2450 SMU. As the maximum sink current the 2450 supports is 1A, the tests topped out at 1A even though the ADP3450 officially only supplies 300mA on the outputs. It seems there is no over-current protection active on the digital I/O power supply.
The voltage loss as a function of increasing load is steady, seeming to be resistive in nature. There is a slight dip at 2.7V for some reason, but this is evidenced in the source data. At a loading up to 300mA, the expected voltage drop is about 128mV below the requested voltage which may be tolerated. At 1A, this increases to 400mV below the requested voltage which is likely to cause some circuits to malfunction.
Digital Input Threshold
The digital input threshold voltage is the voltage at which the input changes to a one or a zero and this exhibits a range and some hysteresis. To test this, I set the digital I/O voltage to several common signalling standard voltages and measured the threshold in both upwards and downwards directions. The “forbidden” zone is enclosed between these two borders where some bits may be either zero or one or undefined.
The threshold voltage uniformity across all channels seemed to increase as the voltage decreases, however, the size of the forbidden region increases as the threshold voltage decreases. The threshold voltages appear to be compatible with most logic families, and the values are roughly in line with those displayed in the WaveForms app.
Digital Input Switching Rate
To test the digital input switching rate, I connected the output of my old Nexys 3 clock generator from my previous oscilloscope review into the inputs, starting with 50MHz at the lowest pin number (as the sampling clock was set to the default 100MHz).
The digital input was not resolving the 50Mhz signal too well, despite changes to the threshold. This is not unexpected because of phase alignment and also because the FPGA may not be able to generate a sharp edge or full swing. The 25MHz signal is better resolved although asymmetries definitely appear – as usual, the recommendation of five samples per cycle still holds, as the 12.5MHz signal comes out very clearly.
Slower rates, of course, come out just fine although the memory buffer limitation of 32,768 samples at 100MHz means an observation time of 327.68µs. A slightly better result may be achieved with the system clock set to 125MHz.
Standby Power Consumption
Standby power was measured using the Tektronix PA1000 Power Analyzer and PWRVIEW software. Power was supplied from a pure sine-wave inverter source, trimmed through a Variac to within 1% of 230V as required by IEC 62301.
With the power supply alone, the standby power consumption is a low 163mW which is great considering the power supply has a power indication LED which probably consumes a good portion of this. Plugging in the ADP3450 but with the power switch in the off position, the standby consumption increased to 176mW which is still very low. This means leaving the unit plugged in but turned off using the side toggle switch is a very economical proposition.
Having the power switch turned on, however, even in the Standard USB/Network mode but with no cables plugged in, results in a power consumption of about 7.18W which is significantly more, but is still relatively small compared to big box instruments which can consume three to ten times this amount. It is comparable to a charging smartphone, although it likely increases when the instrument is busy.
Input Voltage Range
Even though you should use the ADP3450 only with the supplied power supply for safety reasons, as its output is grounded, for those who might want to use the ADP3450 using other power supplies (e.g. for field use on laptop-power banks, or adapted USB-C PD supplies), I’ve tested the input power range and consumption at idle.
The ADP3450 is quite picky with power and it seems to have a power monitoring solution that will only allow the unit to start up reliably between 19.3V to 20.3V. After it starts up, it will remain running as long as the voltage is within 19.0V to 20.4V, otherwise it will shut down the unit. It is good to see that the unit will try to take care of power anomalies (e.g. from a faulty power supply) and the operating range makes it (potentially) possible to adapt a USB-PD cable for the unit. Current consumption suggests a switching-converter architecture internally, with current ranging from 0.32A to 0.35A at idle in Standard mode.
Conclusion
I put the ADP3450’s specifications to the test in a barrage of checks. Testing the input accuracy of the oscilloscope inputs, I measured the voltage measurement error over 32,768 samples averaged. The results at 1V range show that Channel 2 seems to be a little mismatched with the remainder of the channels, however, all channels were able to fit within about 2.5mV of error which is better than the specifications imply, in part due to the averaging. At the 25V range, the error remained within 115mV which is also better than implied by the specification. Reading noise in the 1V range was about 20mV, likely in part due to the Keithley 2450 SMU’s output noise, while in 25V range this was about 110mV.
Oscilloscope input noise was measured with open-circuit inputs. In the 1V range, the channels had a mean peak-to-peak noise of 4.25mV and RMS noise of 0.53mV. In the 25V range, the channels had a mean peak-to-peak noise of 107mV and an RMS noise level of 13.4mV. In both cases, engaging the 20MHz bandwidth limiter had negligible effect on the noise level and both showed about 9-bits of clean resolution which suggests the full 14-bit resolution headline figure is probably not attainable. This is perhaps not unexpected as many oscilloscopes do have a smaller “effective number of bits” than their ADCs would imply and the result is superior to general-purpose 8-bit oscilloscopes. Given this design seems to prioritise signal quality (i.e. higher bits) over higher speed (i.e. higher sample rates), and with a limited number of true hardware ranges, it seems a little unfortunate that the noise level was not a bit lower. Input skew was immeasurable, likely in part due to a single system clock source and also the relatively “slow” clock compared to other instruments.
Input frequency response was within 1.8dB of each other when tested using its own Wavegen. Wavegen voltage accuracy was also good, besting its own specifications by having a measured error of about 10mV across the full range. Output frequency response was measured to be about 16.5MHz when into a high impedance load, besting the datasheet claim of >15MHz, but diminished when connected to a matched 50Ω input with a bandwidth of about 12.8MHz in the low range at 1V set amplitude and 10.4MHz in the high range at 5V set amplitude.
The digital I/O voltage output when unloaded had an error ranging between 8mV and 17.5mV low. When loaded, however, it dips around 128mV under the full rated 300mA load and continues to output even under 1A load and a 400mV voltage drop indicating there seems to be no over-current protection active. The digital I/O input threshold depends on the I/O voltage. Testing showed the threshold voltage uniformity across all channels seemed to increase as the voltage decreases, however, the size of the forbidden region increases as the threshold voltage decreases. The threshold voltages appear to be compatible with most logic families, and the values are roughly in line with those displayed in the WaveForms app.
The digital input switching rate was checked with an Nexys3 board programmed as a clock generator as per previous reviews. It seems that at the default system clock of 100MHz, the 50MHz signal is not reliably captured but a 25MHz signal can be captured with significant asymmetries. It seems the rule-of-thumb of sample rate should be five times as high as the signal frequency is still recommended, so with the 125MSPS sample rate, a maximum frequency of about 25MHz would be recommended.
Instrument standby power consumption with the power switch turned off was an excellent 176mW. Once the instrument is running in Standard mode, but idle, it consumes about 7.18W which is quite a bit more, but still very little compared to big-box instruments which usually consume three to ten times as much. The input voltage to the unit is supervised and needs to be within 19.3V to 20.3V for it to power up and within 19.0V to 20.4V to remain running, consuming about 0.32A to 0.35A when idle.
---
This post is a part of the Digilent Analog Discovery Pro ADP3450 USB/Ethernet Mixed Signal Oscilloscope RoadTest Review.