Keysight Smart Bench Essentials RoadTest Review

Table of contents

RoadTest: Keysight Smart Bench Essentials

Author: Gough Lui

Creation date:

Evaluation Type: Test Equipment

Did you receive all parts the manufacturer stated would be included in the package?: True

What other parts do you consider comparable to this product?: No other integrated bench solutions were identified.

What were the biggest problems encountered?: PathWave BenchVue Individual Apps were not entirely feature complete or stable, missing integration with Test Flow, confusion regarding the older Keysight BenchVue Platform.

Detailed Review:

Keysight Smart Bench Essentials RoadTest Review

By Gough Lui (Feb – Apr 2022)

It’s hard to think of a facet of life that COVID-19 has not had a hand in changing. The tertiary education sector, especially, has faced challenges in continuing business-as-usual as teaching was predominantly done face-to-face with hybrid eLearning approaches being dominant. Once lockdowns hit, opportunities to continue teaching face-to-face were minimised to reduce risk of COVID-19 transmission, which necessitated a new approach to deliver remote teaching over the internet of as near-equivalent quality as possible as the primary mode of delivery.

This new constraint resulted in a flurry of interest from educators in refreshing their labs to handle the challenges of remote teaching. I’ve watched as some equipment was connected to the network for the first time, remote front-panel operation over videoconferencing software became commonplace and low-cost “instruments of last resort” were sent-out by mail to students. Test equipment vendors have also been working hard to deliver solutions that would be effective, easy-to-use and suited to the constraints of a tertiary education environment. After all, this was perhaps a big wake-up call to some labs that their equipment needed a refresh as they were not well-suited to this new post-pandemic world we find ourselves in.

As a postdoctoral research fellow, former laboratory demonstrator and program author, I was thrilled to have the opportunity to evaluate Keysight’s Smart Bench Essentials package. This suite of test equipment which includes oscilloscope, power supply, function generator and multimeter is targeted at undergraduate-level laboratories, offering Keysight quality, usability and capabilities at an entry-level price, tied together with the new PathWave BenchVue software that has options for Lab Management and Remote Lab Operations. In this context, the absolute performance of the equipment is probably not the prime consideration, but rather the value-for-money, ease-of-use, software integration and robustness would be critical for the success of the product.

Unlike previous RoadTests, this one in particular was a new challenge for me as a bigger project involving four pieces of test equipment. As a result, this review and its associated blogs contain a lot of content – I definitely encourage all to click-through to the associated blogs as there is a lot that is missing from this summary review. As usual, if you found this review interesting, entertaining, informative or useful, I’d appreciate if you’d leave a like or share it with someone who may find it interesting. If you have questions, feel free to leave a comment and I’ll do my best to answer them.

Keysight’s Vision of a Smarter Electronics Lab?

The university lab bench is an unforgiving environment where many gain their first hands-on experience with electronics. It is often a space where various accidents happen, equipment gets abused inadvertently or sabotaged and thus is rarely ever fitted out with anything but a disjoint heterogeneous mix of entry-level equipment, frequently without any use of remote-control automation. The environment is highly cost-sensitive given the number of sets of equipment that are deployed and key challenges include the reduction in technical support staff and IT staff willingness to support connectivity of instruments. Academics frequently have to advocate for better arrangements, or else, the status quo is maintained.

The COVID-19 pandemic had resulted in a panic to find ways to deliver the laboratory component of electronics courses remotely. This included filmed or live demonstrations performed by demonstrators on higher-end equipment with competent remote front-panel capabilities or display outputs which could be captured. In desperation, cameras could also be pointed at instruments, although the experience is poor. Such demonstrations turn the practical component into a spectator event, leaving students without hands-on experience. Other courses have resorted to mail-out component kits and “instruments of last resort” which are often USB-connected, fragile, with limited capabilities, safety, sample rate, resolution and accuracy. While they fulfil the key constraint of price, students lack experience with real test equipment and may find frustration with the limitations and confusing results that may sometimes appear. Finally, a simulation approach has often been taken in-conjunction with the above, however, they are rarely engaging and often do not show the effects of non-idealities in components which separate theory from practice.

The Keysight Smart Bench Essentials seem to target some of these pain-points by offering a new line of EDU instruments which are very competitively priced, have competent specifications, quality and reliability one would expect from Keysight branded instruments, but also integrate BenchVue licenses allowing for the use of PathWave BenchVue software to integrate, manage and allow for cloud-based remote-learning sessions to be undertaken (although some of the latter features require additional licensing). The line consists of oscilloscopes (EDUX1052A, EDUX1052G), function generators (EDU33211A, EDU33212A), power supply (EDU36311A) and digital multimeter (EDU34450A) which all have a consistent UI, “signature” 7” LCD display, USB-host port on the front, USB and Ethernet connectivity on the rear for a truly “connected” bench. However, their new form factor does pose challenges with integrating the new equipment with more traditional devices.

This RoadTest review deviates from my proposal due to architectural differences between PathWave BenchVue Individual Apps (which support the Smart Bench Essentials) and Keysight BenchVue Platform (its predecessor), the lack of licensing for Lab Operations for Remote Learning and limited TestFlow integration at this time. In its place, I have decided to evaluate instrument performance and make use of more traditional code-based automation approaches using pyvisa. I have also conducted some rather interesting and quirky experiments, especially with the function generator, which I hope you will enjoy.

image

For more information, please read Keysight SBE In-Depth – Ch1: The Need for Smarter Benches?

Unboxing and Design Features

image

All four items made their way safely from the US to Australia, packed in sturdy cardboard boxes with fitted foam end-pieces and wrapped inside clear plastic bags. Accessories did not always stay put in their designated area, separated by a single piece of cardboard, but ultimately the instruments were unharmed.

image image image image

At a glance, the visual consistency of the units from the front makes all units look like an oscilloscope even if it isn’t, which is a very unusual feeling. In fact, three of the four instruments are housed in what appears to be the same sort of casing with the exception of the jacks and operation panel section. All instruments share the “signature” 7” LCD, soft-keys, front-panel USB, with variations in the operation panel to suit the instrument type. The rear casing of most instruments also has the jacks in a consistent place with provisions for Kensington lock security. The colour of the units is mostly a tri-tone grey which is very attractive and modern. The consistency also extends to channel colour coding.

In the range, the power supply unit is a bit of an exception, as when seen from the side the rear gunmetal grey coloured metal casing, greater depth and weight are noticeably different from the other instruments. The indentation at the top can be seen to be purposely designed to integrate with the other EDU instruments, however. The DMM and PSU require manual voltage adjustments and the PSU even requires a fuse replacement for 230V operation. With the exception of the DMM, all instruments have a cooling fan.

image

Being a price-sensitive market, included accessories are very scant and usually only include a power cord or probes in the case of the DSO and DMM. No other cables or spare items are included, and the quick start guide (if included) is only a single page. All instruments are also Made in China, which is a departure from Keysight’s usual Malaysian manufacturing for such instruments.

One downside is that this review kit did not include the (optional) stacking kit, which I would highly recommend, as the units based on the oscilloscope case moulding really do not stack with each other securely, which may limit the flexibility of instrument placement.

For more information, please read Keysight SBE In-Depth – Ch2: Unboxing^4 & Design Features.

Initial Setup and Documentation

image

To ready the Smart Bench Essentials kit for use, a number of setup tasks must be performed. This includes configuring the mains voltage, which in the case of the power supply, also necessitated a fuse replacement due to different values used for different mains voltages. The next was to consider how to site the units on the bench, as the oscilloscope-like units do not stack well without the EDU190A Instrument Stacking Kit which was not provided and is available for AU$154 including GST but with a long lead time. Owing to restrictive bench space, I designed a simple 3D printed shelf to substitute for now, however, this does raise concerns in case of a partial bench upgrade as conserving bench space when mixing instruments of different form factors could be tricky. Labelling probe cables and adjusting the compensation trimmer is another task which is usually necessary, however, I found the probes already well matched out-of-the-bag which is a bonus.

image

To best make use of the connected nature of the kit, connection to a PC will be necessary. This can be achieved via USB which would necessitate the use of USB-A to USB-B cables and a USB hub in case there are insufficient ports. This would establish a low-latency, one-to-one connection, however, eschews the possibility of using the web-browser based remote front panel interface. Alternatively, Ethernet can also be used with four Cat5 or above 8p8c (RJ-45) cables and an Ethernet switch in case of insufficient ports. Such a configuration would make centralised configuration and updating easier, providing access to remote front panel functionalities through a web browser. However, such a configuration is necessarily more complex to provision and may require IT staff to be involved to approve, provision and meet cybersecurity best practices.

image image

Installing the supporting software was also somewhat tricky as there are two BenchVues – the older Keysight BenchVue Platform and the newer PathWave BenchVue Individual Apps. Only the latter supports the Smart Bench Essentials instruments, necessitating the download of four separate apps - BV0001B PathWave BenchVue Digital Multimeter App, BV0002B PathWave BenchVue Function Generator App, BV0003B PathWave BenchVue Power Supply App, BV0004B PathWave BenchVue Oscilloscopes App. The downloads themselves will download all necessary dependencies which totals multiple gigabytes and seems vulnerable to connection interruptions. Unfortunately, the software do not yet have TestFlow integration which is expected to be released soon, however, the use of TestFlow in the future may also require the older Keysight BenchVue Platform to be installed alongside, increasing the size of the installation significantly.

image

Instrument firmware also needs to be updated using the Keysight Basic Firmware Update Utility from a computer. I noted the presence of an incorrect version of this tool on the Power Supply’s download page which led to firmware upgrade failures, however, no harm came about of this. Upgrading the power supply required multiple attempts while others were happy to take the upgrade first time. The oscilloscope is another exception, requiring firmware upgrades to be performed via front panel and separate updating of the firmware and the VNC server software. Due to a lack of licensing, I was not able to evaluate whether their mass firmware update utility would be able to perform all these upgrades automatically, unattended.

Documentation regarding the Smart Bench Essentials was found to be a bit imprecise and led to confusions regarding what to expect. Keysight are working to improve this. Documentation on the instruments, however, is top-notch and a pleasure to use with datasheets, user manuals, programming manuals and maintenance procedures all documented to a high standard.

For more information, please read Keysight SBE In-Depth – Ch3: Initial Setup & Documentation.

On-the-Bench Impressions

The instruments are attractive on the bench and looks like a well-integrated set. The instruments feel sturdy and has a decent amount of heft. The signature 7” 800x480-pixel LCD screen occupies a large amount of real-estate and is perhaps a good feature to improve usability. However, when I saw the display in real life, my enthusiasm was dampened somewhat by backlight bleed, limited viewing angles and a flickering dot-crawl effect which is distracting. Some displayed screens also seemed to have fine detail vulnerable to these effects. Unfortunately, there is no facility to adjust backlight brightness. In spite of this, the screens were still serviceable and robust enough to handle a stray swinging connector to the face.

image image

One less-than-obvious downside to the LCD screen is the limited real-estate remaining for front-panel buttons. As a result, the layouts of the buttons and knobs felt a bit cramped, especially for the DSO and PSU which could lead to inadvertent operation.

The user interfaces across devices are consistent in their layout of soft-keys in the case of common features, but this does draw some attention to minor differences in the way menu interactions happen as instruments have a mixture of rotary knobs with centre-click, rotary knobs without centre-click, directional-pad and numeric keypads.

Acoustically, three of the four instruments have fans thus with all instruments running, there is a noticeable background noise. The biggest offender is the PSU with a noticeable whir that could be distracting. The DSO and AWG have a more subtle rumbling motorboat hum. It is good to see that the beepers on the instruments can be disabled to ensure a quiet lab.

Specifically for the DMM, it offers a lower level of basic accuracy ranging from 0.015% to 0.027% compared to contemporaries which offer around 0.012% and has limited aperture selections rather than being able to define NPLC directly. The unit lacks a 10A current range, topping out at 3A, and lacks thermocouple temperature instead opting for thermistor-based temperature. The continuity beeper is software based with noticeable delay, and onboard capabilities for data logging are limited to 5kpts with a long sample interval of approximately 6-seconds. The USB Host port is only used for setting storage and recall.

image image

I conducted a simple experiment to identify suitable thermistors for the temperature measurement function as the datasheet specified model was not available. The experiment concluded that most 5kΩ thermistors appear to work well.

The AWG was impressive in terms of its memory depth and two channel capabilities which enables additional capabilities when it comes to channel combining and modulation by the signal on the second channel. A key limitation with the memory means that it is not possible to have all 8Mpts in a single waveform, as each is limited to 1Mpts. Another is that such waveforms cannot be loaded across the remote interface and feedback on .arb file loading and data importing is poor, with the instrument seeming to freeze, frequently without confirmation of successful or unsuccessful loading in many cases. The lack of a dedicated trigger button results in the need to press the Trigger key twice in some cases.

By reverse-engineering the .arb file format, I was able to use the function generator to speak with telephone line modems and send modulated voice, morse code and fax images as a radio signal across the house.

In the case of the PSU, the instrument is fairly usable and the combination banana and binding post terminals makes it versatile in a lab context, although it does mean it is incompatible with shrouded banana cables which are becoming increasingly common. The front panel adjustment knobs for voltage and current have no centre click which makes adjustments awkward, requiring the use of the directional pad to control place value. The channel selection and all on/off key logic also seems a bit less functional than the arrangement I have seen in other units, and the close placement to the Channel 2 on/off key risks inadvertent operation. Similarly to the DMM, the USB port is only used for settings storage and loading, as the unit lacks data logging and screenshot capabilities onboard.

Finally, the DSO showed performance consistent with a competent and mature product. The user interface was fluid and the response of the knobs is excellent. The limited real-estate of the screen is perhaps not best utilised with the fixed bar on the right side where the soft-buttons ordinarily appear and the limited memory depth is unlikely to be a major issue for simple lab exercises. Instead, the key issue was the cramped operation panel which made bumping offset knobs when adjusting scale knobs a likely occurrence. Allocating vertical scale and offset knobs for math/FFT traces also seemed unnecessary as Channel 1 and 2 already share a set. Some keys were also somewhat small for my liking and file operations can sometimes be slow, resulting in a spinner appearing, Another key finding is that the test hook on the probes do not seem all that robust, having bent in a minor cable “tug” incident. In the undergraduate education context, a 50MHz bandwidth limit is not a major issue.

image image

image image

The DSO proved its value, performing UART and I2C decoding competently. The FRA feature was able to construct bode plots automatically, even if they could not be saved on the unit. The FFT feature was able to identify AM radio stations while X-Y mode was able to fluidly display Lissajous patterns generated with the use of the AWG.

On the whole, the instruments were not without their own compromises, but none of them seem particularly a deal-breaker in the context of an undergraduate tertiary education lab. Instead, the instruments are venturing into a market where no mainstream manufacturer has previously chosen to explore and trying to offer a good value offering backed by Keysight’s reputation and experience. In the case of the EDUX1052G DSO, the experience definitely shows in the fluidity of the interface and feature set. The EDU33212A AWG provides levels of memory depth and modulation capability that would previously not be found on an undergraduate university lab bench. Similarly, the EDU34450A DMM is offering 5.5-digits at a price which undercuts practically most of the market while remaining competent at the things that matter. Finally, the EDU36311A PSU meets the needs of most lab benches by providing output that can be configured into a split-rail supply with an auxiliary rail while providing both banana and binding post connectivity options for flexibility.

For more information, please read Keysight SBE In-Depth – Ch4: On-the-Bench User Experience.

Connected to the LAN

Even without using BenchVue, the instruments offer some remote-control capability by virtue of their Ethernet LXI-LAN connectivity featuring a web-based interface. While the user interface and exteriors of the instruments show consistency, there is a clear difference in platforms when it comes to remote web-based interface functionality.

image

The EDUX1052G oscilloscope offers a fully-featured web-based interface which allows for viewing information about the configuration of the oscilloscope, reconfiguration of network settings and a password (which is disabled by default), provides basic context-specific help, allows for sending SCPI commands and receiving responses, provides a remote front-panel capability based on VNC (provided the VNC server software is installed on the oscilloscope) and allows for saving data from the oscilloscope and reloading settings to the oscilloscope directly. The VNC remote front panel achieves an average of about three frames-per-second which is a little on the slow side, but is quite usable.

image image

The remaining instruments, however, seem to have a somewhat simpler web-based interface which, while visually consistent, seems to be less featureful. This interface does permit viewing basic configuration, reconfiguration of network settings and password (which is set to keysight by default) and provides a remote front panel capability within the web browser. This seems to be based on polling for JPEG preview images of the front panel which is a slow process achieving only around two frames-per-second with noticeable visual artifacts which include macroblocking and false colours. The design of this system also seems to misuse MIME types, sending a JPEG file with a MIME type of image/bmp. Regardless, the full-quality image is available using the Screen Capture button, however, no facility to execute commands and read responses is provided.

For more information, please read Keysight SBE In-Depth – Ch5: Connected to the LAN.

PathWave BenchVue Applications

The PathWave BenchVue Individual Apps license is included with each EDU instrument and enables access to the software which is a key element of Keysight’s “smart bench” solution. The licensing is generous, allowing multiple PCs to be licensed with an instrument as long as it is connected at least once per year and allowing for the full app to be unlocked, permitting use with older instruments or those which are supported and do not come with BenchVue-included licenses. This opens up possibilities for students to have the software installed on their own machines, for the possibility to lend out older supported Keysight equipment for students to continue working at home, or as a transition path for some students to surplus, more-sophisticated supported models which may not have the license for example.

image

The apps generally follow a consistent user interface design with a dark theme. The applications take approximately ten seconds to start on my computer and generally provide remote live-view of readings, configuration and control capabilities. This somewhat makes up for the cumbersome soft-button user interface when operating standalone and improves user experience with more intuitive operation. In the case of the DMM and PSU, it extends additional data logging capabilities to the instruments which have limited or no such capabilities when operating standalone. It can also retrieve screenshots from instruments (except the AWG), which is perhaps valuable for USB-connected instruments where the web-interface is unavailable, but is slow and can only auto-refresh at five-second intervals.

image image

Unfortunately, a number of issues were discovered in practice, some of which are related to the nature of remote control of instruments. For example, it is not possible to use the front panels of the instruments when they are under computer command without “pausing” the connection or forcing the instrument to “go local” which could be an issue in case of a laboratory accident. In the case of the PSU, there is no button to “go local” either which may necessitate resetting the instrument. In the case of the oscilloscope in the “run” mode, one has to work both the controls in the software to configure the oscilloscope and watch the oscilloscope screen to see what is actually happening.

image

The remote front-panel capability of the BenchVue suite, while functional, has shown that it has not met feature parity with the instruments. For example, in the case of the DSO, there didn’t seem to be an X-Y mode or support for digital, FRA modes; for DMM, there is no support for the diode drop measurement or dual-display capabilities; for the AWG, there appears to be no way to upload large arb files nor use triggered modes of operation. In spite of this, many common operations are supported.

image

The sequence of operations that is natural is sometimes inhibited by BenchVue as well. For example, it may be common to commence a data log on a power supply and then enable the channel to ensure all behaviour (including the power-on transient) is captured. Unfortunately, in BenchVue Power Supply, one can either control the channels or log the data, but not do more than one thing at a time. This limitation can make it difficult to achieve some workflows. Furthermore, it was discovered that data logging can result in reduced front-panel LCD update rates which may make it difficult for operators to identify an anomalous situation and take corrective action.

image image

Software stability in terms of operations was problematic with the DMM especially over Ethernet connectivity with Query Interrupted SCPI errors being thrown, erratic delays in command execution and spurious values being data logged. The Data Logging capability also proved unreliable on aborting and restarting logs, resulting in truncated logs or graphs which showed artifacts. I frequently had to restart the software prior to logging to ensure the logging would actually complete. The BenchVue software also occasionally had problems detecting Ethernet instruments that necessitated restarting the computer.

image

Performance also proved to be problematic with the PSU over Ethernet, where all-on and all-off commands were executed with each channel in sequence, resulting in a “cascade” which is visually apparent. This means that such features of the application cannot be trusted to sequence the power to the device under test correctly, which may result in damage to devices under test that require strict power sequencing. Perhaps they should be using a channel list command so that these changes are performed synchronously. However, such delays seemed much less of an issue over USB and were inconsistent over Ethernet.

image image

But perhaps the biggest disappointment of all is the quality of the graphing module, which has poorly labelled scales which are difficult to configure and often does not behave well if manipulated during data logging. Additional features such as setting logarithmic scales are also not accounted for, while exported screenshot images show aspect ratio distortion and text sizes/colours that make the results difficult to read. Logged data is also stored in files with an .ivif extension when they are actually .hdf5-standard files – this obscuration of the file format seems unnecessary.

To make the bench truly smart, however, requires the Test Flow integration to allow users to automate and sequence one or more instruments to perform experiments. At this point, it seems that Test Flow integration is not yet ready.

For more information, please read Keysight SBE In-Depth – Ch6: PathWave BenchVue Oscilloscope, Power Supply, Digital Multimeter & Function Generator.

Keysight BenchVue Test Flow Automation

It is unfortunate, given my enthusiasm and the critical role that Test Flow plays in integrating the multiple instruments to perform more sophisticated tasks, that the new PathWave BenchVue Individual Apps are not yet ready to integrate with Test Flow. While Keysight seems committed to bringing this integration, it seems to be working towards integrating Test Flow connectivity which will require the older Keysight BenchVue Platform to also be installed alongside to provide Test Flow. Whether this will remain the case in the future is unknown, but is not an optimal situation, as it means even more bandwidth and disk space consumption.

image image

It seems that the Power Supply 2021.2 app already has a preview integration with Test Flow, however, I was not able to get it to detect my Keysight BenchVue Platform installation despite it being up-to-date. As a result, I had to resort to using Test Flow manually, as if working with an unsupported/non-Keysight instrument. This requires using SCPI blocks with manually selected address and manually entered commands. This is a tedious process which would be avoided if the integration were working correctly.

image

However, when it was all done, Test Flow is very nice to use. It is intuitive in the way it displays the program flow blocks, highlights execution as it proceeds, live-plots results alongside while recording them for later export and allows for single-stepping for debugging of flows. The solution is a lot easier to use and seems lighter than LabVIEW, although it still mainly plays well only with Keysight supported equipment. The use of SCPI blocks is sub-optimal, but does allow users to practice with using SCPI – after all, these instruments are real pieces of test equipment and this could serve to aid the migration to code-based automation (such as pyvisa which I have extensively used for more serious tasks). Running multiple instruments is no challenge and the ability to colour code blocks makes data interpreting easier, however, I did find the plotting function to still be somewhat limited in terms of zooming and rescaling the plot.

I feel that for the bench to be truly “smart”, it must have some automation capability as the BenchVue apps on their own are little more than another remote front-panel with ability to configure settings and log data. When Test Flow is involved, you can start to involve multiple instruments and run sequences which make decisions based on the measured data. As a result, I feel it is unfortunate that Test Flow integration with the Smart Bench Essentials is still a work-in-progress as the package is not only the hardware but also the supporting software.

For more information, please read Keysight SBE In-Depth – Ch7: Keysight BenchVue Test Flow Automation.

Instrument Performance Tests

image image

Testing the EDU34450A’s voltage measurement accuracy showed that it agreed well with the Keithley 2450 SMU up to 20V indicating that the specifications are likely to be valid. Results from the 200V range showed noise indicative of oscillations, thus the data was not reported and the experiment was not re-run due to a lack of time and the unlikely scenario of encountering high-voltages on university lab benches. Testing the current accuracy revealed that the specification is perhaps a bit on the conservative side, with measured errors falling well-inside the error limits. Given its price, the performance is very acceptable.

image image

Moving the microscope over to the EDU36311A power supply, it was shown to have significantly better voltage and current programming and readback accuracy compared to its datasheet specifications. Either I got a very good unit or they were being conservative with specifications to avoid cannibalising sales of higher-end units, but the performance continued to impress with regards to output rise and fall times of 4.95ms and 21.4ms for Channel 1 and 7.77ms and 45.79ms for Channel 2 respectively.

image image

Unexpected power-downs did not result in the output becoming unregulated, and OCP was found to be functioning as expected for the most part, thus devices under test are protected. Transient response performance was tested and found to be consistent with the 50µs claim, putting this cost-sensitive general-purpose power supply near performance/SMU territory. Ripple and noise also measured favourably at 5.145mV and 5.831mV peak-to-peak on Channel 1 and Channel 2 respectively, matching datasheet specifications. The only downsides seems to be a “lagging” regulation loop that allows about 1.2ms of overshoot on Channel 1 and non-relay isolated outputs that have significant leakage current in case they are connected to energy sources (e.g. batteries). In all, this seems to be a superb power supply.

image image

The EDU33212A arbitrary waveform generator also impressed, with tests of the DC offset error, 1kHz AC amplitude error, channel-to-channel skew and cross-talk besting the specifications by a significant margin. Output amplitude flatness across the range of frequencies, harmonic/non-harmonic noise output and intermodulation tests recorded results consistent with the datasheet.

image

For the EDUX1052G, the input noise level averaged 403.53µV peak-to-peak and 412.39µV peak-to-peak across Channel 1 and Channel 2 respectively, corresponding to 51.640µV RMS and 52.204µV RMS respectively. This is a solid result for an oscilloscope of this class/bandwidth and would mean a noise level of about 4mV peak-to-peak with a 10:1 probe, making basic ripple-and-noise measurements possible.

Standby power-consumption was virtually non-existent with hardware power switching on all instruments, except for the power supply which had a tiny 100mW standby draw. Idle power consumption measured 26.1W for the DSO, 5.7W for the DMM, 16.5W for the AWG and 15.5W for the PSU, totalling 63.8W. This is significant, but is comparable to the draw of other instruments and is not unexpected.

image image

The SCPIBenchv1 results show the instruments all have the potential to be high-performers when it comes to remote control, but the results with VXI-11 and SOCKET connectivity show mixed latencies and USB typically excels with consistently fast responses. The DSO seems especially consistent, regardless of the connectivity, but unlike the others, shows marginally better performance on HiSLIP and SOCKET connectivity. Some operations, especially with the DMM which involve auto-ranging, can result in long delays of 2-4 seconds which may result in VISA timeouts depending on the configuration.

While the performance of such cost-sensitive instruments intended for university benches is not a prime consideration, this exploration of instrument performance seems to turn up favourable results which reinforce the impression that Keysight have delivered solid, serious test equipment for the price. This, indirectly, would translate into better quality measurements, better protection for devices under test, less test-equipment-induced errors, but also perhaps increases the instrument appeal beyond the education market into the hobbyist market.

For more information, please read Keysight SBE In-Depth – Ch8: Instrument Performance Tests.

Peeking Under the Covers

image image

Taking a peek under the covers was quite interesting to see. The DMM appears to have a transformer-based power input with screening. There is a separate PCB for the brains based around an STM32 and the DMM front-end, as well as another for the front-panel keypad. The main banana jack connectors are directly connected to the front-end PCB which is perhaps a cost-saving design, along with the use of Chinese HongFa relays. The inputs appear to be protected by MOVs and gas discharge tubes, so safety seems well taken care of. The current shunts appear to be wire-wound resistors. Looking behind the covers, there is a Keysight branded 1NJ5-0001 chip and a Caddock 1776-C48 cost-reduced decade resistor network reference that I could identify, driven by another STM32. The front-panel keypad uses membrane switches with carbon-based tracks, while the LCD shows a moderately thick glass panel glued to the front for robustness, glued into a carrier so the original manufacturer could not be identified. On the whole, it seems sensible, however, the construction quality did leave something to be desired with some flux/washing residue and discolouration as a result.

image image

The AWG follows a similar case design but instead uses a switching power supply (FJ-SW916-A7) which is shared with the DSO. This supply has conformal coating on the rear, covering some (probably non-functional) spark gaps and mediocre construction quality with diodes mounted askew. The supply also uses a mixture of ChengX and OKCAP capacitors which are not highly-reputable brands, which is perhaps the cost of being price-conscious. However, the design of the mainboard itself shows quite a bit of care, using Nexem (formerly NEC-Tokin) signal relays from Japan and being based around an Intel (formerly Altera) Cyclone 10 and an Analog Devices TxDAC. The fan used comes from HengLiXin, while the CR2032 coin cell for the real-time clock comes from Chao Chuang, both seemingly preferred suppliers.

image image

The DSO also shows relatively good construction quality (with the RTC crystal being slightly askew) and uses a mezzanine board named “BLT”. The board houses a hefty ST SPEAr600 dual-core communications processor, a Xilinx Spartan-3 FPGA and two hefty chips under heatsinks, presumably their own ASIC and maybe a hefty FPGA. Each channel’s front-end is encased behind a screening can – I didn’t dare remove it in case it affected the calibration. While it uses the same power supply and fan, the fan direction is reversed compared to the AWG.

image image

Finally, the PSU is a heavy and chunky beast, containing a sophisticated over-sized toroidal transformer from Eaglerise Electric & Electronic with seven secondaries and a screened primary. The power supply seems to be a linear design, with each output controlled by its own STM32 and with another STM32 driving the whole show. Capacitors in this device all appear to be from JiangHai, which is yet another less-reputable brand, which appears to be a compromise for price. Similarly, the 12cm rear fan comes from HengLiXin.

In the end, no instruments were harmed in this peek under the covers – all instruments pass their self-tests post reassembly and all screws were accounted for in the process. There seem to be compromises made in order to reach the low price, namely the use of many more China-sourced components in the design. Given sufficient quality control, this in-itself would not be a problem, and I feel inclined to trust Keysight given the fact that there seems to be preferred suppliers for parts which remain consistent across these products, so there seems to be some care to make deliberate selections of these components. However, the construction quality of some parts seemed to show slightly less attention to detail than I would have expected, but this shouldn’t affect the operation of the instrument at the end of the day.

For more information, please read Keysight SBE In-Depth – Ch9: Peeking Under the Covers.

Conclusion

It seems Keysight set their sights on reinventing the undergraduate university lab bench, focusing on developing a line of modern equipment at an affordable price with decent specifications, suited to the harsh rigours of student hands. They decided that such a bench should be connected, meaning being supported by an ecosystem of PathWave BenchVue software, including the possibility for lab management and remote lab access capabilities suited to a world where remote learning has gained importance. In doing so, Keysight equipment has been venturing into markets and price points which are not often frequented by mainstream test equipment suppliers, competing with younger Chinese brands, while offering their brand, experience, support, hardware and software integration as key selling points.

For the most part, from the hardware perspective, the equipment are all genuine, bona fide test instruments that looks and functions as you would expect. It offers solid performance all-round, with some equipment besting their datasheet specifications by such a margin it has me wondering if they deliberately wrote their specifications in such a way to avoid cannibalising sales of higher-end equipment. There are some minor compromises with regards to the quality of the screen and some of the functionality available, but nothing show-stopping for an educational market that is probably focusing in basic exercises – instead, the additional capabilities over the equipment frequently found in existing labs may open the possibility for new exercises that are more relevant to a microcontroller-dominated world. Perhaps the biggest usability hurdle in this regard is the use of relatively cramped front panel controls and the insistence on soft-key, non-touch operation. In some cases, the pricing of the gear may appeal outside of the educational market as well, especially as they also come with “BenchVue Included” licenses which generously will enable full applications allowing supported, but otherwise unlicensed gear, to be used with the software.

While I was not able to evaluate the remote access lab and management capabilities due to a lack of software licensing, the PathWave BenchVue individual apps all seem to need some work. At the present moment, they function as a remote front panel interface for live viewing, setting of parameters, downloading of screenshots and data logging facility (where supported by the instrument). This capability, however, comes with caveats that may affect the ability to realise certain workflows (e.g. starting data logging, then changing the output status is not possible). The software’s graphing module was a key let-down with poor axes labelling, limited flexibility in scale configuration, graphical glitches where graphs are modified during logging and incorrect aspect ratio on exporting images. Software instabilities in data logging were also experienced.

But perhaps the biggest potential that is yet to be formally realised is Test Flow integration of the PathWave BenchVue apps. Without formal Test Flow integration, automation using Test Flow requires falling back to SCPI blocks with manually entered commands which makes automation more tedious than otherwise necessary. Even when this integration arrives, needing to have both PathWave BenchVue individual apps and the Keysight BenchVue Platform installed to realise the capability is not optimal.

As a result, I can only conclude that the Keysight Smart Bench Essentials kit certainly has the hardware potential to be a powerful, integrated, connected bench. The software, however, is yet to fully realise this potential as it feels a bit basic and unfinished at this stage, which can lead to some frustrating experiences. It seems common for test equipment vendors nowadays to prioritise time-to-market, leaving software to be “fixed” later, so I would implore potential purchasers to evaluate the equipment based on the presently available software to make sure it meets their needs prior to committing as things may change over time.

My sincere thanks to element14 and Keysight for the opportunity to review the Smart Bench Essentials instruments and PathWave BenchVue software suite.

Additional RoadTest Delivery Notes & Updates

The delivery of large, complex, multi-part RoadTest reviews is always a challenging subject. Some reviewers prefer to publish sections as they progress, however, this is something which I find hard to do as it is easy to jump to conclusions without a well-rounded view of the product that is developed through the full evaluation process, which can lead to a review that is not cohesive. This is not an approach I felt would work with my review process.

In the past, I have opted to release the full review and all subsections nearly simultaneously. This has the benefit that everything is available from the post date, but has the disadvantage that my review will often have a later post-date and that readers may not have the time to read all the content. This means they may not revisit and take advantage of the in-depth blogs which accompany the posting. It also increases my vulnerabilities to issues with uploading and publishing the posts.

For this RoadTest, I have experimented with a hybrid approach, where the in-depth articles are released across the three days prior to the summary RoadTest article. These sections were released in a limited time-frame which allows me to be sure that their contents are finalised, but also spreads the “load” of communicating the information over a few days. Unfortunately, they are released out-of-order, as my review writing process does not follow the review section ordering which may lead to confusion or missing information which will be filled in over the coming days. I feel this is an acceptable compromise given the amount of work involved.

I’d definitely welcome any feedback as to the RoadTest publication approach – should I return to monolithic “everything-at-once” releases? Or should I continue with a “short” phased release just a few days prior to the full review release? Feel free to leave a comment to let me know!

Any updates to this review and follow-up sections will be listed below as they become available.

18th April 2022 – RoadTest Summary Review uploaded, on the due date with not a moment extra to spare!

Anonymous
Parents
  • Nice, this Road Test was definitely a beast to try and track all your experiences. Appreciate the fine work. 

    For a such a wide scope , a staged blog would be good just to show progress and save yourself a massive info launch. 

    I usually have a "standard" procedure in the test plan for Road Test that lends itself to blogs if the project is big enough to justify. I also agree that there is a learning curve in parts of the Road Test that can lend to a "frustration" blog that maybe unfair as you gain experience.

    On arrival, I first do the visual inspection, parts count, and a non destructive teardown. This could be blog 1.

    Software tool installation and connection to device is usually next as step 2 which also can be a blog. First impressions for easy installation and initial user experience I think are very important. Lots of people may have a sour taste in their mouth if software/installation is obtuse  

    A 3rd blog would be good for test results. In this case , I think readers maybe interested in a singular instrument, and read more detail on specific ones.

    This product description described a supervisory scheduling tool and cloud based lesson saving. Was this also part of the road test plan as well ?

  • Thanks for your input - definitely appreciate your perspective.

    Interestingly, I think a staged approach may suit RoadTests where design may be necessary or where testing is more of a project. Having recently used this kind of approach for the Experimenting with Thermal Switches design challenge (which I managed to scrape Grand Prize for), I can understand its appeal in allowing readers to follow the progress as it happens.

    But for RoadTest evaluations of products, I'm still hesitant to follow this approach as inconclusive results, incorrect conclusions and contradictions could occur. In fact, I've probably collected about four-times the amount of data presented in all the detailed blogs and this summary combined - much of this never makes it out because they tend to be early experimentation which may not be "ideal" examples (think of them like blurry practice photographs). Publishing as I go along risks putting this out and then having to later correct myself - perhaps at the cost of many more blogs.

    Another key challenge of phased releases is the need to follow some logical sequence - but in my experience, reviewing things often occur in a different order to the review presentation. As a result, it enforces some sort of "rigid" chronological structure on the process and demands regular time-commitment to writing throughout. Even with this "phased" release over three days, it's clear things get delivered out-of-order in part because I have developed a review structure I feel is logical, clear and navigable. I fear that a continual phased release approach may result in more blogs that are even less well-organised resulting in information "hiding" in plain sight.

    I usually collect review assets and run/re-run experiments throughout the majority of the review window, spending the last week-and-a-half on the write-up. In the past, I would upload the sections as draft, then hit publish on them all in the same day. I feel that scattering this across three days has helped to improve early engagement but it's too early to be sure.

    It's interesting to see that you prefer to do teardowns early-on in the review. I always leave this to last, just out of experience, in case a non-destructive teardown actually becomes destructive in subtle ways - e.g. removing the covers will void calibration and may increase drift simply through differences in PCB flex caused by screw torque. I always leave teardowns to last so that if the unit is totally destroyed, that would be on me but the results would already have been collected.

    As for the review proposal - I tend to let myself be guided by it but reserve the right to vary the review in case of issues. In this case, I had originally intended to put emphasis on Test Flow, performing all of the automation with it. I had also intended to focus on BenchVue Mobile, only to find that was part of the older Keysight BenchVue Platform which does not support the EDU-series of instruments. Another key focus was on the Remote Access Lab cloud-learning features, looking at the student experience and the administrative management features (e.g. can an instructor set all instruments to a particular set-up, update firmware all from a central point). Any courseware (Educator's Resource Kit) provided in the form of labs, exercises would also have been included in the review, along with the "bode plot trainer" which is (often) claimed to come with the EDUX models. Unfortunately, because of the PathWave BenchVue platform differences, limited Test Flow integration at this time and a lack of both hardware and included licenses providing access to the features, these elements were cut or substantially scaled back in the review.

    It was not for a lack of trying either - we were in contact with Keysight behind the scenes and I did ask if there was any way I could experience the remote learning without the license (e.g. connecting to an existing class) but this suggestion did not come to any result.

    In lieu of the evaluations which did not take place (and during the time that we were obtaining clarification from Keysight), I performed instrument performance tests and performed certain experiments with the arbitrary waveform generator which were not part of the original proposal. After the evaluation was pretty much complete, I also delivered the teardown section which was proposed as an optional (if-time-permitting) part. I feel this was a fair trade in terms of work and knowledge attained given the constraints.

    - Gough

Comment
  • Thanks for your input - definitely appreciate your perspective.

    Interestingly, I think a staged approach may suit RoadTests where design may be necessary or where testing is more of a project. Having recently used this kind of approach for the Experimenting with Thermal Switches design challenge (which I managed to scrape Grand Prize for), I can understand its appeal in allowing readers to follow the progress as it happens.

    But for RoadTest evaluations of products, I'm still hesitant to follow this approach as inconclusive results, incorrect conclusions and contradictions could occur. In fact, I've probably collected about four-times the amount of data presented in all the detailed blogs and this summary combined - much of this never makes it out because they tend to be early experimentation which may not be "ideal" examples (think of them like blurry practice photographs). Publishing as I go along risks putting this out and then having to later correct myself - perhaps at the cost of many more blogs.

    Another key challenge of phased releases is the need to follow some logical sequence - but in my experience, reviewing things often occur in a different order to the review presentation. As a result, it enforces some sort of "rigid" chronological structure on the process and demands regular time-commitment to writing throughout. Even with this "phased" release over three days, it's clear things get delivered out-of-order in part because I have developed a review structure I feel is logical, clear and navigable. I fear that a continual phased release approach may result in more blogs that are even less well-organised resulting in information "hiding" in plain sight.

    I usually collect review assets and run/re-run experiments throughout the majority of the review window, spending the last week-and-a-half on the write-up. In the past, I would upload the sections as draft, then hit publish on them all in the same day. I feel that scattering this across three days has helped to improve early engagement but it's too early to be sure.

    It's interesting to see that you prefer to do teardowns early-on in the review. I always leave this to last, just out of experience, in case a non-destructive teardown actually becomes destructive in subtle ways - e.g. removing the covers will void calibration and may increase drift simply through differences in PCB flex caused by screw torque. I always leave teardowns to last so that if the unit is totally destroyed, that would be on me but the results would already have been collected.

    As for the review proposal - I tend to let myself be guided by it but reserve the right to vary the review in case of issues. In this case, I had originally intended to put emphasis on Test Flow, performing all of the automation with it. I had also intended to focus on BenchVue Mobile, only to find that was part of the older Keysight BenchVue Platform which does not support the EDU-series of instruments. Another key focus was on the Remote Access Lab cloud-learning features, looking at the student experience and the administrative management features (e.g. can an instructor set all instruments to a particular set-up, update firmware all from a central point). Any courseware (Educator's Resource Kit) provided in the form of labs, exercises would also have been included in the review, along with the "bode plot trainer" which is (often) claimed to come with the EDUX models. Unfortunately, because of the PathWave BenchVue platform differences, limited Test Flow integration at this time and a lack of both hardware and included licenses providing access to the features, these elements were cut or substantially scaled back in the review.

    It was not for a lack of trying either - we were in contact with Keysight behind the scenes and I did ask if there was any way I could experience the remote learning without the license (e.g. connecting to an existing class) but this suggestion did not come to any result.

    In lieu of the evaluations which did not take place (and during the time that we were obtaining clarification from Keysight), I performed instrument performance tests and performed certain experiments with the arbitrary waveform generator which were not part of the original proposal. After the evaluation was pretty much complete, I also delivered the teardown section which was proposed as an optional (if-time-permitting) part. I feel this was a fair trade in terms of work and knowledge attained given the constraints.

    - Gough

Children
No Data