Testing is always time consuming, but in many cases, testing time can be reduced through automation. How are you doing it? Take the poll and let us know, and please tell us how you handle testing and test automation below in the Comments section!
Testing is always time consuming, but in many cases, testing time can be reduced through automation. How are you doing it? Take the poll and let us know, and please tell us how you handle testing and test automation below in the Comments section!
Test automation in production is crucial. Test automation in R&D and product development is seldom worth the effort.
That's an interesting take Doug.
By "in production" do you mean a product that's about to be released to production manufacturing, or a product actually deployed in production which you automate health checking for to ensure it's working properly?
Test automation in R&D and development is absolutely and always worth the effort - we would live and die by the quality of, and automation of, testing: it's such a boring, repetitive activity that not automating would reduce the testing quality and efficacy significantly. Not only that, automation reduces the actual testing time, directly impacting resource costs. Not all tests can be automated of course and those are the ones that need careful monitoring. I did use to have the argument with less experienced colleagues along the lines of "whilst we're creating these tests, we could just be getting one with it" but not for very long. I also used to have an internal philosophical dialog along the lines of "who's testing the tests and do we need to create tests to test the tests" but then I went insane and other voices took over
Test automation can mean different things to different people.
In R&D and product development, we would frequently use test automation to assist in testing software release candidates. We would use scripting languages (like ruby) to test product features, for each software build, to insure that our products functioned correctly. As we added features to the software we would add/modify the test scripts to ensure that they remained effective in testing the devices functionality.
I also think that if you wait with test automation until production time, you are only harvesting half the apple. You should think about executable test options as soon as you start to spec. What's the point of letting a highly skilled and experienced engineer do repetitive tasks?
I tried to create automation during development. I recall developing a backup/restore routine using snapshots on a MySQL database. It moved from being a backup routine in production to a tool to assist in data testing much later in the project. We could push an update to the database and in less than 18 seconds have it restored to its original if inconsistencies occurred. The routine not only automated backups but became a valuable tool for testing.
I confess automation for production is simplified if the development environment reflects production environments. It was my norm to create automation during development with the thought of it becoming a production tool.
Absolutely DFT (design for test) of both prototypes and production should be part of the design process from the start, but that doesn't mean automated test must be used in development. Designing the automated test systems to be used in manufacturing a product is a project in itself that can be started in parallel with product development, but it is costly to keep redesigning the automated test systems as the product evolves through development. For example I've seen a designer revise his complex PCB over 900 times before it was released for manufacturing. The card was designed to be tested with an automated bed-of-nails system, but it would have been prohibitively expensive to design hundreds of beds-of nails and associated automation software for each revision and then use the system for a few minutes before it becomes obsolete. Meanwhile perfecting each bed-of-nails system would massively delay the main project when an hour or two of manual probing tells you everything you need to know. When it takes longer to design and perfect an automated test system than it takes to run all the tests you will ever run, it is usually not worth it. When you can design and perfect an automated test that saves on total test time, is more consistent and reduces errors, then of course it is worth considering it.
I guess it sometimes depends on what are the alternatives. Interns are cheap : )
The most boring weeks of my life as an intern in an R&D lab was just sitting (wearing shirt and tie - we were young but professional!) and turning a dial. The aim was to tune from near-zero all the way up to 30 MHz, slowly (it takes days to do that!) and just listen and write down all the frequencies where there were unexpected sounds. The point being that then the source of the sounds could be identified (it could be a transistor oscillating, or it could be a 16 MHz clock or by-product leaking through, and so on), and reduce as much as possible so they are no longer audible. Solutions could include putting a screening can around parts of circuitry, or rerouting interconnecting cables between boards etc. However, each time a significant change is made, then that knob needs turning from 0 to 30 MHz again : ) just in case any new sounds have appeared, or if old sounds have moved frequency.
Back then, I didn't think of automating, although much of the test equipment did have HPIB, the transceiver did have a remote serial interface so it didn't need someone to turn a knob, and there was an audio meter around with HPIB too, that I could have used. Today if I were doing it, I'd make use of Matlab to do the listening for me.
Another interesting scenario was switching relays on and off thousands of times, as a component selection process, to see which makes of relays could survive (they were going to be deployed in a scenario where they would be switched at high speed to perform antenna tuning.. fast enough that all you hear is a buzz while the circuit does its thing, finding the perfect combination of topology to match the antenna). I don't know if logging counts as automation, I don't know the detail but an engineer alongside me was logging for about a year (with different batteries and charger designs) to prove out his multi-battery charger circuit.
As design engineers, we could pretty much use any technique to automate that we wished, whereas the production engineers tended to use LabView a lot. Some test tools I built included DTMF generators and decoders and I2C master controllable via a PC (I think it was to program a frequency synthesizer, or to simulate button presses).
Later as a software engineer, it was of course a key part of the design process to automate (developer tests, regression, performance tests, run systems for hours on end, perhaps generate load, to check for memory leaks too, etc). Lots of fun : ) I quite liked writing test-case titles (bit more of a pain writing the actual test cases!).
Anyway, regarding the question, I think what you're really driving toward, is essentially asking whether equipment should have interfaces for automation or whether it's OK to just buy equipment without. Maybe I'm wrong, but I'm of the opinion that all test equipment should come with an interface for automation as standard, it doesn't have to cost much. No-one can predict what the equipment will be used for in future, and it gives it a longer life because it can be retasked: having an interface can turn something into a new product.
For instance, a multimeter can quite easily become an audio meter, if it is paired with a sig-gen and a bit of automation. Let the customer/engineer have the opportunity to turn that product into something they need, through the use of interfaces. Otherwise, the manufacturer is only guessing that their build-in display/buttons meets customer needs, locking out control of the bits of functionality within the device often, or forcing down one workflow that isn't fit for purpose. I don't know if that makes sense, maybe I have not explained it as best as I can (it's late : ) . Also, it sucks having test equipment with a poor user interface too.