You Can't Automate Everything

By Jeff Gainer

(Author's note: This essay originally appeared in the 05 September 2001 issue of the Cutter IT Journal E-Mail Advisor.)

Despite what the slick brochures and vendor representatives will tell you, not all tests can be automated.

There are several factors to consider when deciding what tests will be cost effective to automate; the first is the selection of an automated test tool. It's folly to think that one tool is completely superior to another, just as it is wrongheaded to claim that they are all alike. For example, will you require a tool that is simple to learn and use, or are you planning to build a large base of test cases that will require ongoing addition or maintenance?

Once you have selected a few candidate tools, arrange for vendor demonstrations - don't rely on a brochure or "demo" disk. Selection of a tool should always hinge on an on-site demonstration and proof of concept. Insist on a demo against your own application. Some automated test tools do not handle custom GUI objects well, and some tools are highly unstable when faced with Java applets, multimedia, or proprietary development environments. Invite the vendors in to give your technical teams a demo, and insist that they leave an evaluation copy behind after the initial demonstration. The evaluation copy should be a fully functional version of the tool that will expire after a certain number of days (usually 10 to 30 days). Use the evaluation copy to test the tool's effectiveness and stability in your environment. Again, some development environments can cause test automation tools to be highly unstable, so it is essential that you test the tool yourself, without the guiding hand of the vendor. If the vendor won't give you an evaluation copy, find another vendor.

A key part of selecting the best tool for your environment and testing team is evaluating the suitability of automating specific tests and deciding how you will use the tool. Remember that in the short term, test automation takes more time. As a rule of thumb, it takes from four to eight times longer to automate a test than it takes to create the manual test. And the first step to creating an automated test is, of course, creating a manual test that will serve as documentation of the automated test.

If automating a test case might require too much time and produce an overly complicated test, evaluate how the test might be reused. I recently assigned one of my staff to write a suite of tests for a stock-option trading module. The requirements specified about eighty functional tests. The staff member wrote a single module that took the test data as parameters, which in turn, comprised the eighty tests. The reusable function took nearly three days to complete, but he wrote the tests by passing in various combinations of the parameters in less than an hour.

Test automation effort can be significantly reduced if your team develops a single set of utility functions. Utility functions are not tests in themselves, they're procedures to drive the application to certain states prior to executing the actual tests. Creating a utility function for, say, selecting a random customer record in a billing system will not only save test development time, it can dramatically reduce the amount of code to be maintained.

If a test will only needs to be executed a few times, don't waste the effort in automating the test case; it's more cost effective to leave it as a manual test. The same goes for an application that's unstable or rapidly changing. Similarly, if the automated test case will require a high degree of maintenance, the function returns dynamic data, or the test involves a real-time information system, there will likely be a need to test the function manually.

--Jeff Gainer


(c)2001 Cutter Information Corp. All rights reserved. This article has been reprinted with the permission of the publisher, Cutter Information Corp., a provider of information resources for IT professionals worldwide.

This article originally appeared in the Cutter IT E-Mail Advisor, a supplement to Cutter IT Journal.

Return To