Defense Needs Better Ways to Test Software
By Bernie Gauf, CEO of IDT
Originally published in National Defense magazine-October 2013
Developing software for the Defense Department has many inherent challenges, not the least of which is testing. Traditional software testing for defense systems consumes up to 50 percent of development resources. Yet, it is only during this phase that engineers can be assured systems are ready for deployment. Because of the critical nature of military systems, corners cannot be cut.
In brief, the traditional approach to software testing involves manually creating and running a wide range of tests at all stages of development to ensure that the system requirements have been successfully incorporated. Additionally, tests must be run to confirm that new software works properly with the software and systems already in place.
With the current manual testing approach, tests are typically documented using a word processor or spreadsheet application, with a step-by-step procedure describing operator actions or input and expected response. Test procedures also describe how the system is required to be configured prior to conducting the test. The pass/fail status of each step is usually written down by the test engineer on a printed hard copy of the test procedure. Every time a test is run, the test engineer executes each step of the test procedure and records the results. It is a labor-intensive process.
Innovations in software testing technology provide alternatives to current testing methods. A case in point is automated software testing.
As with manual processes, automated software testing requires engineers to design tests that support the verification of requirements. Automated testing is also similar to manual testing in that the test program is dependent on the design of high-quality tests.
Automated software testing enables the operator actions or input, along with the expected response, to be digitally captured or recorded. When the test engineer executes an automated test, the technology sends the digitally captured operator actions and input to the system under test and evaluates the response against the expected results. A report is automatically generated to document the results. In order to execute the test, the engineer simply launches the automated test and is not required to manually conduct each step.
In order to transform testing, this new technology must be embraced.
Change is slow. Sometimes the best ideas take years to catch on. Engineers engaged in software development may believe, and rightly so, that they are part of a highly progressive industry. But even there one finds resistance to change.
Progress always involves risk. That is nothing new. In the early 1900s, Henry Ford introduced the assembly line to automobile production. It was novel. It was threatening. The unions didn’t like it and were worried it would lead to job loss. But this early form of automation boosted production and cut assembly time in half, resulting in improved efficiency, better quality and increased affordability. Within five years, Ford’s process revolutionized the automobile manufacturing industry.
Fifty years later, George Devol designed the first programmable robotic arm. In the early 1960s, this device transported die castings in a General Motors plant in New Jersey. Initially seen as a curiosity, robots also prompted speculation. Would they eventually replace the common worker? It was too early to tell; regardless, robotics soon became another important advance for the manufacturing sector.
As a nation, we say that progress in science, technology, engineering and math (STEM) is crucial to America’s future. This certainly has been true in the past. Manufacturing was forever altered by the introduction of automation and robotics. However, if we truly believe this, then we need to embrace the outcome of advances in STEM going forward, and apply them to the development of our military’s systems.
Engineering teams could benefit from embracing new technology — specifically the advances now available in software testing.
Program managers and engineers involved in large, highly complex software projects for the Defense Department devote their resources to planning, designing, developing, and testing. Many highly skilled engineers are required to conduct the testing and to analyze the results. Following the traditional industry approach, software testing consumes, on average, more than half of a project’s schedule and resources.
In the current budget environment, the pressure is on to streamline the software development process and reduce headcount. As a result, fewer engineers will be available to conduct tests. However, the need to verify system requirements and performance has not changed.
Going forward, limited resources will mandate that systems have longer lifespans, increasing the need for more regression tests over time. In addition, the pace of software updates will likely increase, further augmenting the regression testing workload.
Automated software testing technology accelerates execution and reporting time while expanding test coverage. For today’s large, complex, mission-critical systems, this translates to a significant increase in testing efficiency.
The majority of tests that are run manually can be automated, with an expected increase in testing productivity of about 75 percent. Automation can implement a broad range of tests and easily repeat them, covering multiple combinations and increasing the amount of testing completed. Generally, a high percentage of the tests that are part of a manual testing program — functional, performance, concurrency and stress — can be automated.
Automated tests readily produce documented, objective, quality evidence, including requirements traceability and comprehensive pass/fail results. They provide the capability to verify thousands to millions of test permutations in minutes to hours.
Implementing automated testing does involve an initial commitment to design. But instead of defining and executing every test step and command manually, over and over, creating an automated testing solution requires that focus only once. Upon completion, a well-thought-out automated testing program can support the verification of multiple baselines and configurations.
Just as early manufacturing was transformed by innovators who were willing to forge a new and different path to productivity, so too those who incorporate automated testing into software development will lead us squarely into the future. We need to accelerate the adoption of innovation. Forward-thinking military leaders, program managers and engineers who are still utilizing traditional methods of software testing should embrace this progressive technology.
Bernie Gauf is CEO of Innovative Defense Technologies, an information technology business headquartered in Arlington, Va., and co-author of the book Implementing Automated Software Testing (Addison-Wesley, 2009).
This article was reprinted with the permission of National Defense Magazine and the National Defense Industrial Association. October 2013. www.nationaldefensemagazine.org.