Automated Software Testing GearsAs discussed in our last blog, one of the most important factors in implementing a successful Automated Software Testing (AST) effort is knowing and understanding all of the requirements. These should be gathered and organized in a concise manner, as they will form the basis for effective AST planning. Requirements are the blueprint which guides how an automated testing program is built, implemented and measured.

This series of blogs addresses various types of requirements to be considered when designing an effective automated testing effort. Previously, we addressed the AUT/SUT requirements. This blog will address two more areas of requirements. The complete list is as follows:

  • AUT or SUT requirements (See blog from 03.25.14.)
  • AST framework & tool requirements
  • AST data requirements
  • AST environment requirements
  • AST process requirements
  • AST resources, such as personnel, time lines and budgets

Automated Software Testing Framework and Tool Requirements

Once you understand the SUT or AUT requirements, including the AUT architecture, design and technologies, you will have a baseline with which to start identifying your AST framework and tool requirements. It is a good idea to think about this early in the process. What requirements does the AST framework need to meet in order to verify the SUT requirements? Would a tool that integrates with the AST framework work…or would a platform independent tool be less intrusive and provide more benefits? Choosing the right tool is an important decision.

There is great variety in the types of tools currently available. Unfortunately, many automated software testing tools are platform-, technology- and/or operating system dependent. One vendor’s tool strength might be in the windows desktop market, while another’s is in the browser market. Some tools support specific GUI controls, while others might support a different family of GUI controls. And so on. Despite due diligence in selecting automated testing tools, sometimes a developer’s desire to modernize a product renders a careful tool selection obsolete.

Testing generally calls for more than just GUI automated testing across platforms and OSs. It is also important to test the backend of the system, for example the messages being sent back and forth between system interfaces. To be able to do this type of “grey box” testing, test teams need to either find another tool apart from the GUI testing tool or develop an automated test harness or script; few tools exist that provide both GUI and message-based testing features combined in one tool suite.

According to one of IDT’s directors, Elfriede Dustin, “To avoid this ongoing automated testing tool conundrum and frustration, we need a vendor-provided automated testing tool that meets the following requirements all in one tool.”

  • is cross-platform and cross-operating system compatible
  • is GUI technology independent and captures the GUI the way the user sees it: as images
  • does not use a scripting language, but instead is test model based and allows testers to create test flows via a drag and drop interface
  • allows testers to do automated “grey box” testing, i.e. GUI based and backend based testing combined
  • allows testers to create add-ons and extensions that can be shared across the testing community
  • allows testers across industry to be able to re-use the tool when they switch jobs or programs [i]

To meet the need for the type of testing described here, IDT has developed the patented ATRT technology and methodology, which enables broad capabilities that can provide a comprehensive solution to the testing tool conundrum. ATRT: Test Manager offers all of the features listed above and is well worth exploring.

Framework requirements need to be identified based on automated testing needs and tasks that are specific to each program. Developing an AST framework and/or implementing a tool require software development and a mini-development lifecycle.

Automated Software Testing Data Requirements

AST can inherently exercise high volumes of test data. An effective AST test strategy requires careful acquisition and preparation of this data. Various criteria need to be considered in order to derive effective test data, including, but not limited to, data depth and volume, breadth, scope, test execution data integrity and conditions.

Next the test team will need to consider how the date will be acquired. Will they use “live” or production data? Live data often provides more benefits, but it will need to be treated with caution and wiped clean of personal identity information.  Alternatively, the test tool can generate the data. Regardless of the data method chosen, the test team needs to verify that the test data provides the various scenarios, business logic coverage and volume needed to verify the functional and security requirements along with performance requirements.

Finally, the test team must decide whether test data should be modifiable or unmodifiable. In brief, modifiable test data can be changed without affecting the validity of the test; whereas unmodifiable test data requires that specific test requirement conditions be met and is not modifiable.

Defining the AST framework, tool and data requirements are additional steps on the path towards test automation. Suboptimal decisions may result in delays, cost overruns, and customer dissatisfaction. An experienced automated software testing partner like Innovative Defense Technologies (IDT) can provide guidance, support and innovative testing tools for the effort. Please contact us for more information.

Please check back soon for the next blog in this series which will address test environment requirements.

Some information taken from:  Dustin, Elfriede, Thom Garrett, and Bernie Gauf.  Implementing Automated Software Testing: How to Save Time and Lower Costs While Raising Quality. Upper Saddle River, NJ: Addison-Wesley, 2009. This book was authored by three current IDT employees and is a comprehensive resource on AST.  Blog content may also reflect interviews with the authors.

[i]  (Article by Elfriede Dustin published in Software Test Professionals)