Automated Software Testing Gears

By Eric Sweigard—In an earlier post  I offered some thoughts on why automated testing might be of use helping to train operators of systems suited for automated tests.  I also discussed how automated test tools could perform certain training tasks more faithfully than human trainers.  This post will discuss how such tools might be used to assess trainee and operator performance during a controlled scenario.

The Controlled Scenario

Consider training a future commercial airline pilot to follow emergency procedures for an engine fire.  For illustration only, let’s assume the procedures are:

Initial Indications:

  1. Engine #X Fire visual alarm accompanied by wailing audible alarm

Immediate Actions:

(Note: Immediate Actions must be taken in precise order in less than 20 seconds.)

  1. Acknowledge Engine #X Fire visual and audible alarms by selecting Engine #X Fire at the Engine Control Window
  2. Select Engine #X Emergency Fuel Trip on Engine Control Window
  3. Confirm fuel pressure drops to zero at Engine X Fuel  Inlet Pressure window on Engine Control Window
  4. Select Engine #X Fire Suppressant Release on Engine Control Window
  5. Confirm Engine #X Exhaust Temp  on Engine Control Window drops below 400 degrees (F) upon release of Fire Suppressant

The Application of Automation Technology

Automation technology, such as ATRT: Test Manager, has the ability to sweep a display for images, detect flashing and audible alarms, and compare actual screen images (or internal system messages) to expected screen images (or messages). ATRT: Test Manager can also verify whether the specified actions required by the emergency procedures were taken and whether they were taken in the correct order.  Additionally, ATRT: Test Manager can calculate the time between actions and the elapsed time from emergency initiation to completion of the emergency procedures.  Using these measurements—number of correct actions, number of correct actions in correct order, and elapsed time—we can then calculate objective “measures of performance” by which a trainee’s performance can be assessed.

In addition to collecting metrics and calculating measures of performance, ATRT: Test Manager can also calculate which procedures take the longest to perform, which procedures are most often omitted or performed out of order, and perform trend analysis from one training session to another.  ATRT: Test Manager can also provide instant replay of the operator’s actions to help reinforce where the trainee needs to focus his or her attention during such emergencies.  All of this objective evidence is helpful to the trainers in assessing their own effectiveness in their training programs.  Finally, one could take this simple procedure and combine it with countless other procedures for a complex, more stressful training scenario.

The Bottom Line

Automation technology today makes it possible to support complex interactions with Systems Under Test and those systems’ operators.  Being able to measure objectively, accurately and consistently how a trainee is performing  lends integrity to the training process and can augment the necessary human instructors especially where human capabilities are limited.

If you would like to know more about the kinds of automated testing software that could readily assume an automated training role, please visit our website and learn more about Automated Test and ReTest (ATRT) technologies and processes.

Eric Sweigard is a Program Manager for Innovative Defense Technologies (IDT), where he oversees Aegis Combat System testing.  His previous experience includes 28 years as a Navy Officer.