By Eric Sweigard–As a former Navy ship captain and a current program manager for automated testing projects at IDT, I couldn’t help but wonder if the power of automation that improves the testing of software-driven systems (like the systems I had on my ships) might also be useful in training operators of those same systems.  I’m not the only one to make that connection either.  During numerous demonstrations of automated testing, audience members have suggested similar applications for automation. Clearly, if we can automate a test on a GUI-based system, then it is not too big of a stretch to envision using a similar approach to automatically observing and providing feedback to a user-in-training as well.

Can Automation Really Replace Human Trainers?

First, let’s address the natural reaction of many who will ask, “Are you really proposing that a piece of software could replace experienced, hands-on trainers?” Well, not really. What I am suggesting, however, is that technology–specifically the kinds of technology we utilize in support of automating GUI-based testing–might be able to augment training such that trainees could benefit from the extra time practicing with a 24/7, lower-cost automated trainer.

For example, consider training a sailor to respond properly to a system alarm in a piece of (software-driven) machinery’s control console.  In Navy ships, just as in commercial airline cockpits and nuclear power plants, when an alarm condition occurs on the system being operated, the operator (sailor, pilot, engineer) must respond with a predetermined set of responses in the precise order specified and must do so with haste.

With the hands-on human trainer, the trainee is presented with alarm conditions and the human trainer observes as the trainee takes, or fails to take, the predetermined set of responses. The training instructor then makes an assessment of the trainee’s performance upon completion of the response. There are some limitations with this approach—namely:

  • An experienced trainer must be available to conduct the training;
  • The trainer must be able to observe, assess and accurately record the trainee’s performance of the required steps;
  • The trainer must be able to convince the trainee that what he recorded actually happened;
  • The trainer ought to have a way to record the performance and compare successive sessions for trends of improvement or lack thereof.

With the human trainer, this is easier said than done.  During training sessions, there are often many actions occurring at once and it is easy to miss an action of the trainee, correct or otherwise.  If the trainer says that a trainee acted properly when the trainee, in fact, performed a step out of sequence but arrived at the correct outcome despite that error, then training has been compromised and the operator may misunderstand the procedures as unintentionally reinforced by the trainer.

Can Automation Augment Hands-on Training?

With automation technology, a trainee’s steps will not be missed, nor the sequence of those steps incorrectly assessed.  Automation would enable a replay of the exact actions conducted by the tester or trainee to support assessments, including performance over time, and provide undeniable proof that the assessments are, in fact, accurate. Our experience shows that automation technology can increase training effectiveness at a lower cost when compared to the way training is typically conducted today.

How that transition from automated testing to automated training occurs will be the subject of a subsequent post.  Thanks for reading and if you would like to know more about the kinds of automated testing software that could readily assume an automated training role, please visit www.idtus.com and learn more about IDT’s Automated Test and ReTest (ATRT) technologies and processes.

Eric Sweigard is a Program Manager for Innovative Defense Technologies (IDT), where he oversees Aegis Combat System testing.  His previous experience includes 28 years as a Navy Officer.