Background:
In this effort, Innovative Defense Technolgies (IDT) was focused on using automation to support the detailed data analysis and event reconstruction conducted on a system after a major test event was completed. Over the years, as the system became more complex, the amount of data collected and to be analyzed had grown by more than a factor of ten. As result, the historical approach of manually evaluating results grew manpower- and time-intensive, resulting in a limited sampling of the data being evaluated.
Strategy:
An automated test strategy was developed based on a review of the amount of data collected, the analysis of tasks performed, the time and manpower required to complete each phase of the analysis, and the reporting requirements. The strategy identified the analysis tasks that were repeated most often, were the most time consuming, and also the most manpower intensive. The strategy also identified the data that was most important to fully evaluate because of the critical nature of the associated functionality.
Approach:
IDT’s approach to automated data analysis centered on three key areas:
(1) Being able to describe the analysis in a format that was easy to understand, would be easy to maintain, and that provided traceability to the system requirements
We chose to describe the analysis cases in SYSML, and were able to import the model information, including requirements, into Automated Test and ReTest (ATRT): Analysis Manager.
(2) Being able to read, manage, and sort the large amount of data collected during test events
The system being evaluated already recorded all the data necessary to meet the reporting requirements. The customer had already developed software to read and parse the collected data. As a result, IDT just needed to develop an interface from the existing data reader to the data base utilized within ATRT: Analysis Manager.
(3) Being able to provide: a report format that would enable the analysts to quickly identify any analysis cases that did not pass; the requirements associated with the analysis cases; and the information needed to begin to conduct root cause analysisATRT: Analysis Manager was designed to provide a flexible reporting format that could be tailored to meet the unique needs of each user. This flexibility is also found in ATRT: Test Manager, another product developed for automated software testing. After meeting with the analysts to define the detailed information they wanted in the reports, we were able to easily update the available reports to meet their requirements.
Conclusion:
In this case, a reduction in time and manpower of 77% compared to the historical manual analysis approach was achieved following the strategy IDT outlined and deployed using ATRT. In addition, and even more important for this particular customer, his strategy provided the capability to evaluate all the data collected while more consistently and objectively tracing the results back to their requirements. The net result is higher confidence in the understanding of the system’s performance before final acceptance and deployment.