NAVSEA – Public Release 2015-0262 – Statement A: “Approved for release; distribution is unlimited.”
The US Naval Sea Systems Command (NAVSEA) oversees hundreds of complex, software-driven systems that utilize leading-edge technology to defend our country and protect our Sailors, Airmen and Marines. NAVSEA’s Program Executive Office for Integrated Warfare Systems (PEO IWS) directs the combat technologies and systems for surface ships and submarines. One such system is the AEGIS Weapon System, which is a multi-mission integrated weapon system. This complex system, managed by PEO IWS 1.0, must complete a rigorous test and certification process—including the verification of tens of thousands of requirements and millions of lines of software—to demonstrate the system is suitable for deployment.
The AEGIS system is developed and tested using a methodology by which new capabilities are added and verified incrementally. Therefore each new capability build includes testing of new functionality and re-testing or regression testing of functionality included in earlier builds leading up to final test and certification events. To support this comprehensive test program, system testing is conducted across multiple test sites with both contractor and government-led test teams.
PEO IWS 1.0’s interests were to:
- Increase efficiency of testing to eventually enable more rapid fielding of AEGIS Weapon System capabilities
- Enable more efficient sharing of test and analysis results among various test teams
- Improve software quality through more thorough testing thus reducing the risk and cost of latent software defects
Innovative Defense Technologies (IDT) worked with PEO IWS 1.0 to develop an automated test capability to support their objectives. A review of current testing processes suggested that implementation of IDT’s patented Automated Test and ReTest (ATRT) methodology for testing and analysis would yield significant results. The strategy included the following two major components:
- Automate the data analysis required to verify critical system and software requirements. Hundreds of analysis cases were being evaluated for each new capability build and certification event to verify critical system and software requirements along with key performance measures. The analysis was a manual, labor-intensive process and as a result required a significant portion of the overall test and certification timeline. In addition, the analysis could only be conducted on a subset of the data available due to time and manpower limitations. The employment of ATRT to automate these analysis cases would not only be more efficient from a time and manpower perspective but also enable all the data to be evaluated. The result would be increased efficiency, test coverage and consistency.
- Reuse test and analysis cases and share the results among the test teams responsible for the system. Testing for the AEGIS system is distributed across multiple test teams at different locations and includes: acceptance testing, certification testing and ship qualification/operational testing. Each team has its own set of equipment, areas of expertise, and test objectives; however, there is significant overlap in the test and analysis required. Automation with ATRT would provide the capability for distributed test teams to effectively share test and analysis cases along with the associated results. Consequently, for any test run at any test site, including at sea, all testers and analysts would be able to leverage the knowledge gained from tests conducted at other sites.
NAVSEA PEO IWS 1.0 has successfully employed their strategy with ATRT and realized the following results:
- Increased testing efficiency for those critical system and software requirements where automation was applied. The result is significantly less time and manpower being required to verify the associated requirements and system performance.
- Improved collaboration among test teams. The application of ATRT facilitated efficient sharing of analysis cases between the various AEGIS test entities. As a result, each test team gained the ability to conduct more thorough analysis at each testing stage.
- Improved software quality and reduced risk. Automation has increased requirements coverage and expanded the data able to be evaluated to assess the system performance. Additionally, sharing of analysis methods between test teams has enhanced defect resolution.
Download a PDF of IDT’s NAVSEA Case Study
Contact IDT for more information.