Skip to content

5 Steps to Using MBSE to Enhance Automated Software Testing

By Phil Monte–There are many challenges inherent in the deployment of large, complex software systems and Systems of Systems (SoSs). To address the intricacies of the design, development, testing, integration, and maintenance of these software systems, IDT has developed a technical approach that utilizes Automated Software Testing combined with Model-Based Systems Engineering (MBSE) and Development (MBSD).  “MBSE is the formalized application of modeling to support systems requirements, design, analysis, verification, and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases.”[1]

The MBSE approach enhances specification and design quality, reuse of system specifications, components and design artifacts, and communications among the development team. Typically when implementing MBSE/MBSD, all relationships and requirements throughout the entire system lifecycle are centrally managed through the use of a System Modeling tool. System requirements are linked directly to system Use Cases, or behavioral threads in the system behavioral model.

To develop a system model that maps system requirements to behaviors to executable test flows, IDT has defined a five-step process, which has been used extensively and successfully across many programs. These steps are as follows:

  1. Requirements Management ProcessSystem requirements are managed externally to the system model in a requirements management tool.  The requirements can be directly imported into the system model and be continuously synchronized with the external requirements management tool. This allows for traceability through specification changes and updates, which is critical for automated testing.
  2. Develop System Level Functional DecompositionA large system’s behavior is typically complex and can have thousand of unique paths that need to be tested.  To model these behaviors, a functional decomposition is necessary.  Each is considered a compartmentalized functional system behavior within a given scenario. Groups of these functional Use Cases (referred to as scenarios) roll-up to define the system behavior in its entirety.
  3. Decompose each Functional Use Case into a sequence of actionsFunctional Use Cases consist of a sequence of actions that describe the given function’s behavior.  Groups of these behaviors define a function and groups of these functions define a behavioral thread.
  4. Map Requirements to behavioral Functional Use Case ActionsWith actions defined for a given Functional Use Case, and the system requirements stored off in the model, one can now link these requirements to the specific actions they pertain to, depending on their associated Use Case(s).
  5. Develop set of Test Flows for each Scenario Test Flows can now be derived from a full system behavioral model and linked to its corresponding Use Case(s).

The resulting test flows integrate into our Automated Test and ReTest (ATRT) technology suite, providing full performance and requirement verification, requirement analysis, synthesis, decomposition, allocation and tracking.

IDT’s approach combines Model-Based Systems Engineering and Development, Automated Software Testing, Continuous Integration and Virtualization to provide customers with a comprehensive capability to efficiently manage, certify, deliver, support and extend mission-critical systems. For more details and information on how this approach can help your organization, contact Innovative Defense Technologies.

Phil Monte is a Senior Systems Engineer at IDT. He spends most of his time focused on engineering automated test solutions for the Navy’s Submarine Programs.

[1] International Council on Systems Engineering (INCOSE), Systems Engineering Vision 2020, Version 2.03, TP-2004-004-02, September 2007. (