**2. MS&A approach supporting defense acquisition life cycle**

Existing US DoD acquiEsition life cycle employs capability-based acquisition and has three key milestones, namely, Milestone A, Milestone B and Milestone C, which correspond to (i) analysis of alternative (AoA) and technology development phase, (ii) system and prototype development and demonstration phase, and (iii) produce and deploy phase, respectively [4]. For US Air Force and Space Force, the MS&A domains4 for supporting the three key milestones and associated three phases can be defined as operation, training, test and evaluation, acquisition, analysis, education, experimentation and war-gaming exercise. Based on the identified MS&A domains, **Figure 2** proposes an advanced MS&A framework and associated processes supporting US DoD defense acquisition life cycle. The proposed framework defines the (i) input in terms of systems-of-systems CONOPS meeting warfighter needs and stakeholders' inputs and requirements, (ii) output in terms of data products, and (ii) related MS&A components to support the DoD acquisition life cycle. Following is a high-level description of the proposed framework, including four phases with required MS&A Models and tools and related processes supporting US DoD acquisition life cycle [4]: (i) Phase 1: systems-of-systems architecture trade study supporting pre and post-Milestone A – Addresses required MS&A tools for architecture refinement, evolution and spiral development planning, capability gap analysis, system solution development, convert capabilities to system

<sup>4</sup> Depending on the warfighter's needs, the M&S domains can be different.

#### **Figure 2.**

requirements – This chapter focuses on systems-of-systems architecture design and analysis for the development of cost effective acquisition strategy and acquisition of optimum reference architecture solutions; (ii) Phase 2: systems-of-systems integration and operation support at pre-Milestone B – Addresses MS&A models and tools for assessing and evaluating incompatibility testing and operation; (iii) Phase 3: systems-of-systems testing and fielding support at post Milestone B – Addresses MS&A models and tools for evaluating hardware and software testing and fielding; and (iv) Phase 4: training and sustainment at pre and post Milestone C – Addresses models and tools for supporting on/off-line training and sustainment evaluation.

The proposed M&SA framework, including processes, models and tools, presented in **Figure 2** can provide support the US DoD acquisition life cycle. The framework allows for USG stakeholders to incorporate their needs using systemsof-systems perspective taking into account (i) Warfighter's needs (CONOPS and mission threads), (ii) M&SA domains (e.g., operation, training, test and evaluation, acquisition, analysis, education, experimentation and war-gaming), (iii) JCIDS analyses and US DoD Defense of Acquisition Guide (DAG) process [2–4], (iv) DoD Joint Tactical Architecture (JTA) MS&A standards, (v) USG Stakeholder M&S strategic plan (e.g., [8]), and (vi) USG Stakeholder's goals, scopes, objectives, Statement of Objectives (SOO), and System Engineering Plan (SEP).

## **3. MS&A approach supporting architecture design and analysis**

The proposed systems-of-systems MS&A approach presented in this section addresses the system architecture design and analysis for Phase 1 at pre and post Milestone A with an emphasis on achieving integrated capabilities at the enterprise level. As discussed in Section 1, US DoD has been using capability-based approach for developing government reference system architecture (GRA) solutions that would be used for generating optimum acquisition strategy to select appropriate contractor(s)

*MS&A framework supporting US DoD defense acquisition life cycle.*

#### *Systems-of-Systems MS&A for Complex Systems, Gaming and Decision for Space Systems DOI: http://dx.doi.org/10.5772/intechopen.100007*

for system acquisition. To avoid potential stovepipe GRA solutions, the capability-based approach allows the (i) USG team to develop desired GRA solution(s) in terms of high level required capabilities that are independent of technologies, and (ii) selected contractor to decide what technology enablers (TEs)5 should be used to meet the required capabilities dictated by the GRA. Based on the choices of TEs, the selected contractor defines the system trade space and derives the Level-A specification. From the USG perspective, at pre-Milestone A, the system architecture trade space is not well-defined6 since the choices of TEs are not available for the system architect to perform architecture design trade study making the search for an optimum GRA solution becomes very challenging. This section discusses how MS&A models and tools should be developed to support the system architecture trade study at (i) pre-Milestone A, where the USG team is responsible for the trade study to generate a reference architecture solution for the development of optimum acquisition strategy, and (ii) post-Milestone A, where a contractor (or multiple contractors) is (are) selected to work with USG team to refine the reference solution and develop associated CDD at Pre-Milestone B.

Practically, at pre-Milestone A, a contractor has not been selected7 and the USG team obtains required inputs from warfighter needs and associated stakeholders' requirements in terms of desired systems-of-systems Enterprise (SOSE) CONOPS and associated mission threads for the desired system to be acquired. It is assumed at this stage that the Capability-Base Analysis (CBA) was completed and that the capability gaps were identified and associated potential capability solutions for the identified gaps were documented in the preliminary Initial Capability8 Document (ICD). From USG's perspective, the USG team's objective is two-fold, namely, (i) to develop an optimum reference architecture solution meeting warfighter needs along with affordable cost and deployment schedule, and (ii) to finalize the ICD for post-Milestone A. The goal of the MS&A models and tools for this phase is to support USG team achieving these objectives. The key challenge for Pre-Milestone A is the lack of a clearly defined system architecture design trade space due to the intent9 of capability-based approach. **Figure 3** presents a MS&A approach to address this challenge and support key Milestone A activities, including pre- and post-Milestone A activities. As shown in **Figure 3**, Sections 3.1 and 3.2 discuss systems-of-systems MS&A approaches for pre-Milestone A and post-Milestone A, respectively.

#### **3.1 MS&A approach for pre-milestone A**

This section emphasizes on systems-of-systems MS&A approach for the preaward phase at pre-Milestone A. As mentioned earlier, at pre-Milestone A, a contractor has not been selected, and the USG team is responsible for developing reference system architecture with associated program and technical risks. **Figure 3** shows that there are four key pre-Milestone A activities requiring MS&A support, including: (i) SOSE CONOPS assessment for identifying SOSE architecture solutions and generating corresponding alternative system architecture solutions, (ii) System architecture assessment and trade study for selecting optimum system architecture solutions, (iii) Acquisition strategy development and optimization, and (iv) Pre-award risk

<sup>5</sup> TE is a specific technology solution meeting a required capability alone or in combination with other TEs.

<sup>6</sup> ICD captured CBA results identifying desired capabilities, where the system trade space is not defined until the contract is awarded.

<sup>7</sup> Pre-Milestone A is the pre-award phase, and post-Milestone A is the post-award phase.

<sup>8</sup> Capability is defined as an ability that a system has, which fulfills a warfighter need. As examples, the abilities to manage satellite trajectories and disseminate mission data to users at video streaming.

<sup>9</sup> A contractor will be selected to define the trade space for the system design and build.

#### **Figure 3.**

*MS&A framework and processes supporting milestone A.*

assessment. This section discusses MS&A models and tools for supporting these four key activities. Sections 3.1.1, 3.1.2, and 3.1.3 describe SOSE MS&A approaches for pre-Milestone A activities, including SOSE CONOPS assessment, system architecture assessment, and pre-award risk assessment, respectively. As shown in **Figure 3**, the SOSE MS&A approach for the acquisition strategy development and optimization will be addressed in Section 4.

### *3.1.1 Approach for SOSE CONOPS assessment in pre-milestone A*

The MS&A approach for SOSE CONOPS assessment discussed in this subsection is derived from [9, 10] with an emphasis on the design and build of a new space system that can be deployed in a complex space SOSE. The complex space SOSE can be assumed to have three families of systems (FOSs), namely, FOS of communications satellites, FOS of sensing satellites and FOS of position-navigation-and-timing (PNT) satellites. This section describes a MS&A approach to design and build of a new space system in this complex space SOSE environment.

The proposed MS&A approach employs advanced orbital mathematical and complex space systems simulation models for the assessment of a pre-defined SOSE CONOPS to identify the alternative systems-of-systems architecture solutions meeting warfighter and stakeholders needs [9]. This approach allows the system architecture solution to be optimized within a selected set of alternative systemsof-systems architectures using appropriate APAs and KPPs. Recently, USG has been using the "Resilience" attribute for assessing and optimizing SOSE CONOPS performance [11–14]. The Resilience attribute encompasses avoidance, robustness, reconstitution and recovery. Practically, Resilience Capacity (RC) metric is defined as the system resilience against an adversary threat, and RC is a value that represents a fraction of system capability that is retained after the recovery and reconstitution steps. Mathematically, RC is a function of:

• Avoidance - RAV: is a measure of how likely it is that the threat can be fully avoided,

*Systems-of-Systems MS&A for Complex Systems, Gaming and Decision for Space Systems DOI: http://dx.doi.org/10.5772/intechopen.100007*


Mathematically, RC can be expressed as follow [14]:

$$\begin{array}{l} \text{RC} = \mathbf{R}\_{AV} + \left(\mathbf{1} - \mathbf{R}\_{AV}\right) \mathbf{R}\_{\text{RO}} + \left(\mathbf{1} - \mathbf{R}\_{AV}\right) \left(\mathbf{1} - \mathbf{R}\_{\text{RO}}\right) \mathbf{R}\_{\text{RV}}\\ + \left(\mathbf{1} - \mathbf{R}\_{AV}\right) \left(\mathbf{1} - \mathbf{R}\_{\text{RO}}\right) \left(\mathbf{1} - \mathbf{R}\_{\text{RV}}\right) \mathbf{R}\_{\text{RC}} \end{array} \tag{1}$$

For defense space applications, the most pronounce threat is the radio frequency interference (RFI) threats from both friendly and unfriendly sources. Thus, RAV, RRO, RRV, and RRC can be defined in terms of the SOSE architecture10 performance as follow:


SOSE network score is used to assess and evaluate the SOSE network states. The SOSE network score is calculated by the number of communication pairings (e.g., Ground Terminal 1 connected to Satellite 1) possible in the current state divide by the number of pairings possible in an ideal State. It is the probability two arbitrary SOSE network nodes can communicate or connect to each other. Thus, the SOSE network score is defined as:

$$\text{LOSSE Network\\_Score} = \frac{\sum\_{l=l}^{N} \binom{l}{2}}{\binom{N}{2}} \tag{2}$$

where *l* is the number of fragmented network *i*, N is the total number of fragmented networks, and 2 *l* is the Binomial coefficient.

RC and SOSE network score models are used to evaluate and assess SOSE communications LM and SOSE network availability [9, 10]. In addition to RC model,

<sup>10</sup> SOSE architecture consists of families of space systems (FOS) and FOS are connected by communications datalinks. A datalink connects two system nodes, and the nodes are connected when the communications datalink maintains a specified link margin (LM).

[9, 10] also recommended two additional models that are very useful in the SOSE CONOPS assessment, namely, Resilience Assessment Index (RAI) and Spectrum Resiliency Assessment Index (SRAI). RAI Model is used to generate a "Heat-Map" for identifying areas impacted by RFI threats and associated reconstitution's quality (RRC). SRAI Model generates a "Heat-Map" to show the likelihood that a space system can access to the allocated frequency-band in the presence of RFI events. A description of RAI and SRAI models is provided in [10].

**Figure 4** describes an advanced MS&A approach with desired simulation models for SOSE CONOPS assessment. Based on the warfighter needs and related stakeholders, SOSE CONOPS can be developed to address warfighter needs using required SOSE databases. The required SOSE databases include practical operational systems that can impact the pre-defined SOSE CONOPS. The pre-defined SOSE CONOPS focuses on defense space systems and defense space enterprise, and the operational space systems can impact the defense space enterprise operations, including civilian space and commercial space enterprises. in addition to RAI-RFI, SRAI and RC models, additional mathematical and simulations models are required to perform SOSE CONOPS assessment, including SOSE orbital analysis and simulation, dynamic LM calculation, satellite performance, and avoidance models.

SOSE orbital-analysis-and-simulation model is used to simulate the RFI threats and dynamics of space systems of interest providing accurate SOSE network nodes and associated positions and network nodes connectivity. The dynamic-LMcalculation model simulates and evaluates link margins of SOSE communications links among SOSE network nodes and calculates network score. The satelliteperformance model simulates and evaluates satellite system performance, including signal-to-noise ratio (SNR) calculation and processing time estimation for assessing the recovery time from the threat. The avoidance model simulates space system threat avoidance techniques including antenna beam nulling and adaptive modulation-and-coding techniques to assess and evaluate if the RFI threats can be avoided **Figure 4**.

**Figure 4.** *MS&A approach for SOSE CONOPS assessment.*

*Systems-of-Systems MS&A for Complex Systems, Gaming and Decision for Space Systems DOI: http://dx.doi.org/10.5772/intechopen.100007*

As described in **Figure 4**, the inputs to the SOSE CONOPS MS&A models and tools are required warfighter capabilities, RFI threats and sources of threats, operational use cases and operational constraints. The MS&A output is a set of optimum (or the best) alternative architectures based on a pre-defined SOSE CONOPS. Note that the USG team will adjudicate of what is the "best" or "optimum" set of alternative system architecture solutions based on warfighter and stakeholder needs and the SOSE network score for each operational use case associated with the pre-defined SOSE CONOPS.

### *3.1.2 MS&A approach using multi-criteria decision support system for system architecture assessment in pre-milestone A*

To address the system requirements trade space challenges, the proposed system architecture assessment approach should be based on the required warfighter capabilities and market survey results to identify desired TEs for providing the required capabilities. The MS&A approach is derived from [15, 16], where system architecture assessment is based on the technical performance, market, cost, and schedule risks. Technical performance risk is referred to as technology risk and is quantified using Technology Readiness Level (TRL), while market risk is related to market uncertainty and is quantified by Manufacturing Readiness Level (MRL). Rough Order Magnitude (ROM) Cost, TRL and MRL data are collected from a market survey for cost and schedule risk assessment. **Figure 5** illustrates a recent advanced MS&A approach for system architecture assessment and an example of an architecture solution output [15, 16]. The approach uses game theory combined with the war-gaming concept to assess and optimize the system architecture solutions using the market survey results. The approach requires input from warfighter and associated stakeholders along with a set of "optimum" alternative architectural solutions obtained from SOSE CONOPS assessment described in Section 3.1.1, and a predefined Payoff-and-Cost Function (PCF). The outputs are (i) optimum architecture solution, (ii) associated technology and market risks, and (iii) predicted related schedule and cost risks. The selected optimum architecture solution is captured in terms of selected TEs and DoD Architecture Framework (DoDAF) views, including Capability View-1 (CV-1) and CV-2.

**Figure 5.** *MS&A approach for SOSE system architecture assessment.*

The MS&A approach requires systems-of-systems analysts to develop the USG game engine (a.k.a. DAA-PWGE) and Contractor game engine (a.k.a. KTR-PWGE) for assessing and optimizing the architecture solutions under USG perspective and contractor perspective, respectively [15, 16]. The objective of the USG game engine is to minimize cost and technical risk using an appropriate PCF for trading off the affordability and technical requirements. The objective of the contractor game engine is to maximize profit and minimize execution risk using an appropriate PCF for trading off the profit and execution risk. The game engines can play pure game or mixed game depending or survey results. Pure game is used when the contractors are surer of their risk assessments, and there are no "belief" and "weighting" functions are needed for assessing the TE risks. Mixed game is used when contractors are more uncertain of their risk assessments, and hence "belief" and "weighting" functions are needed to characterize the TE risks. For this case, TEs are weighted based on their priorities using either a uniform or triangular distribution. The games are static Bayesian games with the goal to reach Nash equilibrium, where the games have stable solutions to game theoretic problem involving multiple players in which no individual player can improve their payoff by a unilateral change in behavior. The objective of MS&A models and tools is to selects the best architectural solution and associated architecture solution type for risk assessment. Classification of architecture solution type depends on the system and associated systems-ofsystems requirements and associated market and technology risks (i.e., uncertainty) [15, 16]. **Figure 6** describes an acquisition strategy mapping framework and shows the mapping of requirement type to architecture solution type according to various market and technology risks. Section 4 describes a recommended MS&A approach for acquisition strategy development and optimization using this acquisition strategy mapping framework. Theoretically, for these games, the players can be the USG team and contractors to participate in the games playing action. In practice, during the pre-Milestone A, the USG team can also play the contractor role to determine the win-win acquisition strategy from both USG and contractor perspectives. Detailed description of the game engines can be found in [15, 16].

In practice, when the architecture solution does not converge to a single optimum solution, a brute force approach can be used to force the solution to converge to a single system architecture solution for acquisition strategy development and optimization. Since the brute force approach might not converge or lead to an


#### **Figure 6.**

*Acquisition strategy mapping framework [15, 16].*

*Systems-of-Systems MS&A for Complex Systems, Gaming and Decision for Space Systems DOI: http://dx.doi.org/10.5772/intechopen.100007*

optimum solution, [17] proposed a multiple-criteria decision model based on the Marquis de Condorcet principle found in the ELECTRE models for addressing the situations when the game models do not yield optimal outcome.

### *3.1.3 MS&A approach for program risk assessment in pre-milestone A*

Based on existing US defense acquisition life cycle, the MS&A approach for preaward phase at pre-Milestone A, the USG team often assesses program risk associated with the following nine pre-award events, including (i) Program Go-Ahead, (ii) Early Strategy and Issues Session (ESIS) (see AFI 63–138, is a key event), (iii) Acquisition Strategy Review Board (ASRB), (iv) Acquisition Strategy Panel (ASP) (see AFI 63–101, is a key event), (v) Acquisition Strategy Document (ASD) (is considered as a key event), (vi) Strategy Review Board (SRB), (vii) Source Selection Plan (SSP) (see 2011 DOD Source Selection Procedures), (viii) Request for Proposal (RFP) (is also considered as a key event), and (ix) Source Selection and Proposals Evaluation. The MS&A objective for the pre-award risk assessment is to provide MS&A models and tools for evaluating and assessment of the program and technical baseline (PTB) risks at each of the key events. As pointed out in [18, 19], there are nine PTB components, including five Program Baseline (TB) and four Technical Baseline (PB) components, as shown in **Figure 7**(**a**) and (**b**), respectively. Detailed description of these PB and TB components can be found in [18, 19].

**Figure 8** proposes a MS&A approach for the program risk assessment of four TB and five PB components. The approach recommends a set of three MS&A models and tools, namely, (i) MS&A model and tool #1, (ii) MS&A model and tool #2 and (iii) MS&A model and tool #3 for supporting three MS&A tasks, including (i) Program risk quantification task on the roll-up program cost, schedule and performance risks, (ii) PB and TB risks Quantification task assessing impact on (key) pre-award acquisition event, and (iii) Task on prediction of PTB risk impact at each (key) pre-award acquisition event, respectively.

MS&A model and tool #1 is a set of mathematical models for evaluating the overall program risk based on individual TB and PB components' risks. The overall PTB risk is quantified in terms expected values of the likelihood and consequence that will be placed on the pre-award PTB risk management matrix. A notional PTB risk management matrix is depicted in **Figure 9**. MS&A model and tool #2 is a set of mathematical models and software tools for (i) Assessing the PB

**Figure 7.** *Description of PTB elements [18, 19].*

#### **Figure 8.**

*MS&A approach for pre-award risk assessment.*


#### **Figure 9.**

*A notional program risk management matrix.*

and TB components' risks, (ii) Quantifying PB and TB risks impact on a specific pre-award event, and (iii) Evaluating PTB risk rolled up and quantification from individual PB and TB components' risks. The rolled up PTB expected likelihood and consequence values will also be placed on a program risk management matrix like **Figure 9**.

Finally, the MS&A model and tool #3 is a set of mathematical models and software tools for predicting the PTB risk at a future acquisition event given the risk assessment results at the current acquisition event. The PTB risk results are quantified in terms of expected likelihood and consequence values.

## **3.2 MS&A approach for system architecture analysis in post-milestone A**

This section describes a MS&A approach for supporting the post-award phase of the DoD acquisition life cycle. At post-Milestone A, a contractor is already selected, and the USG team is responsible for working with the selected contractor to refine the USG reference system architecture and minimize associated technical and execution risks. **Figure 10** proposes a MS&A approach for post Milestone A.

*Systems-of-Systems MS&A for Complex Systems, Gaming and Decision for Space Systems DOI: http://dx.doi.org/10.5772/intechopen.100007*

**Figure 10.**

*Proposed MS&A approach for supporting post milestone A.*

The approach shown in **Figure 10** shows the (i) required inputs including warfighter and stakeholder, Performance Assessment Documents (PADs) and Trade Analysis Assumption Documents (TAADs); (ii) desired MS&A activities and supporting MS&A models and tools; and (iii) essential outputs for supporting post Milestone A tasks. The figure is color coded to illustrate appropriate (i) warfighter input, USG Team input, activity and documents, (ii) contractor input, activity and documents, (iii) third party (i.e., related subcontractors) involvement, and (iv) joint USG and contractor teams' activities. USG team provides ICD, SOSE CONOPS and associated threat scenarios, government reference architecture, and warfighter needs. Using USG's inputs, including PADs, TAADs and SOSE perspective, the selected contractor team is responsible for developing desired trade space and performing the system architecture trade analysis and refine the government reference architecture (GRA) and providing the "best"(or optimum) system architecture solution and associated system requirements for the development of hardware prototype. The USG team serves as the final adjudicator of what is the "best," and define which Technical Performance Measures (TPMs) are more important than the others for meeting the warfighter and stakeholder needs, and which residual risks are of the most concern. Typical PADs include (i) Program Technical Objectives and Goals (TOG), (ii) Program TPMs, (iii) Top-Level CONOPS, and (iv) Program Risk Assessment Plan (PRAP). Trade Analysis Assumption Documents (TAAD's) with typical TAADs including (i) Adversary Capability Document (ACD), (ii) Scenarios Document (SD), (iii) Value Model Document (VMD), (iv) Master Test Plan (MTP), (v) Integrated Master Plan/ Integrated Master Schedule (IMP/IMS), (vi) System Capability Baseline (SCB), and (vii) Technology Maturity Baseline (TMB).

For post-Milestone A, the selected contractor is responsible for (i) the architecture analysis (what-if analyses) on the selected alternative SOSE architecture solutions, and (ii) providing all MS&A models and tools for activities supporting SOSE architecture analysis associated with GRA refinement. The contractor MS&A models and tools should be developed for supporting the following SOSE architecture analyses, including, at the minimum, (i) Technology Insertion Assessment: What available technologies could be inserted to gain a significant increase performance without unacceptable increased in risk, (ii) System Capabilities Evaluation: increases/ decreases in system capabilities vs. gains/losses in overall system performance, (iii) SOSE CONOPS Assessment: SOSE CONOPS Changes for increased performance vs. ease of integration, (iv) TPMs Evaluation: Benefits for not meeting threshold objective TPMs vs. not to exceed TPMs, (v) Threat/Scenarios Analysis: Benefits for not to address the full baseline operation under different threat/scenarios vs. Benefits to address scenarios beyond the baseline, (vi) Integrated Management Plan (IMP)/ Integrated Master Schedule (IMS) Assessment: Where would the USG derive benefit from changing quality standards, cost management system, award fee structure, the schedule for implementation, and (vii) Master Test Plan Analysis: Address changes in planned test facilities, test resources, or test restrictions that would provide overall benefit to fully testing the capabilities of the system.
