JOINT SIMULATION SYSTEM (JSIMS)
The Joint Simulation System (JSIMS) is a single, distributed, and seamlessly integrated simulation environment that permits integration of real-world and simulated assets of the U.S. Military Services and their allies on a virtual battlefield. The JSIMS virtual battlefield is simulated by a High Level Architecture compliant federation of component models. The component models include the National Air and Space Model (NASM), Warfighters' Simulation (WARSIM), WARSIM Intelligence Model (WIM), JSIMS Maritime, the Defense Intelligence Agency's Object-oriented Model of Intelligence Operations (DOMINO), Joint Signals Intelligence Simulation (JSIGSIM), and National Simulation (NATSIM). JSIMS is to provide a real-time simulation capability that can be configured for use in exercises of differing duration, scenarios, and complexities. It is to interface with real-world C4I systems, providing a training environment that should be transparent to the training audience. JSIMS is to include scenarios that reflect the transition of military forces into less conventional roles such as multi-national peacekeeping and humanitarian assistance. At Initial Operational Capability (IOC), JSIMS will be an accredited simulation environment to support joint training for unified combatant command staffs, joint task force (JTF) commanders and staffs, and JTF component commanders and staffs. At Full Operational Capability, JSIMS is to evolve to support professional military and senior officer education, mission planning, mission rehearsal, and doctrine development.
In December 1999, the Under Secretary of Defense for Acquisition, Technology and Logistics (USD (AT&L)) re-designated JSIMS as an Acquisition Category ID program with USD (AT&L) as the Milestone Decision Authority. The JSIMS Program Manager was moved to the Army from the Air Force and reports directly to the Army Acquisition Executive. The JSIMS Program Manager is responsible for overall materiel development. The United States Joint Forces Command Joint Warfighting Center (JWFC) is the user representative. The Air Force Operational Test and Evaluation Center (AFOTEC) resigned in October 2001 as the lead Operational Test Agency (OTA). The Navy's Commander, Operational Test & Evaluation Force (COMOPTEVFOR) agreed to be the lead OTA in 2QFY02. Nine Service and Defense Agencies make up the Development Agents and are responsible for the development of component models and workstation software. The JSIMS Alliance Executive Office (AEO) directs the integration of the individual Development Agents' products.
Development activity is proceeding through a series of five integration events, which will conclude in a full system test in FY02. The Multi-Service Operational Test and Evaluation (MOT&E), a geographically distributed training exercise using Unified Endeavor 03-1, will support a full-rate production decision in FY03.
TEST & EVALUATION ACTIVITY
During FY01, the AEO completed the first two of five integration events leading up to the completion of JSIMS Version Release 1.0. These events are designed along a crawl, walk, run approach, starting with ensuring the most basic functions of federating simulations can be accomplished. Later integration events beginning in FY02 will look at cross-federate functions (for example flying planes in NASM and hitting ground targets in WARSIM).
DOT&E approved the AFOTEC JSIMS Operational Test Concept in April 2001. The operational test strategy includes an Early Operational Assessment (EOA) of JSIMS during the JWFC System Functional Assessment (2QFY02) and a combined DT/OT in conjunction with the JWFC System Validation Event in CY02 to determine readiness for the MOT&E later in FY03. The Test and Evaluation Integrated Product Team developed the Test and Evaluation Master Plan (TEMP) and was poised to send the TEMP to OSD for approval in 1QFY02. Due to the resignation of AFOTEC as the lead OTA, the TEMP was pulled back from further coordination.
TEST & EVALUATION ASSESSMENT
The initial AFOTEC Operational Test Concept was based largely on a questionnaire-based, qualitative assessment. DOT&E recommended the addition of objective measures of effectiveness and placing additional emphasis on areas such as C4I system-to-simulation interoperability, information assurance, task accomplishment, and logistics supportability. The integration of multiple training simulations and tools developed by multiple Services and Agencies provides a unique test and evaluation challenge. It is imperative that all Service OTAs are part of the planning team and tasked to provide input on how to assess and evaluate their part of the training.
The resignation of AFOTEC as the lead OTA in October 2001 has created a significant delay in approval of the TEMP as well as planning for the MOT&E and the events leading up to the MOT&E. The Early Operational Assessment did not occur due to lack of a lead OTA. Any revisions to the approved strategy by COMOPTEVFOR will require approval by DOT&E and the TEMP re-coordinated among the services, which may impact existing program schedules.
Combining multiple independent development efforts into incremental integration and test events has proven to be a viable means for evaluating exit and entrance criteria and minimizing associated costs. The integration events have required more time than originally planned by the AEO. Progress is being made, but delays have consumed available time in the schedule. Because these integration events are necessary to achieving a successful MOT&E and full-rate production decision, any further movement of these events in the schedule may impact the milestone date or cause a reduction in functionality delivered at IOC. This integration, validation, and testing is a challenge, but will result in a viable joint training tool for operational test and evaluation.
|Join the GlobalSecurity.org mailing list|