MANAGING LARGE DISTRIBUTED DATA SETS FOR TESTING IN A JOINT ENVIRONMENT

Similar documents
Joint Test and Evaluation Program

Making the Case for Distributed Testing

Air-Ground Integrated Layer Exploration (AGILE) Fire Phase II

JOINT TEST AND EVALUATION METHODOLOGY (JTEM) PROGRAM MANAGER S HANDBOOK FOR TESTING IN A JOINT ENVIRONMENT

UNCLASSIFIED. Cost To Complete Total Program Element P857: Joint Deployable Analysis Team (JDAT)

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE F: Requirements Analysis and Maturation. FY 2011 Total Estimate. FY 2011 OCO Estimate

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: Central Test and Evaluation Investment Program (CTEIP) FY 2013 OCO

Joint Test & Evaluation Program

Lessons Learned: Joint Battlespace Dynamic Deconfliction (JBD2) Distributed Test Event

OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

UNCLASSIFIED R-1 ITEM NOMENCLATURE. FY 2014 FY 2014 OCO ## Total FY 2015 FY 2016 FY 2017 FY 2018

46 Test Squadron Distributed Test Overview

Joint Test and Evaluation Programs

Joint Terminal Control Training & Rehearsal System (JTC TRS)

The National Defense Industrial Association Systems Engineering Conference 2009

The Role of T&E in the Systems Engineering Process Keynote Address

FAS Military Analysis GAO Index Search Join FAS

Developmental Test and Evaluation Is Back

552nd ACW (Air Control Wing), 2000, informal paper defining C2ISR package commander, 552 ACW/552 OSS, Tinker AFB, Okla.

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 7 R-1 Line #9

UNCLASSIFIED OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Joint Test and Evaluation Program

An Enterprise Environment for Information Assurance / Computer Network Defense Testing and Evaluation

ARMY MULTIFUNCTIONAL INFORMATION DISTRIBUTION SYSTEM-LOW VOLUME TERMINAL 2 (MIDS-LVT 2)

GLOBAL BROADCAST SERVICE (GBS)

UNCLASSIFIED. UNCLASSIFIED R-1 Line Item No. 3 Page 1 of 15

FIGHTER DATA LINK (FDL)

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: Central Test and Evaluation Investment Program (CTEIP) FY 2011 Total Estimate. FY 2011 OCO Estimate

EW Modeling and Simulation: Meeting the Challenge

Request for Solutions: Distributed Live Virtual Constructive (dlvc) Prototype

UNCLASSIFIED. Cost To Complete Total Program Element Continuing Continuing : Physical Security Equipment

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: Central Test and Evaluation Investment Program (CTEIP) FY 2012 OCO

Joint Mission Environment Test Capability (JMETC) Improving Distributed Test Capabilities NDIA Annual T&E Conference

First Announcement/Call For Papers

UNCLASSIFIED. R-1 Program Element (Number/Name) PE J / Joint Integrated Air & Missile Defense Organization (JIAMDO) Prior Years FY 2013 FY 2014

A Tool to Inject Credible Warfighter-Focused Non- Kinetic Attack Effects into the BMDS M&S Environment

Joint Command and Control For Net-Enabled Weapons Joint Test and Evaluation (JC2NEW JT&E)

UNCLASSIFIED FY 2009 RDT&E,N BUDGET ITEM JUSTIFICATION SHEET DATE: February 2008 Exhibit R-2

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

UNCLASSIFIED FY 2008/2009 RDT&E,N BUDGET ITEM JUSTIFICATION SHEET DATE: February 2007 Exhibit R-2

UNCLASSIFIED R-1 ITEM NOMENCLATURE

EXHIBIT R-2, RDT&E BUDGET ITEM JUSTIFICATION N/Space and Electronic Warfare (SEW) Support

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: Joint Fires Integration & Interoperability FY 2012 OCO

UNCLASSIFIED R-1 ITEM NOMENCLATURE

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

UNCLASSIFIED. FY 2016 Base FY 2016 OCO. Quantity of RDT&E Articles

U.S. Air Force Electronic Systems Center

The Joint Force Air Component Commander and the Integration of Offensive Cyberspace Effects

OE Conditions for Training: A Criterion for Meeting Objective Task Evaluation Requirements

Decision Support System Engineering for Time Critical Targeting

Test and Evaluation Strategies for Network-Enabled Systems

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 10 R-1 Line #201

SERIES 1300 DIRECTOR, DEFENSE RESEARCH AND ENGINEERING (DDR&E) DEFENSE RESEARCH AND ENGINEERING (NC )

Exhibit R-2, RDT&E Budget Item Justification

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Department of Defense DIRECTIVE

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 6 R-1 Line #62

Department of Defense INSTRUCTION. SUBJECT: Implementation of Data Collection, Development, and Management for Strategic Analyses

UNCLASSIFIED R-1 ITEM NOMENCLATURE. FY 2014 FY 2014 OCO ## Total FY 2015 FY 2016 FY 2017 FY 2018

FY19 Warfighting Lab Incentive Fund Project Proposal Background and Instructions

Test and Evaluation of Highly Complex Systems

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

Synthetic Training Environment (STE) White Paper. Combined Arms Center - Training (CAC-T) Introduction

Department of Defense DIRECTIVE

Gerry Christeson Test Resource Management Center 20 October 2010

17 th ITEA Engineering Workshop: System-of-Systems in a 3rd Offset Environment: Way Forward

INTRODUCTION. Chapter One

UNCLASSIFIED R-1 ITEM NOMENCLATURE

Department of Defense Investment Review Board and Investment Management Process for Defense Business Systems

Theater Ballistic Missile Defense Analyses

C4I System Solutions.

ANNEX 3-52 AIRSPACE CONTROL. COMMAND AND ORGANIZATION CONSIDERATIONS ACROSS THE RANGE OF MILITARY OPERATIONS Last Updated: 23 August 2017

JRSS Discussion Panel Joint Regional Security Stack

United States Air Force

Evolutionary Acquisition an Spiral Development in Programs : Policy Issues for Congress

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Engineering, Operations & Technology Phantom Works. Mark A. Rivera. Huntington Beach, CA Boeing Phantom Works, SD&A

Future Combat Systems

JAGIC 101 An Army Leader s Guide

The Patriot Missile Failure

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE F: Integrated Broadcast Service (DEM/VAL) FY 2012 OCO

DoD CBRN Defense Doctrine, Training, Leadership, and Education (DTL&E) Strategic Plan

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

Exhibit R-2, RDT&E Budget Item Justification

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: Common Joint Tactical Information. FY 2011 Total Estimate. FY 2011 OCO Estimate

UNCLASSIFIED R-1 ITEM NOMENCLATURE

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 9 R-1 Line #96

I n t r o d u c t i o n

Air-Sea Battle: Concept and Implementation

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 9 R-1 Line #188

A METHODOLOGY FOR ESTABLISHING BMC4I SYSTEM REQUIREMENTS AND CAPABILITIES

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

ARCHIVED REPORT. For data and forecasts on current programs please visit or call

Air Force Science & Technology Strategy ~~~ AJ~_...c:..\G.~~ Norton A. Schwartz General, USAF Chief of Staff. Secretary of the Air Force

Data Collection & Field Exercises: Lessons from History. John McCarthy

DEPARTMENT OF THE AIR FORCE PRESENTATION TO THE COMMITTEE ON ARMED SERVICES DEFENSE ACQUISITION REFORM PANEL UNITED STATES HOUSE OF REPRESENTATIVES

UNCLASSIFIED. R-1 Program Element (Number/Name) PE F / Distributed Common Ground/Surface Systems. Prior Years FY 2013 FY 2014 FY 2015

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 13 R-1 Line #68

Exhibit R-2, RDT&E Budget Item Justification

Transcription:

MANAGING LARGE DISTRIBUTED DATA SETS FOR TESTING IN A JOINT ENVIRONMENT Thomas Bock thomas.bock@jte.osd.mil Tonya Easley tonya.easley@jte.osd.mil John Hoot Gibson john.gibson@jte.osd.mil Joint Test and Evaluation Methodology Joint Test and Evaluation Center 7025 Harbour View Blvd., Suite 105 Suffolk, VA 23435 Keywords: capability test methodology, data, DOD, JTEM, joint mission environment, joint test and evaluation, joint test environment, system of systems Abstract As the Department of Defense (DOD) matures its testing methodologies for examining the contributions of a particular system or system of systems (SoS) to joint mission effectiveness (JMe) throughout the course of the acquisition process, the test community has begun the process of examining methods and processes for managing large and distributed data sets in a joint environment. Assuming a realistic joint mission data set can be constructed and relevant quantifiable data obtained, we are still left with the analytic question: Can this data repository, operating within this SoS, contribute to one particular, or a set of mission desired effects within a given scenario and specific conditions? This issue becomes even more complex as we examine the fluid environment of modern military operations. More specifically, the Secretary of Defense (SecDef), tasked the Director, Operational Test and Evaluation (DOT&E) to determine the actions necessary to create new joint testing capabilities and institutionalize the evaluation of JMe. In response to the Strategic Planning Guidance (SPG) tasking, DOT&E s Testing in a Joint Environment Roadmap identifies changes to policy, procedures, and test infrastructure to ensure the Services can conduct test and evaluation (T&E) in joint mission environments (JME). Regarding methods and processes, the roadmap states, T&E must adapt test methodologies to be prepared to test systems and SoS in assigned joint mission environments and accommodate evolving acquisition processes. 1. INTRODUCTION Innovative enterprise initiatives are currently occurring within the DOD, which have the potential to enhance the effectiveness and efficiency of the joint capabilities planning process. One initiative, focused on joint capability assessment and evaluation, is the Joint Test and Evaluation Methodology (JTEM) joint test & evaluation (JT&E) program. JTEM is currently developing an enterprise-level Capability Test Methodology (CTM) to deliver high quality joint capability assessments and evaluations across the acquisition life cycle. 1.1. Joint Test and Evaluation Methodology Program Overview and Challenges The JTEM JT&E is chartered to employ multi-service and other DOD agency support, personnel, and equipment to investigate, evaluate, and make recommendations to improve the ability to conduct SoS testing across the acquisition life cycle in a realistic JME. Specifically, JTEM will develop, test, and evaluate methods and processes for defining and using a distributed live, virtual, constructive (LVC) JME to evaluate system performance and JMe. JTEM will focus on developing and enhancing methods and processes for designing and executing tests of SoS in the JME. As methods and processes for designing and executing tests of SoS in the JME are not well defined or understood, JTEM will institutionalize testing in a JME by demonstrating the viability of T&E methods and processes in realistic JMEs as part of the overarching acquisition process. 1

2. OVERVIEW OF THE JTEM FY07 TEST EVENT (NEW & NLOS USE CASE) The first JTEM JT&E event was conducted in August 2007, in an LVC distributed joint test environment, comprised of simulation facilities and live ranges across the United States, see figure 1. The notional developmental systems under test (SUT) were the US Army Non-Line of Sight - Launch System (NLOS-LS) with Precision Attack Missiles (PAM) and the US Air Force s air-launched, network-enabled weapon (NEW). The JTEM JT&E test event was one of several collaborative tests conducted within the 2007 INTEGRAL FIRE distributed venue (JTEM s involvement in this venue will be annotated as the FY07 test event ). INTEGRAL FIRE was a joint capability integration event intended to support test activities while working to establish persistent capabilities for testing in joint environments. The event was mutually sponsored by the Secretary of the Air Force, Warfighter Integration Directorate (SAF/XC); the United States Joint Forces Command, Joint Systems Integration Command (JSIC); the Joint Mission Environment Test Capability (JMETC) program; and the JTEM JT&E. Operations Center (ASOC), the live JTAC, and one virtual F-15E aircraft. Guided Weapons Evaluation Facility (GWEF), EAFB, Florida: the host for a constructive NEW. Global Modular Army Node (GMAN) at Redstone Technical Test Center (RTTC), Huntsville, Alabama: the host for the virtual NLOS-LS and the virtual NLOS-PAM. The Distributed Test Control Center (DTTC) was the RTTC network hub. Inter Range Control Center (IRCC), White Sands Missile Range (WSMR), New Mexico: the host for two live targets. IRCC also provided a centralized integrated level hierarchy (ILH) data repository. Air Force Command and Control, Intelligence, Surveillance, and Reconnaissance (AFC2ISR) Center, Langley Air Force Base (LAFB), Virginia: hosted the virtual Air Operations Center (AOC). Joint Mission Environment Test Capability (JMETC), Virginia: provided technical support to link three separate network enclaves through an aggregation router. Figure 1 2.1. FY07 Test Event Sites To evaluate SoS in a joint mission and test environment, test data collection efforts were centrally planned, but executed in a distributed fashion by participants at the following sites (figure 2): Simulation and Analysis Facility (SIMAF), Wright- Patterson Air Force Base (WPAFB), Ohio: the INTEGRAL FIRE venue lead and host for two virtual F-16 aircraft, virtual Airborne Warning and Control System (AWACS), virtual Joint Surveillance Target Attack Radar System (JSTARS), and the virtual joint terminal attack controller (JTAC). 46th Test Squadron (46 TS), Eglin Air Force Base (EAFB), Florida: the host for the virtual Air Support Figure 2 2.2. FY07 Test Event s Test Cases and Objectives The overall goal of these series of tests was to evaluate the contributions of NEW and NLOS-LS/PAM systems to JMe when these weapon systems were employed together as participating elements in an overarching SoS. The particular joint mission of interest in these tests was joint fire support, including aspects of JCAS. After the evaluation of test results is complete, contributions to JMe will be used to determine which of the tested weapon design and joint tactics, techniques, and procedures (TTP) alternatives warrant further development. JMe was measured by the ability to deny employment of disparate forces (timeliness of attacks) and the SoS ability to attrite disparate combat assets. Weapon design alternatives of interest are those related to Link 16 J-11 series message implementation. TTP tests were related to NEW 2

employment with a joint terminal attack controller (JTAC) and airspace coordination between the NEW and NLOS-LS/PAM systems. The test objectives and their associated measures are in table 1. Table 1. NEW and NLOS Test Objectives and Measures Objective 1. Determine the ability to perform the NEW handoff function over Link-16. 2. Determine the impact of airspace deconfliction on attack timelines when NEW and NLOS systems are employed in potentially conflicting situations requiring the generation of an ACMREQ. 3. Evaluate guidance message continuity after successful NEW handoff. 4. Evaluate NLOS and NEW contributions to force network fires. Measures a. Time to achieve successful NEW handoff from launch aircraft to second aircraft. b. Percentage of successful NEW handoffs from launch aircraft to second aircraft. c. Time to achieve successful NEW handoff from launch aircraft to JTAC. d. Percentage of successful NEW handoffs from launch aircraft to JTAC. a. Time from CAS request to NEW assignment. b. Percentage of successful NEW assignments. c. Time from fire support request to NLOS assignment. d. Percentage of NLOS assignments. a. Percentage of Guidance Messages received by NEW. b. Percentage of Guidance Messages correctly received by NEW. a. Time for call for fire, or air support request, to mission complete. b. Percentage of targets successfully prosecuted. 3. DATA COLLECTION AND MANAGEMENT Testing in a joint environment requires early identification of data collection requirements in the test planning process and continual refinement through the entire test process. This includes determination of data elements, data formats, and collection processes and procedures standardized across the distributed environment. Historically, the responsible test organization (RTO) manages data collection requirements within their own facility and therefore establishes independent methods and processes for doing so. Distributed testing requires coordination across multiple services and facilities and standardization to minimize data error. Data collection needs to be precise and well defined to ensure consistency across the JME. This requires dedicated data collection and reduction resources to meet data requirements and standards. Data collection coordination must begin early in the event planning process and culminate with data collection testing just prior to event execution. To successfully address the objectives in table 1, each of the participating organizations listed in figure 2 had to perform their portion in lock-step. For example, to achieve permission to launch a NLOS weapon, the requesting unit (the JTAC at Eglin AFB) had to ensure the airspace the weapon would traverse was cleared of friendly aircraft. This required timely communications with the ASOC (a virtual function performed at Eglin AFB), the CAOC (performed virtually from Langley AFB), and any corresponding aircraft currently occupying the airspace in question. During the FY07 test event, JTEM attempted to mitigate the data management challenges prior to the test execution by conducting weekly group telecoms with the participating sites. The discussions attempted to define the data format and requirements, determine who would be responsible for collecting each piece of pertinent data, and standardize the media used to accumulate and eventually transmit the information to the aggregate data collection location (WSMR). The challenge in executing this test was highlighted when the information in figure 2 and table 1 were combined. Specifically, how to meld the pieces being provided by each of the four sites to yield the necessary information required to address the test objectives. 4. FINDINGS & RECOMMENDATIONS Upon completion of the test, JTEM discerned the following findings and recommendations: 4.1. Finding: Need for Data Standardization Standardization of data format is imperative in successfully executing any test, but this issue becomes even more pronounced during testing in a joint environment distributed across multiple test ranges/facilities. Early development of data collection methods and processes to include standard formats and content is required to reduce data variability and data collection errors. While this finding was well-addressed prior to test execution, issues with respect to what units the data would be collected in, the frequency information would be gathered, and what media 3

would be used to collected the information were still prevalent. Recommendation: Conduct early and continuous meetings with all the proposed event participants to ensure the format for data collection is understood. Additionally, plan for, and conduct a thorough dry-run to minimize risk. 4.2. Finding: Need for Improved Data Access & Retrieval (and Classification) From an analytical viewpoint, the goal of any test should be the accumulation of standardized data that is easily discernible by the analyst, and applicable to the appropriate chosen analytical techniques in order to arrive at consistent conclusions regarding how the items under test performed. For this test, the goal was to have the test data stored in a single repository at WSMR. One of the problems encountered in this test was too much data was collected. Refer to the objectives in table 1: the only information required to answer each of them dealt with time (when each portion of the mission started and stopped), and whether the mission was successful. Each site provided megabytes of data most of which was not pertinent to answering the test objectives (for example, the collection of Link-16 data every 6-seconds stating the location of each aircraft way too much detail for what was required). Recommendation: Design a process (architecture) for a common data structure and provide easy and intuitive access to each test participant. Also, stay focused on the data necessary to address the test objectives. 4.3. Finding: Need for Tasking Authority and Communication Separate test customers may have separate data management and analysis requirements. The capability manager s analytical interests will need to address higher order mission measures of effectiveness (MOE) at the joint mission level, as well as SoS effectiveness and suitability performance. System program manager requirements will be more focused at the systems performance and attribute level. The issue is distributed tests conducted in a joint environment will probably have competing priorities and interests from test strategy through test execution. This will require an organization to be identified as having the authority to de-conflict these competing priorities. During the planning and execution of the FY07 test event, numerous issues arose affecting multiple participants. The difficulty was determining who would be responsible for addressing the issue in question. For example, as mentioned in section 4.1 there were issues regarding in what format the data would be collected. A matter complicating this issue was no one organization was assigned the authority to decide which format would be used, once inputs were received from each of the participants. Additionally, no specific organization was identified as the unit responsible for addressing minor software/hardware anomalies that popped up during execution of a test run. Recommendation: The RTO must clearly delineate the roles and responsibilities of the participating organizations and programs. 4.4. Finding: Need for Resource Allocation & Facility Management During execution of the FY07 test event, personnel performing operational functions within the test were also the same people required to gather, process, and reduce the data pertinent to addressing the test objectives. Dualhatting of these responsibilities meant the person acting in the role as an operator could not be intrinsically aware of data collection issues as they occurred. Another issue dealt with certain organizations having specific office hours. Since the FY07 test event was conducted across multiple time-zones, what was construed as normal operating hours in one location, meant another had to open their facility earlier or later than it traditionally functioned. Recommendation: Assign sufficient personnel to minimize "dual-hatting" responsibilities. Specifically, have dedicated data collectors assigned, as necessary. Tests will inherently ensure they have the required operators present to ensure the test is adequately supported. The problem arises when there are not enough personnel on-hand to gather the pertinent data, which is the primary reason for conducting the test in the first place. Additionally, ensure support agencies, including the facilities themselves, are available during test execution. This will require the coordination of all agencies inherent to the normal operation of a facility (for example, security, administration, and so forth). 5. CONCLUSION Data management and analysis continues to be a challenge within the T&E community during the planning and execution of tests executed in a distributed fashion within a JME as numerous customers have stakes in testing systems, SoS, as part of a joint capability. Data management must clearly be an emphasis for coordination between the various ranges and testing facilities to ensure efficient data collection and standardization during a specific test or across a test campaign. Analysis needs to be focused at the mission level as well as the SoS and SUT attributes to support the needs of all test customers. This need may require a coordination of multiple analysis plans. More refinement is needed in processes for Data Management Plan integration, distributed analysis, and requirements for automated data management systems. JTEM intends to explore a restructuring of the data management and analysis construct and data automation systems to enable future distributed testing in a JME. A 4

streamlined data management process will facilitate early data collection planning and integration. Differences in test facility methods and processes should be identified and either standardized for SoS testing in an LVC-DE, or work-arounds developed that will minimize the effects of differences in data management between the sites. Biography Tom Bock is a consultant and information technology professional, specializing in data management, systems analysis, business process design and implementation. He has over 15 years of experience working in multi-cultural, international business environments. Tom is employed by Scientific Research Corporation as a senior analyst working for JTEM. In this capacity, he analyzes actual and predictable interacting operational activities of military, governmental, or business systems to obtain a quantitative, rational basis for decision-making or resource allocation. He is currently pursuing his M.E. in Modeling and Simulation from Virginia Modeling Analysis and Simulation Center (VMASC) at Old Dominion University. His modeling and simulation interest is in distributed virtual simulation and networked multimedia tools for distributed collaboration in the homeland security domain. Tonya Easley is a joint T&E professional specializing in data management and data media. She has worked within military and joint community environments to include work with previous tests. Tonya is employed by Scientific Research Corporation as a data analyst and is part of the data management team for JTEM. John Hoot Gibson is an Operations Research Systems Analyst employed by Teledyne Brown, supporting JTEM in Suffolk, VA. 5