Merging Operational Realism with DOE Methods in Operational Testing NDIA Presentation on 13 March 2012

Similar documents
TACTICAL EMPLOYMENT OF ANTIARMOR PLATOONS AND COMPANIES

Section III. Delay Against Mechanized Forces

MECHANIZED INFANTRY PLATOON AND SQUAD (BRADLEY)

RECRUIT SUSTAINMENT PROGRAM SOLDIER TRAINING READINESS MODULES Conduct Squad Attack 17 June 2011

Mission Based T&E Progress

Test and Evaluation of Highly Complex Systems

Information-Collection Plan and Reconnaissance-and- Security Execution: Enabling Success

THE STRYKER BRIGADE COMBAT TEAM INFANTRY BATTALION RECONNAISSANCE PLATOON

System Analysis: Infantry Studies and Simulations

Chapter FM 3-19

Obstacle Planning at Task-Force Level and Below

Team 3: Communication Aspects In Urban Operations

Assembly Area Operations

Afghanistan National Army ANA 7-10 MTP MISSION TRAINING PLAN FOR THE INFANTRY RIFLE COMPANY

Chapter 3. Types of Training. The best form of welfare for the troops is first class training, for this saves unnecessary casualties.

Intelligence Preparation of the Battlefield Cpt.instr. Ovidiu SIMULEAC

Armor Basic Officer Leaders Course

CHAPTER COUNTERMINE OPERATIONS DEFINITIONS BREACHING OPERATIONS. Mine/Countermine Operations FM 20-32

Chapter 1. Introduction

Risk Management Fundamentals

The Four-Element Framework: An Integrated Test and Evaluation Strategy

Training and Evaluation Outline Report

Army Expeditionary Warrior Experiment 2016 Automatic Injury Detection Technology Assessment 05 October February 2016 Battle Lab Report # 346

(QJLQHHU 5HFRQQDLVVDQFH FM Headquarters, Department of the Army

UNCLASSIFIED. Any Mission, Anywhere UNCLASSIFIED

Preparing to Occupy. Brigade Support Area. and Defend the. By Capt. Shayne D. Heap and Lt. Col. Brent Coryell

National Training Center Cobra Team

(FSO) STX 2010 FSO

Training and Evaluation Outline Report

Training and Evaluation Outline Report

NATURE OF THE ASSAULT

Maneuver Leaders Role in Observation Planning

Chapter 2. Battle Command

Integration of the targeting process into MDMP. CoA analysis (wargame) Mission analysis development. Receipt of mission

DANGER WARNING CAUTION

CD Compilation Copyright by emilitary Manuals

Training and Evaluation Outline Report

Training and Evaluation Outline Report

Test and Evaluation WIPT

National Training Center (NTC) Force-on-Force Convoy STX Lane

Tactical Employment of Mortars

COMBINED ARMS OPERATIONS IN URBAN TERRAIN

Training and Evaluation Outline Report

150-LDR-5012 Conduct Troop Leading Procedures Status: Approved

INITIAL CONSIDERATIONS

This publication is available at Army Knowledge Online ( To receive publishing updates, please

Training and Evaluation Outline Report

Headquarters, Department of the Army

Plans and Orders [CLASSIFICATION] Copy ## of ## copies Issuing headquarters Place of issue Date-time group of signature Message reference number

Analysis of the Operational Effect of the Joint Chemical Agent Detector Using the Infantry Warrior Simulation (IWARS) MORS: June 2008

C4I System Solutions.

M&S for OT&E - Examples

Training and Evaluation Outline Report

Training and Evaluation Outline Report

Train as We Fight: Training for Multinational Interoperability

Training and Evaluation Outline Report

DISTRIBUTION RESTRICTION:

Training and Evaluation Outline Report

Obstacle Planning at Corps, Division, and Brigade Levels

Plan Requirements and Assess Collection. August 2014

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

ARMY TACTICAL MISSILE SYSTEM (ATACMS) BLOCK II

Obstacle-Integration Principles

ROUTE CLEARANCE FM APPENDIX F

Real-time Adversarial Intelligence & Decision making. Program Manager: Dr. Alexander Kott DARPA/Information Exploitation Office (IXO)

Headquarters, Department of the Army

DRILLS FOR THE SMOKE/DECONTAMINATION PLATOON

HEADQUARTERS DEPARTMENT OF THE ARMY FM US ARMY AIR AND MISSILE DEFENSE OPERATIONS

Training and Evaluation Outline Report

Emerging Nuclear Detection Technologies in the Department of Defense

Training and Evaluation Outline Report

Scouts in Contact Tactical Vignettes for Cavalry Leaders

The 19th edition of the Army s capstone operational doctrine

U.S. Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC)

FIELD MANUAL NO *FM HEADQUARTERS DEPARTMENT OF THE ARMY Washington, DC, 19 June 1991

DEPARTMENT OF THE ARMY HEADQUARTERS 8 TH INFANTRY DIVISION OFFICE OF THE COMMANDING GENERAL APO NEW YORK 09111

NEWS FROM THE FRONT. CPT Nick Morton 19 JAN 17. Approved for public release: Distribution unlimited

FM MILITARY POLICE LEADERS HANDBOOK. (Formerly FM 19-4) HEADQUARTERS, DEPARTMENT OF THE ARMY

Environment: Some iterations of this task should be performed in MOPP 4. This task should be trained under IED Threat conditions.

IDENTIFY THE TROOP LEADING PROCEDURE

17895 Infantry Squad Battle Course (ISBC) RANGE DESIGN GUIDE

CHAPTER 4 MILITARY INTELLIGENCE UNIT CAPABILITIES Mission. Elements of Intelligence Support. Signals Intelligence (SIGINT) Electronic Warfare (EW)

Training and Evaluation Outline Report

DIGITAL CAVALRY OPERATIONS

The U.S. Army reactivated active component division. Reinventing the Wheel

Inside the Beltway ITEA Journal 2008; 29: Copyright 2008 by the International Test and Evaluation Association

CHAPTER 2 DUTIES OF THE FIRE SUPPORT TEAM AND THE OBSERVER

We Produce the Future. Air Force Doctrine

NDIA Air Targets and UAV Division Symposium. LTC Scott Tufts 4 October 2012

HQMC 7 Jul 00 E R R A T U M. MCO dtd 9 Jun 00 MARINE CORPS POLICY ON DEPOT MAINTENANCE CORE CAPABILITIES

Soldier Division Director David Libersat June 2, 2015

Developing Adaptive, Agile Leaders for the GWOT. COL Tony Puckett Commander, 30 th Field Artillery Regiment Fort Sill, Oklahoma

EXAMPLE SQUAD OPERATION ORDER FORMAT. [Plans and orders normally contain a code name and are numbered consecutively within a calendar year.

CHAPTER 5 SECURITY OPERATIONS

Training and Evaluation Outline Report

Summary Report for Individual Task Supervise a CBRN Reconnaissance Status: Approved

Headquarters Department of the Army

1. What is the purpose of common operational terms?

Data Collection & Field Exercises: Lessons from History. John McCarthy

Chapter I SUBMUNITION UNEXPLODED ORDNANCE (UXO) HAZARDS

Transcription:

U.S. Merging Operational Realism with DOE Methods in Operational Testing NDIA Presentation on 13 March 2012 Nancy Dunn, DA Civilian Chief, Editorial & Statistics/DOE Division, US nancy.dunn@us.army.mil (443)861-9638 Jonathan Fowler, DA Civilian Chief, Mounted Maneuver Division, Maneuver-Ground Evaluation Directorate US jonathan.fowler@us.army.mil (443)861-9624

Problem The desire for realistic free play in operational testing traditionally precludes or limits the use of scientific, quantifiable test and analysis techniques. 2

Operational Testing Fundamental Objective Determine/demonstrate the impact of new equipment on mission accomplishment of a tactical unit in an operationally realistic environment (a.k.a. factors and conditions) FACTORS Mission Time of Day Terrain Threat Intensity.. FACTOR LEVEL Hasty Attack, Raid, Deliberate Attack, Cordon & Search Day, Night Desert, complex, MOUT Hi,Low.. System-Under-Test (SUT) System Performance (MOE/MOP) Mission Success Percent of detections Probability of kill Message completion rate Based on Requirements Documents and MBT&E 3

DOE Fundamental Objective Scientific Answers to Four Test Event Challenges Four Challenges faced by each test event 1. Which Points? A: span/populate the battle-space 2. How many? A: sufficient samples to control our twin errors false positives & negatives 3. How To Execute? A: Randomize and Block runs to exclude effects of the unknown-unknowns 4. What Conclusions? A: build math-models of input/output relations, quantifying noise, controlling error DOE effectively addresses all these challenges! Many design choices: Full factorials, Fractional Factorials, D-Optimal, Split Plot, etc. DoE 4

Why The Merger is so Challenging Operational Testing DOE Paradigm Shift Needed Goals Usually very general Usually very focused Factor levels Multiple with different factors/ levels. More Qualitative, Soldier Surveys, Subject Matter, Expert observations Many factors not controllable. Test Constraints limit testable factor levels. One Large Event, sometimes with specific excursions. Need more specific goals Specific questions More Quantitative Keep qualitative add more quantitative responses, when possible Controlled by Tester Each response can have its own Test Design Accept that OT data will have more variability, due to required BLUFOR free play Create multiple nested Test Designs using DOE within the one larger OT 5

Creating Multiple Nested DOEs for OT Based on Specific Test Goals Mission Success DOE MISSION TASK 1 DOE MISSION TASK 2 DOE SPECIFIC TASK DOE Requires more controls OPFOR BLUFOR higher HQ White Cell 6

A Way This Can Done Example: Stryker ICVV-Scout T&E for Modifications Remote Weapon Station for firing under armor LRAS3 sensor stored inside no longer mounted on top Double-V Hull for additional survivability 7

Screen First Review and Understand Platoon/Section Missions Area Security Route Reconnaissance 6 OBJ 3 4 S 2 S 5 1 Focus on a select three of all possible reconnaissance and security missions to give a broad, representative range of context for task accomplishment 8

Next Review and Understand Section/Team Collective Tasks Ingress without detection Establish and maintain communication Establish an Observation Post (OP) Prepare LRAS3 for dismounted operations Observe a Named Area of interest (NAI) Report on enemy and noncombatant activity Conduct local security Conduct maintenance React to contact Hand over targets from LRAS3 to Remote Weapon Station (RWS) Call for fires or air support Conduct vehicle ingress/egress under duress Conduct casualty evacuation Recover an Observation Post (OP) Recover and stow LRAS3 from dismounted operation Egress/displace without detection The test is designed around creating opportunities to observe select tasks within the mission context. 9

Create Specific Goals Based on Missions, Collective Task, and Modifications to the Stryker ICVV-Scout Determined that the test will consist of Scout Section Missions. Can a scout section equipped with the ICVV-S accomplish its recon and security missions? Can the scout section dismount, put into operation, shut down, and redeploy the LRAS3 from the ICVV-S? Can the scout section effectively hand over targets from the dismount LRAS3 to the RWS on the ICVV-S? Can a casualty be evacuated from the ICVV-S with the LRAS3 stowed? Can the crew ingress and egress the vehicle quickly enough? 10

Multiple Nested DOEs Based on Stryker ICVV-Scout Specific Test Goals Scout Mission DOE TASKS USING LRAS3 Ingress & Egress TASK LRAS3 to RWS Handover Casualty Evacuation TASK 11

Scout Section Missions (Information for DOE test matrices) : SME ratings and Unit survey of ability of unit equipped with the ICVV-S to conduct its reconnaissance and security missions Factor Control Factor Type mission SV* Area Security, Route Recon, & Screen Missions Light SV* Day, Night LRAS3 SV* Deploying LRAS3, Not deploying LRAS3 Test assets & crew LRAS3 Stowage Terrain Held constant Held constant Held Constant Two ICVV-Ss manned by a scout section ICVV-S with all equipment required to complete a mission (LRAS3) Primary. Secondary, cross country, trails (All Terrains Covered in Each Mission) * SV Systematically varied using DOE principals 12

Test Design Matrix for Scout Section Mission Success Test Design: The power is >= 91% for α =0.1 and S:N ratio of 1.0 for a full factorial completely randomized design with one or two repetitions per cell. We will be able to analyze all main effects and all interactions. Mission Deploying LRAS3 Day Without deploying LRAS3 Deploying LRAS3 Night Without deploying LRAS3 Area Security 2 1 1 2 Route Recon 1 2 2 1 Screen 1 1 1 1 Total 4 4 4 4 13

Establish an Observation Post (Information for DOE test matrices) : Time to set up OP, SME ratings & Unit survey of ability of scout team to establish an Observation Post. Factor Control Factor Light SV Day, Night Team SV Team #1 & Team #2 LRAS3 SV Deploying LRAS3, Not deploying LRAS3 : Time (seconds), SME ratings & Unit surveys of ability of scout crew to effectively hand over targets from the dismount LRAS3 to the RWS on the ICVV-S? Light SV Day, Night Team SV Team #1, #2 * SV Systematically varied using DOE principals 14

Test Design Matrix for Observation Post Test Design: The power is 98% for α =0.1 and S:N ratio of 1.0 for a completely randomized full factorial design with 2 replications. Day Night Deploy LRAS3 No LRAS3 Deployed Deploy LRAS3 No LRAS3 Deployed Crew #1 (Crew #2 in over watch ) 2 2 2 2 Crew #2 (Crew #1 in over watch ) 2 2 2 2 Total 4 4 4 4 Test Design Matrix for Ability to Hand Over Targets From Dismounted LRAS3 to the RWS Test Design: The power is 78% for α =0.1 and S:N ratio of 1.0 for a completely randomized full factorial design, which can analyze main effects and interactions. Day Night Crew #1 (Crew#2 in over watch ) 2 2 Crew #2 (Crew#1 in over watch ) 2 2 Total 4 4 15

Shaping the Conditions - Planning Free play applies to the BLUFOR test unit The unit whose Soldiers will conduct the supporting collective and individual tasks with the new equipment Everything and everyone else in the operational test box is an enabler to create opportunities for data collection from the test unit in a realistic environment Operational Test Team (working with T&E IPT) Crafts an operationally realistic environment in the box Applies DOE factors and conditions into a series of vignettes Writes Operations Orders (OPORDs) from BLUFOR higher headquarters Designs event-driven, realistic triggers to elicit desired BLUFOR tasks BLUFOR higher headquarters White Cell Reinforces constraints & triggers realistically during the test event OPFOR Understands role as challenging enabler, not competitor Free play within the constraints of each encounter; led by operational test team Free play within realistic constraints in a challenging environment 16

Shaping the Conditions - Executing The BLUFOR conducts doctrinally realistic missions in free play Conducts unit-level troop leading procedures & develops own operations orders Responds to battlefield stimuli as they see fit, given their training Everyone else monitors the test & adjusts as necessary Operational Test Team Orchestrates all actions in the box to maintain realistic environment and recognizes approaching conditions to execute triggers Directs execution of triggered actions by White Cell and OPFOR Monitors successful collection of data points using matrices as checklists Develops changes to the test schedule in reaction to missed data points and builds consensus with Evaluator, User, Developer, OSD for the changes BLUFOR higher headquarters White Cell Simulates the existence of higher headquarters and adjacent units Sends triggers: Fragmentary Orders (FRAGOs), intel updates, and reports from adjacent units as directed by OT Team OPFOR Acts as triggers: executes specific contacts as directed by OT Team Fights hard within the realistic constraints of the scenario and encounter 17

Summary Merging Operational Testing with DOE Determine the BLUFOR Mission Echelon for Test Determine which missions and supporting tasks are most important/most affected by the SUT Determine the main Goals of the OT to guide the test planning Determine the number of Test Matrices using DOE that need to be created and how they fit together within the BLUFOR missions Using the White Cell and the OPFOR determine how each condition in the test matrix can be forced to occur Plan time at the end of test to run select additional missions for opportunities to collect data points missed due to free play Analyze the responses using techniques that look over the factors Focus on the right thing. Doing things right. 18

Conclusion Test designs based on DOE that include free play for the test unit in operationally realistic environment can be created and executed. The resulting data allow the use of scientific, quantifiable test and analysis techniques which provide for more meaningful evaluations that better inform senior acquisition decision makers and Warfighter commanders. 19