System Test and Evaluation Procedures

Size: px
Start display at page:

Download "System Test and Evaluation Procedures"

Transcription

1 ATEC Pamphlet 73-1 Test and Evaluation System Test and Evaluation Procedures Headquarters U.S. Army Test and Evaluation Command Alexandria, VA 16 June 2010

2 SUMMARY of CHANGE ATEC Pamphlet 73-1 System Test and Evaluation Procedures This revision Supersedes the following publications: ATEC Pamphlet 73-1, 19 April Revised Guidance for the Early Strategy Review (ESR) and for the Test Concept In-Process Review (CIPR), 23 December Revised Guidance for ATEC System Team (AST) Chair Tasking Authority, 23 December Interim Policy Guidance (IPG) 04-6, ATEC s Level of Involvement on Clothing and Individual Equipment (CIE) and Organizational CIE (OCIE) Programs, 23 December IPG 05-3, Fielding Selected Rapid Acquisition Initiatives (RAI) Programs to the Whole Army, 15 September IPG 07-2, Use of the U.S. Army Materiel Systems Analysis Activity to Support ATEC Evaluations, 5 June IPG 07-4, Coordination of ATEC Assessment and Evaluation Results, 28 November IPG 08-1, Test and Evaluation Document Name Changes, 4 April IPG 08-2, Release of Data, 27 May IPG 08-3, Reliability Test Thresholds for Engineering and Manufacturing Development and Demonstration (EMDD) Phase, 29 July IPG 08-4, Update on the Rapid Initiatives (RI) and Urgent Materiel Release (UMR) Process Within ATEC, 25 September IPG 09-1, Reliability Growth Planning Methodology, 6 February IPG 09-2, Capability Documents Review Checklist, 21 April IPG 09-3, Sustainment (Availability) Key Performance Parameter (KPP) Evaluation, 12 June 2009.

3 IPG 09-4, Classified and Unclassified Real Time Casualty Assessment (RTCA) Data Used by ATEC System Team (AST) Chairperson in Test and Evaluation (T&E), 15 July IPG 09-5, Level of Evaluation for Non-Major Programs, 20 August IPG 09-6, Instrumentation in Planning and Documentation, 20 October IPG 09-7, Use of Model Based Data Engineering (MBDE) in ATEC, 21 October Provides more detail on development and staffing of System Evaluation Plans (SEPs). Clarifies and updates staffing required for coordination and approval of test and evaluation master plans (TEMPs). Removes requirement for document Review Boards. Clarifies ATEC support to Source Selection Evaluation Boards (SSEBs). Updates policy in support of Rapid Acquisition Initiatives. Provides the latest ATEC organization. Provides the latest policies and procedures on Evaluation Strategies (e.g., Design of Experiment (DOE), Mission Based Test and Evaluation (MBT&E)). Updates the Test Schedule and Review Committee (TSARC) procedures. Updates ATEC support on the Capabilities Development for Rapid Transition (CDRT). Updates the terminology used for test and evaluation assessments and reports. Updates the Early Strategy Review(ESR) and Test and Evaluation (T&E) Concept In- Process Review (CIPR)

4 This page is intentionally left blank.

5 Department of the Army *ATEC Pamphlet 73-1 United States Army Test and Evaluation Command 4501 Ford Avenue Alexandria, VA June 2010 Test and Evaluation SYSTEM TEST AND EVALUATION PROCEDURES History. This pamphlet was last revised on 19 April Summary. ATEC supports the Army system acquisition and force development processes through overall management of the Army s Test and Evaluation programs. This pamphlet implements ATEC methodology for planning tests and system evaluation in accordance with ATEC Regulation 73-1; provides background information on integrated T&E strategies; and provides guidance and suggestions for preparing and formatting planning documentation for tests, evaluations, and assessments. Applicability. This pamphlet applies to ATEC Headquarters (HQ) and all subordinate command activities (SCAs). Supplementation. Supplementation of this pamphlet is permitted within each subordinate command to implement further guidance within respective areas of responsibility. All supplements will be coordinated with the ATEC HQ Director, Test and Technology prior to approval by the Technical Director at each command. A copy will be provided to ATEC HQ (CSTE-TT). Suggested improvements. The proponent for this pamphlet is the Test and Technology Directorate (CSTE-TT), ATEC. Users are invited to send comments and suggested improvements on DA Form 2028 (Recommended Changes to Publications and Blank Forms) through the chain of command to Commander, ATEC (CSTE-TTP), 4501 Ford Avenue, Alexandria, VA Distribution: BE. This pamphlet will be widely disseminated throughout ATEC to include HQ, DTC, OTC, AEC, and the Liaison, T&E field, and management offices. OFFICIAL: RODERICK BURKE, SR. COL, GS Chief of Staff THOMAS C. CLOHAN Deputy Chief of Staff for Information Management *This pamphlet supersedes ATEC Pamphlet 73-1, dated 19 April It also supersedes Revised Guidance for the Early Strategy Review (ESR) and for the Test Concept In-Process Review (CIPR), 23 December 2004; Revised Guidance for ATEC System Team (AST) Chair Tasking Authority, 23 December 2004; and Interim Policy Guidance (IPGs) 04-2, 04-6, 05-3, 07-2, 07-4, 08-1, 08-2, 08-3, 08-4, 09-1, 09-2, 09-3, 09-4, 09-5, 09-6, and ATEC Pamphlet June 2010

6 TABLE OF CONTENTS Chapter 1 Test and Evaluation Overview Purpose Scope References Foundation for T&E ATEC mission ATEC organization ATEC interface with acquisition community Chapter 2 Test and Evaluation Management Overview Overview of acquisition programs and the Joint Capabilities Integration and Development System (JCIDS) (CJCSI 3170) Overview of the Defense Acquisition Framework (DODI ) Overview of the Interoperability and Supportability of Information Technology and National Security Systems Process (CJCSI 6212) Overview of the Information Assurance Process (DODD ) Overview of National Defense Authorization Act (NDAA) FY2007 (Section 231) Overview of the ATEC integrated T&E process Rapid Acquisition Initiative (RAI) ATEC s support to Capabilities Development for Rapid Transition (CDRT) Capability Sets Technology demonstrations and experiments Joint T&E (JT&E) Program Chemical and Biological Defense Program (CBDP) Clothing and individual equipment (CIE) and organizational CIE (OCIE) programs Soldier Enhancement Program Combined Test Organization/Combined Test Teams (CTO/CTT) ATEC System Team (AST) Working groups and decision bodies Test Schedule and Review Committee (TSARC) Resource management Chapter 3 System Evaluation Planning and Analysis Overview of evaluation planning Levels of evaluation for non-major programs Development of the evaluation strategy Steps in developing the evaluation strategy Documenting the evaluation process Evaluation tools Evaluation planning reviews and documentation Continuous evaluation and evaluation trends The Ilities ii ATEC Pamphlet June 2010

7 Chapter 4 Test Planning and Execution Overview of test planning Developmental and operational testing Combined and integrated DT/OT Contractor DT Use of Government test facilities Safety Releases and Safety Confirmations OTA Test Plan(s) DT test planning OT event planning Test Support Packages Overview of Test Readiness Reviews (TRR) Evaluation Decision Review (EDR) Developmental Test Readiness Review (DTRR) Operational Test Readiness Review (OTRR) Joint Test Readiness Review (JTRR) Overview of test execution Developmental test (DT) execution Operational test execution Customer test Data management overview DT data management OT data management AEC data management Data authentication Data release Data retention and disposition Chapter 5 Test and Evaluation Reporting Overview Section I Test Reporting Test reporting policy Developmental Test Report Live Fire Test Report Operational Test Reports Test Incident Report (TIR) Technical notes Test reporting timelines Section II System Reporting Evaluation reporting Overview of ATEC s assessment and evaluation reports OTA Milestone Assessment Reports (OMARs) OTA Evaluation Reports/OTA Follow-on Report (OER/OFER) Draft reports Tailored/abbreviated assessments and evaluations OTA Assessment Report (OAR) ATEC reports that support type classification and materiel release ATEC Pamphlet June 2010 iii

8 5-17. System Analysis Report (SAR) System reporting timelines ATEC document distribution guidelines Appendix A References... A-1 Appendix B ATEC Liaison Offices...B-1 Appendix C Joint Capabilities Integration and Development System (JCIDS) Checklists...C-1 Enclosure C-1 ICD Review Checklist...C-2 Enclosure C-2 CDD Review Checklist...C-3 Enclosure C-3 CPD Review Checklist...C-5 Appendix D Capability and Limitations Report (CLR) Writing Guide... D-1 Enclosure D-1 CLR Format... D-5 Enclosure D-2 CLR Staffing... D-11 Enclosure D-3 Example of Timeline... D-16 Enclosure D-4 Data Authentication Group (DAG) Procedures for RAI... D-17 Appendix E Test Schedule and Review Committee (TSARC)... E-1 Appendix F Formats... F-1 Enclosure F-1 Evaluation Determination Matrix (EDM)... F-2 Enclosure F-2 Early Strategy Review and Concept In-progress Review (ESR and CIPR)... F-5 Enclosure F-3 System Evaluation Plan (SEP)... F-9 Enclosure F-4 Operational Test Agency Test Plan (OTA TP) Format... F-23 Enclosure F-5 Detailed Test Plan (DTP)... F-40 Enclosure F-6 Operational Test Agency Assessment Report (OAR)... F-49 Enclosure F-7 OTA Milestone Assessment Report (OMAR)/ OTA Evaluation Report (OER)... F-50 Enclosure F-8 System Analysis Report... F-56 Enclosure F-9 OTA Follow-on Evaluation Report... F-61 Enclosure F-10 Technical Note... F-62 Appendix G Reliability, Availability, and Maintainability... G-1 Appendix H Survivability... H-1 Appendix I MANPRINT... I-1 Appendix J Data Analysis... J-1 Appendix K Test Types... K-1 Appendix L ATEC Test Facilities... L-1 Appendix M Data Authentication Group (DAG)... M-1 Appendix N Enterprise Project Management... N-1 Appendix O System Safety... O-1 Appendix P Rock Drills... P-1 Appendix Q Other T&E Considerations... Q-1 Appendix R Assessing Risk...R-1 Appendix S Guidelines for Conducting Operational Test and Evaluation (OT&E) for Software-Intensive System Increments... S-1 Appendix T Staffing of External Documents... T-1 Appendix U Glossary... U-1 iv ATEC Pamphlet June 2010

9 Table List Table 2-1. AST Responsibilities Table 2-2. Obligation Period and Policy Table 3-1. Evaluation Process Inputs/Key Activities and Outputs by Acquisition Phase Table 3-2. Levels of Evaluation for Non-Major Programs Table 3-3. Evaluation Determination Matrix Table 3-4. OSD Section 231 Report Guidelines and Implementation Methods Table 3-5. Example Baseline Correlation Matrix Table 3-6. Three Categories of Systems and Minimum WFT/TMT Evaluation Considerations Table 3-7. Example Systems and Corresponding WFTs Table 3-8. Example Factors and Conditions Table for a Ground Based Radar Table 3-9. Example Full Factorial Design Test Matrix for a Ground Based Radar Table 4-1. OTA TP Approval Authority Table 4-2. Test Support Package Requirements Table 4-3. Data Level Definitions Table 4-4. Data Retention Guidelines Table 5-1. Readiness Levels and Likely Risk Levels Table 5-2. Consequence Risk Levels Table 5-3. Overall Risk Assessment Guide Table 5-4. ATEC Distribution List Table B-1. LNO Locations and Contact Information... B-1 Table D-1. CLR Timeline... D-14 Table E-1. TSARC Schedule... E-2 Table J-1. ATEC Database Level Updates... J-2 Table L-1. DTC s Test Center Test Capabilities... L-6 Table L-2. OTC s Core Competencies - Test Directorate... L-13 Table P-1. Example of Rock Drill Process... P-4 Table Q-1. Climatic Design Types... Q-5 Table Q-2. ILS Responsibilities... Q-9 Table R-1. Risk Areas and Example Risks... R-3 Table R-2. Likelihood Criteria... R-4 Table R-3. Impact Criteria... R-4 Table R-4. Likelihood/Consequence Matrix: Risk Rating as a Function of Likelihood and Consequence of the Risk Event... R-4 Table R-5. Example Risk Ratings and Criteria... R-4 Table R-6. Example Presentation of Rated Risk... R-5 Table R-7. AMSAA Likelihood Criteria... R-6 Table R-8. AMSAA Likelihood/Consequence Matrix... R-6 Table R-9. Operational Impact Criteria... R-8 Table R-10. Operational Likelihood/Consequence Matrix... R-8 Table R-11. Operational Likelihood/Consequence Matrix... R-9 Table R-12. Decision Table (Fielded or Not Fielded)... R-10 Table R-13. Probability of Obtaining Sample Proportion P... R-11 Table R-14. Decision Table with Probabilities... R-13 Table T-1. Guide for Staffing Externally Generated Test and Evaluation Actions... T-1 ATEC Pamphlet June 2010 v

10 Figure List Figure 1-1. ATEC Organizations and Locations Figure 1-2. ATEC s Strategic Organizational Construct Figure 1-3. Primary Players in Army T&E Figure 2-1. DOD Concurrent Acquisition Processes Figure 2-2. RAI and Urgent Materiel Release (UMR) Process Figure 2-3. Selection of Overseas Contingency Operations (OCO) Systems for CDRT Figure 2-4. Rapid Materiel Acquisition Scenarios Figure 2-5. Rapid Materiel Acquisition Scenario Detail Figure 2-6. In-Theater System Assessments for Rapid MA ACAT III ONLY Figure 2-7. Sample AST Charter Figure 2-8. Difference between T&E WIPT and the AST Figure 2-9. ASARC Membership Figure Army OIPT Membership Figure 3-1. Determination of Level of Evaluation Figure 3-2. Evaluation Strategy Summary Figure 3-3. Example System Evaluation Dendritic Figure 3-4. Example Warfighting Tasks and Measures Figure 3-5. Relationships Figure 3-6. A Combined View of the DODAF and AUTL/UJTL Figure 3-7. Lens Chart Example Figure 3-8. Mapping of System and System Functions to Tasks Figure 3-9. Tactical Levels of Missions and Tasks Figure ESR Sample Agenda Figure T&E CIPR Sample Agenda Figure Lens Chart Sample Figure Example Data Source Matrix (DSM) Figure T&E Strategy Approval Process Figure TEMP Processes Figure 4-1. Test Planning Process Figure 4-2. OTRR Sample Agenda Figure 5-1. T&E Reporting Process Figure 5-2. LFT&E Documents Timeline Figure 5-3. Test Report Documents Timeline Figure 5-4. Risk Assessment Dendritic Example Figure 5-5. Risk Mitigation Strategy Figure 5-6. Risk Summary Example Figure 5-7. System Assessment/Evaluation Report Timeline Figure D-1. CLR Staffing Process Map... D-15 Figure H-1. Nuclear, Biological, and Chemical Survivability... H-11 Figure H-2. DA-approved CBRCS Criteria for Army Materiel... H-12 Figure H-3. CBRCS Evaluation Data Requirements... H-13 Figure I-1. MANPRINT Impact Rating Scale... I-10 Figure K-1. Examples of Test Types during Acquisition Phase... K-1 vi ATEC Pamphlet June 2010

11 Figure M-1. Generic DAG Charter and SOP... M-5 Figure M-2. DAG Resourcing Considerations... M-9 Figure P-1. Example of Rock Drill Reverse Planning Process... P-2 Figure R-1. Derivation of Risk Level for the Joint Common Missile Seeker Dome... R-7 ATEC Pamphlet June 2010 vii

12 This page intentionally left blank. viii ATEC Pamphlet June 2010

13 Chapter 1 Test and Evaluation Overview 1-1. Purpose This pamphlet implements the policies and responsibilities set forth in Army Test and Evaluation Command (ATEC) Regulation It establishes the ATEC guidelines and procedures for test and evaluation (T&E) of Army, Multi-Service and Joint, Interagency, and Multi-National T&E programs. The pamphlet describes the T&E support that ATEC provides to the Defense Acquisition Process through developmental testing (DT), operational testing (OT), field experimentation, rapid acquisition initiatives (RAI), and independent system evaluation Scope The procedures described in this pamphlet are based on Chairman of the Joint Chiefs of Staff Instructions (CJCSI) 3170 and 6212; Department of Defense (DOD) Directives/Instructions (DODD/DODI) 3200, 4630, 5000 series, and 8500; and Headquarters, Department of the Army (HQDA) policy documentation including Army Regulations (ARs) 70-1, AR 73-1, and Department of the Army (DA) Pamphlet This ATEC pamphlet establishes the common procedures and processes that are used to implement the policies contained in ATEC Regulation Additional procedures and information are contained in the appendixes of this ATEC Pamphlet References Required and related publications are listed in appendix A Foundation for T&E The requirement for T&E as an integral part of the acquisition of materiel systems is mandated by law and delineated in multiple DOD/AR directives and regulations. a. Office of Management and Budget (OMB) Circular A-109, Major System Acquisitions, establishes policies to be followed in the acquisition of major systems. b. Title 10 United States Code (10 U.S.C.) includes statutory requirements for T&E. The key sections that address T&E are outlined below. Section 139 Establishes Director Operational Test and Evaluation (DOT&E) 2366 Requires Survivability and Lethality Testing before full rate production (FRP) 2399 Directs Operational T&E (OT&E) of Major Defense Acquisition Programs (MDAP) and restricts contractor involvement in Initial Operational Test and Evaluation (IOT&E) ATEC Pamphlet June

14 2400 Describes Determination of Quantities to be procured for Low-Rate Initial Production (LRIP) of New Systems 2681 Establishes DOD Major Range and Test Facility Base (MRTFB) c. DODD and DODI establish the policy and procedures for the acquisition process of technology projects and acquisition programs for all Services. DODD establishes policy and responsibilities for the MRTFB. DODD 4630 addresses policy for interoperability and supportability of Information Technology (IT) and National Security Systems (NSS). DODD 8500 establishes the DOD Information Assurance Certification and Accreditation Program. d. AR 73-1 implements policies and assigns responsibilities for T&E activities during the system acquisition process ATEC mission ATEC is a direct reporting unit (DRU) under the Office of the Chief of Staff of the Army (CSA) per AR 10-87, dated 4 September 2007, Army Commands, Army Service Component Commands, and Direct Reporting Units and by General Order (GO) 13. ATEC s mission is to plan, integrate, and conduct experiments, developmental testing, independent operational testing, and independent evaluations and assessments to provide essential information to acquisition decision makers and commanders ATEC organization a. ATEC is composed of a headquarters (HQ) located in Alexandria, VA, and consists of three subordinate command activities (SCAs), liaison offices, and T&E field and management offices. ATEC s organizations and their locations are depicted in figure 1-1. ATEC HQ T&E coordination and management activities and resources include (1) Liaison Offices (LNO). See appendix B for list of current LNOs. (2) Joint Test and Evaluation (JT&E) Office. See paragraph 2-12 for information. (3) Joint Test Board (JTB) Coordinates and synchronizes T&E events with Joint Improvised Explosive Device Defeat Organization s (JIEDDO) counter-improvised explosive device (IED) requirements and priorities to ensure all systems are adequately tested and evaluated to provide information to decision makers and the Warfighter in support of the Overseas Contingency Operations (OCO) (formerly called Global War on Terror (GWOT)) mission. b. ATEC SCAs consist of two test commands and an evaluation center. The SCAs are as follows: (1) Developmental Test Command (DTC). HQ DTC is located at Aberdeen Proving Ground, Maryland, and operates eight subordinate test centers providing the full spectrum of 1-2 ATEC Pamphlet June 2010

15 arctic, tropic, desert and other environments under natural or precisely controlled conditions. They are: (a) Aberdeen Test Center (ATC), Aberdeen Proving Ground, Maryland. (b) Electronic Proving Ground (EPG), Fort Huachuca, Arizona. (c) Redstone Test Center (RTC), Redstone Arsenal, Alabama. (d) West Desert Test Center (WDTC), Dugway Proving Ground, Utah. (e) White Sands Test Center (WSTC), White Sands Missile Range, New Mexico. The following test centers are subordinate activities of Yuma Proving Ground: (f) Cold Regions Test Center (CRTC), Fort Greely, Alaska. (g) Tropic Regions Test Center (TRTC), Schofield Barracks, Hawaii and Panama. (h) Yuma Test Center (YTC), Yuma Proving Ground, Arizona. (2) Operational Test Command (OTC). HQ OTC is located at Fort Hood, Texas. OTC has small cells at three locations: Fort Benning has the Infantry Support Cell, Fort Bliss has a Brigade Combat Team (BCT) or rather the Army Evaluation Task Force (AETF), and Fort Leonard Wood has a Test and Evaluation Coordination Office (TECO-FLW). OTC operates eight test directorates that test and assess systems in a realistic operational environment using typical Soldiers to determine whether systems are effective, suitable and survivable in varying environments. The eight test directorates are: (a) Airborne and Special Operations Test Directorate (ABNSOTD), Fort Bragg, North Carolina. (b) Aviation Test Directorate, Fort Hood (AVTD), Fort Hood, Texas. Texas. (c) Battle Command and Communications Test Directorate (BCCTD), Fort Hood, (d) Futures Integration Test Directorate (FITD), Fort Hood, Texas. Arizona. (e) Intelligence Electronic Warfare Test Directorate (IEWTD), Fort Huachuca, (f) Maneuver Test Directorate (MTD), Fort Hood, Texas. Texas. (g) Maneuver Support and Sustainment Test Directorate (MS2TD), Fort Hood, (h) Fire Test Directorate (FSTD), Fort Sill, Oklahoma. ATEC Pamphlet June

16 AEC ATEC DTC Dugway Proving Ground Yuma Proving Ground Ft Huachuca White Sands Missile Range Ft Bliss Ft Leavenworth Ft Sill Ft Hood Ft Leonard Wood OTC Redstone Arsenal FOA Teams Warren Ft Knox Alexandria Ft Lee Ft Bragg Ft Benning Orlando Picatinny Arsenal Ft Monmouth Aberdeen Ft Belvoir Ft Monroe Legend Headquarters Major Test Facility Operational Test Dir LNO Site Ft Greely Cold Regions Test Center Panama & Hawaii Tropics Test Sites AFGHANISTAN IRAQ Figure 1-1. ATEC Organizations and Locations (3) Army Evaluation Center (AEC). AEC is located in Alexandria, Virginia, and Aberdeen Proving Ground, Maryland. AEC operates eleven Warfighting Function evaluation directorates. AEC performs system evaluations and assessments for all commodity areas. The Warfighting Function evaluation directorates are: (a) Ballistic Missile Defense Evaluation Directorate (BMDED) Serves as the Army operational test and evaluation arm of the Ballistic Missile Defense System (BMDS) Combined Test Force (CTF) and lead service member of the BMDS Operational Test Agency (OTA) Team. BMDED is located at Huntsville, Alabama. (b) Command and Control Evaluation Directorate (C2ED) Evaluates Army and joint command, control, and communications, business information and medical information systems. (c) Fires Evaluation Directorate (FED) Evaluates Army Fire Support (rockets and missiles, cannon, command and control) and Air and Missile Defense systems. 1-4 ATEC Pamphlet June 2010

17 (d) Future Force Evaluation Directorate (FFED) Evaluates future force and Army transformation acquisition programs. (e) Integrated Logistics Support Evaluation Directorate (ILSED) Conducts integrated suitability, integrated logistics support (ILS) and MANPRINT evaluation planning, analysis, and reporting of Army warfighting systems. ILSED shapes and influences DOD and Army policies to improve logistics supportability; and supports DASA (APL) in their role as the Army Logistician. (f) Intelligence Evaluation Directorate (IED) Evaluates intelligence-related acquisition programs covering national, theater, coalition and commercial space such as surveillance and reconnaissance, electronic and information warfare, and counter-ied equipment.. (g) Maneuver Air Evaluation Directorate (MAED) Evaluates aviation (aircraft, air traffic control, munitions and Soldier support) systems. (h) Maneuver Ground Evaluation Directorate (MGED) Evaluates infantry/soldier systems, wheeled and tracked combat platforms, sensors and target acquisition systems, battle command systems, combat training simulators and lethal and nonlethal weapons/munitions programs. (i) Reliability and Maintainability (RAM) Directorate Provides Reliability, Availability and Maintainability (RAM) evaluation of a system. RMED is located at Aberdeen Proving Ground, Maryland. (j) Survivability Evaluation Directorate (SVED) Provides Survivability, ballistic and non-ballistic battlefield threats, live-fire, vulnerability and lethality evaluations and reports of Army and designated joint systems. Also leads ATEC s Information Assurance Task Force (IATF) for the Combatant Commanders (COCOM). SVED is located at Aberdeen Proving Ground, Maryland. (k) Sustainment Evaluation Directorate (SED) Evaluates programs that provide sustainment, mobility, maneuver support, quartermaster, ordnance, transportation, military police, engineer and chemical-biological systems. c. ATEC performs its mission of integrated test and evaluation via the ATEC System Team (AST). An AST is formed for every evaluated system, and is composed of members from all SCAs AEC, DTC, and OTC, plus others as required, including Liaison Officers, HQ budget officers, Deputy Chief of Staff for Information Management (DCSIM), etc.. The AST is responsible for all ATEC T&E products. Figure 1-2 is a visualization of this strategic organizational construct. ATEC Pamphlet June

18 Fig 1-2 HQ ATEC OTC DTC AST AEC Figure 1-2. ATEC s Strategic Organizational Construct 1-7. ATEC interface with acquisition community a. Congress. Congress mandates that the Office of the Secretary of Defense (OSD) ensure maximum effectiveness of T&E. Both general policy and system specific guidance may come in the Defense Authorization and Appropriations Bills which are issued late in the calendar year. Figure 1-3 reflects the current DOD and Army T&E structure. 1-6 ATEC Pamphlet June 2010

19 Figure 1-3. Primary Players in Army T&E b. OSD. OSD oversight extends to MDAPs and other programs designated via the OSD T&E oversight list. The process for designating a lead Service for a multi-service operational test and evaluation is outlined in the Memorandum of Agreement (MOA) between all the Service Operational Test Agencies. This MOA is updated annually. (1) The Director, Operational Test and Evaluation (DOT&E) performs operational and live fire T&E oversight. (2) Director, Developmental Test and Evaluation (DDT&E) located under the office of the Director; Defense Research and Engineering (DDR&E) performs developmental T&E oversight. ATEC Pamphlet June

20 (3) Defense Acquisition Executive (DAE) performs management of MDAPs. The DAE is the Under Secretary of Defense for Acquisition, Technology, and Logistics [USD (AT&L)] for materiel programs and the Assistant Secretary of Defense (Command, Control, Communications, and Intelligence) [ASD (C3I)] for IT programs. (4) Defense Information Systems Agency (DISA), Joint Interoperability Test Command (JITC). JITC is DISA s Operational Test Activity (OTA) for DISA-managed programs. They also provide T&E support to other government agencies on a cost-reimbursable basis. JITC has responsibility for certifying joint interoperability of all Command, Control, Communications, Computers, and Intelligence (C4I) systems and Automated Information Systems (AIS) that produce, use, or exchange information in any form and exchange information between Services, Agencies, or countries. JITC and ATEC have a memorandum of understanding (MOU) to facilitate the coordination of joint interoperability test requirements for IT/NSS defense acquisition and defense technology programs. (5) Joint Forces Command (JFCOM). JFCOM is one of nine combatant commands in DOD, and the only combatant command focused on the transformation of U.S. military capabilities. The Command s four primary roles in transformation are joint concept development and experimentation, joint training, joint interoperability and integration, and the primary conventional force provider. ATEC coordinates with JFCOM in order to gather data while forces are training in a common joint environment. (6) DOD Test Resource Management Center (TRMC). The TRMC plans for and assesses the adequacy of the MRTFB to provide adequate testing in support of development, acquisition, fielding, and sustainment of defense systems; and maintains awareness of other T&E facilities and resources, within and outside the Department, and their impacts on DOD requirements. (7) Joint Improvised Explosive Device Defeat Organization (JIEDDO). DODD established a Joint IED Test board under the lead of ATEC to coordinate all Joint IED Defeat developmental testing in DOD. (8) Missile Defense Agency (MDA). The MDA has developed a research, development and test program focusing on missile defense as a single layered defense system. ATEC participates as the Army operational T&E arm of the Ballistic Missile Defense System Combined Test Force, and lead service member of the BMDS Operational Test Activity team. c. Operational Test Activities (OTAs). OMB Circular A-109 and DODD direct that each military department establish an independent operational test activity to plan and conduct operational test and evaluation. The current service OTAs are: (1) Air Force: Operational Test and Evaluation Center (AFOTEC) (2) Army: Test and Evaluation Command (ATEC). (3) Marine Corps: Operational Test and Evaluation Activity (MCOTEA). (4) Navy: Commander, Operational Test and Evaluation Force (COMOPTEVFOR). 1-8 ATEC Pamphlet June 2010

21 d. Army. (1) The Army Acquisition Executive (AAE) is responsible for all Army acquisition T&E planning, programming, and budgeting. Currently the AAE is the Office of the Assistant Secretary of the Army (Acquisition, Logistics, and Technology) [ASA (ALT)]. The Program Executive Office (PEO) reporting chain is through the AAE to the DAE. (2) The Army T&E Executive is the senior Army official providing oversight on all Army T&E policy and procedural issues. Currently, the Army T&E Executive is located in the Office of Deputy under Secretary of the Army (DUSA). (3) The Test and Evaluation Office (TEO) is the key Army proponent for Army T&E policy and for the DOD chemical and biological defense (CBDP) T&E program. Currently the TEO is located in the office of the DUSA. The Director of TEO is also the T&E Executive. (4) Other HQDA staff elements with involvement in T&E are: (a) DCS, G-1 provides guidance regarding Manpower and Personnel Integration (MANPRINT). (b) DCS, G-2 provides guidance regarding threat issues. (c) DCS, G-3/5/7 review and coordinate critical operational issues and criteria (COIC) for all non-tactical C4I/IT programs. (d) DCS, G-4 provides integrated logistics support (ILS) guidance. (e) DCS, G-6 provides a Joint, net-centric information enterprise that enables Warfighter decision-making superiority. The Central Technical Support Facility (CTSF) at Fort Hood, Texas, serves as the Army s integration, compliance, and certification activity for Warfighter and business management areas of the Army s enterprise information environment. In this role, the CTSF is part of the ATEC T&E environment by playing a role similar to DISA or JITC. DCS, G-6 also manages T&E life cycle activities for IT systems in support of the AAE. (f) DCS, G-8 plan, program, and budget research, development, test, and evaluation (RDT&E), Army procurement appropriations (APA), and operation and maintenance (OMA) T&E funds. G-8 also reviews, coordinates, and approves critical operational issues and criteria (COIC) for all materiel and tactical C4I/IT programs. G-8 manages, solicits, and coordinates Army participation in JT&E. (5) The Army is reorganizing to increase responsiveness both globally and at home. AR identifies the new structure. The U.S. Army now has three types of headquarters: Army Commands (ACOM), Army Service Component Commands (ASCC), and Direct Reporting Units (DRUs). (6) ACOMs. ATEC Pamphlet June

22 (a) The U.S. Army Forces Command (FORSCOM) supports T&E by providing user troops and units through the Test Schedule and Review Committee (TSARC) process. (b) The U.S. Army Training and Doctrine Command (TRADOC) is the principal Capabilities Developer, doctrine developer, training developer, and trainer for materiel and tactical C4I/IT systems. (c) The U.S. Army Materiel Command (AMC) manages the Army s science and technology base through its laboratories and research, development, and engineering centers. One of its components, the U.S. Communications-Electronic Command s (CECOM) Army Participating Test Unit (APTU) supports joint interoperability testing of C4/IT systems by conducting tests, performing analysis, and generating joint interoperability certification reports for all Army elements and systems. (7) The nine ASCCs are U.S. Army Europe (USAREUR), U.S. Army Central (USARCENT), U.S. Army North (USARNORTH), U.S. Army South (USARSO), U.S. Army Pacific (USARPAC), U.S. Army Special Operations Command (USASOC), Military Surface Deployment and Distribution Command (SDDC), U.S. Army Space and Missile Defense Command (SMDC)/Army Strategic Command (ARSTRAT), and Eighth U.S. Army (EUSA). All ASCCs support T&E by providing user troops and units through the TSARC process, as required. (a) The U.S. Army Europe (USAREUR) supports T&E by providing user troops and units through the TSARC process, as required. (b) The U.S. Army Pacific (USARPAC) supports T&E by providing user troops and units through the TSARC process, as required. (c) The SDDC supports T&E as the transportability functional expert, advising the MATDEV and ATEC for acquisition systems. (d) The SMDC serves as materiel developer and tester for space and missile defense capabilities. SMDC manages the Ronald Reagan Ballistic Missile Defense Test Site (Kwajalein Atoll) and the High Energy Laser Systems Testing Facility (HELSTF) at White Sands Missile Range. (e) USASOCOM serves as the Capabilities Developer, doctrine developer, training developer, trainer, and OTA for assigned Special Operations (SO) Peculiar systems. When a SO Peculiar system s use is anticipated outside of USASOCOM, USASOCOM teams with ATEC to ensure completeness of the T&E. (8) The 11 DRUs are U.S. Army Network Enterprise Technology Command/9thSignal Command (Army) [NETCOM/9 th SC(A)], U.S. Army Medical Command (MEDCOM), U.S. Army Intelligence and Security Command (INSCOM), U.S. Army Criminal Investigation Command (USACIDC), U.S. Army Corps of Engineers (USACE), U.S. Army Military District of Washington (MDW), ATEC, U.S. Military Academy (USMA), U.S. Army Reserve Command (USARC), U.S. Army Acquisition Support Command (USAASC), and U.S. Army Installation 1-10 ATEC Pamphlet June 2010

23 Management Agency (IMCOM). The DRUs support T&E in different ways and to different extents. (a) MEDCOM serves as the MATDEV, Capabilities Developer, doctrine developer, training developer, trainer, tester, and evaluator for assigned systems. They also maintain a human research review via the U.S. Army Medical Research and Materiel Command (USAMRMC) and provide health hazard assessments via the U.S. Army Public Health Command (Provisional) for acquisition systems prior to conduct of OT&E. The Surgeon General (TSG) is the tester and evaluator liaison/point of contact for the health hazard assessment program. (b) INSCOM serves as the MATDEV, Capabilities Developer, doctrine developer, training developer, and trainer for assigned systems. INSCOM also conducts T&E for assigned classified or secure systems. (c) USACE serves as the Capabilities Developer, trainer, tester, and evaluator for assigned systems. (d) The USARC supports T&E by providing user troops and units through the TSARC process, as required. (e) IMCOM provides base support services as required during testing. (9) The Rapid Equipping Force (REF) is an organization that takes its guidance from HQDA G-3 and reports directly to the Vice Chief of Staff of the Army (VCSA). The REF s mission is to rapidly increase mission capabilities while reducing risk to Soldiers and others. It works directly with operational commanders to find promising materiel solutions to their identified operational requirements. ATEC works with the REF and materiel developers to collect and provide as much information as is known about the materiel solution to the operational commanders. The REF accomplishes its mission in three ways. (a) Equip operational commanders with off the shelf (government or commercial) solutions or rapid developmental items that can be researched, developed, and acquired quickly. (b) Insert future force technology solutions that engaged and deploying forces require. Subject to the time available, ATEC tests and evaluates key technologies and systems under operational conditions and provides a Capabilities and Limitations Report (CLR) and a Safety Confirmation (SC) prior to a REF system being equipped for troop use. ATEC also provides a Safety Release for testing of REF systems/items involving Soldiers as test participants. (c) Address capabilities and advise Army staff of findings that will enable our forces to rapidly confront an adaptive enemy. e. Other Government agencies. (1) National Security Agency (NSA). ATEC provides limited testing and evaluation services to NSA for some intelligence-related acquisition programs, surveillance and ATEC Pamphlet June

24 reconnaissance, electronic and information warfare, covering national, theater, coalition and commercial space. (2) Department of Homeland Security (DHS). ATEC provides limited testing and evaluation services to the DHS and various DHS Components including the Transportation Security Administration, Customs and Border Protection and others. ATEC and DHS have collaborated on, and will continue to dialogue on aspects of T&E policy, including best practices to support federal acquisition programs and T&E training. (3) DOD Biometrics Organization. The Biometrics Task Force (BTF) provides biometric oversight and policy on all DOD biometric standards and requirements as well as providing coordination with non-dod agencies. The Biometrics Fusion Center (BFC) (located in West Virginia) provides standards performance, automated biometric identification system (ABIS) interoperability and some developmental testing of biometric acquisition systems, and coordinates with the ATEC to provide results of biometric testing and ensure biometric test support for operational test events, as necessary ATEC Pamphlet June 2010

25 Chapter 2 Test and Evaluation Management 2-1. Overview a. Today s Army has a growing need to ensure that fielded systems as well as the new systems in development are designed to work as part of a larger System of Systems (SoS) and Family of Systems (FoS). They must be compatible, interoperable, integrated, and usable by the Warfighter. ATEC is entrusted as the independent Tester and Evaluator of Army materiel systems and for designated multi-service and joint systems. The AEC performs evaluation planning and uses the test resources of DTC, OTC, and the materiel developer to carry out detailed evaluations that encompass a system s effectiveness, suitability, and survivability (ESS) in technical and operational contexts. ATEC s role reduces risk to the Warfighter and provides a vital source of information to decision makers. ATEC also closes the feedback loop with materiel developers so that every system s iteration represents an incremental improvement in Warfighter capabilities. b. DOD Acquisition Test and Evaluation Overall Process. The majority of the processes described in the next couple of paragraphs are the responsibility of the Capabilities Developer or the PM; the AST will typically encounter issues, constraints and delays due to concurrency issues and dependencies related to these processes. A basic understanding of these processes will allow the AST members to better mitigate the risk of T&E program schedule slippage caused by ATEC. Furthermore, the AST may be able to identify potential opportunities for collecting evaluation relevant data within the context of the integrated T&E program. The concurrent and interdependent acquisition-related process that impacts the AST mission is illustrated in figure 2-1. Although the figure depicts the DOD 5000 acquisition system timeline from start to finish, systems will enter into the acquisition process at various milestones based upon technology maturity. It is possible that not all processes will apply to some systems. Evaluation programs will be tailored, based upon elements of the overall DOD acquisition process, to the specific requirements, conditions and contexts associated with each program. ATEC Pamphlet June

26 Information Assurance (DoD8500.1) IA Cap MAC/CL IA C&A Planning IA DT C&A Testing Certification Review IATO IATT IA OT ATO ATEC T&E Process (ATEC 73-1) ESR / CIPR SEP TEMP Rock Drill OTA TP MS-B OMAR SEP-U TEMP-U OTA TP OTRR LUT MS-C OAR / OMAR OTA TP OTRR IOT OER FOT OFER System Acquisition (DoD 5000) Materiel Solution Analysis Materiel Development Decision A Technology Development B Engineering & Manufacturing Development Post-PDR A Post-CDR A LRIP/IOT&E Production & Deployment FRP Decision Review Operations & Support Pre-Systems Acquisition System Acquisition Sustainment C = Decision Point = Milestone Review = Decision Point if PDR is not conducted before MS B System Specific JCIDS (CJCSI 3170) JCD ICD CDD CPD IT & NSS I&S (CJCSI 6212) [NR-KPP] J-6 I&S Certification J-6 I&S Certification ISP J-6 System Validation Figure 2-1. DOD Concurrent Acquisition Processes 2-2. Overview of acquisition programs and the Joint Capabilities Integration and Development System (JCIDS) (CJCSI 3170) a. The JCIDS is a system for translating mission requirements, current capabilities, concepts of operation and needs into doctrine or materiel based solution alternatives that can be analyzed and developed through the DOD 5000 framework. The requirements process supports the acquisition process by providing validated capabilities and associated performance criteria to be used as a basis for acquiring the right systems. T&E will assess whether new or modified systems deliver their intended capability within the applicable functional capabilities area. There will be a need to consider realistic test environments, including joint mission and joint test environments, to assess an individual system s contribution to joint mission capability. The Joint Capabilities Document (JCD) will capture the current state of capabilities that will lead to either a materiel or non-materiel solution set (or combination). For materiel solutions, the JCIDS documents of interest to T&E are the JCD, Initial Capability Document (ICD), Capability Development Document (CDD), the Capability Production Document (CPD), and the Concept of Operations (CONOPS) for the system under development. There are many different types of CONOPS prepared to address strategic and tactical employment and support concepts as well as other aspects of mission accomplishment. All of these CONOPS should be used to understand how and in what context the system will be employed. 2-2 ATEC Pamphlet June 2010

27 b. The JCD defines sets of capabilities necessary to support joint missions in support of the Family of Future Joint Concepts or CONOPS. The JCD is a snapshot of the current joint capabilities in a mission area and/or scenario. JCDs are usually developed by one or more of the following entities: Combatant Commands, Combat Support Agencies, or Functional Capabilities Boards (FCBs). The JCD will be used as a baseline for one or more Functional Solutions Analyses (FSAs) leading to the appropriate ICDs or joint doctrine, organization, training, materiel, leadership and education, personnel and facilities (DOTMLPF) Change Recommendations (DCRs). JCDs are relevant to acquisition processes occurring prior to MS A as defined by DOD 5000 series and in AR JCDs are not used for the development of CDDs or CPDs. If JCDs point to materiel solutions they will feed the ICD by characterizing current joint capabilities. The combatant development agency develops the JCD and drives most of the JCIDS process. c. An ICD provides a prioritized list of materiel approaches to providing the desired joint war fighting capability. It describes the current capability gaps, summarizes the DOTMLPF analysis and describes why changes to DOTMLPF only, would be insufficient to provide the desired capability. ICDs are required on all new programs when a Mission Need Statement (MNS) is greater than two years old. All programs that are to be initiated at either MS B or MS C are also required to have an ICD. The Capabilities Developer (CBTDEV) develops the ICD in close coordination with the PM. The ICD contains a plan for conducting an Analysis of Alternatives (AoA) between the possible materiel solutions. Performance of the AoA is part of the Materiel Solution Analysis (MSA) phase of the DOD 5000 process. The MSA phase will culminate in a Preferred System Concept. d. The CDD is the requirements document that is most relevant during the Technology Development (TD) phase of the acquisition process. The CDD will capture knowledge gained about individual technologies that will support the integrated system. There will be experiments and demonstrations underway to determine the most effective technologies to integrate in order to close the capability gaps identified in the ICD. When the PM, who has sponsored the experimentation or demonstration process, determines that the demonstration is complete, but additional development is required before fielding, the PM will coordinate with the CBTDEV who creates a CDD or modifies an existing CDD to guide the development process. CDDs are system specific and are drawn upon heavily throughout an acquisition program. CDDs document the results from the TD phase and are required to support a Milestone (MS) B decision. Requirements recorded in the CDD are used to develop Measures of Effectiveness (MOEs) and Measures of Performance (MOPs) that are specified in the ATEC System Evaluation Plan (SEP). CDDs are rewritten or updated for each increment of an evolutionary acquisition program. e. The CPD captures refinements to integrated system performance requirements and production details that emerge during the Engineering and Manufacturing Development (EMD) phase. During this effort as well, system of systems functionality are defined and a detailed design is completed with the system components integrated into a functional system suitable for demonstration purposes. The integrated system undergoes DT and/or a limited deployment from where operational data may be collected. ATEC Pamphlet June

28 f. Desired performance goals (objective) and minimally acceptable goals (threshold) maybe revised as production realities, resources, and mission factors change. The CDD is modified in response to those factors and the changes are also captured in the CPD. The CPD will identify production attributes for a system and contain detailed instructions for producing a system that will meet mission needs and address key requirements within the constraints and conditions defined by Family of Joint Future Concepts and the CONOPS. The production process will make use of automated tools for specifying standards and supporting interoperability and supportability consistent with CJCSI 6212 requirements. g. The CPD provides the requirements considerations that drive the progression through the Production and Deployment (P&D) phase. A CPD is required to successfully complete a MS C decision and to enter into Low-Rate Initial Production (LRIP). h. The AST members will review all capability documents. The AST will use a CDD/CPD review checklist (see appendix C) to ensure requirements are measurable and critical operational issues (COIs) answer pertinent questions concerning the system or system-of-system s ESS. These checklists are a guide and should not be viewed as all-encompassing for reviewing capabilities documents Overview of the Defense Acquisition Framework (DODI ) a. The Defense Acquisition System as defined by the DODI is a management process by which the DOD provides effective, affordable, and timely systems to the users. The process is broken down into stages based upon the level of maturity of the technologies engaging the system. The process includes milestones reviews where high level decision-making is focused. Key participants in the process include: (1) The Program/Project/Product Manager (PM or Materiel Developer) and Staff including Working level Integrated Product Teams (WIPTs). (2) Overarching committees (e.g., Overarching Integrated Product Teams (OIPTs)). (3) Defense Acquisition Board (DAB). (4) Oversight functionaries (e.g., component headquarters, Director of Operational Test and Evaluation (DOT&E)). (5) Contractors. (6) CBTDEVs (TRADOC Capabilities Management (TCM)). (7) Training Developers. (8) Evaluators. (9) Developmental and Operational testers. 2-4 ATEC Pamphlet June 2010

29 b. Different acquisition programs will enter the acquisition process at different milestones based upon technology maturity or other factors. Prior to MS B, the acquisition process is considered to be in its pre-systems acquisition phase. Between MS B and Full-Rate Production (FRP), the acquisition process is in the systems acquisition phase. The phase between the achievements of Full Operational Capability (FOC) through system disposal is considered the Operations and Support (O&S) phase of an acquisition program. Regardless of when a program enters the process, the PM will need to organize a T&E Working-level Integrated Product Team (T&E WIPT). c. A series of technology demonstrations may be conducted to help the user and the developer agree on an affordable, militarily useful solution based on mature technology. The products of the TD phase include the successor requirements document to the ICD, which is the system-specific CDD, the Acquisition Program Baseline (APB), and a draft test and evaluation master plan (TEMP). The TD phase ends when the technology capability has been demonstrated in a relevant environment. d. Milestone (MS) B is the usual point of program initiation for defense acquisition programs. At MS B the PM has a technical solution but has not yet integrated the subsystems into a single, complete system. The entrance criteria for MS B include: 1) APB; 2) Acquisition Strategy; 3) TEMP; and 4) approved CDD. Each increment of an evolutionary acquisition program will have its own MS B. e. At MS C, the Milestone Decision Authority (MDA) considers the CPD, TEMP, Operational Test Agency (OTA) Milestone Assessment Report (OMAR), Information Support Plan (ISP), and J-6 Interoperability and Supportability and IA certifications to assess program maturity and to make the decision to commit the DOD to LRIP (or procurement for systems that do not require LRIP). MS C also authorizes limited deployment in support of operational testing for Major Automated Information Systems (MAIS) or other software-intensive systems. One purpose of LRIP is to develop a manufacturing capability sufficient to produce the minimum quantity of systems required for Initial Operational T&E (IOT&E). f. Full Rate Production Decision Review (FRP DR). An Operational Test Readiness Review (OTRR) will determine whether the system is ready for OT. The MDA decides if/when the program/system is ready to continue into full-rate production. This meeting is called the FRP DR. The decision to proceed into Full-Rate Production will be documented in an ADM. This effort delivers the fully funded quantity of systems and supporting materiel and services for the program or increment to the users. During this effort, units will typically attain Initial Operational Capability (IOC). As technology, software, and threats change, FOT&E shall be considered to assess current mission performance and inform operational users during the development of new capability requirements. g. The Operations and Support (O&S) phase of the DOD 5000 process comprises the use and sustainment of the system. The purpose of this phase is to execute a support program that meets operational support performance requirements and sustains the system in a cost-effective manner over its total life cycle. O&S has two major efforts: Life Cycle Sustainment and Disposal. ATEC Pamphlet June

30 (1) Life Cycle Sustainment includes supply, maintenance, transportation, sustaining engineering, data management, configuration management, manpower, training, supportability and interoperability (and other) functions. Sustainment strategies can be expected to evolve throughout the system life cycle. During O&S, the PM will work with the users to document performance and support requirements in agreements that specify objective outcomes, measures, resource commitments, and stakeholder responsibilities. (2) The Services, in conjunction with users, conduct continuing reviews of sustainment strategies to compare performance expectations, as defined in performance agreements, against actual performance measures. Once the system has reached the end of its useful life, it is disposed of in an appropriate manner Overview of the Interoperability and Supportability of Information Technology and National Security Systems (IT/NSS) Process (CJCSI 6212) a. The CJCSI 6212 establishes policies and procedures for the J-6 Interoperability and Supportability (I&S) certification for all IT/NSS programs. CJCSI 6212 mandates the Net Ready Key Performance Parameter (NR-KPP) as a requirement to be addressed in the CDD and the CPD. The regulation also establishes the ISP, which identifies and documents the information needs, infrastructure support, and IT/NSS interface requirements and dependencies, focusing on net-centric, interoperability, supportability and sufficiency concerns. The ISP will include the NR-KPP requirements in sufficient detail to enable testing and verification of I&S characteristics. CJCSI 6212 requires the CDDs and CPDs associated with IT/NSS programs be certified by the J-6 for I&S. The J-6 provides total lifecycle oversight of Warfighter systems interoperability. The responsibility of end-to-end interoperability testing of IT/NSS systems is assigned to the DISA/JITC. b. Compliance with the Net Centric Operations and Warfare Reference Model (NCOW- RM). The NCOW-RM is a framework for characterizing enterprise level activities, services, technologies, and concepts (e.g., data sharing strategy) that enable a net-centric environment for warfighting, business, and management operations. (1) The stated purpose of the NCOW-RM is to describe the evolving DOD enterprise aspects of an objective net-centric information environment for the Global Information Grid (GIG). The NCOW-RM, as designed, serves as a common, enterprise-level, reference model for the DOD s net-centric operations and for current and future acquisition programs to reference. It enables a shared vision of the DOD enterprise information environment and is used to assist decision makers promote enterprise-wide unity of effort. (2) The stated NCOW-RM objective is to perform program development and oversight with uniform, department-wide reference architecture. Information-technology-related issues can be addressed within individual programs and across the set of enterprise programs in a constructively consistent, coherent, and comprehensive manner. (3) The NCOW-RM describes the activities required to establish, use, operate, and manage the net centric information environment at the enterprise level to include the generic user-interface, the intelligent-assistant capabilities, the net-centric service capabilities, and 2-6 ATEC Pamphlet June 2010

31 enterprise net-centric management components. Although acknowledged, the richness of core functional enterprise services, or community of interest functional enterprise services, is excluded from the NCOW-RM. These core functional enterprise services are left to the functional community and/or the community of interest to develop and describe. (4) The NCOW-RM also identifies a selective set of key standards that will be needed for evolving NCOW capabilities to support the GIG. c. Development of GIG Enterprise Service Profiles (GESPs). (GESPs were formerly called Key Interface Profiles (KIPs.)) GESPs support interoperability across the GIG through configuration control of key interfaces. GESPs provide a description of required operational functionality, systems functionality, and technical specifications for the interface. Integrated Architectures are pictorial and tabular representations of information flows, systems and system interfaces. They provide essential information for determining how to test the system, and what the required test resources are. The Defense Acquisition Guidebook (DAU), chapter 7, provides more information on GESPs. (1) The GIG Technical Guidance (GTG) is an evolving web enabled capability providing the technical guidance necessary for an interoperable and supportable GIG built on Net-Centric principles. The GTG provides a one-stop, authoritative, configuration managed source of technical compliance guidance that synchronizes previously separate efforts. The GTG is designed to enable users to decide which guidance is applicable and to find detailed information and artifacts needed to meet functional requirements (GIG features and capabilities), DOD Information Technology (IT) Standards Registry (DISR)-mandatory GIG net-centric IT standards, supporting GIG IT standards, and GESPs. (2) The GTG is the source for all technology guidance and standards implementation information used in describing GESPs necessary to meet the net centric operational requirements specified in the system/service views of an integrated architecture. The GTG contains a program characterization questionnaire and compliance declaration matrix that points to applicable GESPs. The GESPs are built from DISR mandated IT Standards reflected in a standards profile and include associated implementation guidance reference architecture and testing criteria necessary to meet all GIG related requirements characterized in the integrated architecture system/service views. d. When DISA/JITC is not the interoperability testing organization, interoperability test plans, test analysis, and test reports will be coordinated with DISA/JITC to ensure sufficient information is available to allow DISA/JITC to certify a system. The J-6 validates that the following have been accomplished: (1) Interoperability and supportability requirements certification has been granted. (2) JITC Joint System Interoperability Test Certification has been granted. KPPs. (3) DISA/JITC has reviewed and confirmed the measurability and testability of all NR- (4) IA accreditation has been achieved. ATEC Pamphlet June

32 The J-6 will issue an interoperability system certification memorandum to the relevant Services, agencies and developmental and operational testing organizations Overview of the Information Assurance Process (DODD E) a. The Director, Operational Test and Evaluation (DOT&E) Procedures for Operational Test and Evaluation of Information Assurance in Acquisition programs, dated 21 January 2009, establishes policy to evaluate information assurance (IA) during OT&E for DOD acquisition programs. This policy applies to all DOT&E oversight Major Automated Information System (MAIS) programs, Major Defense Acquisition Programs (MDAPs), and Platform IT products/weapons systems (mission critical or non mission critical) that have interconnections to external information systems or networks. However, OTAs are encouraged to apply these procedures to non-oversight systems as well. The focus of these procedures is to evaluate, in a realistic operational environment, an acquisition system s (or system-equipped unit s) IA capabilities to include its ability to detect and react to penetrations and exploitations and to protect and restore data and information. However, the procedures describe a six-step process to evaluate IA from early program acquisition activities to major milestone and fielding decisions. This six-step process is as follows: Step 1: Determination of Applicability of DOT&E IA Procedures. Step 2: Initial IA Review. Step 3: OT&E of IA Risk Assessment. Step 4: Operational IA Vulnerability Evaluation. Step 5: Protection, Detection, Reaction, and Restoration Operational Evaluation. Step 6: Continuity of Operations (COOP) Evaluation. b. Steps 1 3 are Program Office driven and are typically accomplished by satisfying DOD The DOD 8500 establishes policy and assigns responsibilities to achieve Information Assurance (IA) for DOD systems through a defense-in-depth approach. DOD 8500 applies to all DOD owned or DOD controlled information systems that receive, process, store, display or transmit DOD information, regardless of Mission Assurance Category (MAC), classification, or sensitivity. c. The process begins by categorizing the system in terms of MAC and Confidentiality Level (CL). The MAC is either I, II, or III, with MAC I being the most critical category. The MAC classification governs information availability and integrity based upon mission criticality and risk. The CL is assigned based on information classification level, sensitivity, and need-toknow. d. The DISA maintains a listing of IA tools, defensive measures, or processes collectively referred to as IA controls. The set of IA controls applicable to a given system is completely specified by the system s MAC and CL. The remainder of the DOD 8500 process is one of 2-8 ATEC Pamphlet June 2010

33 planning and implementation of the controls and of achieving certification and accreditation of the system s IA posture by means of the Defense Information Assurance Certification and Accreditation Process (DIACAP). (1) IA certification considers the IA posture of the system and the performance of any IA subcomponents. It also considers how the system might impact the larger information environment in terms of the potential introduction of new vulnerabilities. (2) DIACAP establishes a standard DOD infrastructure-centric approach that protects and secures the entities comprising the Defense Information Infrastructure (DII). DIACAP begins with the formation of a system Certification and Accreditation (C&A) team that will plan, initiate and conduct the system C&A process. The C&A team have a representative from the DIACAP Certifying Authority (CA) as a member to assist in the system C&A process. A System Identification Profile (SIP) that uniquely identifies a system within the DIACAP process is developed, usually by the PM. The C&A team will develop a DIACAP implementation plan to guide the implementation of the IA controls specified by DISA based upon the system MAC and CL. (3) Execution of the DIACAP implementation plan is verified through various laboratory tests. A DIACAP Scorecard is completed by the DIACAP CA based upon the results of verification testing. Since there are often gaps associated with any system s IA posture, the C&A team will develop an IA C&A Plan of Action and Milestones (POA&M). POA&Ms are required by the Office of Management and Budget (OMB). They include the severity codes assigned by the DIACAP Certifying Authority to each weakness or security shortcoming found during validation and analysis. e. The C&A Team prepares a comprehensive DIACAP package that contains the SIP, the implementation plan, verification results, the scorecard, POA&M and other supporting information. The DIACAP Accreditation Authority will issue either an Interim Authority to Test (IATT) or an Interim Authority to Operate (IATO), an Authority to Operate (ATO), or a disapproval of accreditation. An IATT or IATO is required entrance criteria to enter operational testing of the system s IA characteristics (IA OT) (or before execution of Steps 4 6 of the DOT&E procedures). IA OT (Steps 4 6) is conducted in support of reaccreditation every three years. f. Upon receiving an IATT, IATO, or ATO, the system enters Steps 4 6 of the DOT&E procedures. Steps 4 6 are driven by the lead OTA. (1) Step 4 is an operational IA vulnerability evaluation whereby the lead OTA, working with the intended user s IA/network security staff, conduct an overt, cooperative, and comprehensive vulnerability assessment in an operational environment. This assessment will include technical and non-technical methodologies to evaluate configuration management and system IA tools, new equipment and IA training, IA incident response, and patch management and network access controls. Additionally, the OTA will leverage as much production representative data as possible, evaluate the system s inherited controls as identified in DIACAP, and identify protect, detect, react, and restore capabilities and limitations. The result of Step 4 is to provide vulnerability evaluation results and recommendations to program managers and ATEC Pamphlet June

34 materiel developers, as appropriate, so that fixes can be made to correct deficiencies in system hardware and/or software prior to conducting Step 5. (2) Step 5 is a Protection, Detection, Reaction, and Restoration (PDRR) operational evaluation whereby the lead OTA, working with an accredited Threat Computer Network Operations Team (TCNOT), conducts an independent and comprehensive evaluation of PDRR capabilities of the system. This comprehensive evaluation will consider the operational vulnerabilities and shortfalls discovered during Step 4, to include their exploitation potential, and their mission impact. Step 5 is conducted in a realistic, system-of-systems, operational environment approved for adequacy by the DOT&E action officer. The Step 5 evaluation will also consider the systems ability to facilitate user and system/network administrator detection of and reaction to penetrations and exploitations. This will support the determination of how well the system s IA measures support mission accomplishment. (3) Step 6 is a COOP evaluation that may be performed concurrently with Step 5 in which COOP and contingency plans are evaluated for all MAC I systems in the context of mission accomplishment in case of an information attack or system failure/malfunction Overview of National Defense Authorization Act (NDAA) FY2007 (Section 231) The National Defense Authorization Act for Fiscal Year 2007, Section 231, directed a review, and amendment if necessary, of defense acquisition T&E. The Office of the Deputy Under Secretary of Defense (Acquisition, Technology and Logistics) produced an initial DOD Report to Congress on Policies and Practices for Test and Evaluation, dated 17 July 2007, in order to satisfy this legal mandate. The report introduces eight key T&E principles that are referred to herein as the OSD Section 231 Report Guidelines. The principles are listed and briefly described below. a. Measure improvements to mission capability and operational support. Conduct periodic assessments of observed improvements. Relate system functionality to warfighting functions and tactical mission tasks and measure improvements. b. Experiment to learn strengths, weaknesses, and the effect on operational capabilities. Use an OT design of experiment: test under controlled conditions to determine capabilities and limitations. c. Integrated DT and OT. Consider using a single test event to address both DT and OT requirements. Use an OT design of experiment during DT. Give DT an operational flavor with appropriate threats or Soldier operators. The goal of integrated testing is to conduct a seamless test program that produces credible qualitative and quantitative data useful to evaluators, and to address developmental, sustainment, and operational issues. Integrated testing allows for the collaborative planning of test events, where a single test point or mission can provide data to satisfy multiple objectives, without compromising the test objectives of participating test organizations. d. Begin early, be operationally realistic, and continue throughout the life cycle. Get involved very early in the development process. Form a liaison with the system PM and 2-10 ATEC Pamphlet June 2010

35 contractors. Coordinate with AMC and pertinent Battle Labs. Communication with all parties is a requirement. e. Evaluate in mission context expected at the time of fielding. Ensure an operational environment with realistic and representative threats that are expected at the time of fielding plus five years. Apply a mission-based test and evaluation (MBT&E) focus to the system evaluation. f. Compare to current mission capabilities. Whenever possible, include in the OT design of experiment comparison to the baseline system. Factors such as terrain, mission, operational mission summary/mission profile (OMS/MP), and tactics, techniques, and procedures (TTPs) must be held constant. g. Use all available data and information. Use historical data as well as data from contractor testing (when verified and validated), DT, modeling and simulation (M&S), etc. for system evaluation. Use a data model to facilitate understanding of differing sources of like data. h. Exploit benefits of M&S. Supplement/complement live testing with virtual and constructive simulations. Plan early verification, validation and accreditation (VV&A) of models and resource the effort Overview of the ATEC integrated T&E process Integrated testing must be embedded in the T&E strategy. An integrated T&E (IT&E) program will employ DT, OT, analysis, deployment observations, and M&S in order to provide essential information over the life cycle of an acquisition program. Information is provided at times that are most critical to decision-makers and also to the PM (Material Developer (MATDEV)) and the requirements community (CBTDEV). IT&E helps to reduce acquisition risk and facilitates useful and ongoing feedback between the development and user communities. IT&E programs will make use of a system data model that relates various DT, OT, M&S event data, as well as field performance data. In some cases, DT and OT data can be gathered from the same test event in order to make the best use of time and resources. a. An IT&E program will (1) Draw requirements information from the Joint Capabilities Integration and Development System (CJCSI 3170). (2) Test to specific joint interoperability and supportability guidelines (as required by CJCSI 6212) in the context of a NR-KPP. (3) Meet information assurance (IA) criteria as directed by DODI 8500 if the system needs to interact with DOD s GIG. (4) Follow a process framework, complete with milestone events and formal decisions, as defined by the DOD 5000 series. ATEC Pamphlet June

36 b. The primary purpose of the ATEC IT&E process, as set forth in ATEC Regulation 73-1, is to support system development and acquisition by serving as a feedback mechanism in an iterative systems engineering process. ATEC has overall responsibility for Army OT, and independent evaluations of acquisition systems. ATEC also has primary responsibility for conducting DT for the Army in support of PMs and Materiel Developers. ATEC ensures that DOD and Headquarters, Department of Army (HQDA), MATDEVs, and CBTDEVs are informed of system operational ESS. c. Due to the diversity of Army systems and underlying technology maturity, an acquisition program can be initiated at any DODD milestone. Consequently, ATEC T&E programs must be flexible in order to accommodate a vast range of programs. d. Some programs deviate from the Defense Acquisition Process and are discussed in paragraphs 2-8 and 2-9 below Rapid Acquisition Initiative (RAI) The RAI process temporarily bypasses some aspects of the DOD 5000 process in favor of rapid fielding. Often the JCIDS process is cut short with the intent to be revisited later. The Army developed the RAI to accommodate the need for accelerated evaluation products that support the need for rapid fielding. a. Initial information about potential RAI systems may be identified from many sources such as LNOs, PMs, or SCAs. Normally the LNOs provide notice of potential RAI systems to the ATEC DCSOPS. Particular attention must be given to the DTC-conducted customer tests with potential to become RAI candidates. The Test Center (TC) Commanders must maintain constant vigilance and reporting. Any ATEC component approached about OCO systems will immediately notify the ATEC DCSOPS. ATEC DCSOPS, with the support of the LNOs, will conduct initial investigation and screening of potential RAI programs under development by joint and service RAI activities. b. The primary customers of ATEC s CLRs are combatant commanders, users of RAI equipment, and the Army acquisition community. One of the key deficiencies in the RAI process is the Joint Use Operational Needs Statement (JUONS), Operational Needs Statement (ONS), and the abbreviated format known as the ten liner. These documents vary greatly in quality and for the most part do not express substantive operational requirements and associated key performance parameters or critical technical parameters. As such, it is incumbent upon the AST chair to use all means to bridge this requirements gap and produce a CLR that provides stakeholders with sufficient information so that readers can draw operational effectiveness and suitability conclusions or seek additional data. It is critical that these customers know as much as possible about the equipment, including the associated technical and operational risks inherent in its employment, as well as any unknowns. c. The proponent is responsible for providing the written security classification guidance and foreign disclosure release instructions in accordance with AR 380-5, Chapter 2. When this information is obtained it will be provided to the appropriate AST. However, all programs, projects, and equipment that are sent into Operation Iraqi Freedom/Operation Enduring Freedom 2-12 ATEC Pamphlet June 2010

37 (OIF/OEF) that are NOT covered by another organization s or Program Manager s Security Classification Guide (SCG), will use the ATEC CG/ED SCG. d. All AST members, AEC Technical Editing, and the AEC Security Manager are provided the final draft CLR for review and comment before it is submitted to the Director, AEC. The AST members review of the CLR is for content, including accuracy and completeness. The AEC Security Manager will ensure documents have correct classification markings. The final version of the CLR will be given to the Director, AEC to coordinate the overall evaluation results/summary on all CLRs with the appropriate Proponent before it is submitted to the ED ATEC for signature. A notation on the ATEC Form 45 will document completion of this task. This informal coordination is not a concurrence or approval, but an acknowledgement that they were provided overall results. e. RAI projects are time hyper- sensitive. It cannot be overemphasized that diligent processing of documentation during technical editing and headquarters approval is essential to the stakeholders. Key programmatic decisions are locked in early by the REF and JIEDDO. Delays of any kind that impact the RAI process must be avoided at all costs. All personnel must remain sensitive to the fact that organizations producing materiel in response to urgent theater needs expect ATEC deliverables on the dates agreed. AST chairpersons must coordinate and obtain approval for any schedule slippage with the program sponsor s (e.g., RAI programs stakeholders) designated project officer. If agreed-upon timelines cannot be met, the AST Chair must contact the project sponsor and obtain an approval to slip the CLR delivery date. ATEC strives to never be the cause of a delay in materiel equipping of RAI solutions where no mission critical deficiencies exist. f. Per ATEC Regulation 73-1, the CLR and the Safety Confirmation are released concurrently. More information on Safety Release and Safety Confirmations is provided in paragraphs j and k below. The DCSOPS will receive copies of all RAI Safety Confirmations. Information in the two documents should not conflict. Issues should be worked within the AST prior to document publication. g. The AST Chair will submit an electronic version of the final CLR, along with the paper copy and the ATEC Form 45 for signature. Currently, classified documents are maintained at both the AKO-S and VDL-S Web sites. h. ATEC will manage the RAI programs using the ATEC Decision Support System (ADSS) database. i. The AST Chair will review each RAI program every 6 months after publication of the original CLR to determine whether an update is required. Any updated CLR will be staffed and approved in the same manner as all CLRs. Updated CLRs will highlight any changes and use a new version number. j. The flow chart below (figure 2-2) provides a visual guide for the RAI and Urgent Materiel Release (UMR) process. Each process step is described below. ATEC Pamphlet June

38 Figure 2-2. RAI and Urgent Materiel Release (UMR) Process 2-14 ATEC Pamphlet June 2010

39 (1) Initial information about potential RAI systems may be identified from many sources such as LNOs, PMs, or SCAs. Normally the LNOs provide notice of potential RAI systems to the ATEC DCSOPS T&E Management Division. Particular attention must be given to the DTCconducted customer tests with potential to become RAI candidates. The Test Center Commanders must maintain constant vigilance and reporting. (2) Any ATEC component approached about RAI systems will immediately notify the ATEC DCSOPS. ATEC DCSOPS, with the support of the LNOs, will conduct initial investigation and screening of potential RAI programs under development by joint and service RAI activities. (3) Decision Point #1 (DP#1) is the decision on the initial level of T&E support required and whether to form an AST. ATEC DCSOPS will recommend the level of T&E support required and whether or not to establish an AST to the ATEC ED or TD. The ATEC DCSOPS will meet with the ATEC ED or ATEC TD daily or as needed to review new initiatives. If neither the ATEC ED nor TD is available, the AEC Director will make the determination. (4) If it is unclear that an initiative will proceed to an equipping decision, then an AST will not be established. When appropriate, the ATEC DCSOPS will continue to monitor developments and revisit at a later date. (5) If approved by the ATEC CG/ED or TD, ATEC DCSOPS will establish an AST, and enter the customer delivery date (date that the Safety Confirmation and CLR are due) into ADSS. AST requests for changes to this date must first be coordinated with the program sponsor. For high visibility systems (ACAT I, ACAT II, and OSD Oversight), notification of the date change will be made to the ATEC TD and AEC Director. The AST will coordinate with the program sponsor for definition and refinement of ATEC T&E support requirements, including definition of the scope of ATEC support to be provided, ATEC products to be delivered, program schedule, resource requirements, system availability for T&E, and funding sources. The AST will keep ADSS updated with program status and anticipated dates for the T&E Concept briefing, Safety Confirmation, and CLR. (6) The AST will determine if the item is a normal acquisition program or a RAI to support an immediate unit equipping. (7) If the program is a normal acquisition program, then the AST will determine if there is a requirement to field initial quantities under a UMR. (8) If the program is a normal acquisition and there is no UMR requirement, it will be released from ATEC DCSOPS oversight and T&E will be managed by the appropriate AST as a normal program under ATEC Regulation Decisions to drop a system from DCSOPS oversight will be reviewed by the ATEC TD. (9) If the program is a RAI/UMR, the AST prepares the T&E Concept. It is intended to describe the minimum essential T&E needed to support an informed equipping decision by the decision maker. The T&E Concept is developed through negotiations between the AST and the program sponsor. It is based on the complexity and maturity of the system, the time and resources available to conduct testing, availability of the system for testing (if needed), and the ATEC Pamphlet June

40 level of risk associated with the intended use of the system (i.e., armor, lethality, and known safety concerns related to a previously-tested like system). As part of the T&E Concept, the AST identifies whether or not a data authentication group (DAG) is needed to support specific events. The purpose of the DAG is to authenticate that the data collected and reduced is suitable for analysis and evaluation. If agreement cannot be reached with the sponsor, the AST chair will rapidly elevate the issues up the chain of command. (10) DP #2 is the T&E Concept approval. The T&E Concept review is a mini-esr/cipr that will provide command guidance to the AST. The ATEC ED is the approval authority for the T&E Concept. Approval authority can be delegated to the ATEC TD or AEC Director. The ED will approve or modify the T&E Concept. When necessary, the decision will be made at this meeting for the AST to develop a more detailed System Evaluation Plan (SEP). The ED also determines at this DP whether or not the RAI system requires preparation of a CLR, although this decision could be made at DP#1. At the conclusion of this DP, the AST Chair will ensure the update of ADSS with dates for any required Safety Releases, test dates, Safety Confirmations, and CLR. (11) If the approved T&E Concept does not require a CLR, DTC will write and distribute the Safety Confirmation without further ATEC review. Distribution of the Safety Confirmation will include ATEC DCSOPS to be posted to the OCO SIPRNET site. (12) Testing will be conducted by the SCAs as required by the approved T&E Concept. If a DAG is required, the DAG duties will be tailored to the unique features of the event and will be described in a DAG charter. Guidelines for a DAG charter tailored to support rapid testing are located in appendix M. (13) The AST will produce the CLR. The CLR provides critical information to decision makers and Warfighters receiving the system. The level of detail will vary depending on the amount of pre-existing information available on the system and the amount of time and resources available. It is written in everyday language to facilitate understanding by non-acquisition readers. Bullets, underlines, bold text, boxing, or other editorial techniques to highlight major points are encouraged. If appropriate, data should be presented in tables or figures. A CLR guide (appendix D) will assist in preparing the reports. (14) DTC will produce the Safety Confirmation (SC). It is a separate document issued by DTC that provides ATEC s risk assessment of safety findings and conclusions pertaining to the safe use of the materiel. The SC will include an explanation that it only addresses risk that the system itself might harm Soldiers; that is, that the SC does not in any way address the capability of the system to perform its intended function. DTC will provide a copy of the final draft SC to AST members and the AST will assure no conflicts exist between the SC and CLR. In a rare case where a SC is not required, AEC will prepare, staff, and distribute the CLR in accordance with paragraph 2-8j(15). (15) DP #3. The CLR will be sent to the ATEC ED or TD with a properly staffed ATEC Form 45 (AEC Technical Editing, the AEC Security Manager, the AEC Director and AEC Technical Director, and the ATEC Assistant TD). The ATEC Form 45 will include the AST s 2-16 ATEC Pamphlet June 2010

41 recommended distribution list. The ATEC ED or TD approves the CLR (or provides guidance for changes). See appendix D for policy on staffing of the CLR. (16) ATEC DCSOPS will notify DTC to make distribution of the SC as soon as the CLR is approved. DTC will distribute the SC to the appropriate AMC Life Cycle Management Command (LCMC) Safety Office and enter it into the MaterielRelease@atec.army.mil mailbox. (17) The AST will submit an electronic version of the final CLR via SIPR to atec.d.rrt@us.army.smil.mil. ATEC DCSOPS will post the CLR and the SC to ATEC s classified Web site. ATEC DCSOPS will make additional distribution of the CLRs to the receiving units and their higher headquarters Force Modernization Offices, ONS originator and, via OTC DCSOPS, to the ATEC Forward Operational Assessment (FOA) Team. (18) The AST Chair will ensure that updated RAI program information is entered into ADSS. (a) The actual dates of publication of the Safety Confirmation and the CLR will be updated in ADSS. (b) The AST Chair will not place programs in a RAI completed status without first verifying with AST members from each SCA that all RAI related efforts are completed. (19) Six months after publication of the CLR, the AST will review the status of the program to determine whether an updated CLR is necessary. ADSS will automatically generate review dates based upon the date of previous CLR publication. (20) If a system becomes stagnant at any point in the process, the ATEC leadership may refer the system back to monitoring (paragraph j(4)). k. The key products of ATEC s RAI/UMR program are the T&E Concept, Safety Release, Safety Confirmation, and the CLR. (1) T&E Concept. The AST develops the T&E Concept. It is intended to describe the minimum essential T&E needed to support an informed equipping decision by the RAI/UMR program sponsor. The T&E Concept is developed between the AST and the program sponsor. It is based on the complexity and maturity of the system, the time available to conduct testing, the availability of the system for testing, and the level of risk associated with the intended use of the system (i.e., armor protection, lethality, and known safety concerns related to a previously-tested like system). The T&E Concept is approved by the ATEC ED or TD. It should be short, one to two pages, and discuss the WHO, WHAT, WHEN, WHERE, and HOW. (2) Safety Release. The Safety Release is a formal document issued by DTC prior to any hands-on testing, training, or maintenance by Soldiers. A Safety Release is issued for a specific event at a specified time and location under specific conditions. It is a document that indicates the conditions under which the system is safe for use and maintenance by Soldiers and describes the specific hazards of the system based on test results, inspections, and system safety analysis. Operational limits and precautions are included. The Safety Release must be available prior to start of testing, training, maintenance, or demonstration. A copy of the Safety Release ATEC Pamphlet June

42 must be provided to the commander of the unit to which participating personnel are assigned and to the test activity conducting the event. The Commander is responsible for ensuring that the Soldiers are qualified to support the tests. The test conductor is responsible for ensuring the Soldiers are properly trained regarding the items contained in the Safety Release. For testing, an Airworthiness Release does not negate the need for a Safety Release. In the rare event that a Safety Release is required for a system being tested in a hostile theater, the ATEC ED or TD must provide authorization prior to distribution by DTC. This applies to all ACAT levels or those programs/items which are not assigned an ACAT level. (3) Safety Confirmation. The Safety Confirmation is a separate document issued by DTC that provides the Materiel Developer with safety findings and conclusions and states whether the specified safety requirements are met. It indicates whether the system is safe for operation or identifies and assesses risks for hazards that are not adequately controlled or mitigated, lists any technical or operational limitations or precautions pertaining to personnel and system safety, and highlights any safety problems that require further investigation and testing. For an ATEC RAI-tracked UMR action, an advance electronic copy of the final Draft Safety Confirmation is forwarded to the AMC Life Cycle Management Command (LCMC) Safety Office to avoid delay in preparation of the proponents Safety and Health Assessment for the Materiel Release Review Board. For RAI programs, in addition to the copy forwarded to AST members, an advance electronic copy of the final draft Safety Confirmation is forwarded to the RAI program sponsor to facilitate any preliminary equipping decisions. (4) CLR. The AST prepares the CLR. It includes all available valid, verifiable data and information gathered during ATEC, other service, and industry testing and assessments. The CLR is intended to provide Warfighters and decision-makers essential information to assist in making an informed decision regarding equipping, employment, and potential future acquisition decisions. The format for the CLR makes it inviting and easy to read for Warfighters, who look for information they need rather than reading cover to cover. The level of detail provided in the CLR will vary depending on the amount of pre-existing information available on the system, and the amount of time and resources available to conduct additional testing on the system. (a) Request from foreign governments or their representatives for copies of CLR not previously determined by the Original Classification Authority (OCA) having foreign disclosure release authority, to be releasable to requesting government, will be directed to HQDA G-2 through AEC/ATEC channels. If the OCA does not have foreign disclosure release authority then by default HQDA G-2 will make the determination in coordination with the OCA. It is HQDA G-2 s responsibility to review the request and provide applicable guidance and/or instructions to ATEC. (b) All classified information and sensitive information relating to operational matters must be transmitted via the Secure Internet Protocol Routing Network in accordance with paragraph 3, ATEC memorandum, CSTE-OPS, 14 Dec 2007, subject: Protective handling of Classified, Sensitive Unclassified, and For Official Use Only (FOUO) information. (c) The AST Chair will submit an electronic version of the approved CLR via SIPR to: atec.d.rrt@us.army.smil.mil ATEC Pamphlet June 2010

43 2-9. ATEC s support to Capabilities Development for Rapid Transition (CDRT) During recent combat operations the Army developed new materiel systems and non-materiel capabilities to meet emerging challenges. The CDRT is a semi-annual Army process that identifies the very best non-standard materiel and non-materiel insertions the Army should incorporate as enduring throughout the force. This program is managed by TRADOC, Army Capabilities Integration Center (ARCIC), in partnership the HQDA G-3/5/7. a. ATEC supports the Army s method of getting capabilities to the Soldier quicker and with less fiscal and schedule risk than through the standard acquisition process. CDRT capitalizes on equipment successfully being used in OCO and recommends acceleration of these technologies for fielding to the Army at large as formal programs of record or as accelerations to existing programs. The following procedures are the processes that ATEC will follow. b. This guidance provides the policies, responsibilities, and general procedures to support the Army s mission to field RAIs which have proven to be successful in the OCO Army-wide. The goal is to achieve a timeline of less than two years from VCSA designation to the Full Rate Production (FRP) decision and Initial Operational Capability (IOC). ATEC will support the Army by prudently accelerating T&E timelines and products to facilitate rapid fielding. The ATEC Commander s intent is the same as for all other programs, that is, ATEC will gather data on these selected systems whenever and wherever possible, making maximum use of existing data to support fielding decisions. c. ATEC will support accelerated fielding efforts to the maximum extent by shortening T&E timelines, using out-of-cycle resourcing documents, and prudently minimizing required data collection. d. For all RAI systems that the VCSA has selected as candidate systems and the AAE has approved for inclusion into the acquisition cycle, ATEC will participate in T&E WIPT meetings with the materiel developer, Capabilities Developer, and TRADOC to determine differences between the RAI system as issued/operated in OCO and the proposed objective system. The key areas for review are the intended environment, system functionality/uses, and system description. The AST will also conduct a data review to determine what data are required to support an adequate evaluation of the proposed system. Figure 2-3 provides an overview of ATEC s input into the CDRT process. ATEC Pamphlet June

44 Figure 2-3. Selection of Overseas Contingency Operations (OCO) Systems for CDRT e. ATEC supports the PM s preparation of the acquisition strategy and facilitates development and approval of a TEMP, unless no TEMP is required because there are no further T&E requirements. f. In order to facilitate the goal of achieving a fielding timeline of less than 2 years, ATEC will support development and approval of required T&E documents on an expedited schedule. g. The AST will collect any additional data to support RAI fielding from events conducted in the continental United States (CONUS). New equipment training (NET), early equipping at home station, Army Special Operations Forces (SOF), and other early users are excellent sources for data collection. Units returning to CONUS who employed the RAI system in theater will be interviewed or surveyed about their operational experience. Only when no other option is feasible, and only for ACAT III systems that are not on the OSD T&E oversight list, the ATEC ED may approve data collection in-theater by the ATEC FOA team through the OTC. Data collected in-theater generally is subjective in nature, and may be limited by combatant battlerhythms. The AEC will prepare an OTA Assessment Report (OAR) (formerly called System Assessment (SA)) based on all available data. h. The formal ATEC Rapid Materiel Acquisition Process begins following the selection by the VCSA of an OCO system for fielding to the Army. The process is shown in figure 2-4, below ATEC Pamphlet June 2010

45 Figure 2-4. Rapid Materiel Acquisition Scenarios i. After a system has been issued to a deployed unit as an RAI, the system proponent, TRADOC, and the ATEC FOA team will monitor the system to attempt to measure success in theater. The FOA team will periodically update the AST Chair, who will perform Continuous Evaluation (CE) and keep other AST members informed. TRADOC will determine which systems should be proposed to the VCSA for issue to the whole Army. TRADOC will alert the ATEC DCSOPS of the systems they will propose to VCSA. ATEC will support TRADOC s decision process as required, developing an OAR for those systems under consideration, and participating in the VCSA decision briefing. DCSOPS will be notified when the VCSA has chosen the systems that will be inserted into the acquisition process, and when the AAE has approved them. j. ATEC DCSOPS will notify each selected system s AST that their program has been nominated by the VCSA for inclusion into the acquisition cycle. The AST will compile information from any previous evaluations/assessments, as well as from the FOA team. The LNO to TRADOC will get current and emerging information. A PM will be named by ASA(ALT). k. A T&E WIPT consisting of representatives from the materiel developer, capabilities developer, AST, and TRADOC will convene to review the CPD. The outcome of this review will be a list of differences in capabilities and operating environments between the system that was already issued to deployed troops as an RAI and the system that is described in the CPD. These differences will be in terms of capabilities, environment, and application of the new system. l. The AST will review the required functionality of the system described in the CPD in view of what data is already available about the performance of the already-issued equipment. They will use any data available, including (but not limited to) contractor testing, DT, OT, FOA team observations, and user surveys. They will develop a briefing for the MDA and PM which describes the evaluation issues and data that needs to be collected and/or addressed. This briefing will serve as an ESR/CIPR for approval within ATEC, will be the basis for a TEMP, and will characterize the T&E risk. In order to support the VCSA s guidance for rapid full-army fielding of these systems, data collection requirements will be kept to the minimum that will support a credible evaluation. A required component of this briefing is a Data Source Matrix (DSM), showing open evaluation issues/measures and desired data collection. Given that the timeline from VCSA selection to full equipping is 120 days to two years, this briefing will be staffed concurrently to SCA Commanders and ATEC HQ in keeping with the short timelines. ATEC Pamphlet June

46 m. The MDA, PM, and ATEC will meet to review the acquisition strategy for the system, and to identify any changes arising from the preparation/refinement of the CPD, required system modifications, and impacts to T&E requirements. The process for this meeting is the same as for T&E WIPTs (see AR 73-1). At this meeting, one of four courses of action will be chosen, based on the following cases. Case 1: System needs no further development nor further T&E, and can be issued Army-wide after all MS C/ FRP requirements are met. Case 2: System has minimal modification requirements, but does require further ATEC investigation. Data are still required prior to Army-wide fielding due to select capabilities and/or applications not being collected prior to RAI equipping to OCO. System enters the acquisition process at MS C. Case 3: System as issued to OCO troops needs modifications to meet CPD requirements. System needs test-fix-test phase, which may include ATEC T&E. Case 4: System is not appropriate for rapid Army-wide fielding due to large required system modifications or T&E requirements. n. Based on the acquisition strategy chosen by the MDA, ATEC will develop (or assist the PM in developing) the products required for MS C in accordance with AR (1) For case 1, ATEC will publish an OMAR and a Safety Confirmation as input to a MS C decision. The OMAR may be tailored or abbreviated, as determined by the AST. The report will be based on all available credible data, both data available prior to equipping and data collected by the ATEC FOA team in-theater. Data will be for the system that is currently in the field, as well as the system proposed for Army-wide distribution. (2) For cases 2 and 3, the PM will prepare a TEMP, with assistance from ATEC, which details required future development and post MS C T&E. The PM will coordinate and gain approval of this TEMP prior to the MS C. Concurrently; ATEC will publish an OMAR and a Safety Confirmation as input to the MS C decision. The report will be based on all available credible data, both data available prior to equipping and data collected by the ATEC FOA team in-theater. Data will be for the system that is currently in the field, as well as the system proposed for Army-wide fielding. For case 3, the report will consider data from the test-fix-test events. o. After MS C, systems requiring no further development but still requiring T&E (case 2) will enter an accelerated testing and evaluation cycle. This cycle is similar to the typical defense acquisition management framework, as detailed in published Army and ATEC guidance, but on a much accelerated schedule. The process listed in figure 2-5 below should not wait for MS C, but begin as soon as a draft TEMP has been agreed upon between ATEC and the PM ATEC Pamphlet June 2010

47 Figure 2-5. Rapid Materiel Acquisition Scenario Detail (1) The first AST product is a draft DSM showing the evaluation issues and measures mapped to data collection events identified in the draft/approved TEMP. (2) A combined ESR/CIPR will be conducted at the SCA level before being approved at the ATEC HQ level. (3) No SEP is required. However, an OTA TP (including a Pattern of Analysis (PoA)) is still required to document the plan for what data will be collected and delivered and in what format. Regardless of whether or not a SEP exists, the DSM is not sufficiently detailed to plan event data collection efforts. Traditionally, the PoA has been the document which cross-walked issues and measures to the corresponding data requirements and data elements. (4) OTA TPs will be developed for each data collection event as required by the testing organization. The OTA TPs will reflect the approved TEMP, DSM, and ESR/CIPR strategies. p. Test plans and other test documentation will be developed as required by the DTC and/or OTC. ATEC Pamphlet June

48 q. The AST will gain ATEC approval of the OTA TPs. For those systems on the OSD T&E Oversight List, ATEC will forward the appropriate OTA TP through the TEO for DOT&E approval with a CF: to DOT&E Action Officer. r. ATEC will use the out-of-cycle TSARC process to obtain troops, if needed, for testing. ATEC will develop and approve out-of-cycle Test Resource Plans (TRPs) and Test Cost Proposals for testing as needed. The TRP will be included in the Five Year Test Program (FYTP) approved by HQDA (DCS, G-3) if the timeline permits. s. Prior to any ATEC operational testing, DTC will approve and publish a Safety Release. TRADOC will approve and publish documents required for testing and fielding, including (but not limited to) Training Test Support Packages (TSP); NET TSPs; Doctrinal and Organizational TSPs; Threat TSPs; System Support Packages (SSPs); Tactics, Techniques, and Procedures (TTP) documents; and Logistic Support plans. All documents must be available early to make the VCSA s 2-year timeline. t. During CONUS testing, AEC will be onsite to analyze data as it becomes available, support the data authentication process, and draft the assessment or evaluation report. The test report will be published by the test organization within 30 days of the end of the test. The assessment/evaluation report will be approved within 60 days after the end of the test. u. The ATEC OTA Evaluation Report (OER) and the Safety Confirmation will be provided to support the FRP Decision Review (DR). v. For systems requiring further development (and consequently further review of test data) (Case 3), the above steps will be followed, with the additional step of the AST closely following the test-fix-test phase, and using data as possible to support the evaluation. The ATEC T&E should closely follow the agreements and guidance contained in the TEMP, DSM, and ESR/CIPR. If ATEC testing is required, the steps in paragraph i, above, will be followed. w. In rare cases where ED ATEC has determined that an ACAT III system, that s not on the OSD T&E oversight list, could benefit from in-theater data collection in support of fielding Army-wide, the AST, through the ATEC DCSOPS, will give direction to the OTC FOA team via an abbreviated OTA TP. Data collection tools and procedures will be developed by the AST and provided to the FOA team. In the event that the FOA team needs additional personnel to carry out the data collection, ATEC will mobilize additional ATEC assets. In CONUS, DT will be conducted in parallel with in-theater data collection. It should be remembered that in-theater data collection will be largely subjective, versus quantitative, in nature. NET should provide a good opportunity for data collection, in CONUS or in-theater ATEC Pamphlet June 2010

49 Figure 2-6. In-Theater System Assessments for Rapid MA ACAT III ONLY Capability Sets a. A Capability Set includes interoperable communication systems and applications fielded together to strengthen Brigade Combat Team (BCT) Battle Command Network (BCN) and improve effectiveness in today s operational environment. New capability sets will continue to build upon the current Network every two years, upgrading units; communication capabilities to provide capabilities most in demand. Capability Set fielding will be synchronized with the Army Force Generation (ARFORGEN) Model. It will include the necessary Doctrine, Organization, Training, and Leader Development required training, maintaining, and fighting as fully integrated BCTs. b. The Army is transitioning to a Brigade Modernization approach that biannually develops and fields Capability Packages to adapt to the changing operational environment filling the highest priority shortfalls for Soldiers and their leaders with the best capabilities available. Each package will include a Capability Set focused on the BCN and will strengthen ties to critical resources, information, and fellow Soldiers-ultimately empowering individual Soldiers and BCT leaders. c. The significance of the BCN continues to grow. Capability Sets allow the Army to incrementally field fully integrated command and control tools to support command needs while maintaining backward and forward compatibility with other units in the Force. Ultimately, these ATEC Pamphlet June

50 sets provide a means for the Army to keep up with the rapid pace of change in the operational environment. d. ATEC will provide to the Army and the gaining commander a report stating the demonstrated capabilities and limitations of each capability set. In the near future, ATEC will provide information on the improvement or lack of improvement in the networked command and control system over previous capability sets. This is not part of the fielding decision but will let the Army Staff and gaining Commanders know what capabilities and limitations of the equipment they will be receiving Technology demonstrations and experiments The CBTDEV, with support from ATEC, may utilize the Battle Labs to execute demonstrations/experiments to aid in defining operational requirements that may also support the system evaluation. The following demonstrations/experiments allow the Capabilities Developer to examine and resolve combat development, materiel concept, doctrinal, leadership, organization, and training issues. In support of a concept study, a Technical Feasibility Test (TFT) or an Early User Test (EUT) may be conducted to determine safety and feasibility of the components/subsystems if a concept is chosen. ATEC has a MOA with TRADOC to provide support to demonstrations/experiments. A copy of this MOA is located on the ATEC Intranet web site a. Joint Capabilities Technology Demonstrations (JCTDs) initiated in FY06, will continue to replace the Advanced Concept Technology Demonstrations (ACTDs). The JCTD Program implemented a new and enhanced business process to better meet the DOD s transformational goal of becoming capability based vice threat based. JCTDs focus directly on the Combatant Commanders most critical Warfighter needs and provide a faster, more agile and integrated joint response to emerging asymmetrical threats. JCTDs emphasize increased upfront transition planning, provision for a higher level of OSD funding during the first two years and transition bridge funding from Budget Activity for those projects that demonstrate compelling joint military utility. JCTDs are integrated with the JCIDS and will provide a faster process that focuses on joint and transformation technologies. ATEC participates by assessing these new capabilities. b. Advance Technology Demonstrations (ATDs). An ATD is a pre-acquisition mechanism for the Warfighter to explore military utility and potential of technologies to support warfighting concepts. A successful ATD allows accelerated entry into the acquisition life cycle (milestone B or C). ATDs are relatively large scale in resources and complexity, but typically focus on an individual system or subsystem. The user is involved throughout the process. Experimentation is with Soldiers in a real or synthetic environment. It has a finite schedule of 5 years or less with exit criteria established by the MATDEV and TRADOC. c. Advanced Warfighting Experimentations (AWEs). AWEs are culminating efforts in the process to evaluate major increases in warfighting capability. They cross DOTMLPF domains and synergistically combine new force structure, doctrine, and materiel to counter a tactically competent opposing force. Moreover, they impact most, if not all, battlefield dynamics and battlefield operating systems. These experiments use progressive and iterative mixes of high ATEC Pamphlet June 2010

51 fidelity constructive, virtual, and live simulation to provide the Army leadership with future operational capability insights. AWEs are sponsored by the CG, TRADOC and approved and resourced by the CSA. Similar to JCTDs, ATEC participates by assessing these new capabilities. d. Concept Experimentation Programs (CEPs). A separately funded TRADOC warfighting experimentation program supporting the DOTMLPF operational requirements determination sponsors (TRADOC schools/centers, Army Medical Department Center and School, and SMDC Capabilities Developers) and the ability to investigate military utility of and capitalize on technologies, materiel, and warfighting ideas. The CEP provides funding and other resources to conduct warfighting experimentation supporting the Army Experimentation Campaign Plan to provide insights to support refinement of warfighting concepts, determination of DOTMLPF needs solution to approve Future Operational Capabilities (FOCs), development of materiel requirements, and support evaluation of organizations for fielding. The CEP is an annual program that provides commanders a quick experimentation response process Joint T&E (JT&E) Program a. The JT&E Program provides non-materiel solutions to critical warfighting issues. It charters operational test projects that improve joint warfighting capabilities with existing equipment. The program develops solutions to joint operational problems and measures the associated improvments through enhanced tactics, techniques, and procedures (TTP). It also measures improvements brought about by enhanced testing methodologies. The JT&E Program s objective is to provide rapid solutions to issues identified by the joint military community. b. The JT&E Program exists to conduct OT&E that cuts across Service lines. The interactions among Services become extremely important during combat where the success of joint military operations are centered around Service-centric boundaries and responsibilities. A lack of joint OT&E makes detecting certain deficiencies in those interactions very difficult. The JT&E Program provides quantitative information for analysis of existing joint military capabilities that results in recommendations for increasing joint military effectiveness through process improvements. The program is complimentary to, but not part of, the weapons acquisition process. JT&E products include the development or refinement of joint or multi- Service TTP; inputs to improve joint and Service training programs; new operational and technical testing methods; new test and training range procedures; and joint and multi-service operations analysis tools. c. The ATEC JT&E Office generates operational solutions to urgent, specific, joint Warfighter problems through a dynamic rigorous test process, principally accomplished through Quick Reaction Tests (QRTs). (1) QRT definition. QRTs are short-term (normally 12 months) and normally funded at $1M. The primary products are handbooks containing CONOPS and TTPs to address urgent Warfighter needs. ATEC Pamphlet June

52 (2) Management of Army resources for QRTs and joint tests. ATEC is the OTA for providing Army resources to support Joint test requirements. The ATEC JT&E Office manages this program. The ATEC JT&E Office also assists Joint tests through the TSARC process by requesting Army money, personnel, equipment to support a specific test event. (3) ATEC JT&E office currently has 13 personnel, and a budget of $4.8M to support QRTs and $1.1M to support office and Joint tests. This office also manages the TDA of an additional 16 personnel serving as Liaisons or attached to joint tests. There are also liaison personnel with JFCOM and the Air Force Joint Office Chemical and Biological Defense Program (CBDP) a. The DOD CBDP is a key component of a comprehensive national strategy to counter the threat of chemical and biological weapons as outlined in the 2002 National Strategy to Combat Weapons of Mass Destruction (CWMD). This national strategy is based on three principal pillars: (1) Counter proliferation to CWMD use. (2) Strengthen nonproliferation to CWMD proliferation. (3) Consequence management to respond to WMD use. b. The CBDP focuses on the first and third pillars of this strategy. The CBDP facilitates capabilities development for the CWMD mission areas of passive defense, consequence management, interdiction, and elimination operations. The CBDP supports strategic initiatives to improve chemical, biological, radiological, and nuclear (CBRN) defense preparedness, to reduce risks to the Warfighter, and to field the appropriate capabilities for sustained military operations with minimal degradation in combat effectiveness caused by CBRN hazards. c. The Army s Director, TEO is the T&E Executive for all Service CBDPs. The CBDP T&E Executive will determine the Lead OTA for each CBDP system based upon the OTAs providing a coordinated recommendation and input from the JPEO-CBD and/or other test sponsors. The CBDP T&E Executive also provides the policy guidance for all CBDP programs Clothing and individual equipment (CIE) and organizational CIE (OCIE) programs The term organizational clothing and individual equipment refers to and includes mission essential Army owned property for which the organization commander retains responsibility, and which may be rotated among using individuals as required, unless identified as non-recoverable. All OCIE items are considered accountable property, regardless of Accounting Requirements Code. a. ATEC will generally not participate in CIE programs unless the PM requests ATEC support. However, in specific cases in which a particular program is deemed to involve critical Soldier safety, lethality, or protection issues, such as ballistic or chemical protective clothing items, ATEC will participate to ensure Soldier interests are protected ATEC Pamphlet June 2010

53 b. ATEC will formally evaluate OCIE items that involve Soldier safety, lethality, or protection issues, such as ballistic or chemical protective clothing items. The ATEC LNO for PEO Soldier will review all OCIE programs and recommend to ATEC DCSOPS the level of ATEC involvement. c. ATEC DCSOPS will task the SCAs to establish an AST, as required. The ATEC DCSOPS will maintain a database and track the documents associated with CIE and OCIE to ensure there are no duplication or overlap of efforts between ATEC and U.S. Army Natick Soldier Research, Development, and Engineering Center (NSRDEC). d. An AEC evaluator will serve as the AST Chair for all CIE/OCIE programs that require an evaluation. A DTC test manager will serve as the AST Chair on all CIE/OCIE programs in which an ATEC waiver for an evaluation is granted. DTC will issue a Safety Confirmation based on review of referenced documentation and/or safety testing. An AST Chair will coordinate efforts within ATEC to provide an assessment or evaluation report and/or Safety Confirmation on CIE/OCIE, as appropriate. The AST members will execute the T&E process as outlined in this pamphlet following tailored/abbreviated formats, except in rare instances in which more extensive documentation is required due to the complexity of the item Soldier Enhancement Program The Soldier Enhancement Program was established by Congress in 1989 and is managed by PEO Soldier and the TCM-Soldier. Its mission is to identify and evaluate commercial off-the-shelf (COTS) individual weapons, munitions, optics, combat clothing, individual equipment, water purification technologies, shelters, communication and navigational aids. a. The Soldier Enhancement Program encompasses all items worn or carried by Soldiers in a tactical environment, and is designed to improve/enhance the Soldier s lethality, command and control, sustainability, mobility, and survivability. Basically, the Soldier Enhancement Program follows the same materiel acquisition process as a typical materiel acquisition program. The major thrust, however, is to identify and evaluate commercially available individual weapons, munitions, combat clothing, individual equipment, food, water, shelters, communication, and navigation aids in order to get successful items into the hands of the Soldier in less than three years. b. Proposals can be generated by anyone and go before the Soldier Enhancement Program Executive Council at least twice each year. The Executive Council is co-chaired by the PEO Soldier and TCM-Soldier. The Council validates and prioritizes all Soldier Enhancement Program proposals and forwards to DCS, G-3/5/7 for Army prioritization and funding. After the proposals are validated, the originating school may begin processing the Soldier Enhancement Program capability document (CDD or CPD). The JCIDS format is used, but is streamlined to the maximum extent possible. PEO Soldier executes the funding for all Soldier Enhancement programs. Approval guidelines follow the same procedures as other acquisition programs. ATEC Pamphlet June

54 2-16. Combined Test Organization/Combined Test Teams (CTO/CTT) ATEC will not participate in CTOs or CTTs except as directed by the ATEC Commanding General (CG)/Executive Director (ED) or ATEC Technical Director ATEC System Team (AST) a. The AST is responsible for planning, executing, and reporting the T&E for each system evaluated by ATEC. It is a multi-disciplinary team typically composed of representatives from each of the SCAs. The AST is an action officer group which plans and coordinates activities, identifies and resolves issues, and elevates issues up the chain of command as necessary. The AST has a Chair who is ultimately responsible for all AST products and who has tasking authority for their AST members. b. An AST is established for each evaluated system. An AST is established upon notification of a new requirement from any source or receipt of justification for new capabilities (i.e., ICD, CDD, upgrade or improvement of an existing system). When a new or potential requirement becomes known, HQ ATEC Deputy Chief of Staff for Operations (DCSOPS) will coordinate the formation of an AST with the SCAs and document required timelines in the ADSS. c. The PM, PEO, or program sponsor contacts ATEC for test support. This request can come via an ATEC LNO, AEC, DTC, internet request, or . The receiver of this request forwards the request to the ATEC DCSOPS Admin mailbox (dcsops_admin@us.army.mil) d. The DCSOPS sends the requester an AST Standup/Test Support Request. This consists of an ADSS Support Form (system/poc information) and a Quad Chart (detailed system description/photo and acquisition information). e. The requestor completes the form and quad and forwards them to the ATEC LNO who services their agency. The LNO reviews the package for completeness and reviews ADSS to determine whether the system (or any of its increments) is already supported by ATEC. f. If the LNO determines that there is no current AST and that formation of one is warranted by program need, he/she develops an ATEC Form 45 for AST Standup, and forwards it along with the requestor s form and Quad to the ATEC DCSOPS Admin mailbox. Grouping of several smaller similar systems into a single AST is acceptable and encouraged. This allows the experience gained on T&E requirements for an individual system to be more easily transitioned to other similar systems with minimum requirements for command resources. However, every system requires separate planning and reporting documents to support the T&E program. g. The ATEC DCSOPS reviews the Form 45 for completeness, assigns a tasker number, creates a system shell or updates existing information in ADSS based on the ADSS Support Form and system Quad. They assign the ATEC LNO to the AST in ADSS. They ensure that the System Summary field contains information regarding AST standup based on the Origin of Action and Summary fields from the Form 45. Generally, the Form 45 requests an AST Chair/Evaluator, ILS support, RAM support, and Survivability support from AEC, a DTC test 2-30 ATEC Pamphlet June 2010

55 manager, and an OTC test officer. ATEC DCSOPS then issues the Form 45 to all SCA DCSOPS cells. h. Each SCA DCSOPS staffs the Form 45 internally. When an AST member is identified, the SCA (or that AST member) posts the member name and contact information in the ADSS record for the program and sends a notification of action completion to the ATEC DCSOPS Admin mailbox. Each SCA is able to name additional support personnel to the AST as required and post them into ADSS, e.g., AEC or OTC operational research system analysts (ORSAs), MANPRINT support, or DTC Test Center test officers. In selecting the AST members, consideration should be given to: Battlefield Functional Area (BFA), predominant program issues, expertise of available personnel, where the system is in the acquisition lifecycle, and workload. Consideration should also be given to assignment of a military individual to provide military experience and judgment. The ATEC DCSOPS verifies that all required AST members have been identified in ADSS and closes the tasker. i. AST First Meeting. Prior to the first AST meeting, all AST members should have reviewed all the system information available in ADSS, and researched to determine if their individual directorates have had any prior involvement with the system. j. AST Chair Assignment. Prior to the first AST meeting, AEC will designate a person as the AST chair for the first meeting. All AST members, at this first meeting, after coordinating with their chain of command, will recommend which SCA should provide the Chair. If agreement cannot be reached, the ATEC Technical Director will select the AST Chair regardless of system s ACAT. Replacement selections will be made in sufficient time to allow a seamless transition. The ATEC CG/ED or designee will approve all AST chairs (both new and replacement) for OSD T&E oversight programs. k. AST Size and Composition. (1) The AST typically consists of the evaluator, developmental and operational testers, and analysts (such as, RAM, Survivability, MANPRINT, and ILS). Each organization provides the necessary support for their designated members. (2) The size of the AST will change as the system and its T&E program matures. During busy planning and test execution periods, the AST will grow to accommodate the workload and the specialized needs of the programs. During program lulls when there is no active testing, the AST will decline to two or three core members while other members work on other programs. The AST determines when additional members are needed or can be released. Additional members from the SCAs or ATEC HQ staff elements may include ORSAs, programmer/computer support personnel, a resource manager (RM), the system evaluator of a related program, general support analyst (statistical methods, M&S, etc.), instrumentation engineer, software analyst, or threat manager. (3) The AST is an internal ATEC coordination group, but when required, the AST can invite representation from outside ATEC. Most specialty needs from other organizations should be accommodated with invitations to specific AST meetings and not require full-time AST membership. ATEC Pamphlet June

56 (4) The AST composition for MDAPs, Major Automated Information Systems (MAIS), Multi-Service Acquisition Programs, and OSD T&E oversight programs will include representatives from all three SCAs. The composition for non-oversight programs will vary depending on need. l. AST Responsibilities. (1) The chair is empowered by the ATEC CG/ED. The chair is the leader, facilitator, coordinator, and administrator who ensures T&E documentation is correct, information is entered into ADSS, and that system tests and evaluations support Army and DOD senior decision makers. (2) The most important elements of a successful team effort are cooperation and communication. Members are required to contribute fully in their area of expertise and in the general performance of T&E strategy development; issue resolution; and test and evaluation planning, conduct, and reporting. While AST participation is a priority, it is recognized that the workload of some members may present conflicts. These conflicts should be resolved with the member s chain of command. Responsibilities for the AST Chair, AST members, and AST member s supervisors are provided in table 2-1 below ATEC Pamphlet June 2010

57 Table 2-1. AST Responsibilities m. AST Standing Operating Procedures. (1) Each AST member is responsible for keeping his/her chain-of-command informed of AST decisions and for keeping the AST informed of chain-of-command concerns. This is especially true within AST members directorates. For briefings outside the directorate, the whole AST should be involved at least in the preparation. (2) Issues uncovered by the AST should always be resolved at the lowest appropriate organizational level. Issues not resolved within 2 weeks will be raised up the chain of command for quick resolution. Differences among AST members will be resolved prior to attending a T&E WIPT. (3) AST members will coordinate directly with other ATEC elements and with outside commands or agencies to accomplish ATEC s T&E objectives. Especially important is the interface with the materiel, combat, and training developers. The AST Chair is the primary POC regarding the program and generally represents ATEC at external meetings/forums. ATEC Pamphlet June

58 (4) The AST is empowered to take actions and state positions that are consistent with ATEC and Army policies and procedures. After the AST has developed consensus (or received ATEC ED/TD guidance) on a system or issue, the AST chair normally represents ATEC at external forums. However, any AST member with the technical knowledge or expertise may represent ATEC and state the ATEC position at meetings or working groups. To be empowered, an AST member must have, or fully anticipate, the support of the chain of command. The AST is not authorized to operate outside these parameters. If the direction of the meeting, issue, or working group becomes inconsistent with these parameters, members should seek guidance. (5) The AST will speak with one voice when interacting with organizations outside ATEC. (6) The AST Chair normally requests meetings; however, other AST members may also do so. The AST establishes internal procedures and goals to accomplish required tasks. (7) The AST is encouraged to develop a charter; however, it is not mandatory. A sample charter is provided in figure 2-7. (8) AST Chairs, when coordinating products or positions with other AST members, should set a reasonable suspense, understanding that most AST members support multiple programs and have other responsibilities within their chain of command. Generally speaking, with good communication all AST members should know what major actions are up coming and have already briefed their leadership before the tasker from the AST chair arrived. (9) The AST Chair is responsible for the synchronization and integration of ATEC efforts for the assigned system. AST members are responsible for ensuring the synchronization and integration occurs within their areas of expertise ATEC Pamphlet June 2010

59 1. BACKGROUND: a. Authority: The (name of team) is hereby chartered under the authority of ATEC Regulation 73-1, dated March This AST is formed to ensure the coordinated evaluation of the (name of program) and associated test plans, design, conduct, and reporting. This AST will maintain and reinforce ATEC s role as the Army s independent source for the IOT&E while maintaining a cooperative relationship with the materiel developer/pm in the execution of all other tests, experiments, evaluations, and assessments. b. Purpose and Scope: Provide system description and rationale for forming the AST. 2. GOALS AND OBJECTIVES: Identify required documentation, resources, methodology, milestones, meeting schedules, etc. The scope will vary depending on the program and the point in the life cycle process. 3. RESOURCES AND MANAGEMENT. a. Personnel. List each team member s name, organization, phone number, fax number, and address. b. Chair. Provide name of Chair with above information. Include responsibilities for coordinating meetings, planning the agenda, tasking other group members, and reporting status of system under test. c. Procedures. (1) The AST will meet as necessary, based on the guidance received from the chains of command, and as mutually agreed to by all team members. To the maximum extent possible, the AST will use VTC, electronic document transfer and review, to minimize travel requirements. (2) Provide an agenda to all members prior to the meeting. (3) Start on time. (4) Be creative. (Think outside the box.) (5) Strive for consensus but respect members differences of opinions and elevate issues within members chain of command if they cannot be resolved in the AST. (6) Assign team member to capture notes, review notes at the end of the meeting, have all members agree on content, and distribute to each member. (7) Empowerment. Describe appropriate AST members authority and responsibilities. 4. SIGNATURES: All AST members (and their immediate supervisor for buy-in to chain of command commitment to the AST). Figure 2-7. Sample AST Charter n. ASTs for multi-service programs. Multi-Service Operational Test and Evaluation (MOT&Es) are done by agreement among the participating services. For MOT&Es, the lead developing/acquisition Service s OTA will be the lead OTA. If the Service s OTA declines, the lead OTA will be chosen by mutual OTA agreement between participating Services. Testing and documentation procedures follow those of the lead Service; issues between participants will be resolved at the lowest level possible. There is an annually-updated MOA among the OTAs which provides the framework for T&E conducted by two or more OTAs. ATEC forms an AST for all MOT&Es in which it participates. When ATEC is the lead, key OTA representatives from other services are given ADSS access and listed as AST members. In cases where ATEC is not the lead OTA, the AST Chair is the ATEC POC to the other services and provides AST input to the lead OTA, as required. Like attendance at T&E WIPT meetings, the AST Chair works with his team to determine when other AST members need to interface with other OTA personnel. For all MOT&Es, care must be taken to coordinate with other OTAs to avoid duplication of work. ATEC Pamphlet June

60 2-18. Working groups and decision bodies a. T&E Working-level Integrated Product Team (T&E WIPT). The T&E WIPT is a forum in which the users, developers, testers, and evaluators work together to refine requirements and work the tradeoffs over the scope and resources allocated to evaluation program events. It is chaired by the PM. The T&E WIPT develops and maintains the T&E strategy (TES) and the TEMP. The PM also collaborates in the development of Critical Technical Parameters (CTPs) and COICs. The forum members also contribute to the development of an acquisition strategy, the CDD and CPD, and the System Threat Assessment Report (STAR). The T&E WIPT will typically have the opportunity to review and contribute to, through coordination with the AST, evaluation related documents. The T&E WIPT is a key collaborator with the AST on the Concept In-Process Review (CIPR). Figure 2-8 is an attempt to visually show the difference between the AST and the T&E WIPT. T&E WIPT and AST Membership CBT DEV DTC PEO/PM User Unit DOT&E ATEC AST ATEC AST OTC T&E WIPT (PM Chair) AEC TEO Threat G-2 SLAD Logistician Trainer Figure 2-8. Difference between T&E WIPT and the AST b. Army Systems Acquisition Review Council (ASARC). The ASARC is the Army level review body for acquisition of all ACAT I/IA programs as well as other selected programs where the AAE is the MDA. The ASARC provides a structured forum where issues requiring top-level consideration are presented to senior Army leadership. ATEC s role is to provide information to enable decision-makers to make the right decisions at the right time. Additional information 2-36 ATEC Pamphlet June 2010

61 may be found in DA Pamphlet The ASARC membership consists of senior acquisition managers and functional principals shown in figure 2-9. The Chairman for ASARCs is the ASA (AL&T). Figure 2-9. ASARC Membership c. Overarching Integrated Process/Product Team (OIPT). The Army OIPT conducts full program reviews prior to ASARC s with a primary focus on resolving programmatic issues that cannot be resolved at lower levels. Figure 2-10 provides the Army OIPT membership. ATEC s role is to provide information. The Deputy for Acquisition and Systems Management chairs the Army OIPTs for ACAT ID, IC, and II systems. ATEC Pamphlet June

62 Figure Army OIPT Membership d. Defense Acquisition Board (DAB). The Under Secretary of Defense (Acquisition, Technology, and Logistics) (USD (AT&L)) conducts DAB reviews for ACAT ID and IAM programs at major decision points. An Acquisition Decision Memorandum (ADM) documents the decision(s) and program direction resulting from the review. DAB members are Vice Chairman of the Joint Chiefs of Staff, Secretaries of the Military Departments; USD (Policy); USD (Comptroller); USD (Personnel and Readiness); USD (Intelligence; Assistant Secretary of Defense for Networks and Information Integration/DOD Chief Information Officer; DOT&E; Director, Program Analysis and Evaluation; and Director, Acquisition Resources and Analysis (as the DAB Executive Secretary). The USD(AT&L) may ask other department officials to participate in reviews, as required. e. Information Technology Acquisition Board (ITAB). The ITAB is the decision forum for milestone review of ACAT IAM programs, excluding defense business systems. It contributes strategic-level insight for net-centric, GIG, and information technology issues when they cannot be resolved at the OIPT level. An ADM documents the decision(s) resulting from the review. The USD(AT&L) chairs the ITAB. See the DAG, Chapter 10 for additional information. f. Joint Requirements Oversight Council (JROC). The JROC reviews and approves capabilities documents designated as JROC interest and supports the acquisition review process ATEC Pamphlet June 2010

63 In accordance with CJCSI 3170, the JROC validates capability needs, and also validates the KPP when it approves the associated capabilities document. Additional information may be found in the DAG, Chapter 10. g. Industry Participation. Industry representatives may be invited to a WIPT meeting to provide information, advice, and recommendations to the IPT; however, the following policy should govern their participation: (1) Industry representatives will not be formal members of the IPT. (2) Industry participation will be consistent with the Federal Advisory Committee Act. (3) Industry representatives may not be present during IPT deliberations on acquisition strategy or competition sensitive matters, nor during any other discussions that would give them a marketing or competitive advantage Test Schedule and Review Committee (TSARC) The mission of the TSARC is to provide high-level centralized management of resources (i.e., troops and personnel, equipment, funding, flying hours, ammunition, and instrumentation) for test and evaluation. The TSARC process is used for all OT, Joint Test (JT), Force Development Test and Experimentation (FDT/E), and DT requiring user troops. Appendix E of this document and AR 73-1 provide additional information Resource management a. ATEC receives funding either as direct appropriations or as reimbursements from customers for the T&E of acquisition systems. The major appropriations and the typical funding sources for T&E activities are shown below. T&E direct appropriations are reviewed by various organizations and committees, including Congress, TRMC, DOTE, ASA(ALT), TEO, ATEC, and the TSARC. Other organizations and their functions in the funding process are addressed in AR 73-1, DA Pamphlet 73-1, and DOD R. b. Major Appropriations. Appropriations are statutory authority to incur obligations and to make payments out of the U.S. Treasury for specified purposes. The appropriations most commonly associated with T&E activities are briefly described below. Details can be found in DFAS-IN FY. (1) Research, Development, Test, and Evaluation (RDT&E). This 2-year appropriation provides for the development, engineering, design, purchase, fabrication or modification of end items, weapons, equipment, test requirements documentation, e.g., safety and environmental quality reports, or material including computer application software. It funds pre-frp decision testing (DT, LF, and OT) and post-frp testing for systems in block development. RDT&E funds are normally used to support the conduct of the Market Investigation and the purchase or lease of candidate systems/components required for T&E purposes. RDT&E funds are to support T&E activities such as T&E of modification of test articles; purchase of specifications, manufacturer s publications, repair parts, special tools and equipment; facility and instrumentation sustainment, revitalization, and upgrades, transportation of the test article to and ATEC Pamphlet June

64 from the test site; and training, salaries and temporary duty costs of T&E personnel. The appropriation also funds day-to-day operations of RDT&E activities/installations such as headquarters operations, civilian salaries and awards, travel, fuel, minor construction projects (that do not exceed $750K), expenses of operational military forces, training and education, recruiting, depot maintenance, industrial fund and stock fund support, and mission-unique base operations support. (2) Procurement. This 3-year appropriation provides for the procurement, manufacture, modification, and configuration of aircraft, missiles, weapons, tracked combat vehicles, ammunition, and other items to include initial provisioning of spare and repair parts. There are five separate Army procurement appropriations (APA): Other Procurement, Army (OPA); Ammunition; Aircraft; Missiles; and Weapons and Tracked Combat Vehicles (WTCV). For acquisition programs approved for production, this appropriation covers all costs integral and necessary to deliver a useful end item intended for operational use or inventory upon delivery. (3) Operations and Maintenance. This 1-year appropriation provides for the operation and maintenance of Army installations and major units to include equipment, facilities, supplies, and civilian pay. Post-FRP testing (e.g., surveillance and reconditioning test) can also be funded from Operations and Maintenance, Army (OMA). (4) Each appropriation has a period during which it is available for obligation. This is the time frame that the funds can be executed for requirements. Table 2-2 shows the obligation period and funding policy for each of the appropriations most frequently associated with T&E activity. Table 2-2. Obligation Period and Policy Appropriation Obligation Period Funding Policy O&M 1 Year Annual RDT&E 2 Years Incremental Procurement 3 Years Full c. Funding sources. (1) RDTE Direct Appropriations. HQDA or OSD oversees the RDT&E direct appropriated funds used throughout ATEC for most T&E activities. The following activities are funded by this source. (a) Developmental Testing RDT&E Direct Appropriations are planned to pay for the costs of DOD developmental testing, to include meteorological support, not directly attributable to a customer effort. RDT&E funds are also used for test facility revitalization/upgrade, instrumentation development, M&S development, and overall management. Funds are intended to sustain and improve an objective test capability with the needed facilities and instrumentation across DTC test centers for developmental testing of DOD materiel. It also funds DTC HQ AST support for early planning as well as funding for the DTC 2-40 ATEC Pamphlet June 2010

65 safety verification mission. Types of expenses include civilian pay, temporary duty, supplies, and equipment and other non-labor costs. (b) Operational Testing RDT&E Direct Appropriations provide for the OTC s base costs associated with OT. This pays for the early test concept planning, support to the AST, and development of the test/simulation execution strategy. Types of expenses include civilian pay, support contracts, temporary duty, supplies, and equipment for subordinate test directorates of OTC. (c) Evaluation RDT&E Direct Appropriations are the primary source of funding for accomplishing the AEC mission of continuous evaluation to ensure integrated technical and operational evaluations throughout the life-cycle of assigned programs. This pays for salaries of civilian employees assigned to AEC and the associated costs including temporary duty, support contracts, supplies, and equipment. (d) MOT&E RDT&E Direct Appropriations are designated to fund the Army s direct costs of planning and conducting Multi-Service Operational Test and Evaluation (MOT&E) of programs when there is no Army program manager. (e) Early involvement RDT&E Direct Appropriations fund the ATEC liaison personnel co-located with PEOs. The intent is to achieve cost savings and design efficiencies early in a system s development, thereby avoiding more expensive product improvement programs later in a system s life cycle. (2) Reimbursable Funding. Reimbursable funding, typically RDT&E for all T&E up to FRP, is provided by PEOs/PMs and many other sources. Coordination is the critical element that ensures that PEOs/PMs accurately program for known T&E requirements by fiscal year. (a) Developmental Testing PMs and other DOD users fund DT and are charged for all direct costs readily identifiable to a specific testing event. Defense contractors entitled to DOD rates and performing tests under contract to other DOD organizations will also be charged only direct identifiable testing costs. This funding policy applies to only Major Range and Test Facility Base (MRTFB) facilities. Non-DOD customers, including other Government agencies and private industry, will be charged direct and indirect costs for use of MRTFB facilities in accordance with DODD and DOD R. DOD and non-dod customers at non- MRTFB activities shall be charged for both direct and indirect costs. The materiel developer will program and budget funds required to conduct DT based on the requirements of the TEMP. (b) Operational Testing As mentioned above, funds for OT&E are provided by the PEO/PM. This includes test-specific costs (beyond base costs) for the planning, execution and reporting of approved TRPs. (c) Evaluation The programming and funding of T&E for all acquisition systems is the responsibility of the Materiel Developer. The RDT&E funds are used to support system evaluation. Evaluation of a specific event and continuous evaluation throughout the lifecycle is funded by RDT&E. d. T&E systems and costs. ATEC Pamphlet June

66 (1) The MATDEV/PM must plan, program, budget and allocate funds for all acquisition category system testing (DT, LFT, OT), and evaluation of OT events, as identified in the TEMP. The AST must assist the PM in developing their T&E budgetary estimates for use to support their POM process by documenting all reimbursable cost estimates in ADSS. ADSS is used throughout ATEC to document resource requirements reimbursed by customers for T&E efforts. RM Online is the command s official resource management system for requirement planning (budgeting) for direct appropriated and reimbursable funds for personnel, dollars and manpower. Standard Operation and Maintenance Army Research and Development System (SOMARDS) is the Army standard system used across ATEC to capture direct and reimbursable program execution. (2) Each SCA will prepare and approve their cost estimates. All cost estimates must be entered into ADSS. ATEC DCSOPS sends the cost estimates to the PEOs (with copies to ATEC DCSRM, HQDA G8, ASA (AL&T), and each ATEC SCA. DT cost estimates are provided by DTC to the PEO and all customers. e. ATEC DCSRM will follow-up with the PM Business Offices via a reimbursable funding coordination letter to ensure that funding requirements have been disseminated to their level. For DT, this function is done by DTC Test Centers Resource Management ATEC Pamphlet June 2010

67 Chapter 3 System Evaluation Planning and Analysis 3-1. Overview of evaluation planning a. The primary mission of ATEC is to make an independent evaluation of acquisition systems performance (ESS) under realistic operational conditions, including joint combat operations with representative users. Operational ESS must be evaluated and reported on the basis of whether a system can be used by Soldiers to accomplish combat mission. The appropriate environment for that evaluation includes the system under test and all interrelated systems (that is, its planned or expected environment in terms of weapons, sensors, command and control, and platforms, as appropriate) needed to accomplish an end-to-end mission in combat. The data used for evaluation are appropriately called measures of effectiveness (MOE), because they measure the military effect (mission accomplishment) that comes from the use of the system in its expected environment. This statement of policy precludes measuring operational ESS solely on the basis of system-particular performance parameters. The second purpose is to determine if thresholds in the approved requirements document and COI have been satisfied. The measures used are appropriately referred to in the context of performance as in key performance parameters, or measures of performance (MOP). These performance attributes are often what the program manager is required to deliver. But they are NOT the military effect or measure of operational effectiveness required for achieving the primary purpose of OT&E. All T&E planning and execution activity should support this effort. Additionally, ATEC is committed to support of system development and acquisition by providing feedback to the development community in a timely manner so that, if required, improvements can be made in the timeliest manner. ATEC reports and evaluations should never be a surprise. b. Another objective of evaluation is to determine the extent to which the system meets the requirements for a given milestone based on the technical performance characteristics and operational ESS. The evaluation identifies capabilities and limitations of the tested system(s) and highlights improvements required. Capabilities, limitations, and improvements are addressed with respect to current system and mission profiles as well as, identifying the impact on the ability of the Warfighter to perform the mission. Other objectives of testing and evaluation are: (1) Support the formulation of realistic system requirements and specifications and ensure the system performance risks are acceptably minimized. (2) Discover key issues at the earliest opportunity so they may be addressed and resolved with the least impact to program schedule and cost. (3) Provide for early and frequent assessment and reporting of a system s status during development and fielding. (4) Reduce test time and cost through comparative analyses, data sharing, and use of all data sources for evaluation. (5) Verify the corrective action process during system development. ATEC Pamphlet June

68 (6) Ensure the timely assessment of personnel and system safety to support testing during the acquisition cycle and acquisition milestone events leading to fielding. c. The primary keys to a successful evaluation are early coordination and good communications among AST members. Arriving at and reporting conclusions and recommendations are simple enough as long as there are data to support them. It is capturing relevant data in a usable format that is the hard part. To do this, an evaluation plan has to be developed and executed. There are two levels of planning which occur for every program system-level planning and event-level planning. The system T&E planning is done first, and then individual event T&E plans are developed to reach particular goals within the system plan. d. Due to the diversity of Army systems and underlying technology maturity, an acquisition program can be initiated at any DOD 5000 milestone. Table 3-1 contains the main inputs, outputs, and key activities associated with the general ATEC T&E process. Actual programs are likely to deviate from the table based upon technology maturity, shifts in mission and priorities, development schedules slippage and resource issues. Programs that are initiated at one point in the DOD 5000 process are understood to require completion of some of the basic tasks associated with the preceding phase(s). 3-2 ATEC Pamphlet June 2010

69 Table 3-1. Evaluation Process Inputs/Key Activities and Outputs by Acquisition Phase Phase Inputs Key Activities Outputs MSA Joint Capabilities Document (JCD) Initial Capabilities Document (ICD) Joint and Army Universal Task List (UJTL and AUTL) Form ATEC Systems Team (AST) Conduct mission analysis considering operational capabilities of the system and environmental/threat challenges Develop concept performance objectives with T&E Working Integrated Process Team (T&E WIPT) Develop T&E Strategy (TES) T&E Strategy (TES) AST Charter TD UJTL and AUTL ICD Draft Capabilities Development Document (CDD) Operational Mode Summary/ Mission Profile (OMS/MP) Draft Critical Operational Issues and Criteria (COIC) Security Classification Guide (SCG) Review Critical Operational Issues and Criteria (COIC) Develop Measures of Performance (MOPs) and Measures of Effectiveness (MOEs) in context of a Pattern of Analysis (POA) Develop data model Develop experiment design Coordinate with AST members, CBTDEV, and MATDEV Consider use of Modeling and Simulation (M&S) Develop Data Source Matrix (DSM) Conduct Early Strategy Review (ESR) and/or Concept In-Process Review (CIPR) Develop System Evaluation Plan (SEP) Contribute to T&E Master Plan (TEMP) Coordinate and oversee DT Develop Milestone B OMAR Conduct Rock Drills Coordinate on development of Test Resource Plan(s) (TRP) Safety Confirmation to support entrance into E&MD phase Approved COIC Draft T&E Master Plan (TEMP) Test and Simulation Execution Strategy System Evaluation Plan (SEP) Milestone B OTA Milestone Report (OMAR) Test Resource Plan (TRP) ATEC Pamphlet June

70 Phase Inputs Key Activities Outputs E&MD UJTL and AUTL CDD COIC OMS/MP T&E Master Plan (TEMP) System Evaluation Plan (SEP) Acquisition Program Baseline (APB) System Specifications Draft Information Support Plan (ISP) TRP Logistics Demonstration (LD) plan Military utility assessments or demonstration final reports SCG Update the SEP Update the TEMP Develop M&S Accreditation Plan; conduct M&S Verification and Validation (V&V) as required; and, prepare M&S V&V and Accreditation Reports Coordinate with AST members on OTA TPs Coordinate/Oversee DT Conduct a LUT Conduct LD Conduct Operational Test Readiness Review (OTRR) Conduct OT Develop Milestone C OMAR Prepare an OTA Assessment Report (OAR) in support of LRIP for ACAT I, II or oversight systems TEMP Updated SEP OTA TPs Accredited M&S Milestone C SER DT Event Level Test Reports LD Report Safety Release(s) to support DT and OT Safety Confirmation to support entrance into MS C phase and FRP-DR OER P&D UJTL and AUTL CDD Capabilities Production Document (CPD) COIC OMS/MP TEMP SEP APB System Specifications ISP Military utility assessments or demonstration final reports SCG TRP OTA TPs Update the SEP Update the TEMP Verify and validate M&S as required Evaluators/testers coordinate on OTA TPs Prepare T&E materials for OT Readiness Review Coordinate/Conduct OT Develop OER in support of Full Rate Production Decision Review (FRP-DR) Updated TEMP Updated SEP Validated M&S OTA TPs Operational Test (OT) reports Full Rate Production Decision Review (FRP-DR) OER 3-2. Levels of evaluation for non-major programs a. Upon AST formation, the AST will perform a document review. If there is adequate information to create the T&E strategy, the AST will develop an Evaluation Determination Matrix (EDM) within 90 days, prior to developing the system s evaluation issues and conducting the ESR. The EDM is developed using a quantitative, repeatable methodology described in paragraph 3-2f below, and is used to guide determination of the program s required level of evaluation. 3-4 ATEC Pamphlet June 2010

71 2. b. There are three levels of evaluation: Tailored, Abbreviated, and Low-Risk. See table 3- (1) A Tailored Evaluation is the most comprehensive, addressing as many evaluation issues as the evaluator deems necessary. This may be all possible evaluation areas or just a few. Tailored evaluations typically require multiple data collection events to provide all required data. (2) An Abbreviated Evaluation requires some data collection, but generally from few data sources. As a result, less detailed planning documentation is required. There is no need for the evaluator to address how the data will be analyzed across multiple events. The evaluation issues addressed may range from just a few to all those possible. (3) A Low-Risk Evaluation is completed based on data already available, either through previous ATEC testing, DT, or field performance data. Generally programs considered for an Low-Risk Evaluation are those which are modifications to existing programs that have a long, successful history of use by Soldiers. Table 3-2. Levels of Evaluation for Non-Major Programs Tailored Evaluation Abbreviated Evaluation Low-Risk Evaluation Eval Determination Score 15 Default is combined ESR/CIPR (including DSM) ESR/CIPR approval by AEC Director SEP-T (new format) OMAR-T or OER-T (new format) Eval Determination Score or 10 with any 3 s or 4 s Combined ESR/CIPR (including DSM) ESR/CIPR approval by AEC Director No SEP/ASEP; OTA TPs as normally required OMAR-A or OER-A (new format) Eval Determination Score 10 with no 3 s or 4 s After approval by AEC Director, inform ATEC Standard memorandum to T&E WIPT w/notification of no further evaluation including no position memorandum (for MR) Stand-down of AST members (except DTC as appropriate) c. The level of evaluation recommended by the AST is approved by the AST Chair s Directorate Director. Appendix F includes the format for the EDM submitted for approval. If the Director approves a low-risk evaluation, that approval must be confirmed by the AEC Director and the ATEC Technical Director must be informed. Figure 3-2 shows how the evaluation determination score is computed. d. The AST will use flexible, shortened forms of standard planning and evaluation/ measurement reports for each of the evaluation levels. e. Regardless of the level of evaluation, ATEC is required to sign the Test and Evaluation Master Plan (TEMP) for all Army acquisition programs and DTC will perform the required level of testing for the Safety Confirmation. f. ATEC s mission is to conduct testing, experiments, assessments, and evaluations in order to provide the essential information to decision-makers and Warfighters, making better use of resources, focusing on systems presenting the greatest risk, increasing the rigor of testing and ATEC Pamphlet June

72 evaluations, and producing more timely reports. As a means to accomplish all in a timely manner, ATEC will conduct one of three levels of evaluation for non-major programs: Tailored, Abbreviated, or Low-Risk. Figure 3-1 shows the process for determination of level of evaluation. AST Develops Evaluation Determination Matrix to Determine Level of Evaluation Prior to Development of Evaluation Issues AST Formed Adequate information? Yes AST reviews docs AST devs Eval Determ Matrix AST Chair Dir approval Yes Score* <= 10 w/no 3 s or 4 s Low-Risk Evaluation No Monitor status No Score* OR <=11 with 3 s or 4 s Abbreviated Evaluation Score* >=15 Tailored Evaluation 5 Figure 3-1. Determination of Level of Evaluation g. Document review. Upon AST formation, the AST will begin a document review. If the AST determines that there is adequate program information available, it will begin to develop the EDM. If the AST assesses that there is insufficient information about the capability, intent, or maturity of the program, upon confirmation of the AEC directorate director, the AST will go into a monitor status until sufficient documentation is available. h. Development of the EDM. The EDM should be completed by the AST and submitted for approval by 90 days after standup of the AST. There are five factors in the EDM: system maturity, technical complexity, linkage to other systems, Soldier survivability, and impact on unit mission. The EDM is shown in table 3-3 below. (1) If an AST determines that one or more of these factors (rows) are not applicable to their program, they can be removed from the matrix. Generally, however, all five factors will be used. (2) For each factor in the EDM, the AST will assign a score using the definitions shown in the Scoring Rules column in table 3-3. For example, if the acquisition program is a completely new design, unproven the system maturity factor will be scored as ATEC Pamphlet June 2010

73 Table 3-3. Evaluation Determination Matrix Factor Description Score Scoring Rules System Maturity Technical Complexity Linkage to Other Systems Soldier Survivability Impact on Unit Mission Proven historical performance Level of technology compared to stateof-the-art Dependency/ interoperability with other systems System provides protection to Soldiers (shield/armor/ safety/cbrn) Ability of unit to complete mission if system fails to meet requirements 4 Completely new design, unproven 3 Existing/proven design with significant modifications 2 Existing/proven design with minor modifications 1 Proven design from Army, other service, or industry 4 Pushes state of the art (new idea, step up in technology) 3 Government tested but not fielded; demonstrated in labs, on brassboards, as prototypes 2 In commercial use 1 In military use (ruggedized, tested) 4 Critical to operation of another system (must be present for other system to work; e.g., IT system that controls a weapon) OR interoperable with many systems 3 Supports operation of a major system (other system can work in a degraded mode) OR interoperable with a few systems 2 Supports operation of non-major system (other system can work in a degraded mode) OR interoperable with one other system 1 No interoperation with other systems 4 Intended to prevent casualty (e.g., body armor w/plates, HMMWV add-on armor) 3 Intended to prevent major injury (e.g., partial shield for Soldier, helmet) 2 Intended to prevent bodily injury (e.g., ballistic eyeglasses) 1 Provides no protection to Soldiers 4 System critical to unit mission completion (don t perform mission without; e.g., crew-served weapon, asbestos gloves) 3 Unit mission completed in degraded mode (e.g., transport trailers) 2 Unit mission completed with workarounds (e.g., training devices, cold-weather gloves) 1 No direct impact on unit mission (e.g., MRE heater) (3) The scores for system maturity, technical complexity, and linkage to other systems, Soldier survivability, and impact on unit mission will be added to get a total system score. (4) If all factors are used, the total score will be used to determine the level of evaluation. If one or more of the factors are not used, the total scoring guidelines below should be reduced by 4 points for each unused factor. See table 3-3. (a) If the score is greater than or equal to ( ) 15, the system should receive a Tailored Evaluation. ATEC Pamphlet June

74 (b) If the score is between 11 and 14, inclusive (11 x 14), or if the score is less than 11 but there are any individual factors which score 3 or 4, the system should receive an Abbreviated Evaluation. (c) If the score is 10 or less (x 10) and no individual factor received a score of 3 or 4, the system should receive a Low-Risk Evaluation. (d) These scores are general guidelines. If the AST or the ESR approval authority does not agree with the outcome, the EDM guidelines may be waived. i. Approval of the EDM. After the AST has developed and agreed upon the EDM, it will be submitted to the AST Chair s Directorate Director for approval. This should be done within 90 days of the AST standup, unless the system is put on monitor status. As required, individual AST members can staff the proposed EDM matrix through their Directorate chain-of-command prior to submission to the AST Chair s Directorate Director for approval. If a briefing to support the EDM submission is produced, its format and content are at the discretion of the AST. j. Tailored Evaluations. Acquisition programs with an EDM score of 15 or above will generally be selected for Tailored Evaluations. (1) After approval of evaluation level by AST Chair s Directorate Director, the AST will prepare for a combined ESR/CIPR. The Data Source Matrix (DSM) will be part of the ESR/CIPR. If the AST believes that the ESR and CIPR need to be conducted separately, that is acceptable. The format for this briefing is in appendix F. (2) Within 30 days of evaluation level approval, the AST should project a date for a combined ESR/CIPR. The goal for ESR/CIPR is the same as for major programs; that is, to be conducted within 180 days of AST standup. The Chair for the ESR/CIPR is Director, AEC. The goal for tailored evaluations is a combined ESR/CIPR. However, separate ESR and CIPR can be conducted if the AST believes it necessary. Formats for ESR and CIPR briefings can be found in appendix F. The ESR will contain the EDM results. (3) The AST (led by the Evaluator) will begin developing evaluation issues, measures, design of experiments, and analysis methodologies. They will assess limitations to the testing or evaluation, and develop required sources of data (types of events). (4) The AST will then (led by the Testers) develop a test strategy to address the design of experiment as well as draft test cost estimates. (5) The AST Chair will then coordinate this draft T&E strategy with the WIPT, resolving issues (as possible). (6) After approval of the ESR/CIPR by the Director, AEC, the AST will develop a Tailored System Evaluation Plan (SEP-T) for the system. The format for the SEP-T is in appendix F. Consistent with current policy, the SEP-T will be approved by the Director, AEC. (7) The approval authority of the TEMP for tailored evaluation programs remains the same as that for all programs not on the OSD Oversight List. 3-8 ATEC Pamphlet June 2010

75 (8) The OTA TPs are prepared by the AST as required, consistent with current policy. (9) The OTA TPs approval authority is provided in table 4-1. (10) The testers will develop Detailed Test Plans in the usual way as required for events supporting the evaluation. (11) At the conclusion of each test event, the tester will develop and publish a test report. OTC will use the Abbreviated Operational Test Report (AOTR) format and timeline; DTC will use the Final Test Report (FTR) format and timeline. Additionally, DTC will provide an Interim Data Report (IDR) to the Evaluator no later than 45 days after conclusion of the test (E+45). (12) At the conclusion of all events necessary to address the evaluation issues, the AST will prepare a Tailored OTA Milestone Assessment Report (OMAR-T) or a Tailored OTA Evaluation Report (OER-T). The formats for these reports are shown in appendix F. (13) The AST will staff the OMAR-T or OER-T through their individual Directorates, then through AEC Technical Editing. (14) The OMAR-T or OER-T is submitted to the AEC Director for final review and approval. k. Abbreviated Evaluations. As described above, acquisition programs with an EDM score between 11 and 14, or a score less than 11 with any EDM factor receiving a score of 3 or 4, will generally be selected for Abbreviated Evaluations. (1) After the AST Chair s Directorate Director approves evaluation level, the AST will prepare for a combined ESR/CIPR. The EDM and the DSM will be part of the ESR/CIPR. Separate ESR and CIPR may be conducted if the AST believes it is necessary. The format for this briefing is in appendix F. (2) Within 30 days of evaluation level approval, the AST should project an ESR/CIPR date. The goal for ESR/CIPR is the same as for major programs; that is, to be conducted within 180 days of AST standup. The Chair for the ESR/CIPR is the Director, AEC. (3) The AST (led by the Evaluator) will begin developing evaluation issues, measures, design of experiments, and analysis methodologies. They will assess limitations to the testing or evaluation, and develop required sources of data (types of events). (4) The AST will (led by the Testers) develop a test strategy to address the design of experiment as well as rough test cost estimates. (5) The AST Chair will coordinate the draft T&E strategy with the T&E WIPT, resolving issues (as possible). (6) After approval of the ESR/CIPR by the Director, AEC, the appropriate AST member will begin preparation of any required OTA TPs. ATEC Pamphlet June

76 (7) The approval authority of the TEMP for the Abbreviated Evaluations remains the same as for TEMPs for ACATs IIs not on the OSD Oversight List. (8) The AST will not develop a System Evaluation Plan (SEP). The ESR, CIPR, and DSM should be sufficient for the AST to develop an OTA TP (if needed) and execute the T&E. (9) OTA TPs are prepared by the AST as required, consistent with current policy. The Evaluator may include evaluation strategy and methodology in OTA TPs if desired, either as part of chapter 3 or as an appendix. (10) The OTA TP is signed by all ATEC SCAs at the same level. That is, if the Tester signature is by the Director of the Test Directorate, the Evaluation Directorate Director will sign, etc. (11) The testers will develop DTPs in the usual way as required for events supporting the evaluation. DT Test Plans will necessarily be reviewed by the Evaluation team to ensure that all required information for the evaluation is included (since no SEP is required and an OTA TP may not be created). (12) At the conclusion of each test event, the tester will develop and publish a test report (TR). OTC will use the AOTR format and timeline; DTC will use the FTR format and timeline. Additionally, DTC will provide an IDR to the evaluator no later than E+45. (13) At the conclusion of all events necessary to address the evaluation issues, the AST will prepare an Abbreviated OTA Milestone Assessment Report (OMAR-A) or an Abbreviated OTA Evaluation Report (OER-A). The formats for these reports are shown in appendix F. (14) The AST will staff the OMAR-A or OER-A through their individual Directorates, then through AEC Technical Editing. (15) The OMAR-A or OER-A is submitted to the AEC Director for final approval. l. Low-Risk Evaluations. As described above, acquisition programs with an EDM score of 10 or less (with no EDM factors receiving a score of 3 or 4) will generally be selected for Low- Risk Evaluations. (1) If the AST Chair s Directorate Director approves a Low-Risk Evaluation, within 5 days the AST Chair must submit a Form 45 requesting evaluation type confirmation from the AEC Director. Submitted in the package should be, at a minimum, the EDM, the briefing given for evaluation-level approval at the Directorate level (if any), and a signature-ready copy of the standard memorandum notifying the T&E WIPT of ATEC s decision. (2) After the AEC Director confirms the decision to perform a Low-Risk Evaluation, the approved package, including signed Form 45, will be forwarded to the ATEC Technical Director for information. Once the ATEC TD initials the Form 45, the package will be returned to the AST Chair ATEC Pamphlet June 2010

77 (3) The AST Chair will deliver the signed notification memorandum to the T&E WIPT members. The memorandum should be published as a Portable Document Format (PDF) document, and both ed and postal mailed to the T&E WIPT. (4) The PDF version of the memorandum will be stored/published in the VISION Digital Library (VDL) and documented in the ADSS by the AST. (5) The AST will be stood down. If DTC continues to be involved in program testing, the AST will remain, with the DTC Test Manager as the AST Chair and no other members. If desired, the evaluator can remain as an AST member. (6) If a TEMP is required for the program, all ATEC SCAs will sign the coordination page at the Action Officer level (current or former AST members if possible). The current AST Chair s SCA Commander or Director will sign the approval page of the TEMP. If there is no active AST, the former AST Chair s SCA Commander or Director will sign the approval page. (7) DTC, or any ATEC organization, will notify ATEC DCSOPS if there are significant program changes. (8) If ATEC DCSOPS is notified of any significant or substantial program changes, or if any documents come in for review or approval, the AST will be re-formed or repopulated, and the AST will restart the document review and EDM process Development of the evaluation strategy At the center of the evaluation plan is the system evaluation strategy. It is briefed at the ESR, targeted for 180 days after the AST is formed. The approved evaluation strategy is documented in the System Evaluation Plan (SEP). The following paragraphs identify the major considerations and philosophies that the AST must consider when developing the evaluation strategy. a. An evaluation strategy should address how the AST intends on testing a system so the Evaluator can evaluate how well the system meets ESS criteria, determine which factors and conditions affect its performance, and assess performance improvements over current capabilities. There are four phases to the evaluation strategy. First, the system s requirements, functionality, associated measures, and intended operational environment must be clearly specified. This is accomplished by performing a requirements analysis. Second, the Evaluator must document the data needed to evaluate the system. Data requirements, especially for large or complex systems, are documented, organized, and managed using a tool called a system evaluation data model. In addition, evaluation data sets, which are rectangular tables or files where the rows are observations and the columns are variables, must be designed. Data sets are used to analyze the data collected at test events. Evaluators can find additional data model and data sets information in the Evaluator and Analyst Handbook. Third, whenever possible, test events must be structured so they not only reflect operationally relevant conditions but also permit determining the effects of test conditions on system performance as well as determining performance improvements. This is accomplished by means of design of experiments. Fourth, each measure must be assigned to one or more data sources. ATEC Pamphlet June

78 An evaluation strategy describes the process of identifying, defining, structuring, and resourcing the areas of study and data required to test and evaluate Army systems. The process is facilitated by a number of tools. The definition of an evaluation strategy includes: Requirements Analysis: Identify areas of study, associated measures and standards, and the data required for the evaluation. o Tools and procedures: WBS/Functional Dendritic, Lens Chart, Factors & Conditions, Pattern of Analysis, Mission-Based T&E. Data Management: Name, define, and house/store the data requirements associated with the areas of study and measures. o Tools and procedures: Data Model, Data Sets, Data Archive. Design of experiment: Design/structure test events to enable causal inferences. o Tools and procedures: Scientific method, full factorial/fractional designs, sample sizing, statistical analysis. Data Sources: Identify sources of data for the data requirements. o Tools and procedures: Data Source Matrix (Historical data, Developmental Test, Contractor Test, Operational Test, Modeling & Simulation, etc.). Figure 3-2. Evaluation Strategy Summary b. Requirements Analysis. The requirements analysis involves reviewing the relevant system documentation and identifying the functions the system must perform, the components of the system that perform the required functions, and the measures of system performance and mission accomplishment. KPPs, COIC, and Additional Issues (AIs) are also identified. The analysis also identifies the conditions under which the system is expected to perform. In short, the requirements analysis results in the identification of all the variables needed to evaluate a system. c. Data Management. All of the data supporting a system evaluation should be documented, organized, and managed using a system evaluation data model. The data model will define all of the data files as well as all of the variables that the Evaluator will draw upon in evaluating the system. Data models are refined into normalized file or table designs. Normalized data tables are optimized for data management, query, and consistency. d. It is easy to confuse the design of a data set with a physical data set that has data in it. If a physical data set is envisioned as a data table, the data set design would be represented by the column headings of that table. The design comprises a list of names of variables that are needed to address hypotheses about the effects of variables on system performance or overall mission accomplishment. When a physical data set is created, the variable names become row headings. Physical data sets are populated with data generated at some test event. Each observation captured at the test event becomes a row in the table. The design of a data set is driven by the 3-12 ATEC Pamphlet June 2010

79 design of experiment and the planned data analysis. Additional information on data models can be found in the AEC Evaluator Handbook. e. Design of Experiment. In the classical sense, the purpose of design of experiment is to test hypotheses regarding the effects of conditions (independent variables) on the response variable (dependent variable). This is done by controlling some test conditions (typically by randomizing them or holding them constant) and systematically varying other conditions, in such a way that any effect on the dependent variable due to changes in the independent variable can be determined. For the Evaluator, the dependent variables will be measures of system performance, measures of overall mission accomplishment and the system s contribution to overall mission accomplishment. Examples of independent variables include mission, enemy, terrain, and weather but may also be system specific, e.g., bandwidth, armor, signature. It is a benefit to the Evaluator to learn to view all tests in terms of controlled conditions, independent variables, and dependent variables. The approved design of experiment must be followed by the testers. Any plans to deviate from the approved design of experiment must be coordinated with the AST Chair, the lead test officer and the tester chain of command. (1) The primary purpose of any evaluating system performance with quantitative techniques is to produce an estimate of future system performance once the system is deployed in a combat circumstance. This estimate is an assessment of expected system performance, based on test results. However, since the test results are only an estimate of system performance, there is risk associated with the possibility the test results are not representative of future system performance. Therefore, the rest estimate must be matched with a corresponding confidence interval. (2) This confidence interval represents a range of estimated values predicted for actual system performance. The purpose of any confidence internal is to mitigate the risk associated with the test estimate. To go along with the confidence interval is a level of confidence, which is the probability the actual system performance will fall between the lower and upper bounds of the confidence interval. The level of confidence should be 90% or higher. The purpose of a high level of confidence is so ATEC can strengthen the conclusion from test results by stating ATEC is at least 90 percent sure the actual system performance will fall within the stated range of values. (3) Data collection requirements need to be tied to the performance measures that address the objective of the evaluation. Performance measures can be placed into two categories-attribute and variable measures. Attribute measures are binary and take on values of pass/fail. Variable measures are numerical values, such as distance, time, or quantity. In either case, data collection requirements can be planned by using the expected results and modeling those results to a normal probability distribution. This process will better plan the test and describe the data needs in terms of the conclusions required by the system at the end of the test. f. Statistical techniques are used in conjunction with design of experiments to test hypotheses, account for effects due to randomness, support generalizations, and establish confidence levels associated with conclusions. An experiment that is not carefully designed or insufficiently controlled has the potential to confound or bias the results of the analysis. Such cases are unacceptable and can lead to situations where resources are wasted, superior solutions ATEC Pamphlet June

80 are overlooked, or the Warfighter is denied an increased capability when one is available. Design of experiments is addressed in greater detail in the Evaluator and Analyst Handbook. g. Data Sources. Evaluation data sources potentially include more than just test results. They can also include the results of analysis, modeling and simulation results, the results of interviews and review of relevant literature. The DSM is the tool that relates Critical Operational Issues and Criteria (COICs), AIs, and measures to the relevant data sources where the data needed for the measures can be obtained. The DSM is an important planning tool and is included as an appendix to the SEP. An example of a DSM is provided in figure h. Office of Secretary of Defense (OSD) Section 231 Report Guidelines for T&E. Section 231 of the National Defense Authorization Act for Fiscal Year 2007 directed a review, and amendment if necessary, of defense acquisition T&E. The office of the OSD DUSA (Acquisition, Technology and Logistics) produced an initial DOD Report to Congress on Policies and Practices for Test and Evaluation, dated 17 July 2007, in order to satisfy this legal mandate. The report introduces eight key T&E principles that are referred to herein as the OSD Section 231 Report Guidelines. The eight principles and methods for implementing them are provided in table 3-4. Table 3-4. OSD Section 231 Report Guidelines and Implementation Methods Principle Measure improvements to mission capability and operational support. Experiment to learn strengths, weaknesses, and the effect on operational capabilities. Integrate DT and OT. Begin early, be operationally realistic, and continue throughout the life cycle. Evaluate in mission context expected at the time of fielding Compare to current mission capabilities. Use all available data and information. Exploit benefits of Modeling and Simulation How to Implement Conduct periodic assessments of observed improvements. Relate system functionality to warfighting functions and tactical mission tasks and measure improvements. Use an OT design of experiment: test under controlled conditions to determine capabilities and limitations. Consider using a single test event to address both DT/OT T&E requirements. Use an OT design of experiment during DT. Give DT an operational flavor with appropriate threats or Soldier operators. Get involved very early in the development process. Liaison with the PM, contractors, and developmental tester. Coordinate with AMC and pertinent TRADOC Battle Labs. Communication with all parties is a requirement. Ensure an operational environment with realistic and representative threats that are expected at the time of fielding plus 5 years. Apply a mission-based test and evaluation (MBTE) focus to the system evaluation. Whenever possible, include in the OT design of experiment comparison to the baseline system. Use historical data as well as data from contractor testing, DT, M&S, etc. for system evaluation. Use a data model to facilitate understanding of differing sources of like data. Supplement/complement live testing with virtual and constructive simulations. Plan for early VV&A of models and resource the effort ATEC Pamphlet June 2010

81 3-4. Steps in developing the evaluation strategy The products produced during the development of an evaluation strategy are not necessarily produced sequentially but are presented below in a plausible order. a. Step 1. Gather and review relevant documents. Identify KPP and COIC. KPPs are attributes or characteristics of a system that are considered critical or essential to the development of an effective military capability and those attributes that make a significant contribution to the key characteristics as defined in the Joint Operations Concepts. (They are found in the CDD or CPD.) COICs are operational ESS concerns with performance standards that must be examined to determine the degree to which the system is capable of performing its mission and is ready to move on to full rate production. COIs are always stated as questions addressing system capabilities. (COICs include a scope and rationale and are found in the COIC Memorandum or TEMP). b. Step 2. Use or produce selected DODAF products. DODAF v 2.0 is the current version in use. It can be downloaded from the internet. DODAF provides an underlying structure to work with complexity as well as providing a standard way to organize system architecture into consistent operational and system views (see paragraph 3-6 and appendix I for additional information). Operational views (OVs) are views of the military operational elements, tasks, activities and information flows that support mission accomplishment. System views (SVs) show the interrelations and dependencies of technologies, systems and other resources which support mission accomplishment. All major weapons and information technology system procurements are required to develop and document an enterprise architecture (EA) using the views prescribed in the DODAF. Operational views OV-1, OV-5, and OV-6c and system view SV-5 should be developed for all systems. If a system interoperates with other systems, OV-2, SV-2, and SV-6 should also be developed. c. Step 3. Develop AIs. AIs are operational ESS concerns not addressed by the COIC. The capabilities of, and missions supported by, the system under study should be matched with like capabilities and missions in the Army Universal Task List (AUTL) in order to help identify AI. Ensure that interoperability, RAM, MANPRINT, ILS, and Survivability are adequately addressed. AIs, like COIs, are stated as questions but are not accompanied by formal criteria. d. Step 4. Develop one or more measures for each issue identified above. The capabilities of, and missions supported by, the system under study should be matched with like capabilities and missions in the AUTL in order to help identify measures. Measures are observable quantities that define two or more levels of some aspect of ESS. Measures have traditionally been grouped into MOEs and MOPs. MOEs are measures that most directly impact overall mission accomplishment. MOPs are measures that most directly indicate levels of system performance. e. Step 5. Develop a functional dendritic and supplement it with a lens chart. A functional dendritic is a hierarchical decomposition of system attributes or characteristics in terms of ESS categories. A lens chart is a grouping of measures into 2 to 4 categories where the categories are separated by drawings of optical lenses. Potential categories include system components and ATEC Pamphlet June

82 associated functions, system or system of systems, family of systems, and overall mission accomplishment. f. Step 6. Develop the DSM. The DSM is a management tool used to identify archived data from past tests and data from future events that will be drawn on in an evaluation. The DSM traces the T&E issues, criteria, and measures given in the SEP to T&E events in which the data needed for the measures will be obtained (see figure 3-11). Considerations for selecting an event will include tactical context, robustness, and fidelity that would be required in the event to properly address the measure. g. Step 7. Develop a Pattern of Analysis (PoA). The development of a PoA is a process of identifying data requirements. It provides traceability between T&E issues given in the SEP to their associated criteria, measures and data requirements. It is an extension of the DSM. Normally factors and conditions are identified at this time. Factors and conditions are the expected test conditions under which system performance measurements will be made. They constitute the variables affecting system performance. (1) Data collection is a very important part of the process for evaluating the performance of military and computer systems. It is critical to setup the data collection plan so that the performance goals are thoroughly addressed at the end of the evaluation. The data collection plan needs to determine the type and amount of data required to accomplish the goals setup at the beginning of the evaluation. (2) Objectives of data collection include using the test results to make statistically significant conclusions on how the system under test will perform when deployed. These objectives include passing some performance requirement, such as a minimum probability of success that indicates accuracy. Another objective could be exceeding some functional measure, such as average distance traveled or time in operation. Whatever the objectives, the data collection plan needs to be tied to the performance measures that focus on system objectives. h. Step 8. Develop a System Evaluation Data Model. A System Evaluation Data Model is a representation of the information and data assets required to evaluate the system. The System Evaluation Data Model is expressed in terms of entities and the relationships between entities. It ensures properly documented data for communication between the evaluator and the tester. i. Step 9. Develop an design of experiment. An experiment is a controlled event in which the experimenter manipulates variables (independent and blocking) and observes the effects on other variables (dependent variables). Ideally, a design of experiment will provide for assessing the effects of test conditions on system performance and for comparing new capabilities with currently fielded capabilities. Testers will not deviate from the approved design of experiment without coordination with the AST first. j. Step 10. Develop evaluation data sets. An evaluation data set is a file containing the data on which a data analysis is performed. The evaluation data set will contain all the variables necessary to test one or more hypotheses concerning system performance. k. Step 11. Assess the need and provide for M&S and instrumentation. A model is a set of variables with defined relationships that describe an object or event. When the values of the 3-16 ATEC Pamphlet June 2010

83 variables are specified, the model simulates a specific situation. A simulation may be defined as a set of related variables with specified values. Instrumentation provides for automated data collection. Many test events require the use of M&S to replicate an operational environment (e.g., stimulate with simulated message traffic) and instrumentation to capture data (e.g., record message traffic). l. Step 12. Produce the ESR. The ESR focuses on the evaluation strategy identifying information needed to evaluate ESS and impacts on mission capability. The ESR provides verification that all KPP and COIC are addressed. It serves as a basis for developing the SEP, chapters 1-3. The design of experiment presented at the ESR is used to develop supporting event requirements for the CIPR and SEP chapter 4 (test strategy). m. Step 13. Provide input to the TEMP. n. Step 14. Produce the SEP. The SEP is a description of system evaluation issues and the methodology to address those issues. It serves as a contract between the evaluator and tester regarding what needs to be evaluated, and permits the conduct and replication of an experiment designed to address the evaluation issues Documenting the evaluation process Having learned about the system, its intended operating environment, the required capabilities, and the tasks which Warfighters will be required to perform with it, the next step is for the Evaluator and his/her team to develop the evaluation framework. a. The evaluation framework is generally presented as a dendritic (see figure 3-3). Level 2 of the dendritic has three components effectiveness, suitability, and survivability. b. Level three of the dendritic contains evaluation areas (EA), also known as study questions, issues, or AIs. Each COI will be an EA, and the evaluation team may create other EAs as groupings for important study areas. These are usually major themes, capabilities, or areas of consideration. c. Each EA is broken down further into evaluation focus areas (EFA), and these are shown on level four of the dendritic. Each KPP and COIC will be an EFA, and again, the evaluation team may create other EFAs as necessary. Those EFAs which are supported by a KPP or COIC will be asterisked in the dendritic. Every EA will have at least one EFA. d. Measures are the fourth and final level in the dendritic. This level is often not shown graphically due to space considerations. Measures are the questions you want to ask/answer about an EFA, and the data collected will be designed to answer these measures. (1) Measures must be quantifiable and testable. Ensure the measures will elicit enough data to fully determine whether COICs or KPPs are satisfied under all factors and conditions. (2) Measures should be designed to provide substantive information to aid in understanding the system response. They should be broad enough to help the Evaluator ATEC Pamphlet June

84 understand why the capability is as it is and to estimate how it might be expected to change as the system matures. (3) As you develop the measures, consider how you will answer them what data you will need to calculate the measure, under what conditions the data will need to be collected, what analytic and statistical techniques you will use. These will be necessary for the evaluation strategy and the evaluation plan. Carefully considering these questions will also ensure that you have the right measures for a robust evaluation framework. Figure 3-3. Example System Evaluation Dendritic 3-6. Evaluation tools Some additional tools that will help develop the evaluation strategy are presented in the below paragraphs. a. Baseline Correlation Matrix (BCM). Now that the Evaluator and his team have a first draft of the evaluation framework, they will develop the BCM. The BCM documents the crosswalk between requirements/capabilities and measures, quickly highlighting those requirements which are inconsistent with each other or are not addressed by an evaluation measure ATEC Pamphlet June 2010

85 (1) The BCM table format shows general capabilities in the left column and measures in the right column. In the middle columns are requirements/capabilities documents and each row traces a particular capability through all the documentation (see table 3-4). Entries should include the paragraph number from the source document and a summary of the capability, measure, and threshold when applicable. The BCM should include, but not be limited to, the following column headings if the applicable documents exist: (a) ICD. (b) CDD or CPD (no need to do both of them, use the most recently published). (c) AoA. Correlate the AoA measures of effectiveness with system issues and requirements if possible. Resolve inconsistencies. (d) System specification or Request for Proposal (RFP) if the document details operational requirements. For NDI, the RFP and system specifications may be the primary requirements documents available to the AST. (e) System MANPRINT Management Plan (SMMP). (f) Computer Resource Management Plan (CRMP). (g) System Safety Management Plan (SSMP). (h) Safety Assessment Report. (i) Safety Releases and Confirmations. (j) Critical Operational Issues (COI). (k) Critical Technical Parameters (CTP) from the TEMP paragraph. (l) KPPs/COICs. (2) Development of the BCM is an evolutionary process. As new or updated requirements documents are developed, the AST compares them to the requirements already established in the BCM. By tracing the consistency of the requirements for wording, criteria, units, and specific values, the AST can identify discrepancies in requirements when their impact can easily be minimized. If an inconsistency, omission, or other change that is not directly traceable to an earlier requirement is noted, it must be justified or rectified. The system capabilities are examined to assure that each is covered by an adequate set of measures. If not, then coverage can easily be added at this stage to the evaluation dendritic. The end product is a consistent, fully justified set of measures that is a firm foundation for an evaluation. ATEC Pamphlet June

86 Table 3-5. Example Baseline Correlation Matrix Capability ICD CPD SYSTEM SPEC/RFP CRMP SMMP Evaluation Area (EA) Evaluation Focus Area (EFA) COIC/KPP/CTP Measures 1.0 Firepower (para 5.7) Probability of kill = 0.96 per enemy plane when raid size is < 20 planes. (para 2.3) Probability of kill = 0.96 per enemy plane when raid size is < 20 planes. (para 1.2) Probability of kill = 0.96 per enemy plane when raid size is < 20 planes. 5. Issue. Does System XYZ retain capability of kill in an EW environment? (No criteria) 1.1 Criteria. Probability of kill (pk) >0.96 when raid size is < 20 planes Pk=#K/ T#Tgts. #K= # enemy planes killed in given battle sequence T#Tgts=total # targets in given sequence. 5.1 Pk=(etc.) 5.2 Pk= (etc.) (para 5.9) Must have a firing rate of 1 round per launcher every 5 seconds. (para 2.7) Must have a firing rate of 1 round per launcher every 3 seconds. (para 1.3) Must have a firing rate of 1 round per launcher every 3 seconds. 2. Issue. Does System XYZ have an effective firing rate during a typical battle scenario? 2.1 Criteria. C firing rate of 1 round per launcher every 5 seconds MTT Launcher Firing Rate = (Sum of DUREI)/(Sum of #U). DUREI=duration of engagement i. U=# launcher for launcher 1 in engagement 2.0 Target Location (para 4.2) Must detect target with high probability at a distance threat aircraft can deliver ordnance. (para 3.1) Must detect target with probability 0.91 at a distance of < 2 miles. (para 2.5) Must detect target with probability 0.91 at a distance of < 2 miles. 3. Issue. Does System XYZ accurately detect enemy targets in an operational environment? 3.1 Criteria. Detect an enemy target with probability > 0.91 when target is < 2 miles out Pd=#D/ T#Tgts. #D is # enemy planes detected in a given battle sequence. T#Tgts is total # targets available in a given battle sequence. (para 4.2) Given target detection, must correctly identify target with 0.98 probability. (para 3.2) Operator must correctly identify target with 0.98 probability. (para 5.7.1) Weapons sight shall have a resolution of 0.3 milliradians. 4. Issue. Does System XYZ correctly identify targets in the field? 4.1 Criteria. Correctly identify 98% of the targets it detects Pi = #l/ #D. #1 is # enemy planes correctly identified in a given battle sequence. #D is # enemy planes detected in a given battle sequence (etc.) (etc.) 3-20 ATEC Pamphlet June 2010

87 b. DODAF. The DODAF products are useful tools in that they convey the details of system functionality within an SoS and/or FoS context. It is the standard way to organize system and/or enterprise architectures in complementary and consistent views. (1) The CJCSI D defines JCIDS required DODAF products to support capability development and acquisition. The PM uses existing DODAF products and also develops new ones. They are used to guide system development and are required to satisfy NR-KPP requirements, in particular with respect to interoperability certification and standards compliance. The AST can use the products to help identify data requirements and test architectures. (2) The DODAF helps the evaluator structure requirements in operational, system, and technical views. In both the operational and system views, the DODAF structures requirements by interoperability or adjacent network entities, by data flow, by activity (i.e., task or function), and by time phase. (a) OV products provide descriptions of the tasks and activities, operational elements, and information exchanges required to accomplish military missions. OVs specify operational nodes (e.g., Division Main, Brigade Tactical Operation Center, and Staff Sections), information exchange, activities or tasks, and the phasing or ordering of the activities or tasks. (b) System View (SV) products provide graphical and textual descriptions of systems and system interconnections that provide or support a required military task described in the OVs. SVs views show system nodes (e.g., ER/MP Unmanned Aerial System, Maneuver Control System) or components, system functions, and the mapping of the operational activities and capabilities to system functions. (3) The following bulleted list shows the views most pertinent to T&E. (a) Operational Views. o OV-1 High Level Operational Concept Graphic - High level graphical and textual description of operational concept (high level organizations, missions, geographic configuration, connectivity, etc). o OV-2 Operational Node Connectivity Description - Operational nodes, activities performed at each node, and connectivity s and information flow between nodes. o OV-3 Operational Information Exchange Matrix - Information exchanged between nodes and the relevant attributes of that exchange such as media, quality, quantity, and the level of interoperability required. o OV-4 Organizational Relationships Chart - Command, control, coordination, and other relationships among organizations. ATEC Pamphlet June

88 o OV-5 Operational Activity Model - Activities, relationships among activities, inputs and outputs. In addition, overlays can show cost, performing nodes, or other pertinent information. o OV-6c Operational Event-Trace Description - One of the three products used to describe operational activity sequence and timing that traces the actions in a scenario or critical sequence of events. o OV-7 Logical Data Model - Documentation of the data requirements and structural business process rules of the Operational View. (b) System Views. o SV-2 Systems Communications Description - Physical nodes and their related communications lay downs. o SV-4 Systems Functionality Description - Functions performed by systems and the information flow among system functions. o SV-5 Operational Activity to Systems Functionality Traceability Matrix - Mapping of system functions back to operational activities. o SV-6 Systems Data Exchange Matrix - Detailing of data exchanges among system elements and applications allocated to system elements. o SV-11 Physical Schema - Physical implementation of the information of the Logical Data Model. c. Army Universal Task List and Universal Joint Task List (AUTL and UJTL). The AUTL and UJTL describe operational missions or tasks and measures corresponding to those missions or tasks. AUTL tasks are called Army Tactical Tasks (ARTs). (1) The AUTL categorizes ARTs and corresponding measures into six Warfighting Functions (WFs) and a group of tasks called Tactical Mission Tasks (TMTs). Tasks at the WF level must be executed in order to accomplish the tactical mission tasks. Tactical mission tasks are performed by units while conducting tactical operations (i.e., offensive, defensive, stability, or support operations). Tactical mission tasks describe specific results or effects the commander wants to achieve or activities of friendly forces. Tactical mission tasks are combined arms in nature and do not fall under a specific WF. All other ARTs are WF-specific. The UJTL groups tasks by level of war and at each level of war, by task areas. The UJTL describes three levels of war: strategic, operational, and tactical. At the tactical level of war, tasks are joint in nature and are called Tactical Tasks (TAs). Examples of WF missions and measures are provided in text boxes later in this chapter. (2) Measures of Mission Accomplishment. In accordance with the AUTL, mission accomplishment refers to the completion of one or more tactical mission tasks required to successfully completing a tactical operation. The most critical measure for all TMTs is whether 3-22 ATEC Pamphlet June 2010

89 the mission was accomplished. The AUTL lists five additional generic measures of mission accomplishment. In order for a mission to be successful, the following must be true: (a) The operation was consistent with the Higher Commander s intent statement of what the force must do. (b) The operation was carried out within the specified timeline. (c) Fratricide and collateral damage were within the bounds dictated by the Commander s risk assessment. bounds. (d) The resources expended during completion of the operation were within dictated (e) The Unit remains capable of being assigned a continuing or future mission. ATEC Pamphlet June

90 Example WF Tasks and Measures ART 2.0: The Intelligence WF, ART PROVIDES INTELLIGENCE SUPPORT TO TARGETING (1-34. The intelligence officer, supported by the entire staff, provides the Commander information and intelligence support for targeting of the threat s forces and systems. It includes identification of threat capabilities and limitations. (FM 34-1) (USAIC&FH).) Scale Yes/No Yes/No Measure Identify enemy C2 nodes. Identify enemy communications systems. ART 3.0: The Fires WF, SECTION II Army Tactical Task (ART) 3.2: DETECT AND LOCATE SURFACE TARGETS (3-2. Perceive an object of possible military interest without confirming it by recognition (detect). Determine the placement of a target on the battlefield (locate). Target location can be expressed, for example, as a six-digit grid coordinate. (FM 6-20) (USAFAS)) Scale Yes/No Time Percent Measure Detect and locate all high-priority and high-payoff targets within the AO. To locate targets during reconnaissance and surveillance of defined target area of interest. Of potential targets detected to targeting accuracy during reconnaissance and surveillance. Example Tactical Missions ART 8.5: CONDUCT TACTICAL MISSION TASKS Tactical mission tasks describe the results or effects the commander wants to achieve the what or why of a mission statement. ART ATTACK BY FIRE AN ENEMY FORCE/POSITION Attack by fire uses direct fires, supported by indirect fires, to engage an enemy without closing with him to destroy, suppress, fix, or deceive him. (FM 3-90) (USACAC) ART BLOCK AN ENEMY FORCE Block denies the enemy access to an area or prevents his advance in a direction or along an avenue of approach. (FM 3-90) (USACAC) ART BREACH ENEMY DEFENSIVE POSITIONS Breach employs all available means to break through or secure a passage through a defense, obstacle, minefield, or fortification. (FM 3-90) (USACAC) Figure 3-4. Example Warfighting Tasks and Measures (3) A combined view of the DODAF and UJTL/AUTL is given in figure 3-7. This figure shows how the AUTL and UJTL contribute to the DODAF operational and system views. The DODAF OVs as well as the AUTL and UJTL address tasks and measures involved in 3-24 ATEC Pamphlet June 2010

91 accomplishing various missions given to units or individuals. The DODAF SVs addresses specific system tasks and measures (i.e., measures of system performance). Figure 3-5. Relationships DODAF Operational View o o UJTL, Army, Air Force, Navy Strategic Operational (Joint Task Force) Tactical (Joint Task Force Land Component Command) AUTL, Army at the Tactical Level (DIV, BDE/BCT, BN/CAB) Tactical Mission Task: 5 Generic Measures (e.g., Accomplish mission, Y/N) Warfighting Function Task: Various measures based on zero, one, or more systems (e.g., Detect, Y/N, based on zero, one or more systems) DODAF System View System Task: Measure based on the performance of a specific system (e.g., Detect, Y/N, based on a specific EO system) Figure 3-6. A Combined View of the DODAF and AUTL/UJTL (4) Associating a Unit s Mission with the AUTL and UJTL. A useful strategy for identifying functions or tasks that a system must help perform as well as measures of those tasks is to associate the unit s mission that a system is intended to support with the missions and tasks ATEC Pamphlet June

92 given in the AUTL and JUTL. By associating a system with one or more tactical mission tasks and one or more warfighting functions, the AUTL can be used to help determine the tasks or functions a system must perform and can therefore be used to determine MOEs and MOPs. d. Pattern of Analysis (PoA). (1) A PoA is a table or outline that shows the data requirements relevant to the measures and issues addressed in system evaluation. A PoA traces the T&E issues given in the SEP to associated criteria, measures, and data requirements. The evaluator must work with the tester to ensure common understanding of required measures and data requirements. (2) The PoA is important for several reasons. First, it is a formal way to identify the data requirements relevant to the set of measures used to address the evaluation issues; second, these data requirements identify the variables that are necessary for the Evaluator s data sets used to analyze data collected at test events; and third, the PoA can be used to ensure the variables relevant to the measures are traceable to the system data model. This in turn ensures that the data model is complete and all data requirements are properly defined. (3) Without decomposing the measure into specific data requirements, as is required during PoA development, the data requirements provided to the tester will not be clear. If confusion results in delays or the inability to evaluate the capability to detect targets, the completeness and cost effectiveness of the evaluation will be adversely impacted for no good cause. A well executed PoA clears up a lot of ambiguity. (4) Development of a PoA. A PoA is developed by (a) Identifying the evaluation issues. (b) Identifying all the measures that will be used to address the issues. (c) Decomposing each measure into derived and primary data requirements. (d) Defining the data requirements. (e) Identifying the test conditions under which the measures will be taken (i.e. through identification of all the variables that affect the measures). (f) Mapping the primary data requirements to the system data model. e. LENS. Lens charts, approved at the ESR, are used to investigate evaluation areas usually at three or four lens levels: component functionality, system or SoS, FoS, and overall mission effectiveness. The AST may further decompose the evaluation of functional areas into sub-areas and measures as shown in figure 3-9. (1) The lenses making up a lens chart are best regarded as three or four distinct categories of measurement covering component level testing (DT), system or SoS level testing, FoS testing, and mission level measurements ATEC Pamphlet June 2010

93 (2) Recall that SoS is defined in AR 70-1 as A set or arrangement of interdependent systems that are related or connected to provide a given capability, and that an example of an SoS is a network of interdependent information systems. Other examples include the Warrior Extended Range Multi-Purpose (ER/MP) Unmanned Aerial System (UAS) and the Excalibur. Testing at this level may address interoperability or networking as would be the case if we were evaluating Army Battle Command System (ABCS). A FoS is defined as, A set or arrangement of independent systems that can be arranged or interconnected in various ways to provide different capabilities. An example of a FoS is a unit of action that includes armor, infantry, artillery, and combat support systems. As such, testing at this level will include addressing interoperability or networking (local or wide area) functionality. It is system level testing aimed at assessing a system s capability to interface with other systems. (3) Measures of mission accomplishment must be defined independently of measures of SoS or FoS networking and measures of other functional areas (system level testing minus local and wide area networking). If the same or similar measures occur at two or more lenses, redundant measures and confusion will result. The highest order lens may be regarded in terms of measures of tactical mission accomplishment or perhaps higher level Army tasks. Lower level lenses may be regarded in terms of measures of component or system performance. Notice that with or without lens charts, individual evaluation report products will address ESS, functional areas, and progress in meeting criteria given in the COIC and KPP as well as tactical mission accomplishment. (4) In addition to the functions called out in the evaluation issues (e.g., KPP, COIC), the strategy of associating a unit s mission with missions and tasks in the AUTL and JUTL is a useful aid in identifying tasks. Some systems, such as intelligence and communications systems, support joint operations. For these systems, the UJTL may be a useful guide. Generally, however, the AUTL is used as an aid in identifying tasks to serve as the context for evaluation. ATEC Pamphlet June

94 Figure 3-7. Lens Chart Example (5) Additional specific tasks for team and collective tasks can be drawn from appropriate Army Training and Evaluation Programs (ARTEP) manuals. Such tasks are most appropriately included at the platform level. Also of note in this example is the integration of modeling and simulation to address the lethality of the munitions at the platform and SoS levels. (6) As the AST completes the lens charts they should indicate which of the tasks, issues and measures contain or are KPP, COI, or other formal requirements. f. Mission-Based Test and Evaluation (MBT&E). As an organization, the Army is inherently mission-based. The value of all doctrine, tactics, materiel solutions, and training efforts tend to be measured in terms of their contributions to mission success. In practice, this determination is not always made. In the test and evaluation of Army systems there is a tendency to pass or fail a system based on specification compliance. (1) The purpose of MBT&E is to focus test and evaluation on mission analysis and a system s contribution to bringing about the effects intended by the higher commander. (2) The focus on mission analysis and a system s contribution to accomplishing the mission is intended to preclude evaluations based only on the extent to which a system meets key performance parameters and criteria accompanying critical issuses. In addition to evaluating 3-28 ATEC Pamphlet June 2010

95 how well system performance meets standards, the evaluator must also evaluate the system s contribution to accomplishing the overall mission. (3) The embrace of MBT&E isn t a matter of choosing one basis for evaluation over another, but more to ensure that the impact on mission success is not neglected. Whether or not COICs are answered and improvements in system functionality are demonstrated, is still important to the customer and hence, to the evaluator. But, the ultimate measure of acquisition success from the perspective of the Warfighter will always be how well the system contributed to meeting mission objectives safely and effectively. (4) MBT&E is a procedure for analyzing unit missions, analyzing systems supporting unit missions, and measuring unit and system effectiveness. MBT&E is characterized by, and can be defined in terms of the following processes: mission analysis, system analysis, system to mission mapping, and measure development. (5) Mission Analysis. Mission analysis starts with the higher Commander s mission statement, intent, and concept of operation. A mission is a task given to an individual or unit. A mission statement describes the task(s), its purpose, and the reason for undertaking the task. The statement contains the elements of who, what, when, where, and why. The Commander s intent describes what constitutes successful task execution. The concept of operation describes how to accomplish the task. The higher Commander s intent, along with the mission statement, is restated in terms of missions and tasks given to subordinates (FM 5-0, paragraph 3-15). (6) Tactical Levels of Missions and Tasks. Higher Commanders issue operation orders (OPORDs) to subordinate Commanders. The orders include mission statement, the Commander s intent, and concept of operation. FM 3-0, Operations, states that the components of full-spectrum operations are offensive, defensive, and stability or civil support operations along with associated primary tasks undertaken to defeat the enemy and achieve the Commander s end state. These operations, along with the primary tasks, have three levels of subordinate tasks called Tactical Mission Tasks (TMTs), warfighting function tasks (WFTs), and system tasks. Figure 3-9 depicts the relationship between the levels of missions and tasks. (7) System Analysis and System Tasks. Any Army organization, regardless of branch or echelon, performs tasks related to one or more of the WFs, known as system tasks. A system task is an individual task performed by an operator on a specific system in support of a WFT, for example employ the Paladin howitzer to destroy the enemy by direct fire. System Field Manuals (FMs) and system maintenance manuals contain these tasks. System components (for example, optical lenses that allow an operator to detect targets) enable these low-level tasks. (8) During system analysis, system components, functions, and tasks are identified. For example, the Warrior UAS has EO/IR, Laser, and Missile components. The EO/IR components have detect, locate, and identify functions; the laser component has a guidance function; and the missile has a kill function. Each component has associated operator tasks. (9) System to Mission Mapping. In this step, the subordinates missions and tasks are related to system tasks performed by an operator. This results in a mapping of system ATEC Pamphlet June

96 components and functions to different levels of missions and tasks. Figure 3-8 provides an example of mapping the mission and tasks to system and operator functions. (10) This procedure is a detailed decomposition of missions into tasks and subtasks, a decomposition of a system into components, functions, and tasks, and a mapping of the mission tasks and subtasks to system components and functions. Mission decomposition can often be performed in terms of the missions and tasks given in FM 7-15, The Army Universal Task List (AUTL). Figure 3-8. Mapping of System and System Functions to Tasks (11) Measure Development. Measures should be developed to assess the following: (a) System performance. (b) Mission accomplishment or accomplishment of tactical tasks (unit capability). (c) The contribution of system performance to mission accomplishment. (12) System performance can usually be measured quantitatively. Measuring mission accomplishment and the impact of system performance on mission accomplishment is often done using military judgment (for example, Soldiers measure the degree of mission accomplishment using a 5-point rating scale) ATEC Pamphlet June 2010

97 OPORD with mission statement, etc., given to combined arms unit Operation: Offensive, defensive, and stability or civil support operations and primary tasks undertaken to defeat the enemy and achieve the commander s end state. (FM 3-0, p. 3-7; FM 7-15, Chapter 8) 7) Unit capability OPORD with mission statement, etc. given to combined arms unit Tactical Mission Tasks: An activity performed by a combined arms unit, e.g., Attack by Fire an Enemy Force/Position (ART 8.5.1). Unit capability Commander assigns tasks verbally to smaller unit Warfighting Function Tasks: A collective task that supports a tactical mission task, e.g., Conduct Surface to Surface Attack (ART ), Conduct Lethal Direct Fire against a Surface Target (ART 2.4.1). Unit capability Commander assigns tasks verbally to a small unit or an operator System Tasks: An individual task performed by an operator on a specific system, e.g., Employ the Paladin howitzer to destroy the enemy by direct fire (Tasks found in system field manuals or maintenance manuals). System Functionality Capability: The ability (i.e., potential) to achieve the desired effect under specified standards and conditions through combinations of means and ways to perform a set of tasks. (CJCSM, p. GL-5: Parenthetical added) Concept of Operation: A statement that directs the manner in which subordinate units cooperate to accomplish the mission and establishes the sequence of action a force will use. (FM 3-0, p. 5-11) Systems are characterized by system attributes components which include (e.g., system optical components lens, radar) and (e.g., functions. optical lens, A function radar) and is an functions. action for A which function a system is an action is designed for which and used a system (e.g., is detect designed targets). and used (e.g., detect targets). Mission: The mission is the task, together with the purpose, that clearly indicates the action to be taken and the reason there fore. In common usage, especially when applied to lower military units, a duty assigned to an individual or unit, a task (JP1-02). (FM 5-0) Warfighting Function: A group of tasks and systems (people, organizations, information, and processes) united by a common purpose that commanders use to accomplish missions. (FM 3-0, p. 4-3) Figure 3-9. Tactical Levels of Missions and Tasks ATEC Pamphlet June

98 (13) Scaling MBT&E. The system evaluations will always address key performance parameters and criteria associated with critical issues. The number of primary tasks, tactical mission tasks (TMTs), and warfighting function tasks (WFTs) addressed in the evaluation strategy needs to be established based upon the complexity of the system under evaluation and the resources available for testing. (14) In most cases (ACAT III systems), TMTs are not assessed in a system evaluation. A TMT would only be assessed if a FoS (consisting of two or more systems from two or more WF areas) were played or simulated as part of the test scenario. (15) If a FoS is under evaluation, at least one TMT and two or more WFTs should be considered. For example, if ABCS, Abrams M1A2, and Paladin M109A6 were a FoS under assessment, the evaluator may want to consider assessing the TMT direct fire (ART 7.5.1) and the WFTs: command and control (ART 5.0), direct fire (ART 1.4.1), and indirect fire (ART ). (16) If a SoS with multiple WFs is under assessment, at least two WFTs need to be considered. For example, the Warrior UAS supports two WFs: the reconnaissance, surveillance and intelligence WF (ART 2.3) and the direct-fires WF (ART 1.4.1). (17) If a system has a single major WF, at least one WFT needs to be considered. For example, the FMTV 5-ton Cargo Truck has the primary function surface transportation of cargo and equipment (ART ). Note that if the FMTV 5-ton Cargo Truck is the only cargo truck being assessed, its assessment is also the assessment of the WF. (18) The above three cases are depicted in table 3-5. Examples of systems and corresponding WFTs that should be considered for evaluation are given in table 3-6. Table 3-6. Three Categories of Systems and Minimum WFT/TMT Evaluation Considerations System Size of Tested Unit Minimum Operational Tasks Capability Sets/Packages (for example FCS Spin-outs) Combined-Arms Unit 1 TMT and 2 WFTs from different WF categories Multi-WF SoS (for example, Warrior and ARH) Smaller Unit 2 WFTs from different WF categories Single WF System (for example, M2 HB Machine Gun, FMTV 5-ton Cargo Truck Small UAS, ABCS) Smaller Unit 1 WFT 3-32 ATEC Pamphlet June 2010

99 Table 3-7. Example Systems and Corresponding WFTs System Warrior UAS ARH MLRS Paladin Small UAS ABCS FMTV 5-ton Cargo Truck M2 Machine Gun WFT ART Reconnaissance, ART Lethal Direct Fire ART Reconnaissance, ART Lethal Direct Fire ART Conduct Surface to Surface Attack, ART1.2 - Conduct Tactical Maneuver ART Conduct Surface to Surface Attack, ART1.2 - Conduct Tactical Maneuver ART Reconnaissance ART Manage Tactical Information ART Move by Surface ART Lethal Direct Fire g. Design of Experiments (DOE). An evaluator is interested in assessing system performance to determine system capabilities and limitations. In addition to determining system capabilities and limitations, an evaluator should also be able to explain why a system performed well or poorly. A good test design provides the evaluator with the information needed to explain why a system performed well or why it performed poorly during test. In order to do this, the evaluator must be able to identify the conditions impacting system performance and be able to assess performance differences. (1) The primary purpose of any evaluating system performance with quantitative techniques is to produce an estimate of future system performance once the system is deployed in a combat circumstance. This estimate is an assessment of expected system performance, based on test results. However, since the test results are only an estimate of system performance, there is risk associated with the possibility the test results are not representative of future system performance. Therefore, the test estimate must be matched with a corresponding confidence interval. (2) This confidence interval represents a range of estimated values predicted for actual system performance. The purpose of any confidence interval is to mitigate the risk associated with the test estimate. To go along with the confidence interval is a level of confidence, which is the probability the actual system performance will fall between the lower and upper bounds of the confidence interval. The level of confidence should be 90 percent or higher. The purpose of a high level of confidence is so ATEC can strengthen the conclusion from test results by stating ATEC is at least 90 percent sure the actual system performance will fall within the stated range of values. (3) The analyst is able to identify the conditions impacting performance by using an design of experiment, and is able to assess performance differences using hypothesis testing. Hypothesis testing is a formalized method for posing questions in such a way that statistical techniques can be applied to the collected data, so detailed, unambiguous, and authoritative ATEC Pamphlet June

100 conclusions can be drawn. The purposes of a design of experiment are to isolate the effects of variables and to enable tests of hypotheses about their effects. Understanding these concepts is a prerequisite to designing an experiment. (4) There are four general steps in the use of a design of experiment. The first is in the development of the design. Design of experiment must be developed by, and have the concurrence of, the entire AST. This step must also take into account the need to accurately simulate the expected operational environment of the system. The second step is the execution of the test event in accordance with the design of experiment. The design of experiment developed by the AST must be incorporated into the test event. Third, the data generated at a test event must be captured and housed in a database. In addition, the conditions under which the data are generated must be linked with the measures of system functionality (i.e., dependent variables). Test conditions must be linked with the measures so that data sets may be easily populated. The fourth step is to create data sets and analyze the data using hypothesis testing. h. Measures of Effectiveness, Measures of Performance, and Measures of Association. The JCIDS manual specifies that The CPD defines a single increment of the performance attributes (key performance parameters, key system attributes, and other attributes) to support a MS C decision. The measures used are appropriately referred to in the context of performance as in key performance parameters, or measures of performance. The performance attributes are often what the PM is required to deliver. But they are not the military effect or measure of operational effectiveness required for achieving the primary purpose of OT&E. Rather, they are generally system-particular performance parameters. As such they should be referred to as measures of performance (MOP) and not measures of effectiveness (MOE). It is unacceptable to DOT&E and ATEC in evaluating and reporting operational effectiveness, suitability, and survivability to parse requirements and narrow the definition of mission accomplishments so the MOPs are confused with MOEs. (1) A measure of effectiveness assesses the combined contribution of multiple systems. A measure of performance assesses the effects of a single system. (a) Measure of Effectiveness (MOE): A measure of mission accomplishment, where mission refers to the assignment of one or more tactical mission tasks to a combined arms unit. A MOE can also be a measure of mission accomplishment where mission refers to the assignment of one or more tasks to a unit at the WF Function level. A measure of a WFT is a MOE when the WFT is supported by two or more systems. A measure of a WFT is a MOP when the WFT is supported by a single system. (b) Measure of Performance (MOP): A measure of system functionality. (c) Measure of Association: Assess contribution of MOPs to MOEs. i. Aggregating Measures. To summarize the results of testing, measurements are aggregated or rolled up into higher-order measures. (1) Measures reflecting the outcome of a sequence of related tasks can be expressed as simple identities. The proportion of targets destroyed, for example, depends on the proportion of targets engaged (i.e., fired upon). The proportion of targets engaged, in turn, depends on the 3-34 ATEC Pamphlet June 2010

101 proportion of targets detected, located, and identified. The proportion of targets destroyed is a MOE while the proportion of targets engaged, detected, located, or identified are MOPs. (2) Military judgment is often the default means for rolling up lower order measures into higher order ones. A military Evaluator derives an overall conclusion regarding the effectiveness and suitability of a system based on an assessment of the relevance and importance of issues, criteria, measures and outcomes. In effect, the Evaluator uses an intuitive weighting system and implicitly assigns a subjective weight to each measure for each issue and then combines the results into a conclusion. Judgments become more difficult as systems become more complex. (3) Another option for aggregating lower order measures into higher order ones is to apply a weighted linear combination of measures. The relevant T&E community explicitly assesses the importance of each issue and measure and assigns a weight to each measure. The weights, though explicit, are still produced intuitively and can be challenged. (4) Military judgment is often combined with a third approach in which measures are categorized by a functional dendritic. The presumption is that changes on lower level measures will be reflected in changes on higher level measures. j. Rock Drills. Rock drills are conducted for all ACAT I and ATEC CG/ED designated systems. The rock drill will include the determination of the task attributes (schedule, timelines, dollars, work involved, responsibility, resources, instrumentation, etc.) relationships with or links to or from other tasks, responsible agencies or personnel, and other data, as appropriate. It will identify issues and disconnects. The rock drill process culminates in an integrated schedule that ATEC, across the command, will commit to, execute, and manage to as a baseline. A rock drill may be held for non oversight programs as required. See appendix P for additional information Evaluation planning reviews and documentation a. Preliminary Early Strategy Review (ESR). A preliminary ESR is an early discussion of what the test and evaluation strategy should contain with the ATEC technical leadership. The AST may not have in-depth details, yet but should be able to discuss the concept of operations, key mission tasks, COIC/KPPs and their user rationale, analysis used in the AoA, important factors and conditions. Follow-up discussion will be scheduled as needed. A preliminary ESR will be held for all ACAT I as well as all programs on the OSD Oversight List. AEC may hold a preliminary ESR on other programs as required. A sample agenda for the preliminary ESR could discuss such issues as: (1) System Description. (2) Concept of Operations. (3) Key Mission Tasks. (4) COIC/KPPs. (5) Potential Data Model. ATEC Pamphlet June

102 (6) User Rationale. (7) AoA information. (8) Factors/conditions. b. Early Strategy Reviews (ESRs). Now that the evaluation team has developed a robust evaluation outline, it is time for them to obtain ATEC leadership guidance and approval. The forum for approval of the evaluation methodology is the ESR. This review will not be a PowerPoint presentation sent to ATEC leadership for approval but will be scheduled as face-toface discussions between the AST and the appropriate approval authority. In cases where a faceto-face meeting is not possible, a video teleconference will be an acceptable method of communication. (1) The ESR is an ATEC internal open meeting chaired by the ATEC CG/ED or designee (for oversight systems) or by the Director, AEC or designee (for non-oversight systems). All SCAs senior leadership and AST members will be invited to participate in the ESR. As appropriate, the Capabilities Developer or the Program Manager (PM) may be invited to participate in the ESR. This could occur if there are outstanding issues that must be resolved in order to execute the evaluation strategy. (2) By 90 days after the AST standup (when each SCA identifies the AST members), the AST Chair should set a date for the ESR, noting the date in ADSS and sending a Microsoft (MS) Outlook meeting request to ESR Chair (ATEC CG/ED or AEC Director). Microsoft Outlook is a common tool across the ATEC command. The targeted timeline for conducting an ESR is 180 days after establishment of an AST. This is only a planning date timeline. The actual date may vary based on the approved acquisition strategy and program status. (3) Before formal development of the ESR briefing, the evaluation team first briefs the rest of their AST on the draft evaluation outline and strategy and addresses all comments. Although AEC is the evaluator, and the lead for the evaluation strategy, it is important to have tester understanding and buy-in early. While it is important for the AST to have a consolidated strategy at the ESR, the plan should not be in stone; the reason for this briefing is to obtain leadership guidance and ASTs should be prepared to modify their strategy as a result of guidance given at the meeting. Additionally, the Evaluator should present the evaluation strategy to the T&E WIPT for comment prior to conduct of the ESR. (4) With AEC in the lead, the AST prepares the ESR briefings. (a) Although the briefing concentrates on the evaluation strategy, it contains general information, such as the AST composition, system description, mission description, key documentation status, COICs, KPPs, program schedule, and evaluator concerns. If special instrumentation or long-lead item requirements are expected, they should be included in the briefing. ASTs will use the same instrumentation technology resources for both DT and OT events, but not at the expense of compromising data requirements. (b) The ESR is not the place to discuss specific test events or data collection methodologies. If the PM has identified potential developmental or operational test events in his 3-36 ATEC Pamphlet June 2010

103 program schedule, acquisition strategy, TES, or TEMP, include them on the program schedule chart, but prior to the ESR the AST should not participate in any event planning if possible. (c) The briefing focuses on the evaluation strategy, lens chart, potential data model being used, and identifying information and methodologies. The purpose of this meeting is to get high-level advice on and approval of the evaluation strategy and general data requirements the T&E CIPR is the meeting where HOW required data is collected. (d) The briefing includes a schedule for the production of documentation to support the program and evaluation requirements. Initially, this schedule may include the system level documentation based upon the known requirements in the milestone decision review process. Once approved at the ESR, the milestones will be recorded in the ATEC System database, currently ADSS. ATEC DCSOPS and the AST members will monitor completion of required documentation and take actions as required to avoid late submissions. (e) A format is provided in appendix F. An example agenda for an ESR is provided in figure 3-11, below: ATEC Pamphlet June

104 Introduction - AST Composition ESR Example Agenda - System Description (include any special test configurations) - Mission Description - Operational and Organizational (O&O) - Program Schedule - Key documentation status Evaluation Concepts and Strategy - Lens Chart(s) - MBT&E approach - DOE Evaluator Concerns - Priority measures - Kinds of information that must be captured for the evaluation (Performance, RAM, Survivability, ILS, MANPRINT) - Methodologies that are necessary to capture the required data - Analysis plan for treatment of the data to build the system evaluation - Common instrumentation for both DT and OT - Data Model with agreed upon definitions of data elements as well as data relationships within the evaluation data model - High risk areas - Limitations - Safety Issues - Other concerns Other Items as Necessary (include those that are applicable) - Draft T&E milestones (mandatory) - Draft Parts III and input to Part IV of the TEMP (mandatory) - Special instrumentation/long-lead item requirements (if known) - Potential high cost items - Potential unusual or lengthy event requirements Figure ESR Sample Agenda (5) AST Chair responsibilities for the ESR include coordination of the ESR agenda, ensuring the ESR is on the Commander s (through ATEC DCSOPS) or Director s (through AEC Command Group) calendar, inviting SCA senior leadership and non-atec attendees as appropriate. (6) At the conclusion of the ESR and approval of the evaluation strategy a designated member of the AST will prepare and distribute the minutes of the ESR ATEC Pamphlet June 2010

105 (7) The tester members of the AST draft the T&E Concept In-Process Review (CIPR) briefing. The T&E CIPR gains leader approval of the test strategy designed to collect all data required to support the evaluation strategy approved at the ESR, and is generally conducted 30 days after the ESR. (8) The AST prepares the initial draft System Evaluation Plan (SEP). The AST, led by the evaluator, develops the detailed measures, factors and conditions, sample size requirements, and other significant information. The strategy forms the basis for development of supporting event requirements in Chapter 4 of the SEP. (9) The AST will begin to finalize input to the TEMP. (10) If at any time after the ESR is approved significant programmatic changes occur which will impact the approved T&E strategy, the AST should regroup, adjust the T&E program, and schedule an In-Process Review (IPR) to get ATEC leadership approval. The TEMP may require updates, as well. c. T&E Concept In-Process Review (T&E CIPR). With an approved evaluation strategy and the basic data requirements laid out, the AST can now finalize tests and other data collection events. The purpose of the T&E CIPR is to obtain leadership approval of the basic testing/simulation strategy. (1) The T&E CIPR is conducted in a similar manner as the ESR. It is an open meeting chaired by the ATEC CG/ED or designee (for oversight systems) or by the AEC Director or designee (for non-oversight systems). All SCAs Senior Leadership and AST members will be invited to participate in the T&E CIPR. If needed, representatives from other organizations may be invited (2) By 30 days prior to the ESR, the AST Chair should schedule the T&E CIPR, noting the date in ADSS and sending a MS Outlook meeting request to ESR Chair (ATEC CG/ED or AEC Director). The targeted timeline for conducting a T&E CIPR is 30 days after the ESR. The T&E CIPR date will generally be briefed as part of the ESR. (3) Before formal development of the T&E CIPR briefing, the AST will meet to discuss the types of events which would be best suited to generate the required data for the evaluation. The AST will develop a strategy that avoids duplication of testing and makes optimal use of limited T&E resources, integrating DT with OT if possible. Once the AST has a general test framework, the Tester members of the AST each build and expand the details for each of the events. The testers then brief the whole AST and address all comments. This generally occurs prior to the ESR. Likewise, the overall test strategy should be presented to the T&E WIPT for comment prior to the T&E CIPR. (4) The AST should not move too far forward with event planning until after the T&E CIPR. The test strategy should remain flexible until it has leadership inputs and approval granted at the T&E CIPR. (5) Once the AST has a consolidated test strategy, the tester members lead the AST in preparing the briefing for the T&E CIPR. The principal portion of the briefing should present ATEC Pamphlet June

106 the AST s proposed strategy for conduct of events to provide the data required for the evaluation. The focus of the briefing should be on the identification of the proposed events, and discussion of allocation of requirements to the events. The Introduction to the briefing will include an abbreviated Data Source Matrix (DSM) which maps the evaluation dendritic approved at the ESR to the proposed test events. See paragraph 3-7d(4)(f) below for a further discussion of DSMs. (6) A format for the DSM is provided in figure 3-13 and in appendix F. A sample agenda for a T&E CIPR is provided in figure 3-12 below. T&E CIPR Sample Agenda Introduction (Usually briefed by AST Chair) Purpose AST Membership Mission Description System Description (include any special test configurations). Overall Program Schedule Summary of ESR Abbreviated Data Source Matrix DT Events (Usually briefed by DTC Test Manager) DT Event 1 o Type of Event o Description/Concept o Entrance Criteria o Measures o Methodology o Limitations o Instrumentation o Safety Issues o Model/Simulations o Milestones DT Event 2 DT Event N Figure T&E CIPR Sample Agenda 3-40 ATEC Pamphlet June 2010

107 OT Events (Usually briefed by OTC test officer) OT Event 1 o o o o o o o o o o o o o Type of Event Description/Concept Entrance Criteria Measures Methodology Limitations Instrumentation Model/Simulations Milestones Issues OT Event 2 Major variables, controls, and treatments for the event Execution, to include schedules, unique event phases, tactical context, threat, and test situation concept. OT Event M Support package status Other Events (Each event usually briefed by event director/tester) Event 1 Event P Conclusions (Usually briefed by AST Chair) Concerns High risk areas Limitations Unusual requirements Other items deemed necessary Input to system TEMP Cost (transparent) Figure T&E CIPR Sample Agenda (Continued) (7) The AST chair responsibilities for the T&E CIPR include coordination of the agenda, ensuring the meeting is on the Commander s (through ATEC DCSOPS) or Director s (through the AEC Command Group) calendar, inviting SCA Senior Leadership, and inviting non-atec attendees as appropriate. ATEC Pamphlet June

108 (8) At the conclusion of the T&E CIPR and approval of the data collection strategy: meeting. (a) A designated member of the AST will prepare and distribute minutes of the (b) The AST will refine the draft SEP and begin drafting the OTA Test Plans (OTA TPs) for each of the events approved in the T&E CIPR. The Evaluator has the lead for the SEP, and the Tester has the lead for each OTA TP. (c) The AST chair will refine input to the system TEMP. The AST chair will finalize input to Parts III, IV, and V of the TEMP and forward them to the PM. (9) If at any time after the T&E CIPR significant programmatic changes occur which will impact the approved T&E strategy, the AST should regroup, adjust the T&E program, and schedule an In-Process Review (IPR) to get ATEC leadership approval. The TEMP may require updates as well in this event. (10) The ESR and T&E CIPR may be combined into a single meeting with the prior approval of the ESR/T&E CIPR Chair. d. System Evaluation Plan (SEP). The intent of the SEP is to document ATEC s plan for executing the ESR and CIPR-approved integrated system T&E strategy. The SEP is prepared by the AST, with different AST members leading development of each chapter and appendix. The plan describes the method for assessing system effectiveness, suitability, and survivability, and evaluating the contribution of the system to overall mission capability. The SEP also describes how ATEC will identify system capabilities and limitations, performance risks, and impact on mission capability. (1) The SEP is focused in four evaluation domains: (a) System Effectiveness. System Effectiveness is used here to mean the ability of the system to meet user requirements such as critical criteria and KPPs. (b) System Suitability. (c) System Survivability. (d) System Contribution to Mission Capability. (2) These four domains provide the basis for evaluating capabilities and limitations at the platform, SoS, or FoS level. The evaluation dendritic is overlaid across these domains, so that each domain contains evaluation areas, supported by their evaluation focus areas and measures. The combined evaluation of these four domains provides the basis for ATEC s integrated system evaluation. (3) For multi-service and joint programs where ATEC is not the lead OTA, the AST will provide Army unique input to the lead OTA. Their input is documented in the Army Input to Evaluation Plan (AIEP) and also in ADSS ATEC Pamphlet June 2010

109 (4) The SEP format is flexible, however there is certain information a SEP should contain. There are generally four chapters and several appendixes. (a) Chapter 1, Introduction, describes the purpose of the evaluation, a description of the system and how it will be used in the field, and a short discussion of the program milestones, major test events, and planned evaluation reports. Development of this chapter is led by the AST Chair. (b) Chapter 2, Evaluation Overview, provides a general discussion of the evaluation strategy. Presented in chapter 2 are the KPPs, COIC, and the work breakdown structure/functional dendrite. Displayed in the WBS are the functional areas by ESS that address the KPPs and COICs. The WBS may be supplemented with a lens chart as appropriate. If a FoS is played in the test scenario, the lens chart should be used to display Army tactical mission tasks as well as the system functional areas. See figure 3-13 below. The Army tactical mission tasks are shown in the right hand side of the lens chart. Evaluation Considerations Joint Task Force or Components/Elements System of Systems Family of Systems (Core Criteria (Tactical Level) (Tactical Level) Multiple FoS (Tactical Level) Subsystem/System (KPPs, COIs/AIs ICD, Specs) CDD, CPD) Capabilities Capabilities Capabilities Capabilities Data Sources Mission Accomplishment DT Component Integration IGT OT IOT CE Certification Exercises Training ACTDs OA/OE Other Unit Procedures FUE M&S Effectiveness Lethality Network ISR Maneuver Interoperability Suitability Interoperability RAM MANPRINT ILS Transportability Survivability E3 EW Information Assurance NBC Soldier Survivability NWE Ballistic Effects Effectiveness ART 1.0 Movement and Maneuver WF ART 2.0 Intelligence WF ART 3.0 Fires WF ART 4.0 Sustainment WF ART 5.0 Command and Control WF ART 6.0 Protection WF Suitability Survivability Effectiveness ART 7.5 Conduct Tactical Mission Tasks Higher commander s intent of what the Force must do and the conditions Specified timeline Suitability Minimum expenditure of resources Unit s capability of continuing or being assigned future missions and operations Survivability Commander s risk assessment for fratricide avoidance and collateral damage Effectiveness TA 1 Deploy/ Maneuver TA 2 Develop Intelligence TA 3 Firepower TA 5 C&C Suitability TA 4 Log & CSS Survivability TA 6 Protect the Force TA 7 Operate in CBRNE Environment Overall Effectiveness, Suitability, and Survivability Figure Lens Chart Sample A factors and conditions table is also presented in chapter 2. The factors and conditions table is ATEC Pamphlet June

110 used to display the factors and conditions under which the system is expected to operate. It displays each factor, the conditions the factor may assume, and the method of controlling the factor. Factors may be systematically varied, randomized, held constant, tactically varied, or left uncontrolled. Randomized factors are not shown in the factors and conditions table. An example factors and conditions table for ground based radar is given in table 3-7. Conditions systematically varied (e.g., Aircraft Type) are changed in accordance with a time ordered events list. Conditions that are controlled by randomization (e.g., Aptitude). Conditions held constant (e.g., MOPP) are the same throughout the test. Tactically varied factors (e.g., Terrain) are changed in accordance with unit tactics, techniques, and procedures and are usually uncontrolled. Uncontrolled factors (e.g., Weather) are not managed and may bias test results. Table 3-8. Example Factors and Conditions Table for a Ground Based Radar Factor Control Type Conditions Aircraft Type Systematically Varied Fixed Wing, Rotary Wing Environment Systematically Varied Benign, Electronic Warfare Range Systematically Varied Near, Mid, Far Flight Profile Systematically Varied Transit, Attack MOPP Held Constant MOPP 0 Weather Uncontrolled A high level discussion of the design of experiments and analytical techniques used to assess system performance under the different factor and conditions should also be given. The PoA, data model data dictionary, data sets, and DSM should be referenced in the appendixes. This chapter is prepared by the Evaluator. (c) Chapter 3, Evaluation Details, provides the details of how the evaluation analysis will be conducted. At a minimum, the chapter shows the evaluation outline down to the measure level, and discusses how the evaluator plans to combine and analyze data to address ESS, EAs, COIs, EFAs, COICs, and KPPs. Descriptions of the evaluation measures should include a complete definition of the measure including rationale for why that measure is required, a description of the data required to address the measure including conditions under which it should be collected, and a proposed analysis methodology for the measure including a description of anticipated statistical treatments. Additionally, no more than 30 measures should be identified as priority measures for the evaluation (ideally 6 10). All measures will be shown in the DSM, included as appendix A of the SEP. This chapter is developed by the evaluation team ATEC Pamphlet June 2010

111 1. The system evaluation is an analysis of all data sources, usually including multiple Contractor DT reports when testing is witnessed by ATEC, Government DT reports, AOTRs/OTRs (meaning data from live test events), SME technical analyses, previous data (meaning historical data or archived data from previous tests), and M&S studies. The focus is on integrating all data from all sources to address the evaluation issues. The system evaluation develops conclusions and recommendations by determining capabilities and limitations at multiple levels of complexity starting with the individual system or its major subsystems and culminating with an evaluation of the contribution of the system to the unit missions based on all the data analyses. In preparing analysis reports, each finding must be supported by the data analysis of measures. The more the data analysis allows ATEC to understand the why, when, and how and not just what, the more information can be imparted to the MDA and the stronger the evaluation becomes. Without data analysis, there is no system evaluation, since there is no way to defend and support any conclusions and recommendations. 2. Analysis Planning. Proper data analysis must be planned in the SEP before any testing begins. For each Evaluation Area, the evaluator and analysts need to determine what questions must be answered. This will drive the determination of the measures, which are the basis for the type of data that needs to be collected. Both DT and OT type data, contractor data and M&S, need to be considered. In some cases, the collection of several types of data can lead to a more complete picture at analysis time. Sometimes small-scale simulations are more helpful in trying to find out what the data mean than trying to run a big combat model. 3. Design of experiments is used to structure test events such that the factors and conditions impacting system performance can be identified by analyzing the data. The design determines how factors are controlled, which measures are used, the data that must be collected, and the analytic techniques used to do the analysis. Tools used to represent design of experiments include factors and conditions tables (discussed above), test matrices showing how the factor and conditions are combined, and test logs listing the treatments making up the test matrices. a. The primary purpose of any evaluating system performance with quantitative techniques is to produce an estimate of future system performance once the system is deployed in a combat circumstance. This estimate is an assessment of expected system performance, based on test results. However, since the test results are only an estimate of system performance, there is risk associated with the possibility the test results are not representative of future system performance. Therefore, the test estimate must be matched with a corresponding confidence interval. b. This confidence interval represents a range of estimated values predicted for actual system performance. The purpose of any confidence interval is to mitigate the risk associated with the test estimate. To go along with the confidence interval is a level of confidence, which is the probability the actual system performance will fall between the lower and upper bounds of the confidence interval. The level of confidence should be 90% or higher. The purpose of a high level of confidence is so ATEC can strengthen the conclusion from test results by stating ATEC is at least 90% sure the actual system performance will fall within the stated range of values. ATEC Pamphlet June

112 c. A test matrix displays the different sets of conditions under which system performance is measured and is created by crossing factors and combining the conditions. Test matrices can be used to display full factorial designs and fractional factorial design. An example of a full factorial design for the factors and conditions in table 3-1 for the ground based radar is given in table 3-8. Sample sizes are shown in each cell of the design. Events should be structured in such a way as to avoid learning effects or the bias resulting from an uncontrolled factor. Factorial designs are discussed in greater detail in the Evaluator and Analyst Handbook. Table 3-9. Example Full Factorial Design Test Matrix for a Ground Based Radar Fixed Wing Rotary Wing Benign Electronic Warfare Near Mid Far Near Mid Far Attack Transit Attack Transit d. A test log is a listing of the combinations of factors and conditions making up each cell of the design of experiment. Test logs can be used by the tester to create a time ordered event list for the test being planned. For example, the first four runs in the above matrix are: (Runs are usually randomized.) Benign, Near, Fixed Wing, Attack. Benign, Near, Fixed Wing, Transit. Benign, Near, Rotary Wing, Attack. Benign, Near, Rotary Wing, Transit. e. The data elements necessary to do an analysis are the data describing the factors and conditions given in the design of experiment and the data needed to compute the measures that are analyzed. The data needed for the measures is identified in the pattern of analysis. These data should be captured in the system evaluation data model and are used to create data sets. The data sets are used to analyze the data. f. Before the evaluation design and data analysis procedures for each measure are finalized, there must be coordination with the tester(s). This coordination will either verify that the tester can collect the type of data required or cause a modification to the type of data, evaluation design, and data analysis for a specific measure. Finally, the method of data reduction and its incorporation into the level 3 database needs to be finalized. Table 4-2 provides data level definitions. All of these evaluation designs should be documented in the SEP, chapters 3 and ATEC Pamphlet June 2010

113 4. Data analysis involves a comparison of the system s performance observed during the test with respect to the SEP requirements as defined by the measures. In addition to computing estimates of the measures, it is important to examine the data for trends and to look for supporting evidence and similar results whenever possible. This could strengthen the conclusions. At the completion of the analysis, the goal should be to answer each question and also investigate the why or when. It is important to understand that as much information as possible should be gleaned from the data. Finally, data displays need to be developed that stand on their own merit and clearly explain the results of the analyst s findings. For more details on data analysis techniques, see appendix J of this document. Again, planned analysis and statistical techniques should be planned and described in the SEP to ensure all required data is available and to gain leadership approval of analytic technique. (d) Chapter 4 of the SEP describes each event that supports the evaluation. The description of each event should include the purpose, scope, design of experiment and any known requirements for long lead instrumentation, simulation, or stimulation. The factors and conditions under which data must be collected for the event should be discussed and sample sizes determined in any area that might drive test resources. All events shown in the DSM should be discussed in this chapter. The tester members of the AST are primary authors of this chapter, with each event description being developed by the organization who will execute that event. (e) Each SEP has at least three appendixes: the DSM; a list of COIs, COICs, and KPPs; and a matrix showing the schedule for approval of future OTA Test Plans (OTA TPs) which further detail the events supporting the evaluation. (f) The DSM links the evaluation dendritic from chapter 3 of the SEP to the events that will provide the data to address the dendrite s measures. Every event providing evaluation data will have a column in the DSM and will be discussed in chapter 4 of the SEP. The DSM identifies all supporting test and simulation events and allocates measures to those events. The purpose is to provide a crosswalk of all measures to the identified data sources. It is structured to show each level of the dendritic in the left columns and each identified data source across the remaining columns. Measures are allocated to the most appropriate event for generation and collection of data. Events which support a measure are designated as either a primary or secondary source of data for that measure. Each measure must have at least one primary data source. It is possible for a measure to have multiple data sources identified. Government DT should be further broken down by sub test. The DSM shows the contributions of each data source to the measures, enabling event planners to properly scope the requirements of the events. The DSM serves as a check and balance to ensure there is no unnecessary or redundant testing and that there is a planned source for all required data. All data marked (whether P or S) will be collected. The P indicated the primary source of the date while the S is an event providing additional information. A sample DSM is shown in figure 3-13, below. ATEC Pamphlet June

114 Survivability Suitability Effectiveness Market Survey CDT GDT* LUT IOT M&S FDTE Evaluation Domain EFA COI/ COIC Measure 1-2. Time 2-1. Shot time to Engage lines P P 2. Lethality 1-1. P kill/shot 2-1. P kill/shot P S S (etc.) 3. Interoperability 3-1. Messaging with ABCS S P P P 4 (etc.) (etc.) 5 (etc.) (etc.) 2-1. Ao 6-1. Ao P P S S 6. RAM 6-2. EFF P P 6-3. MR P P S S 2-2. MTTR 6-4. MTTR P P S S 7-1. Jamming S P S S 7 E E EMP S P 7-3. RFDEW S P S 7-4. TEMPEST S P (* GDT broken down by sub-test.) Figure Example Data Source Matrix (DSM) (5) SEPs are due no later than 90 days after the T&E CIPR (CIPR + 90) and at least 6 months before the start of the next major test event (whichever is first). This deadline is important, because the SEP is the foundation for a program s T&E; an approved SEP is the groundwork for the detailed event and data collection planning. The AST, with testers in the lead, develops the OTA TPs based on the SEP. (6) SEP Staffing and Approval. SEPs for programs on the oversight list are approved by the ATEC CG/ED or his designee. Non-oversight SEPs are approved by the AEC Director or his designee. The AST Chair is the action officer for staffing and gaining approval of the SEP. (a) Different sections of the initial draft SEP are authored by individual AST members from across the SCAs. Each author sends her input to the AST Chair who collates and edits them all to make a complete initial draft document. (b) The AST Chair alerts the AST members that an initial draft is available for their review, either via or by posting it in the VDLS. The AST members make comments on all chapters of the SEP, and the AST Chair is responsible for ensuring edits are made to the document and for keeping the master copy. This step of the SEP development process should be the longest. (c) Once the AST has developed a final draft SEP, each AST member should staff it through her Directorate for comment and approval for further staffing. The AST member is responsible for collecting and consolidating comments and forwarding them to the AST Chair to be addressed or included. This review is no higher than the O-6 level, and the review should not occur outside of the Directorates. This does not preclude individual SCAs staffing their originally-authored SEP input during the review outlined above to whatever level the SCA chooses. The final draft SEP, however, should not leave the test or evaluation directorate where 3-48 ATEC Pamphlet June 2010

115 AST members are assigned during this step. Within AEC, this review would go only as high as the Division Chief; the Directorate Technical Director or Director would not complete a review at this time unless desired. (d) With an AST directorate approved SEP, the AST Chair forwards the SEP to the Technical Editing division of their SCA. This is generally done by sending an and either attaching the document or posting the document in VDLS. The Technical Editor marks up the document using the MS Word track changes feature and return it to the AST Chair for acceptance or discussion/resolution of the changes. (e) At the same time the final draft SEP is sent to Technical Editing, the AST Chair sends the document to the SCA Security division. The purpose of this review is to ensure that the information contained in the SEP is appropriate to the markings and document classification. The Security reviewer proposes any necessary changes to the document using track changes and returns it to the AST Chair for acceptance or discussion/resolution of the proposed changes. (f) Once the AST Chair has a Technical Editing and Security-approved document, they prepare it with an ATEC Form 45 for SCA-wide staffing. See ATEC Regulation 1-2 for instructions on the preparation of the Form 45. The first reviewers should be the Technical Director and/or Director of the AEC evaluation directorate. After that approval, there should be concurrent staffing through the chain of command at DTC and OTC (if they are members of the AST). The level of signature on the Form 45 is determined by the SCA doing the review. As each organization provides comments to the AST Chair, the comments will be accepted or resolved. (g) After final comments and/or signature are received from the testing organizations, the AST Chair forwards the Form 45 to the AEC Technical Director for review. (h) After approval from the AEC Technical Director, the Form 45 goes to the AEC Director for approval. If there are still outstanding comments or non-concurrences, they should be resolved at this level if at all possible. (7) SEP Final Approval. (a) For non-oversight programs, the signature authority is Director, AEC. (b) For oversight programs, the signature authority is the ATEC CG/ED. 1. After the AEC Director signs the Form 45, the Form 45 is delivered to the ATEC Command Group s Staff Action Control Officer (SACO) via the Command Group mailbox. 2. The SACO reviews the SEP for format and grammar, and sends any proposed changes back to the AST Chair. 3. After resolution of the SACO comments, the SACO staffs the SEP with the Assistant Technical Director. Again, with any significant comments, the document is returned to the AST Chair for comment resolution. ATEC Pamphlet June

116 4. The document is next reviewed by the Technical Director. 5. After resolution of TD comments with the AST Chair, the document is forwarded to the ATEC CG/ED for final review and approval. 6. After resolution of CG/ED comments and CG/ED signature, the ATEC CG/ED will draft and sign a personal memorandum to accompany the SEP when sent by the AST Chair to the MDA; DOT&E; ASA(ALT); DCS, G-1; and the T&E Executive. 7. The SACO returns the signed SEP and signed personal memorandum to the AST Chair. The AST Chair makes any required changes to the SEP and then begins to prepare the document for distribution. (8) SEP Publication and Distribution. Once all changes are complete and the document is final, the AST Chair publishes the SEP in Portable Document Format (PDF). The PDF SEP should be posted two places in VDLS, both in the AST s library and in the ATEC document library. Next, an is prepared. (a) For non-oversight programs, electronic (e.g. PDF) copies of the SEP are forwarded to: Program Manager (PM)/Materiel Developer (MATDEV). DCS, G-8, Programs. DCS, G-4, Logistics. DCS, G-3, Operations. The Surgeon General (TSG). Capabilities Developer (CBTDEV), TRADOC Systems Manager, Functional Proponent (FP). Testing and Evaluation Working-level Integrated Process Team (T&E WIPT). ATEC HQ Technical Library. Joint Interoperability Test Command (JITC). U.S. Army Materiel Systems Analysis Activity (AMSAA) (if applicable). Army Research Laboratory (ARL) [Human Research and Engineering Directorate (HRED) and Survivability/Lethality Analysis Directorate (SLAD)] (if applicable). ATEC Technical Director (electronic copy only unless specific request) ATEC Pamphlet June 2010

117 AST Members (electronic copy only). to: (b) For oversight programs electronic (e.g. PDF) copies of the SEP are forwarded All addressees in paragraph (a) above. Milestone Decision Authority (MDA) (accompanied by a personal memorandum signed by the ATEC CG/ED). Director, Operational Test and Evaluation (DOT&E) (accompanied by a personal memorandum signed by the ATEC CG/ED). USD (AT&L)/(DDT&E) (accompanied by a personal memorandum signed by the ATEC CG/ED). TEO (accompanied by a personal memorandum signed by the ATEC CG/ED). DCS, G-1, Personnel (accompanied by a personal memorandum signed by the ATEC CG/ED). ATEC Technical Director (electronic copy only unless specific request). Overarching integrated product team (OIPT). (9) The AST develops the SEP while also coordinating with the T&E WIPT on development of the TEMP. The SEP and the TEMP are complementary documents serving different purposes. They are usually developed concurrently and both are based on the evolving strategies for evaluation and supporting event requirements. The TEMP is an agreement among the acquisition community which establishes what T&E will be done at the issue level of detail while the SEP is an ATEC document which explains how the T&E will be accomplished at the measure level of detail. e. TEMP. The TEMP is the agreement or contract among the acquisition community as to the T&E program for an acquisition system. Although the PM has overall responsibility for the TEMP, it is produced by the T&E WIPT. It is the overall summary of all T&E for a system. A TEMP is required for all Army acquisition programs except investigational drugs, biological, and medical devices, regardless of ACAT level. The TEMP is written at the issue level of detail, and can include all test events (not just those being used to support the evaluation.) Although the AST provides comment on all chapters of the TEMP, ATEC has primary responsibility for authoring Parts III (T&E Strategy) and IV (Resource Summary.) The TEMP is written in parallel with the SEP, but cannot be signed by ATEC until after SEP approval. See figure 3-14 below. The Defense Acquisition Guidebook (DAG) and DA Pamphlet 73-1 provide detailed TEMP policies and procedures. ATEC Pamphlet June

118 T&E WIPT TEMP AST Inputs ICD, CDD, SSP, SOW, COIC, KPP, Acq Strat O&O Develop Initial Evaluation Strategy ESR Finalize Evaluation Strategy Develop Data Source Requirements CIPR Final Coordination on T&E Strategy SEP Figure T&E Strategy Approval Process (1) Part I of the TEMP is the system introduction, and includes a mission description, system description, system threat assessment, program background, and key capabilities (KPPs and key system attributes (KSAs)). (2) Part II of the TEMP is the test program management and schedule. It will discuss the T&E responsibilities of all participating organizations (such as developers, testers, evaluators, and users). It also identifies the T&E organizational construct (i.e., T&E WIPT, LFT&E IPT) and describes (a) The requirements and methods for the T&E database. (b) Deficiency reporting. (c) TEMP updates. (d) The integrated test program schedule. (3) Part III of the TEMP is the T&E Strategy. Briefly describes how the T&E Strategy supports the acquisition strategy. This section summarizes an effective and efficient approach to the test program. Both developmental and operational test objectives are discussed. It provides an evaluation framework that describes the overall evaluation approach. It will include a toplevel evaluation framework matrix that shows the correlation between the KPPs/KSAs, CTPs, key test measures (MOEs/MOPs), planned test methods, and key test resources, facility or infrastructure needs. Again the DAG (chapter 9) provides more detail information. (4) Part IV of the TEMP is the resource summary. This section specifies the resources necessary to accomplish the T&E program. A list of all key T&E resources, both government and contractor, that will be used during the course of the current increment is included. The following test resources/shortfalls should be identified: (a) Test articles. (b) Test sites and instrumentation. (c) Test support equipment ATEC Pamphlet June 2010

119 (d) Threat representation. (e) Test targets and expendables. (f) Operational force test support. (g) Models, simulations, and testbeds. (h) Joint mission environment. (i) Special requirements. (j) Federal, state, and local requirements. (k) Manpower/personnel and training. (l) Test funding summary. (5) Staffing and approval process of TEMPs. Formal TEMP approval staffing should not occur until the supporting documentation (JCIDS documents (CDD or CPD), Acquisition Strategy, COICs, STAR) are, or are soon to be, approved. See figure 3-15 for TEMP processes. (a) The AST Chair is empowered to sign the coordination page for ATEC after coordinating it within ATEC. For ACAT I and programs on the OSD Oversight List, the PM signs and submits the coordinated TEMP to PEO, TRADOC, and ATEC at the same time. The goal is simultaneous concurrence. The PM forwards the signed TEMP to TEO for DA Staff review, if required, and the T&E Executive approval. The TEO forwards the Army approved TEMP to OSD for final approval. The process is the same as above for programs on Missile Defense Agency programs and for multi-service programs on the OSD T&E oversight list (when ATEC is the Lead OTA). (b) The TEMP staffing process is similar for non-osd oversight list programs. The T&E WIPT sign the coordination sheet. The PM forwards to PEO (if not the MDA), TRADOC, and ATEC for concurrent signatures. After getting these signatures, the PM forwards the TEMP to the MDA for final approval. (c) Internal ATEC signature authority for TEMPs is as follows: 1. For non-oversight programs, the TEMP approval authority is the Director, AEC or designee. After receiving the signed Form 45 showing the tester SCAs concurrence, if the AEC Director has no concerns, he will sign the cover page of the TEMP and the package is returned to the AST Chair. 2. For ACAT I and OSD oversight programs, the AST Chair continues staffing of the TEMP approval Form 45 from Director, AEC to the ATEC Command Group. ATEC Pamphlet June

120 a. After Director, AEC signs the Form 45; the Form 45 is delivered to the ATEC Command Group s Staff Action Control Officer (SACO) via the Command Group mailbox. b. The SACO staffs the TEMP serially with the Command Sergeant Major, the Command Surgeon, the Command Chief Counsel, the Chief of Staff, and the Technical Director s Assistant. Again, with any significant comments, the document is returned to the AST Chair for comment resolution. c. The document is next reviewed by the Technical Director. d. After resolution of TD comments with the AST Chair, the SACO provides the document to the ATEC CG/ED for review and approval. e. After resolution of CG/ED comments and CG/ED signature on the cover page of the TEMP, the SACO returns the signed TEMP to the AST Chair. (d) The AST Chair forwards the signed TEMP cover page to the PM for publication. When the PM publishes the final TEMP with all signatures, the AST Chair forwards the TEMP to all AST members and posts it in VDLS ATEC Pamphlet June 2010

121 TEMP DRAFTING PHASE T&E WIPT MATDEV MATDEV (PEO, PM, etc.) Draft TEMP CBTDEV TRADOC AST Chair Sys Evaluator Dev Tester Op Tester Continuous process of negotiations and compromises to edit the Draft Temp AST AST Chair Evaluator OTC Test Officer DTC Test Manager Logistician Others ATEC COORDINATION PROCESS MATDEV Draft TEMP T&E WIPT MATDEV CBTDEV TRADOC AST Chair Sys Evaluator ATEC Form 45 DTC Test Manager AEC Director of Directorate (COL) OTC Test Officer Dev Tester Op Tester Through respective chains of command Logistician Others DTC Test Management Director (for ACAT 1 and OSD O/S List) OTC Director of Directorate (COL) AST Chair ATEC FINAL APPROVAL PROCESS T&E WIPT MATDEV Final Draft TEMP ATEC Approved TEMP MATDEV CBTDEV TRADOC AST Chair Sys Evaluator Dev Tester Op Tester Logistician Others ATEC Form 45 ATEC goal approval date of <10 days DTC CG Director AEC ATEC ED (for ACAT 1 and OSD O/S List) OTC CG MATDEV AST Chair Figure TEMP Processes ATEC Pamphlet June

122 3-8. Continuous evaluation and evaluation trends a. Operational needs often dictate that systems that add capability be fielded as long as the operational risks are not perceived as too great. Traditionally, ATEC has conducted independent OT&E and developmental testing in support of milestone decisions. Evaluations were based on assessments of how well a system met specifications and user requirements derived from a slowly changing threat. Continuous evaluation incorporates evaluation relevant information that emerges at any point in the acquisition process, including through post deployment to system retirement. In theory the evaluations or assessments could be used to support a low rate initial production (LRIP), materiel release, or fielding at any time. b. The current trend is that continuous evaluation is ATEC s primary mission, and what was once traditional OT&E is conducted only if required by law or when the risks associated with fielding a system are too high. In order to make this transformation successful, the AST has to imbed itself into the acquisition process and continuously conduct evaluation with the emphasis on system fielding. In order to do this, the capabilities of a new system must be compared with fielded capabilities in any available venue. Some suggested venues are: (1) Pre-milestone A assessment of emerging technologies and concepts. (2) Warfighting Experimentation. (a) Joint Concept Technology Demonstration (JCTD). (b) Advanced Technology Demonstration (ATD). (c) Advanced Warfighting Experimentation (AWE). (d) Concept Experimentation Program (CEP). (3) Prototype development programs. (4) Contractor tests and demonstrations. (5) Developmental testing. (6) Traditional operational testing. (7) Joint Test and Evaluation (JT&E) programs. (8) Multi-Service OT&E. (9) Multi-National Test and Evaluation. (10) Service and joint training exercises. c. In every instance, the AST must attempt to conserve resources and expedite the T&E process. For example, if practical, in lieu of using FORSCOM as operators in a test event, the 3-56 ATEC Pamphlet June 2010

123 use of operators assigned to test agencies might be used. Also, instead of using a dedicated test event, the system might be inserted into test events conducted for other systems or by other agencies (e.g., DARPA) and countries (e.g., tests conducted by UK). For example, the realism of an OT for a tactical networking system is likely to benefit from a large number of other systems attempting to access and compete for the tactical networking resources. Such a test would be an opportunity for several other systems to collect operational test data. d. CE is used to provide a continuous flow of information and data to decision-makers, the MATDEV and CBTDEV across the lifecycle of an acquisition program. The data generated in early development phases is visible and maintained as the system moves into its formal testing, thereby avoiding duplication of testing. CE continues through a system s post deployment so as to verify whether the fielded system meets or exceeds thresholds and objectives for cost, performance, and support parameters. e. ATEC s mission includes CE of acquisition programs in order to provide an assessment or evaluation at any point in time, not just at the formal MDRs. Decisions involving LRIP quantities, materiel release or fielding may be made at points not associated with formal MDRs. Additionally, CE usually requires development of questions pertaining to normal logistical and supportability requirements for systems which may not be specifically addressed in the user s stated requirements. For example, RAM for the system life cycle may require total or supplemental development with the AST in order to provide a comprehensive evaluation of the system. f. The objective of CE is to provide impartial assessments of system progress. It is a form of risk analysis. CE allows the evaluator to be informed of system progress throughout the lifecycle and to feed information back to the PM and developer in order to improve the system at times other than acquisition decision points. Another benefit of CE is that it decreases the AST learning curve just before decision points, as they ve been involved with the program all along, not just at T&E events. CE begins during the mission need determination and concept exploration phases and continues through post deployment support to system retirement. g. The traditional paradigm was to conduct independent OT of production representative items against the anticipated threat using operators and maintainers under realistic combat conditions. The system requirements were the source of the test measures and the overarching goal was to support readiness for production and to reduce acquisition risks. The emerging paradigm is to use contractor and DT, contractor demonstrations, training exercises, experiments, and other alternative test events in lieu of traditional OT. The use of independent, dedicated OT is limited to when required by law or when risks are assessed as high. Items under development are tested in an environment of an unpredictable changing threat using contractor operators and maintainers. Integrated OT is conducted in a SoS/FoS environment as much as possible, and testing determines the degree to which the system adds capability as compared to what already exists. The overarching goal is to assess operational risks, assess whether mission accomplishment is likely, and determine if the system is safe and survivable and can be trained, fielded, and supported. Paragraph 5-12 provides information on what is expected from the ATEC assessment or evaluation report(s) (OAR/OMAR/OER/OFER). ATEC Pamphlet June

124 h. Evaluation support from outside ATEC. ATEC receives evaluation support from many other Army agencies, such as Medical Command (MEDCOM), Research, Development and Engineering Command (RDECOM), and Surface Deployment Distribution Command (SDDC). By agreement between the Army Materiel Command (AMC) and ATEC, data analysis support is provided by AMC s Army Materiel Systems Analysis Activity (AMSAA) and the Army Research Laboratory s (ARL) Human Research and Engineering Directorate (HRED) and Survivability and Lethality analysis Directorate (SLAD). AMSAA s core missions and functions include item and system level performance analysis, logistics analysis, modeling and simulation (M&S), and verification and validation (V&V). HRED provides support in the areas of human factors and manpower and personnel integration (MANPRINT) analysis. SLAD provides survivability/lethality analysis. Requirements are identified and documented annually in a joint support plan. (1) Prior to the planning period for the follow-on fiscal year, the AST coordinates their requests with the respective organizational focal points and gains approval for the support needed. In those cases when AMC cannot support the ATEC request with mission funds, ATEC can contract with AMC or other agencies on a reimbursable basis. The AST identifies requirements for analysis support from AMC by providing the following information: (a) Work statement. (b) Deliverable (including type of analysis product expected). (c) Required completion date (based on ATEC timelines and program milestones). (d) Justification for analysis (i.e., based on COIC, STAR, ICD, CDD, CPD, specification requirements, etc.). (e) Delineation of existing related data and past/ongoing/projected efforts performed by other agencies. (2) The AST analysts and key subject matter experts from the AMC support activity jointly discuss and clarify the evaluation needs and reach preliminary agreement on the proposed analytical support. After obtaining approval from their management, the AMC activity provides statements of these agreements as input to the joint AMC/ATEC Support Plan for the appropriate fiscal year. (3) If appropriate, ATEC will schedule top-level management reviews with the support organizations to discuss and clarify all requests and priorities. ATEC prepares the finalized joint support plan for approval of the commander/directors from the participating organizations. i. Use of data from other DOD and foreign nations. This paragraph discusses OSD sponsored Foreign Comparative Testing (FCT) Program (10 U.S.C. 2350) and the International Test Operating Procedures (ITOPs). ATEC personnel are encouraged to use any appropriate data that other DOD agencies or other nations have gathered on similar test programs to avoid duplication of effort and enhance the evaluation ATEC Pamphlet June 2010

125 (1) The FCT program involves the NATO and non-nato Allies defense equipment to determine whether such equipment meets valid existing DOD needs. Policies and procedures for the execution are documented in DOD M-2. NATO Comparative Test Program has been integrated with the FCT program. (2) The intent of international test procedures standardization documents is to shorten and reduce costs of the materiel development and acquisition cycle by minimizing duplicate testing and to improve the interoperability of U.S. and Allied equipment. They also promote the cooperative development and exchange of advance test technology and expand the customer vase. (3) ITOPs are one type of standardized test procedures document commonly used within ATEC. ITOPs are a cooperative effort of France, Germany, the United Kingdom, and the U.S. Other increasingly important resources for international test procedure standardization are NATO Standardization Agreements (STANAGs) and Allied Procedures. More information on ATEC participation in the development and use of international test procedure standards can be found by contacting APGR-TEDTStandards@us.army.mil j. Requirement analysis. In broad terms, the requirements analysis identifies all the variables needed to evaluate the system. These are derived from the system issues, measures, and performance conditions which are given in the KPPs, COIC, and any AI and measures identified by the Evaluator. (1) KPPs are found in the CDD. The CDD also contains the system Operational Mode Summary/ Mission Profile (OMS/MP). The OMS/MP describes a systems expected operating conditions. COIC are found in the TRADOC COIC document for the system or in the TEMP. AIs are added by the AST if needed and testable measures are generated by the AST for assessment purposes. The CJCSM C, Universal Joint Task List (UJTL), and FM 7-15, Army Universal Task List (AUTL) should also be used as a resource to help determine the functions a system must perform and measures of system performance. The Army Training Evaluation Program (ARTEP) manuals can also be used to help determine the functions (or tasks) a system must perform and measures of system performance. (2) A system s ESS are subdivided in terms of one or more EAs. EAs will be defined differently based upon system types. The particular EAs attributed to a system s ESS will differ based upon the type, nature, and mission of the system. Identification of the EAs by the Evaluator is a key initial step of the overall evaluation process. For each EA, one or more measures that will be used in the evaluation of the area should be listed. Measures may be narrow reflecting the specific, technical characteristics of a system or they may be broader addressing various levels of mission accomplishment. Measures are identified with data requirements, which, at the lowest level, are satisfied by appropriate data elements. The Evaluator is also concerned with assessing a system s contribution to a unit s capability to accomplish missions. The AUTL is a useful resource for providing an accurate mission context for developing measures to assess a system performance and mission success. Measures must always be quantifiable and testable. For more detail, see paragraphs 3-3 to 3-6 for using UJTL and AUTLs. ATEC Pamphlet June

126 (3) Typical products of the requirements analysis are the functional dendritic, factors and conditions, and the PoA. The functional dendritic is a representation of the EAs under study for a particular system and is a hierarchical presentation of functions performed by the system. It may be supplemented with a lens chart (see paragraph 3-6 on Lens charts). While the functional dendritic focuses on system functionality, the lens chart addresses system functionality and mission accomplishment. The distinction is important because while a system s functionality may meet technical standards and criteria given in the COIC, it may not contribute to accomplishing a unit s mission. The factors and conditions represent the planned test conditions and identify which test factors will be under control and how (and why). The PoA is used to represent the decomposition of measures to data requirements. (4) The DODAF products are useful tools for the Evaluator because they convey the details of system functionality within an SoS and/or FoS context. The DODAF is the standard way to organize system and/or enterprise architectures in complementary and consistent views. CJCSI D defines JCIDS required DODAF products to support capability development and acquisition. The PM uses existing DODAF products and also develops new ones. They are used to guide system development and are required to satisfy NR-KPP requirements, in particular with respect to interoperability certification and standards compliance. The AST can use the products to help identify data requirements and test architectures. (5) The Evaluator can exploit the AUTL and/or UJTL for the identification of measures. The process involves the assessment of a system s requirements and related measures in terms of a unit s mission and related tactical tasks. The AUTL provides specific measures pertaining to the WFs of Movement and Maneuver, Intelligence, Fires, Sustainment, Command and Control, and Tactical Mission Tasks and Operations. These measures are used to assess WFs and tactical mission task accomplishment independent of any particular system. By associating a system with one or more tactical mission tasks and one or more WFs, the AUTL can be used to help determine the tasks or functions a system must perform and can therefore be used to determine MOEs and MOPs. k. Software Analysis. One evaluation consideration for the AST is software performance. System evaluations require oversight of the software development process particularly when software has a significant contribution to the system s capabilities. As an integral portion of the system, software impacts the system s ESS. Careful analysis of the software performance establishes whether the system s software enables the system to meet user requirements. Analysis of the software also supports a risk analysis of the system s readiness for T&E for software-intensive systems like C4ISR programs. (1) The system s software must be assessed under conditions required in the system operational profile. The software evaluation must be based on more than the formal OT that takes place at the end of the system development. Evaluating the software capabilities requires aggressive, early assessment of the system s technical and functional characteristics using all available sources. Thus, evaluating the software must start early and continue throughout the acquisition process. (2) The system evaluator uses the CE process to determine the software s capability to support the user s requirements. The system evaluator identifies the qualitative and quantitative 3-60 ATEC Pamphlet June 2010

127 characteristics of the system s software that will impact the system s capability to support its mission and develop plans to evaluate that software. The system evaluator must consider and understand any factors that may inhibit realistic DT or OT of the software. Risk analysis techniques will help the system evaluator better understand the impact of the limitations and verify whether the software can support the system s mission and meet the capability requirements. Key to this assessment is an understanding of the PM s processes for managing and engineering the system and close coordination with the PM office to ensure that the system has adequately mature software prior to entering operational test. (3) There are a variety of software testing methods used to detect errors and develop software performance metrics. The analysis of the software establishes whether the system s software meets the system and user requirements. The SEP for a software intensive system should include measures to assess the following information: (a) Software features and components that allow the system to perform its required operational mission. CTPs. (b) Software maturity and software performance measures, as part of the system s (c) Key software maturity thresholds as entrance/exit criteria to proceed to the next level testing. (d) Data supporting effective planning of life cycle support for the software product being developed. (e) The relationship between the system objectives and the software characteristics that affect the system missions. (f) The relationship between the system and mission evaluation focus areas that have been identified for the software. (g) The analysis and evaluation criteria that will demonstrate compliance with the software technical performance requirements. (h) The relationship between the software functions being tested and the systemlevel test events and scenarios. events. software. (i) The methods and measures that will identify traceability of requirements to test (j) Any factors that may inhibit realistic developmental and operation test of the l. Systems entering IOT should not possess any known Priority 1 or 2 Problem Change Reports (PCRs) that preclude robust evaluation of critical operational issues. Appropriate impact analyses of Priority 3 PCRs are required to assess the software maturity for IOT. All parties involved with the software configuration management must review the impact analyses and ATEC Pamphlet June

128 recommend whether ATEC proceed, delay, suspend or terminate the operation test. OTC will establish configuration control of the software for the OT in accordance with established OTC policy to ensure that the software under test is not changed during the test period or only changed after agreement by ATEC, PM, and the user that such changes are required and documented. At OTRRs 2 and 3, the PEO must certify, and ATEC must agree, that the software is stable, that software and interface testing of sufficient breadth and depth has been performed, and that the required functionality has been successfully demonstrated at the system level (DUSA(OR) memorandum, Initial Operational Test and Evaluation Software Entrance Criteria, 2 Jul 03). Once the maturity level has been demonstrated, the system or increment is baselined, and a methodical and synchronized deployment plan is implemented for all applicable locations. m. For more in-depth information regarding software capabilities evaluation, refer to appendix Q or appendix S, or visit the Army Software Metrics Web site at n. Data analysis. Knowing the analysis purpose and understanding the characteristics of the data being analyzed is essential in any data analysis. The analysis purpose is either descriptive or inferential. When descriptive, the purpose is to describe a distribution of sampled data. Quantitative dependent variables are described using the arithmetic mean and standard deviation. The mean and standard deviation are considered parameters that are associated with a distribution. Therefore, when statistical analysis is to be performed on this kind of data, parametric methods are used. If data are badly skewed, medians and the interquartile range (i.e., 75 th percentile minus the 25 th percentile) may be used. Qualitative dependent variables are described using counts, percents and proportions. When the analysis purpose is inferential, the purpose is to make inferences about a population characteristic based on a sample of data. Hypothesis testing is used for this purpose. When analysis is directed towards data that is not associated with a distribution, then nonparametric analysis methods are used. The evaluator and analyst handbook provides additional information. o. Assessing Risk. The traditional paradigm has been to conduct independent OTs of production representative items against the anticipated threat using Soldier operators and maintainers under realistic combat conditions. The system requirements were the source of the test measures and the overarching goal was to support readiness for production and to reduce acquisition risks. The emerging paradigm is to use contractor and developmental testing, contractor demonstrations, training exercises, experiments, and other alternative test events in lieu of traditional OT. The use of independent, dedicated OT is limited to when required by law or risks are assessed as high. Items under development are tested in an environment of an unpredictable changing threat using Soldier or contractor operators and maintainers. Integrated operational testing is conducted in a SoS/FoS environment as much as possible, and testing determines the degree to which the system adds capability as compared to what already exists. The overarching goal is to assess operational risks, assess whether mission accomplishment is likely, and determine if the system is safe and survivable and can be trained, fielded, and supported. Paragraph 5-12 provides information on what is expected from the OMAR provided at milestone A or B ATEC Pamphlet June 2010

129 3-9. The Ilities This paragraph provides a brief description of the areas of study and methodologies used to address issues regarding Reliability, Availability, and Maintainability (RAM); Survivability; Integrated Logistics Support (ILS); and Manpower and Personnel Integration (MANPRINT). a. Reliability, Availability, and Maintainability. RAM has a direct bearing on mission success, as well as on logistical considerations such as maintenance workload, sparing, level of repair decisions, training, and other operating and support cost factors. (1) Reliability is the duration or probability that a system can perform a specified mission for a specified time in a specified environment. Mission reliability is the reliability associated with completion of a specific mission profile. (2) Maintainability is a measure of the ability of an item to be retained in, or restored to, a specified condition when maintenance is performed by personnel having specified skill levels and using prescribed procedures and resources, at each prescribed level of maintenance and repair. It reflects the ease and efficiency of performing both corrective and scheduled maintenance on a system. (3) Availability is the probability that a piece of equipment is in an operable and committable state at a given (random) point in time. Repair, maintenance, and administrative and logistics downtime are the most common causes of equipment non-availability for use. A system s availability is a function of its reliability and maintainability. (4) The System Evaluator, in coordination with other members of the T&E WIPT, is responsible for determining the extent and nature of RAM data required for the RAM portion of the system evaluation. See appendix G for detailed information on RAM. b. Survivability. Survivability against the full spectrum of battlefield threats must be considered in all system acquisition programs, including new developments, Non-Developmental Items (NDI) acquisition, and system modifications/upgrades that can impact the system s ability to withstand the specified threats. (1) The evaluation of survivability must address the system s capabilities to avoid/evade as well as withstand the effects of expected threats. Survivability evaluations will consider the following areas: (a) Electromagnetic Environmental Effects (E3). (b) Information Assurance (IA). (c) Nuclear, Biological, and Chemical (CBR). (d) Nuclear Weapon Effects (NWE). (e) Electronic Warfare (EW). ATEC Pamphlet June

130 (f) Obscurants and Atmospherics. (g) Soldier Survivability (SSv). (h) Ballistic Effects. (2) Each survivability evaluation is focused on the susceptibilities of the system and tailored to address the operational requirements of the CBTDEV. The methodology incorporates the CBTDEV s mission critical tasks for the candidate system and addresses operational implications of survivability, including the Soldier and TTPs, in the survivability measures. (3) Additional information on Survivability can be found in appendix H. c. Integrated Logistics Support. The Army s ILS program provides for all the necessary support resources to ensure the supportability and readiness of the system when fielded. (1) The Army logistician (HQDA, ASA(ALT) ILS) facilitates the development and integration of the ILS elements (see AR ) for all assigned acquisition programs. Additional ILS information can be found in appendix Q. (2) The ILS evaluator works closely with the Army logistician and the acquisition community through the IPT process to provide a continuous assessment of the logistics support of a program and any associated software. (3) The MATDEV provides an ILS manager who will be the focal point for all ILS actions for the program and who chairs the Supportability IPT (SIPT). (4) The ILS evaluator is a member of the SIPT and provides a continuous assessment of the system to ensure that readiness and supportability objectives are identified and achieved. The evaluation strategy will (a) Ensure the ILS assessment considers compatibility with the testing strategy (b) Ensure that a Logistics Demonstration is conducted in accordance with AR 700- (c) Identify, track, and report logistics supportability deficiencies and shortcomings. (d) Provide for testing of the system s logistics support concepts, doctrine, organization, and hardware and ancillary materiel in the intended environment. (e) Provide continuous evaluation of the system throughout its life cycle and provide data as required. d. Manpower and Personnel Integration. MANPRINT is an engineering analysis and management process to identify and articulate requirements and constraints of human resources, human performance, and hazards to personnel to ensure the human is fully and continuously considered in the system design. The assessment of MANPRINT is an essential element of a 3-64 ATEC Pamphlet June 2010

131 system s evaluation strategy at each decision point. MANPRINT considerations include the ramifications to overall system status when human performance degrades over time. Also, recommendations as to how risks to system performance due to MANPRINT issues can be mitigated should also be addressed. Both system design and operator/maintainer issues can be a source of MANPRINT issues. (See AR ) (1) The MANPRINT program includes seven domains: (a) Manpower deals with the number of people in the force structure, irrespective of skill level, required to sustain operations under combat conditions and to maintain and support a system. (b) Personnel addresses the ability to provide qualified people for specific skills needed to operate, maintain, and support a system. levels. (c) Training considers time and resources required to develop the correct skill (d) Human factors engineering (HFE) considers the characteristics of people (physical, cultural, mental) that must be addressed in designing a system (known as an ergonomic science, this addresses all aspects of the -machine interface). (e) System safety considers the safety engineering principles and standards necessary to optimize safety within the bounds of operational effectiveness, time, and cost. (f) Health hazards consider conditions that can cause illness, disability, or reduced job performance. (g) Soldier survivability (SSv) considers fratricide, killed in action, and wounded in action prevention. (2) The AST will typically include a representative from a MANPRINT domain agency such as ARL-HRED. This representative will serve as a MANPRINT Evaluator or Analyst. The MANPRINT Analyst works with the AST to develop an effective strategy to produce valid, reliable, quantitative and qualitative data early and iteratively, providing rapid feedback to the MATDEV s system engineering process. (3) Measures supporting MANPRINT evaluations will often include parameters related to the accuracy, consistency, reported ease of use and time required to perform a task using the system. (4) MANPRINT analysis is best practiced an iterative, continuous feedback loop with the MATDEV throughout the design process, rather than as a decision-oriented go, no-go assessment of MANPRINT compliance provided just prior to the milestone decision. ATEC Pamphlet June

132

133 Chapter 4 Test Planning and Execution 4-1. Overview of test planning a. The test planning process generally consists of the review of a variety of functional area requirements that may vary significantly dependent upon the type of test related activities. Tests and experiments will normally require most, if not all, of the system s functions to be exercised. Other events, such as market investigations or modeling and/or simulation activities may require performance of only a subset of the areas. The sequence is: (1) Identifying test requirements from appropriate sources. (2) Developing the design for the test. (3) Identifying test control and scenario and/or test schedules. (4) Identifying data management, training resources, instrumentation, administrative and logistical, and other appropriate requirements for the test. b. Performance of these functions generally falls into phases consisting of preliminary analysis and planning, test design, and detailed test planning procedures. The results of preliminary analysis and planning and test design are documented in a test planning document, usually an OTA TP. The results of detailed test planning procedures are documented in the executing command s test execution plan that contains the details required for day-to-day test execution, usually a DTP. c. An overview of the test planning process is described below and shown in figure 4-1. (1) The goal of any test is to collect all the data that is required for the assessment or evaluation; for tests which do not support an evaluation, the data may be collected by the test sponsor (PM, commercial customer, etc.). Thus, the starting point for test planning should be a review of the data requirements. The DSM and the factor and conditions and design of experiment required for each measure from the SEP are the foundation for test design. (2) The test design process identifies the independent, dependent, and uncontrolled variables; the treatment of the independent variables to produce the desired effect on the dependent variables to generate required test data under the appropriate conditions; and required sample size to provide desired level of confidence in test results. Comparison is an additional consideration of the overall test methodology. This may include comparison of a new system to a baseline or to specific standards, performance of an organization with the system to an organization without the system, or to obtain specific data pertaining to elements of system design or performance requirements. The condition under which the test is to be conducted also greatly impacts the test design. Simulation of operational combat conditions and tactical operations may require greater degrees of test design than for other types of test. The degree of detail of test design may vary significantly dependent on the type of test, number of independent variables, and test environment requirements. ATEC Pamphlet June

134 (3) Certain requirements for test design may be met by predetermined standing operating procedures that do not change significantly from system to system. This is especially true for certain types of DT requirements. Other test design requirements may necessitate the creation of a complex test design involving player forces, real time casualty assessment (RTCA), and considerable operational environment simulation from test source material. These requirements, individually or in combinations, occur for many OT tests. Regardless of the methodology and degree of depth required for the test design, the core test design forms the basis for all other test planning requirements. (4) Test designs are clearly and comprehensively described in the test planning document. The test design should provide the overall methodology and design for conduct of the test. Essential information should be shown in a format that most clearly shows what is to be performed and how it will be performed. Overviews of phases, expected or required sample sizes, and organization of trials in accordance with the various combinations of independent variables should be shown in tabular or graphic form that provide for best understanding of the information. Descriptions of other key information should be structured to paint the picture for the decision-maker and other readers. Clear understanding of the design is critical for all personnel and will ultimately lead to a better-executed test. d. The test planning process depicted in figure 4-1 shows a number of sources as inputs to the overall process, many other potential sources exist for specific types of events or for unique event requirements. (1) Test planners must consider all identified sources in determining overall requirements to ensure the test provides usable and creditable information for the overall purpose. The results of the planning are documented in the OTA TP and/or the DTP. (2) External test conducted outside of ATEC will normally use an OTA TP prepared by the AST as general test guidance, but will use the external agency s documentation for the test execution plan. (3) The test organization that requires user participation will prepare a TRP. The TRP identifies and schedules the required resources and provides administrative information necessary to support each test. The TRP will be submitted through the TSARC process for resource approval and required tasking actions. (AR 73-1, chapter 9 and appendix E of this document provide additional TSARC information.) 4-2 ATEC Pamphlet June 2010

135 (Repeat for each test) TEST PLANNING PROCESS Figure 4-1. Test Planning Process Test Support Packages, OMS/MP, ICD, CDD, Customer Requirements Preliminary Planning and Analysis Test Resource Plan (TRP) or other resource document Approved SEP Test Officer Develops Draft OTA TP(s) Leadership Review & Approval Refine Resource Requirements Develop Test Execution Plan Complete detailed event planning requirements Finalize, staff, and obtain approval of OTA TP (Many OTA TPs may be developed from one SEP) Test Execution Figure 4-1. Test Planning Process ATEC Pamphlet June

136 4-2. Developmental and operational testing a. A variety of testing must be planned and accomplished, either to confirm program progress or to conform to statutory dictates. It is through testing that DOD validates the performance requirements identified during the JCIDS requirements development process and promised in the APB by the PM. Appendix K provides a list of acceptable Army test names. (1) Traditionally DT has been conducted in preparation for OT. DT determines whether a system undergoing acquisition meets performance specifications as outlined in requirements documents (e.g., ICD/CDD/CPD). Software thread testing, for example, assesses the capability of a software intensive system to perform each specific function the system is intended to perform. When DT is successful, OT can focus on operational issues. Successful DT provides a smooth transition to OT. To the extent operational conditions can be implemented during DT, the data from DT may be used to answer operational issues, and OT can be tailored to address the remaining unanswered questions. For example, if operational message loads are simulated during DT, it may be possible to minimize the amount of interoperability testing during OT. (2) OT is conducted to assess the capabilities of a system in the environment in which it is intended to operate. The environment must accurately simulate red and blue force conditions (i.e., simulate the expected threat and friendly situations). For example, if a system is intended to operate in the arctic under jamming conditions using Soldier operators and maintainers, the system must be tested in an arctic environment and under jamming conditions using Soldier operators and maintainers. If a system is intended to interoperate with other systems (i.e., exchange information) during high intensity conflict when there will be the greatest use of communication channels, the system must be tested during simulated combat with peak message loads. b. DTs carried out by DTC encompass a wide range of engineering-type, technical testing. DTs can be carried out at all stages of technology maturity and throughout the acquisition process. DT is intended to verify that a system s technology is technically capable of meeting KPPs. In simplest form, DT provides data to answer the question Does this thing work? (1) A Logistics Demonstration (LD) is conducted by the PM/MATDEV to: (a) Evaluate the disassembly and re-assembly of the system using the support equipment, and the particular tools and associated support items of equipment. (b) Evaluate the supportability of the material design. (c) Evaluate the adequacy of maintenance planning for the system, to include maintenance concept, task allocation, troubleshooting procedures, and its peculiar support equipment. (d) Evaluate the preliminary System Support Plan (SSP), including interface compatibility of any Test Maintenance, Diagnostic Equipment (TMDE) and support equipment with the material system. (e) Review the technical data. 4-4 ATEC Pamphlet June 2010

137 (f) Evaluate any embedded Built in Test (BIT)/ Built in Test Equipment (BITE) diagnostics application, TMDE procedures in the draft technical manuals. (2) The MATDEV is responsible for the overall conduct of the LD and develops an LD plan and prepares the final LD report. After an LD, a MAC is held in conjunction with MANPRINT WIPT meetings. The MAC considers all sources of program information to include test and experimental data, observations, subject matter expert insights, and other knowledge that affect the capability of users to operate, maintain, and support the system and perform mission tasks. The MAC discusses MANPRINT issues and based on their severity assigns a rating of critical, major, or minor to each issue. c. After it is known that a system meets its technical requirements, OT is used to determine the impact of the system with respect to overall mission requirements with operational units (Soldiers, squads, platoons, etc.). (1) In its simplest form, OT provides data in support of answers to these questions: (a) Will the system work in an operational environment? (b) Will the Soldiers use the system as it is intended? (c) How will Soldiers address system failures or employ workarounds? (d) All things considered, does the system represent a net improvement to operational capability? (e) Does use of the system introduce other issues or problems affecting Mission capabilities? (2) A Limited User Test (LUT) is any type of RDT&E funded user test conducted that does not address all of the ESS issues and is therefore limited in comparison to an IOT that addresses all ESS issues. A LUT addresses a limited number of operational issues. The LUT may be conducted to provide a data source for system assessments in support of the LRIP decision (MS C) and for reviews conducted before IOT. The LUT may be conducted to verify fixes to problems discovered in IOT that must be verified prior to fielding when the fixes are of such importance that verification cannot be deferred to the FOT. A LUT may also be used or required to (a) Support a NDI before materiel release. (b) Support reprocurement. (c) Supported limited procurements. (3) A LUT cannot be used to piece-meal IOT through a series of limited objective tests or to circumvent statutory requirements for IOT. ATEC Pamphlet June

138 d. The Evaluator develops evaluation plans that draw on test resources provided by DTC and OTC to produce data that will determine whether a system has sufficient performance, safety, and operational ESS characteristics relative to key performance parameters. Government witnessed contractor tests can also be leveraged for evaluation purposes Combined and integrated DT/OT a. Considerations for integrating or combining DT/OT. A well-designed system test program considers the use of combined and integrated DT/OT. The goal of combined/integrated DT/OT is to streamline testing, save resources, avoid test duplication, generate data at the earliest opportunity, increase early involvement of the OT community, and generally improve the T&E of acquisition programs. ASTs, in coordination with the T&E WIPT, must look objectively at test types and expected outputs to determine the best and most cost efficient way to generate the data required for system evaluations. (1) The requirements of the developmental or operational environment coupled with statutory and regulatory requirements will almost always require some degree of separate DT early in the program and separate OT late in the program. But combined and/or integrated DT/OT will be conducted when it is judged to be the most effective and efficient test method to support the evaluation requirements. Combined or integrated DT/OT should be scheduled where it makes sense. (2) An approach of integrating DT and OT is to conduct DT in an operational environment with the use of realistic threats, operational conditions, and scenarios consistent with the program s OMS/MP whenever possible. This increases the utility of the DT results as they are used in support of addressing the system evaluation, as well as provides early insight to operational use. Other potential benefits include testing at a single location, most efficient test strategy, reduced transport cost, early user input, and a limited operational environment to LFT&E. Measures collected under similar conditions during DT and OT can be analyzed together to increase confidence in the evaluation. (3) For integrated testing to be successful, it is important that the pedigree of the data be understood and maintained. The pedigree of the data refers to accurately documenting the configuration of the test asset and the actual test conditions under which each element of test data was obtained. b. DT/OT test definitions. ATEC has defined two methods for conducting DT/OT, combined and integrated. (1) Combined DT/OT is a single, multi-phased test that produces data to answer developmental and operational system issues, but not simultaneously. It is usually conducted as a series of distinct phases, each either DT or OT, at a single location using the same test items. Combined DT/OT is encouraged to achieve time, cost, and resource savings. However, it should not compromise DT and OT objectives. (2) Integrated DT/OT is a test with a single phase being used to simultaneously meet a subset of system developmental and operational issues that generates data to address 4-6 ATEC Pamphlet June 2010

139 developmental and operational issues simultaneously under operational conditions. The execution strategy for this test is based on the requirements of the program. Integrated DT and OT has the potential to answer both DT and OT questions efficiently in terms of the time and resources normally required, but it is also the most difficult to execute as it requires maximum coordination and cooperation among members of the test community. This is essentially a control issue. Issues regarding who controls the testing are more easily resolved when testing is combined and conducted in distinct DT and OT phases. When collection of DT and OT measures are integrated, there must be voluntary cooperation where all parties stand to benefit. The developmental test team, operational test team, and evaluation team must develop a test management structure to share control of the event. The lead agency for the integrated DT/OT is determined through the SEP approval process. c. Considerations for combining or integrating testing. Combining or integrating DT and OT events are most feasible prior to an LRIP decision. Testing in support of full rate production of major defense acquisition programs (MDAPs) (ACAT I and II or DOT&E oversight systems) is subject to constraints imposed by Title 10. First, the contractors affiliated with the companies involved in the development or modernization of the system under test is not allowed to participate in the system testing. The limitation does not apply to contractors that will be involved in the operation, maintenance, or support of the system when it is fielded. Second, production representative systems are required. Third, Soldier operators must be used. It may be as simple as using Soldier operators in an Early User and DT format. A post-ms C test might be a production qualification test (PQT) combined or integrated with a LUT. A post-full rate production decision test may be a production verification test (PVT) with a follow-on operational test (FOT). (1) ATEC supports using combined or integrated DT/OT where the test issues allow. In early phases of a program, the test activity will be mainly focused on technical and performance evaluation to establish technical validity, resolve design problems, and support development of a mature, production representative design. At this stage, much of the test activity may not directly address operational considerations. However, the goal of test integration at this stage should be to assure that operational considerations are considered in the test design and in resolution of technical problems and corresponding design changes. (2) Integrated DT/OT following MS C must be carefully considered before inclusion into the test strategy. OTs during these acquisition phases leverages considerable resources. Including DT with these tests could potentially add significant risk to conduct of the OT (system readiness, user safety risks, user personnel training). Additionally, since OT is normally conducted at the operator s home station, there must be consideration of whether DT can be properly executed at that location. (3) Combined DT/OT can generally be conducted within all phases of the acquisition program cycle. Key considerations include the location for the DT/OT, the availability of a Safety Release for user personnel operating the system in the OT phase, use of system contractors, and confidence that the system can transition to the OT phase following the DT phase. ATEC Pamphlet June

140 (4) When the AST is developing the test strategy, consideration should be given to the following options for combined and/or integrated DT/OT: (a) Collecting OT data during DT. (b) Collecting DT data during OT. (c) Conducting DT under expected operational conditions (from the OMS/MP). (d) Using military operators during DT, to provide early operational insight. (e) Using the same military test participants. This will provide OT Soldiers more experience on the test systems, ensuring that the Soldiers are more representative of those who would use the mature, fielded system. It will also provide early user influence in the design allowing the hardware to mature sooner. (f) Using M&S in conjunction with live testing to accomplish some degree of integration. (g) Using the same data collectors for DT and OT. This will ensure the data disseminated in the Test Incident Reports (TIRs) are consistent, making it easier for the evaluator to understand and use the data. (h) Using the same instrumentation. This will eliminate redundant development and ensure instrumentation developed will meet both the DT and OT communities requirements. (i) Using common questionnaires and data forms to facilitate data handling and summarization by the evaluators. d. Test management methods. (1) For a combined DT/OT, ATEC requires that an operational test officer from OTC be in charge of the OT phase of the test. It may be appropriate to use the OTC test officer as the assistant test officer to the DTC test officer for the DT phase and reverse roles during the OT phase. (2) Management of the integrated DT/OT will be determined through recommendation of the AST and confirmed by the SEP approval process. Normally, integrated DT/OT that is predominantly collecting technical data will be managed by DTC. All other integrated DT/OT should be considered for management by OTC to ensure that the OT requirements for operational environment and any statutory or regulatory requirements are met. e. System contractor involvement constraints. Application of the contractor involvement limitations can usually be made without undue difficulty in the separate phases of any combined DT/OT. Clear understanding of actions considered permissible in both phases is needed prior to test execution. This will ensure that all concerned understand the constraints and the point at which DT ends and OT begins. 4-8 ATEC Pamphlet June 2010

141 f. Management of DT/OT data. Prior to any combined or integrated DT/OT test, data management must be carefully planned and then well-executed during the test. It is essential that all personnel from test agencies, PM offices, Capabilities Developer agencies, and the system contractor recognize and understand the processing differences in managing DT and OT data based on statute. g. Benefits. The largest benefit of combined and integrated DT/OT is the potential time and cost savings to the overall program through reduction in scope, number, and duplication of test requirements. Other benefits include collection of data earlier in the acquisition cycle and early operational user involvement. The earlier issues are discovered through testing, the easier they are to address. Overall program test requirements can be reduced by combined and integrated DT/OT. The key risk to DT/OT is to the integrity of the evaluation; the AST must work diligently to ensure that the quality of the evaluation is not jeopardized by joining DT and OT tests. h. Support to required evaluations and assessments. The end result of the combined or integrated DT/OT is information provided to support the system evaluation. When developing the DSM, the AST must challenge conventional thinking about the source of data for each measure. Can data be collected at both DT and OT tests and combined for the evaluation? Can DT be performed under operationally realistic conditions with representative operators so the data can be combined with OT data and used for the evaluation? Can DT data be collected during an OT, perhaps as a side test or during an excursion, to save test resources and inject operational environment? The goal of DT/OT is to get the best system test and evaluation strategy possible, maximizing data from all sources, within the resources available which provide complete, credible information to decision makers and Warfighter as early as possible. i. Common language. When different test agencies participate in the same test event and exchange data, a common language is required. A common language is required at two levels. First, it includes terms used to talk about T&E, such as, issue, mission, and measure. Second, it includes the language used for evaluating a specific system, such as, detection, slant range, and slant angle. A common language requires standard data definitions and formats. A common language will enforce the specific definition of variables and conditions, and interpretation of results. j. Instrumentation. ASTs will use the same instrumentation technology resources for both DT and OT events, but not at the expense of compromising data requirements. ASTs will include SMEs from DTC and OTC when planning DT and OT events. For any conflicts that the AST cannot resolve, the ATEC, Test Technology Directorate-Modeling, Simulation, and Instrumentation Division will also serve as a resource. Instrumentation requirements will be specifically addressed during the preliminary ESR, ESR, and the CIPR and documented in the SEP. k. Test design. Integrated DT/OT also presents the greatest challenge to design. The AST must consider how to design an event to produce valid data for both DT and OT. In particular, the AST must consider how best to assign factors and conditions and how the events should be structured. ATEC Pamphlet June

142 l. Factors and conditions. Probably the most readily thought of requirement for integrating DT and OT is to enhance the operational realism of DT. This is usually done by giving consideration to realistic threats and observing operational conditions and scenarios specified in the OMS/MP. (1) Sometimes it is relatively easy to impart a realistic operational environment to a test. One example would be the varying of message load when testing interoperability between two communication systems. It can also be difficult. If testing is conducted in accordance with the OMS/MP, for example, Soldier operators are required. This requires a safety release that may not be available during DT when Soldiers are not involved. Further, operating IAW the OMS/MP implies continuous operations which may not be supportable. (2) There will always be a question as to whether the conditions under which testing occurs are sufficiently operational to permit the use of data generated at a DT event to answer operational issues. This is a judgment call made by the AST. If it is impossible to implement operational conditions during a DT event, a combined event with distinct DT and OT phases can be implemented. Conducting DT with deference to operational realism will increase the validity of the DT results as they are used in support of answering operational ESS questions. m. Treatment structure. The goal of ATEC testing is not to predict how a system under test performs but observes and record how the system actually performs. The use of a design of experiment or design of experiments (DOE) must be used. The DOE is a method and tool on how to define a test for a specific system; under conditions the system must operate according to system designed specifications and the OMS/MP for that system. For integrated testing, a design of experiment that accommodates the needs of both the developmental and operational tester must be used. In contrast, Combined DT/OT can employ separate design of experiments for the different DT and OT phases Contractor DT a. The AST may use DT conducted by a contractor whether it is conducted at the contractor facilities or at a government facility as a data source for the evaluation. The AST will determine those contractor tests that are primary and secondary data sources for the evaluation and document those tests in the SEP and DSM as requiring government monitoring. Materiel developers will be encouraged to develop test plans, allowing a minimum of 30 days for AST review prior to the start of test. The DTC AST members will lead review of contractor test plans to determine adequacy of the testing and proposed test data collection methodologies in support of the SEP data requirements. b. The contractor tests that will provide data for the evaluation will be included in the test strategy as well as the DTC strategy for monitoring, and briefed at the T&E CIPR. Costs for support of contractor test plan review, test monitoring, test support, and review of test reports will be included during the development of the program s test cost estimate. Where feasible, contractor tests will be performed at, or be supported by, DTC s test facilities to leverage cost benefits of the DOD MRTFB ATEC Pamphlet June 2010

143 c. Monitoring of contractor testing is intended to verify that testing and data collection is being executed in accordance with approved test plans and test standards. Deviations from the approved contractor test plan will be brought to the attention of the PM by the AST. d. Any AST member may be called on to witness a test. The AST member witnessing the test is responsible for acquiring contractor data and providing it for the evaluation Use of Government test facilities a. When developing the test strategy to best support the evaluation data needs, the AST shall consider the broad spectrum of DOD test facility assets, to maximize the value of the Army s capital investment. DOD has made a significant investment in government ranges and other test facilities, and maximum advantage should be made of these resources. b. If tests which support an evaluation use other than government facilities, the rationale shall be documented in the TEMP per AR If a government facility cannot conduct a required DT or if it would not provide a cost benefit, the decision and rationale to use contractor support will be documented in the TEMP. Assessment of cost benefit must be based upon a documented analysis that includes range capability, availability, cost and the value major DOD ranges and ranges in the production industrial base provide to the Army. c. DODD defines national assets as the MRTFB that exist primarily to provide T&E capabilities for DOD customers. Detailed guidance for the use of MRTFB by other than DOD customers, including commercial entities, other U.S. government agencies, defense contractors, and foreign entities/governments is provided in DTC Pamphlet Safety Releases and Safety Confirmations a. One of the most important aspects of DT is verification of the elimination or control of safety health hazards. HQ DTC test managers and system safety personnel assist in determining the system safety testing and human health data requirements and ensure that pertinent specification analysis and criteria are identified and reflected in test documents. Two important ATEC/DTC documents result from this effort, including the Safety Release and the Safety Confirmation. (1) Safety Release. A Safety Release is a formal document issued by HQ DTC prior to all tests, training, demonstrations, and experimentation before any hands-on use or maintenance by user troops. (a) Safety Releases are issued for specific tests and under specific conditions. T&E policy requires the Safety Release to be issued prior to each test indicating the conditions under which the system is safe for use and maintenance by Soldiers and describing the specific hazards of the system. (b) Primary addressees for Safety Releases are the test organization, the PM, and the Commander of the Soldiers participating in the test. ATEC Pamphlet June

144 (2) Safety Confirmation. A Safety Confirmation is a formal document issued by HQ DTC in support of applicable milestones, the FRP decision, type classification, materiel release decisions (i.e., full, conditional, training, and urgent), fielding, post-system fielding system changes (modifications, upgrades, etc.) and rapid acquisition items fielding. A Safety Confirmation may also be required to support an OAR that has been requested by the PM but not tied to a particular milestone or acquisition decision. (a) The Safety Confirmation indicates if the specified safety requirements have been met, includes a risk assessment for hazards not adequately controlled, lists any technical or operational limitations, and highlights any safety problems requiring further testing. (b) Safety Confirmations are issued to primary addressees, as follows: 1. For acquisition systems or evaluated test projects, the Safety Confirmation is addressed to the AEC and the program office or commander who plans to acquire/buy the assessed materiel for the Army. The Safety Confirmation is attached to the AEC OER or the OAR (as required). 2. For non-acquisition systems in support of RAI items, the Safety Confirmation is addressed to the program office (sponsor) or commander who plans to acquire/buy the materiel for the Army. 3. For post-fielding system changes (modifications, upgrades, etc.), when an evaluation or assessment is required by AEC, the Safety Confirmation amendment is provided to AEC if AEC plans to prepare an OER or OAR. If AEC does not plan one of these documents, yet the item still warrants an amended Safety Confirmation, the Safety Confirmation will be provided to the system Product Manager. b. For items/systems in support of RAI, which may include UMR, REF, IEDs, and Rapid Fielding Initiative (RFI), a Safety Confirmation will be issued in accordance with the guidelines provided in paragraph 2-8. The Safety Confirmation for RAI is distributed by DTC after approval from HQ ATEC. c. Additional information on ATEC/DTC system safety documentation is provided in appendix K. Detailed guidance on Safety Releases and Safety Confirmations is provided in AR (with implementation guidance in DA Pamphlet ) and DTC Pamphlet OTA Test Plan(s) a. Once the AST has determined all the tests required to support the evaluation, the test plan should reflect a clear understanding of what questions will be answered by each test. It then lists the data required, including a specific list of all data elements to be collected in each test. The plan describes the test conditions to be used, how variables will be controlled, and to what tolerances for each group of data elements. It also details statistical procedures to be used to summarize or analyze the data (if any) and how the data will be presented. All analyses should be very clear as to whether a comparison to a standard item is required and how the analysis will be accomplished ATEC Pamphlet June 2010

145 b. There are three main types of OTA TPs: DT, LFT, and OT. These may vary from a onepage OTA TP for DT to a more comprehensive document several hundred pages in length for an LFT or IOT. (1) The requirement for OTA TPs for DT tests will be determined by the DTC AST member. They are usually required only when the SEP or established TOP does not provide the level of detail needed to develop the DT s DTP. The lead for development of OTA TPs for DT tests is DTC. (2) OTA TPs are prepared for all LFTs. The lead for development of the OTA TPs for LFT is AEC. OTC. (3) OTA TPs are prepared for all OT. The lead for development of these documents is (4) For multi-service test programs where the Army is the lead, OTA TPs will be developed. The other services involved will provide their input following the Army s OTA TP format. For multi-service test programs where the Army is not the lead, the AST member most closely associated with the test will prepare the Army Input to the Evaluation Plan (AIEP) following the format of the lead service. (5) For other tests or activities where data will be collected to support the evaluation (e.g., stand-alone modeling runs, training exercises), the AST will work with the conductor (test officer/test director/agency director) of the test to determine what documentation, if any, is required. c. Briefings. For OTs of major programs, the AST must brief DOT&E no later than 120 days prior to the first day of the test on the draft test plan (T-120). Prior to the meeting, the briefing must be coordinated among the AST, SCA leadership, and the ATEC Command Group. d. The AST will transmit the approved OTA Test Plan through the T&E Executive to DOT&E. The normal ATEC/SCA signature approval authority for an OTA-TP is shown in table 4-1. ATEC Pamphlet June

146 Table 4-1. OTA TP Approval Authority OTA TP Type Approval Authority All Operational Test OTC Commander 1,2 All Developmental Test DTC Commander/Executive Director, or designee 1,2 All Live Fire ATEC Commander/Executive Director or designee 3 All Integrated/Combined DT/OT ATEC Commander/Executive Director or designee 3 All Multi-Service T&E ATEC Commander/Executive Director or designee 3 Any other test event conducted by a non-atec organization AST chair s SCA Commander/Director 4 Notes: 1 After getting signed coordination from AEC. AEC will have 10 work days to review and respond to test organization. 2 The test organization will prepare a transmittal memorandum for the OTA-TP from the test organization through ATEC through TEO to DOT&E, as required. 3 The AST Chair will get signed coordination from all 3 SCAs and prepare the transmittal memorandum. The SGS will make distribution. 4 The AST Chair provides the approved OTA-TP to the external ATEC agency as guidance for test conduct DT test planning a. DT tests may include subsystem, system, or system-of-system tests of hardware, software, or simulations throughout the acquisition process. Test planning for DTs used to support evaluation decisions begin with an understanding of the evaluators needs and requirements. This is accomplished by a coordinated effort between the DTC HQ test manager supporting the AST and the evaluator. Likewise, the detailed test/simulation test execution planning is a coordinated effort between the DTC HQ test manager, the DTC HQ functional support staff (e.g., safety, environmental), and the test officer at the test center who will be in charge at the test site. This team ensures the requirements of the SEP and/or OTA TP are addressed and documented in the DT DTP. b. As soon as the test/simulation test is known, DTC establishes an effort in the ADSS to track progress. Initial planning will be based on the scope of work submitted by the requestor. It should contain sufficient detail to develop the cost and schedule. At the test center, tasks are defined through the lowest level with resource assignment and eventual development of time/cost estimates documented in ADSS. c. OTA TP. As discussed above, a determination will be made by the DTC AST member in coordination with all other AST members as to whether an OTA TP is required for a specific DT event based on the level of detail available in the SEP. If required, a DT OTA TP is developed by the DT member of the AST based on information provided by, and in coordination with, the system evaluator. It is a single document or memorandum that addresses one or more DT events. The DT OTA TP is coordinated within the AST to ensure all system evaluation issues are addressed as necessary, and is co-signed by Commander, DTC and Director, AEC. The DT OTA TP should include the following: (1) Description of tests to be conducted ATEC Pamphlet June 2010

147 (2) Description of the conditions under which the system will be tested (e.g., environmental conditions, component level versus system level, threat). (3) Test criteria and test methodology. (4) Data requirements including sample size; number of shots, miles, trials, etc. (5) Data presentations. (a) Statistical methods, e.g., confidence levels, risk. (b) Structure of the database. (c) Potential data displays (e.g., tables, charts, graphs). (6) Description of how testing relates to how item will be used in the IOT (i.e., how the DT results can be used to address readiness for IOT). (7) Modeling and simulation efforts. (8) Detailed test schedule. (9) Instrumentation to be utilized. d. DT DTP. The DT DTP is the major planning document for the conduct of the DT. It documents explicit instructions for the conduct of tests and subtests. The DT DTP is prepared by the test activity in accordance with the directions provided in the TEMP, SEP, and the OTA TP along with any instructions provided by the HQ, DTC test manager. Development of the DTP determines the best plan for testing the system. TOPs and ITOPs are used as appropriate. (1) The draft DTP is initially provided to the AST and the T&E WIPT for review and comment. After all internal ATEC coordination, the DTP is finalized and approved by ED, DTC or his designee (usually the test division chief). If required, the approved DTP and approved safety release will be provided to the Human Research Protection program (HRPP) Coordinator and Human protections Administrator (HPA). AR addresses conditions that require a HRPP review. (2) For LF testing, the DT DTP may be requested by DOT&E to answer questions not answered by the OTA TP. The AST must ensure that the document is carefully marked as to its approval status prior to delivery to DOT&E (e.g., DRAFT ). (3) No major deviations from the approved test plan can be made without the written approval of HQ, DTC. Minor deviations will be reflected in the narrative portion of the test report. e. Since DTs are designed to provide safety data as early in the test as possible, the test officer needs safety data and other engineering inputs to provide a safety release recommendation to support preparation of a safety release. See appendix O for additional ATEC Pamphlet June

148 information on safety. Specific safety tests are performed on critical devices or components to determine the nature and extent of hazards presented by the materiel. Special attention is directed toward the DT program to (1) Verify the adequacy of safety and warning devices and other measures employed to control hazards. (2) Analyze the adequacy of hazard warning labels on equipment and warnings, precautions, and control procedures in equipment publications. (3) Identify operational limitations to control hazards (e.g., speed limitations) f. To ensure the availability of the skills, instrumentation, facilities, and range time, all capabilities are closely managed. As a test becomes imminent, test priorities must be taken into account. Considerations for the prioritization of tests include overall Army priority, program schedule, test sequence, availability of special resources, weather windows, etc. g. Test schedules are established based on program requirements; consequently, no two test programs are the same OT event planning a. OTs can generally be defined as those operational test and experimentation test conducted to support Army acquisition program evaluation requirements and other events (whether test, experimentation, or exploratory) that are conducted in simulated operational or combat environments with typical user troops and, as appropriate, representative materiel. The key difference of an OT from other types of events is the employment of typical users to operate the system in the environment under which the system is expected to be employed after fielding. Types of operational tests and experiments are defined in appendix K of this document for various acquisition program requirements as well as for as-required testing that can be used for both acquisition and non-acquisition related requirements. b. The AST will provide input for or participate directly in the planning for OTs for acquisition programs. However, the majority of the planning requirements for these events and for non-acquisition related operational events are conducted by the assigned OTC test directorate. c. OT planning requires actions in many areas due to the nature of simulating an operational environment and conditions. Some of these actions will address how to simulate the expected operational environment; integration of the system within a user organizational structure; and integration of new tactics, techniques, and procedures (TTPs) for operation of the system. Other actions may require planning for training of typical users to operate the system and logistical support of the system during the event. Data generation and collection requirements may require identification of new or modified instrumentation for simulation or stimulation of the tested or supporting system(s) as well as event scenario and control requirements. These and many other actions are necessary to ensure proper event execution that provides creditable and usable data to address the evaluation or other customer requirements. A number of these considerations are discussed in the following paragraphs ATEC Pamphlet June 2010

149 d. The Event Summary and Overall Methodology is developed to provide the upper level logic behind the event and how the event will be structured and controlled for generation and collection of data. It identifies the overall design for employment of the system under test and sets the basic parameters for all subsequent planning. e. An OT OTA TP (see appendix F for format) is prepared to document planning actions for an event or combination of events identified in the SEP. The OTA TP documents the test design, supporting methodology, and analytic details required for the specific event. For OT, the OTA TP includes the level 3 database design information including both the data element dictionary and database design. This database information lays the foundation for the evaluation and must be jointly developed by the tester and evaluator to ensure success. Finalizing and building the level 3 database, and entering dummy data, early allows the evaluation team to develop analysis tools and identify data displays that will be used for evaluation reporting. For integrated DT/OTs, the AST will determine if separate OTA TPs or a combined one will be prepared. f. The basis for test design is provided in the SEP, where the evaluator describes the design of experiments, intended analysis of the measures, factors, conditions, and sample sizes. The development of the design of experiment must take into account the OMS/MP and have the concurrence of all members of the AST. The actual test design must incorporate the design of experiment and provide for executing each set of conditions the required number of times. (1) Event variables (factors and conditions). There are four types of event variables (often referred to as event factors) - independent, dependent, error control, and uncontrolled. During events all these variables assume discrete values (or conditions). The independent variables are the variables which are controlled in order to assess their impact on the dependent variables. The dependent variables are the measures of interest. There are three types of error control variables. They are blocking variables, variables that are held constant and randomized variables. These variables are controlled to preclude biasing test results. Uncontrolled, or extraneous, variables are those variables that are not included in the design of experiment but none the less may impact the dependent variable and must be documented. Uncontrolled variables often include tactically varied variables. Tactically varied variables are included in the test event because they enhance event realism. Their conditions develop as a result of tactical operations employed in the event. One of the primary considerations in designing an event is to document the effects caused by extraneous variables. (2) Test controls. Factors are controlled in one of three ways: Systematically varied, held constant, and randomized. Independent variables (for example, target type) and blocking variables (for example, day) are systematically varied by the tester. Variables that are held constant have the same value (for example, MOPP 0) throughout the test. Variables that are randomized (for example, skill level) are presented to the system under test in accordance with a randomization scheme (for example, a random numbers table). Tactically varied variables are normally uncontrolled. (a) Systematically varied factors are used to permit examination of all required factors in sufficient quantity for effective analysis. The AST establishes the values that the systematically varied factors will obtain during the event and combines them in an design of experiment to create specific operational situations under which data must be collected. When ATEC Pamphlet June

150 critical factors for a system are identified, the most representative conditions for that factor are developed into the test matrices with the number of conditions held to a minimum. (b) Factors are held constant for the test when prior knowledge or testing indicates a preference, or no other option for that factor is available. (c) Uncontrolled factors should be held to a minimum and the tester will capture the conditions the uncontrolled factors assume. (3) Combining conditions. Sets of possible test conditions are identified by using a test matrix to create all possible combinations of the factors and conditions selected to be systematically varied. For example, a hypothetical system s target detection capability could be influenced by three training level conditions (untrained, average, and highly proficient), three weather conditions (clear, overcast, and precipitation), and two terrain conditions (flat and mountainous). This situation would require consideration of 18 possible test combinations (3 x 3 x 2 = 18). The radio communications capability of the hypothetical system could require consideration of training and terrain conditions (3 x 2 = 6 combinations) because weather conditions have little effect. (4) Number of required trials. Determining the event duration and sample size required for collection requires knowledge of the system or concept under test, data requirements, environment to be simulated, player force structure, and mission requirements. Event duration and sample size must be based on the minimum amount of testing required to provide data to support customer requirements to reach definitive conclusions concerning the system or concept under test. The number of trials for a phase is normally dictated by the amount of risk an AST is willing to take in estimating statistical parameters and drawing conclusions about system performance. Sample sizes may be calculated using standard statistical software packages (for example, SAS and SAS JMP. In some instances the number of required iterations of a set of test conditions may be inconsistent with the operational environment being simulated. Sample sizes may be adjusted by the AST in order to obtain operational realism. Essentially, this process determines how many repetitions are required to provide statistical confidence that the event results are valid and representative of the test. If necessary, the AST should document any event limitations resulting from inadequate sample sizes in paragraph 1.5 ( Event Limitations ) of the OTA TP. A test matrix, also called a trial matrix, is developed for each phase or set of requirements to show the number of iterations necessary to achieve the desired level of statistical rigor for each phase. g. Event planning must always consider the requirement to balance event realism and event control. Test designs that do not include the capability for possible degradation of system performance due to realistic conditions of employment fail to address a critical decision area and can seriously reduce the value of the test results. Test event realism comes from scripting the events to follow the OMS/MP and the approved doctrine, tactics, techniques, and procedures. Event realism is enhanced when the players, friendly and threat, are allowed to respond to the natural battlefield conditions. However, in order to answer the issues and criteria, the event executor must be able to collect the data; this requires that a certain amount of control be maintained during the event trials. The conditions for test environments will normally fall into one of three categories of operational realism: maximum, limited, and minimal ATEC Pamphlet June 2010

151 (1) Maximum. This type of event requires simulation of a tactical environment. A scenario is developed which merges the event trials and activities into a realistic and believable sequence. The scenario describes the actions of all player and opposing forces (OPFOR) units and includes all information that will be presented to the players. This type of realism is maintained by including initial and updated briefings for friendly and threat force players through operations orders, fragmentary orders, intelligence summaries, messages, and other information designed to evoke player response. Scenarios are based on standard TRADOC scenarios or other scenarios as specified. The particular scenario to use is agreed upon by the AST and the system proponent. In preparing the scenario it is essential to specify the time and location of each planned trial or activity. Once trials begin, there is limited intervention by controllers. (2) Limited. When events do not require maximum operational realism, the preparation of a scenario may be unnecessary. It is, however, necessary to develop a detailed description of the events that will occur. The description should be sufficiently detailed so that the trials or activities can be executed without additional information. For each, the method, time, location, participants, and information to be provided must be specified. Mission event or execution lists may be used to ensure that the required amount of realism is maintained and that the required data is being collected. (3) Minimal. This type of realism may be appropriate for customer tests or user evaluations of a system or concept. Although little realism is simulated in this type of event, there is need for the event executor to maintain close supervision through frequent checks to ensure that the user is properly employing the item or concept. For these tests, this section describes the frequency of checks and inspections and the areas to be checked. h. Event control procedures must be developed to ensure that the event can be properly organized and executed to generate the required event data. Control procedures vary as to the type and need. For events which have limited or maximum tactical realism, detailed control procedures are normally required to ensure that specific tactical operations occur, both friendly and OPFOR units begin and generally conduct operations as required, and instrumentation and simulation or stimulation devices are operating as required. Other control procedures may address placement and recovery of data collection personnel, visitor access, logistical support requirements, and other similar items. Regardless of the type of event, necessary control procedures must be identified by event planners and implemented during event execution to ensure that the execution proceeds in accordance with the test design requirements. A control plan is usually developed to identify the specific control measures required and to identify those personnel and situations in which a specific measure must be implemented. This plan normally is included in the event planning documentation. i. The collection of event data through the use of automated instrumentation systems is a key factor in the majority of events. In addition, instrumentation systems that use M&S are often employed to provide realistic simulation of combat environments (weapons simulator; Nuclear, Chemical, Biological, and Radiological (CBR) stimulants; C4I stimulator) and to generate data for systems to use in lieu of having actual forces in the field (combat simulations and stimulation). ATEC Pamphlet June

152 (1) Instrumentation. Instrumentation planning is conducted to identify those instrumentation systems that are required to collect data to address the event issues and/or to provide the necessary degree of combat environment realism or generation of cue and/or task loading information. Every effort must be made to utilize the same instrumentation technology resources for both DT and OT events. (2) Modeling and simulation and stimulation. The use of models and simulations is highly recommended and emphasized in operational events. Employment can be used for reducing costs, providing or enhancing test design, predicting results for comparison with field results, providing simulation or stimulation of systems and organizations that cannot be actually present, and assessing areas that cannot be fully tested. However, there are two restrictions on use: (1) modeling and simulation data cannot be the sole source for production decisions in lieu of operational testing; and (2) all models and simulations must undergo accreditation prior to use. Simulators, emulators, drivers, and stimulators that are used to fully workload systems under test are included in this category. Threat simulators are a separate category but must also be approved and certified for use. The OTC, Transformation Technology Directorate can assist in identifying existing models or simulations to meet event requirements, developing requirements for models or simulations, and providing guidance in meeting VV&A requirements. For additional information see ATEC Regulation j. The analytic approach is the methodology by which the event data will be collected and processed to address the event requirements. (1) Tailored requirements. Under normal circumstance, an AOTR that provides required information obtained from an operational test, experiment, or other event is produced. This would be done for all OTC-conducted tests whether they are for oversight, non-oversight, or for non-acquisition program events. Only in the case where the customer or AST Chair has requested a full OTR does the tester have the responsibility for aggregating data at the measure level. The customer or AST can ask OTC to provide data aggregation at the criteria or issue level, but that is generally not the case. No matter which report is prepared, the tester describes the methods used to calculate and aggregate the data. The methodology for aggregation is developed based upon the guidance in the SEP and the overall product that is required for the customer, and is tailored for each event as appropriate. (2) Major areas of discussion. The major areas of discussion for the methodology will center on the collection of data for the specific measures assigned to the event in the SEP s DSM. The tester must be able to explain the relevance of the data requirements with respect to the measures and develop the appropriate data collection, reduction, and aggregation methods. Measures must be clearly defined including unique terms, factors and conditions, and data elements required. (3) Data collection, reduction, and aggregation. The data collection and reduction procedures required to address a measure are a function of the degree of precision established for a given measure. Some measures will require input from several sources in order to provide the AST with the data to answer the measure. Data from instrumentation, data bus recordings, and manually collected data may need to be combined before the measure can be answered. In other cases, the measure may be answered by a single source of data, for example a questionnaire 4-20 ATEC Pamphlet June 2010

153 provided to the test players. The objective of the data collection, reduction, and aggregation paragraphs under each measure in the OTA TP is to provide a clear explanation of how the data will be collected, reduced (entered into the level 3 database), and aggregated at the conclusion of the event. The OTA TP will reference the level of precision defined in the SEP for the measure. k. Data management planning must address all aspects of requirements for the organization and procedures for data collection and reduction efforts, the critical data process descriptions, the Data Authentication Group (DAG) requirements, if any, and the event database. Data management is discussed later in this chapter. l. Most OTs require employment of a DAG to review collected data and data summaries, identify data anomalies, and authenticate the event database for further use for evaluation or other purposes. DAG requirements are further discussed in appendix M. m. The test database requirements are identified by the AST based on the test measures, criteria, and issues. Both testers and evaluators need to work together to design the database and confirm correct database design before actual testing begins. When the SEP is prepared, AEC will draft a version of the level 3 database which they will use to create the evaluation. The test members of the AST comment on this database design, and the AST together determines the final level 3 database design. The test officer/test analyst in conjunction with the AST also identifies the format (including version) for the files (ASCII, SAS, ACCESS, Excel). To prelude problems during data transfer, the method for transferring each type of file to the AST must be identified and coordinated during the planning phase.. n. A safety release is written by DTC to document operational limitations and precautions for all tests using user troops. Requests for a safety release are made to the DTC AST member. Operational testing, including pretest system training, will not begin until the tester, the trainer, and the Commander of the test Soldiers have received a copy of the approved safety release. o. Human Research Protection Program (HRPP) approval must be received from the U.S. Army Medical Research and Materiel Command Human Subjects Research Review Board (MMRMC HSRRB) prior to conduct of any hands-on training or testing involving user troops. p. As appropriate for the level of planning, an OT DTP containing the necessary details for day-to-day execution of the event will be prepared. The OT OTA TP along with the OT DTP will document planning for assigned events in accordance with OTC policies. A format is not mandated for the OT DTP and no formal staffing or approval is required outside OTC unless requested. q. The Pattern of Analysis (PoA) is a major element in test planning. The PoA is developed in conjunction with both the tester and the evaluator. It provides the transition between the measures contained in the approved SEP to the identification of the actual data elements required for calculation of the measures. The PoA is required for all OT events and becomes an appendix to the OT OTA TP for the event. It depicts the relationship of COI into measures and related specific test and data elements. The PoA can be displayed in narrative terms or graphically. It is normally prepared by the operational test analyst in conjunction with the system evaluator. ATEC Pamphlet June

154 (1) The initial portion of the PoA is developed by the system evaluator as a function of the development of the SEP. Using the approved strategy, the system evaluator develops the initial dendritic portion of the PoA to organize ESS, Issues, and measures. (2) The test analyst then develops simple, measurable data elements required for calculation. A quality PoA is used by the test analyst/test officer to assist in the planning and development of requirements for test scenario or other scheduling plan and the data collection and management plan Test Support Packages The AST must have timely and complete applicable portions of the support packages to properly plan and design events. The Materiel, Combat, and Training Developers normally provide the AST test organization member the TSPs in support of specific test planning and execution requirements. Unless an appropriate waiver is granted, testing in support of a system acquisition is not conducted when support packages are insufficient to support the test objectives. (See DA Pamphlet 73-1 for additional information.) If an issue arises between the AST and the responsible agency over the requirement for, composition, or delivery of a TSP or waiver of a TSP that in the opinion of the AST is needed in order to fulfill the requirements as stated in the TEMP, the AST should first undertake to resolve such differences within the T&E WIPT. If the difference cannot be resolved within the T&E WIPT, the issue along with the opposing viewpoint should be elevated through the chain of command. a. Delivery Requirements. Modifications to delivery dates as set forth in DA Pamphlet 73-1 or established by the T&E WIPT are permissible to allow the phased provision of selected sections of a TSP. In such cases, an agreement on a delivery date(s) and the content of each phased-in portion of the packages will be coordinated between the providing activity and the AST. Default due dates and responsible agencies for TSPs are shown below in table 4-1. Test Support Package SSP NET D&O Threat Training Table 4-2. Test Support Package Requirements Responsible agent or agency PEO/PM PEO/PM Capabilities Developer Capabilities Developer Training Developer Reference Guide Draft TSP due Final TSP due AR DA Pamphlet 73-1 AR DA Pamphlet 73-1 T-270 T-60 AC AC DA Pamphlet 73-1 AC T-270 AR DA Pamphlet 73-1 AC T-270 DA Pamphlet 73-1 T-270 T-60 Note: AC=As coordinated between test agency and responsible agent or agency ATEC Pamphlet June 2010

155 b. System Support Package (SSP). The SSP is a composite of the support elements required to keep a fielded materiel system in an operationally ready condition. It identifies and provides the system support hardware and required supply support elements. It is developed by the Materiel Developer in coordination with the Capabilities Developer, Training Developer, logistician, test organization, and system evaluator. The SSP is evaluated as part of the logistics demonstration and is certified as appropriate during OT. The SSP Component Listing (SSPCL), which informs the tester of what the SSP will consist, is normally due at OTRR 1 with the SSP itself being due at OTRR 2. c. New Equipment Training Test Support Package (NET TSP). The NET TSP, provided by the PM, provides an equipment-specific training program for unit/user instructors to use in training test participants and troops for a specific test. It contains equipment-specific documents, training aids, simulators, Programs of Instruction (POI), and personnel requirements. The NET TSP is provided to DTC at least 60 days prior to start of DT. For OT, the finalized NET TSP is provided to the unit before unit NET begins. Formal delivery is made to OTC no later than OTRR 2. The training developer uses the NET TSP as the basis for developing the training test support package. d. Doctrinal and Organizational (D&O) Support Package. The D&O Support Package is provided by the Capabilities Developer and contains descriptions of means of employment, organization, logistical concepts, mission profiles, and the proposed test setting for use in test planning. The initial portions are normally due prior to OTRR 1 with the complete package required at OTRR 1. The D&O TSP includes the OMS/MP that defines the operational tempo of the item under test. The operational tester uses the CDD/CPD/OMS/MP as the basis for designing the operational test event. e. Threat Test Support Package (Threat TSP). The Threat TSP is provided by TRADOC ADCSINT, Fort Leavenworth, KS. It is coordinated with the T&E WIPT. The Threat TSP provides a description of the threat that a system will be tested against. This package is provided by the proponent threat support office who also participates in test planning, trains personnel for portraying threat forces, and monitors threat portrayal prior to and during a test. The TSP is derived from System Threat Assessment Report (STAR)/System Threat Assessment (STA) and selected regional scenarios. The Threat TSP is prepared for testing of all oversight systems when an operational threat is required. To support DT requirements, the PM/Materiel Developer will expand and tailor the initial threat TSP for each test in which threat force operations are to be portrayed realistically. For OT, the Capabilities Developer will expand and tailor the initial threat TSP for each OT requiring realistic threat portrayal. Format outlines are in AR The draft Threat TSP is required prior to OTRR 1 with the final due by OTRR 1 (T-270). However, the draft Threat TSP must be provided in enough time so that coordination can be accomplished through the TSARC process for resources to portray or replicate the threat. The recommended time is 16 months prior to OTRR 1. Certification or validation of unusual threat conditions or player elements should be completed by OTRR 2. f. Training Support Packages. The Training Support Package provided by the Training Developer defines the training program for the individuals and units who will use the system. The Training Support Package outlines target audience descriptions, training schedules for test players, draft Army Training and Evaluation Plans, draft Soldier training publications, ATEC Pamphlet June

156 maintenance personnel, lesson plans, and lists of major training. Initial draft training requirements are due prior to OTRR 1 with the final TSP due at OTRR Overview of Test Readiness Reviews (TRR) a. Prior to execution of a test, a series of Test Readiness Reviews (TRRs) are conducted. A TRR is a forum in which the designated tester, in conjunction with other members of the AST, brings together the representatives of all agencies associated with or having a principal involvement in an event (DT, OT, FDT/E, combined or integrated DT/OT, customer test, or other type of event). Assessments are made on the readiness of the system, concept, or force development product; support packages; instrumentation; test planning; evaluation planning; and any other area required to support the successful conduct of the test or experiment. b. The TRR chair brings together representatives of all agencies associated with, or having a principal involvement in the test to determine overall readiness for planned testing. Minimum membership includes the PM/Materiel Developer, the operational and developmental testers, and the system evaluator. There are normally at least four components to the overall readiness assessment: (1) System or developmental item. Available test data is reviewed and assessed in terms of TEMP entrance criteria and to assure readiness of the system to enter test. In the case of a DT, this can be contractor pretest/shakedown and/or other available information. In the case of OT, it will be contractor testing and the results of government DT completed to date. The SSP must be as complete as possible. (2) Test plan. The test plan must be complete and sufficient to show that the test, if executed as planned, will provide for the proper collection of adequate and valid data for the evaluators to perform a credible, timely analysis and evaluation. This review includes the identification of any problems that might impact the execution of the test. (3) Test resources. The availability of facilities, support equipment, support personnel, instrumentation, and any other support areas are reviewed for possible impact on the efficient conduct of the test. (4) Pre-test training. This review is to assure that test personnel are adequately trained or that training is planned prior to conduct/participation in test. Training includes training on the system being tested and in the operation of test instrumentation and facilities being used for support of the test. c. Each of the above considerations is the responsibility of one or more members of the readiness review team. The PM affirms that the system and its support materials are ready. The testers affirm that the test plan and test resources are ready and that the test center or directorate personnel are properly trained. The user representative and training developer also affirm that the test players are properly trained or training is planned. d. Problems concerning the test plan or test resourcing/training should be surfaced by the AST and resolved by internal action or external coordination prior to the formal readiness 4-24 ATEC Pamphlet June 2010

157 review. The TRR is a forum where information and positions are presented to a decision-maker for a go, no go decision. e. The TRR is a method used to identify and mitigate risks to a test. Although risks may come from a number of different sources, such as the availability of test players or their training, determining the impact of software defects on an upcoming test is especially difficult. This assessment provides the decision-maker with a comprehensive understanding of the risks to the test Evaluation Decision Review (EDR) a. The EDR is conducted for operational tests and for select developmental tests which provide data for an evaluation. An EDR is required for LUTs and IOTs of programs on the OSD T&E oversight list. Other EDRs are held at the discretion of the Evaluator and the chain of command. EDRs are usually held no later than T-65; they must be conducted prior to OTRR2 or the equivalent DTRR. Completion of the EDR is a hard entrance criterion for OTRR 2. EDRs are chaired by Director AEC or designee. b. Prior to conduct of the EDR: (1) The Tester and AEC will agree upon and complete the level 3 database design based upon the approved SEP, OTA TP, and database validation. The database must be either a flat file containing all the test conditions (independent and blocking variables) or the data elements required for each evaluation measure (dependent variables) or a relational database linking the appropriate test conditions with the data elements required for each evaluation measure. (2) The tester will develop adequate dummy test data to populate the test database and perform data entry, data reduction, quality control, and level 3 data generation to produce a complete, dummy, authenticated level 3 database. The DAG members should elect to review DAG procedures during this step. (3) The tester will provide the dummy authenticated level 3 database to AEC via the means they will use during the record test. The evaluation team will use the database to identify data voids that may impact the evaluation, and to validate the status of evaluation analysis software routines, especially for the quick look measures they have identified. AEC identifies potential problems and works with the Tester to implement corrections prior to the conduct of the actual EDR meeting. c. The EDR includes tester and evaluator representatives with evaluation readiness as the focus of the meeting. The following are key elements of this review: (1) Test officer/test analyst and system evaluator consensus on the level 3 database design, to include results of a database validation exercise using dummy data. (2) Any known data voids and their impact to the evaluation. (3) Status of analysis software routines that support the evaluation team. ATEC Pamphlet June

158 d. For ACAT I and OSD T&E Oversight programs, the Director AEC will report results of the EDR to ED ATEC and to the chair of the OTRR 2 (if different than the ED ATEC). Additionally, the EDR findings and results will be a specific topic presented during the next test readiness review, especially OTRR Developmental Test Readiness Review (DTRR) The objective of the DTRR is to determine what actions are required to assure resources, training, and test hardware will be in place to support the successful conduct of the test, and to ensure that T&E planning, documentation, design maturity/configuration, and data systems have been adequately addressed. a. The DTRR is conducted and chaired by the PM/MATDEV to determine if the system is ready for developmental testing and if the tester has all processes and support in place to conduct the test. As a minimum, the PM conducts a DTRR for programs on the OSD T&E Oversight list prior to the PQT. The MATDEV and DTC jointly determine whether a DTRR should be conducted for all other test events, regardless of the program ACAT. b. The DTRR working group is composed of the principal members of the T&E WIPT, shown below. Other agencies may be represented based on program requirements and considerations. (1) PM/MATDEV (Chair). (2) MATDEV s representatives. (3) Capabilities Developer. (4) Developmental Tester. (5) Operational Tester. (6) System Evaluator. (7) Logistician. (8) Trainer. c. The DTRR working group reviews all pre-start activities and requirements to determine what actions are required to assure resources, training, and test hardware will be ready for test conduct. The review also helps to ensure that T&E planning and documentation, design maturity/configuration, and data systems have been adequately addressed. See DA Pamphlet 73-1 for a detailed list of factors considered by the DTRR. d. At least 2 weeks prior to the scheduled review meeting, the PM provides the DTRR working group with the time and location of the DTRR scheduled review and the DTRR package. The DTRR package consists of the following documentation: 4-26 ATEC Pamphlet June 2010

159 (1) Coordinated TEMP. (2) Developmental Test Plans. (3) Safety Assessment Report (needed 60 days prior to start of test). (4) Environmental Impact Documents (needed 120 days prior to start of test). (5) Description of test item configuration. (6) RAM Failure Definition/Scoring Criteria. (7) Status of System Support Package (SSP), New Equipment Training (NET), MANPRINT, Instrumentation, Data Collection/Reduction Facilities. (8) Supportability IPT approved Supportability Strategy. (9) Airworthiness release or statement, if required. (10) Status of software. (11) Safety Release. (12) Common Instrumentation (13) Link to Evaluation Data Model (14) Contractor or other test data. (15) Test milestones. (16) Validated Test Operations Procedures (TOPs). e. The outcome of the DTRR is to make recommendations regarding all issues concerning T&E planning. The PM is responsible for resolving disagreement and distributing the DTRR minutes. The PM will issue a developmental test readiness statement (DTRS), which will be included in the DTRR minutes, which verifies that the system is ready for developmental testing. The minutes also include a list of any actions to be completed before testing can begin. The minutes are distributed within 10 working days after the meeting. For additional information, see DA Pamphlet f. Internal DTC directives initiate the planning and/or execution of DT. Prior to start of test, the following documents must be issued and received at the test center: (1) Test Execution Directive. (2) Approved OTA Test Plan or Detailed Test Plan. (3) Appropriate Environmental Documentation. ATEC Pamphlet June

160 (4) Safety Assessment Report. (5) Safety Release (if troops are used). (6) Security Classification Guide. (7) System Support Package Components List. (8) Health Hazard Assessment Report (HHAR) Operational Test Readiness Review (OTRR) a. The OTRRs are conducted by the operational tester, as deemed necessary, for each acquisition program to determine the state of overall readiness for operational testing. Assessments are made at the OTRR on the readiness of the system, support packages, instrumentation, test planning, evaluation planning, and any other area required to support the conduct of the test. The purpose of the assessments is to: (1) Determine if any changes are required in planning, resources, training, equipment, or schedule in order to successfully execute the test. (2) Identify any problems that impact on the start or adequate execution of the test and subsequent evaluation or assessment of the system. (3) Make decisions as appropriate to resolve problems or to change or confirm scheduled requirements. b. The chair of the OTRR is dependent on the ACAT and/or oversight levels of the system. Although OTC hosts and ATEC chairs the meeting, the OTRR should be seen as a collaborative group where the collective goal is to have the system, test unit, test structure, data handling, and evaluation tools ready and available by the test start date to provide data required for the evaluation. The role of the OTRR Chair is to provide an unbiased assessment of each briefing, its implications for test readiness, and to consider the test-start recommendations of each briefer in determining the outcome of the OTRR. (1) The ATEC ED is the chair of OTRRs for ACAT I and OSD T&E oversight programs. However, the ATEC ED may retain the Chair responsibilities at his level for any program at any time. The ATEC ED may delegate the Chair duties to the ATEC Technical Director or to the OTC CG for a specific OTRR. The OTC CG may further delegate to the OTC ED or a Director of a Test Directorate. (2) The OTC CG will be the OTRR Chair for non-oversight systems. The OTC CG may delegate the Chair to the OTC ED or an OTC test director for a specific OTRR. (3) The delegated Chair will act with the authority of the ATEC ED; however, the ATEC ED will be notified when indications show possible test delays or cancellations. c. The principal OTRR attendees include: 4-28 ATEC Pamphlet June 2010

161 (1) Operational tester. (2) Materiel Developer. (3) Capabilities Developer. (4) Training Developer. (5) System evaluator. (6) Developmental tester. (7) Logistician. (8) Command providing player units. (9) AST chair and other members providing special skills, as appropriate. (10) Others, dependent on system requirements. It is not uncommon for the T&E Executive and DOT&E to send representatives to an OTRR. d. OTRR Product. The resultant product of each OTRR is a decision by the chairman to execute the OT as planned, to direct required changes to ensure successful test execution, or to recommend (to the program decision authority) delay or cancellation of OT. Start of the OT will be delayed when a problem is identified that would affect the validity of the data being collected to address the evaluation issues. OT start can also be delayed when it is apparent that the system has little change of successfully attaining critical technical parameters or satisfying critical operational criteria, and deficiencies cannot be resolved before the start of the test. OT may also be delayed when it becomes evident that critical test data or information cannot be obtained to adeq1uately answer the issues. Suspension, delay, or termination of OT is covered in paragraph e. The following key documents are required to support an operational test: OT. (1) Statement from DUSA (A&T) for ACAT ID programs that the program is ready for (2) AST Charter. (3) System Evaluation Plan (SEP). (4) Test Resource Plan (TRP). (5) Test Support Packages (TSP). (6) Failure Definition/Scoring Criteria (FD/SC). (7) OTA Test Plan (OTA TP). ATEC Pamphlet June

162 (8) DAG Charter. (9) Safety Release. (10) Human Use Committee (HUC) Documentation. (11) Airworthiness Release. (12) Record of Environmental Consideration (REC). (13) Software Configuration Control Plan. (14) Detailed Test Plan (DTP). (15) Operational Test Readiness Statements (OTRSs). f. Operational Test Readiness Statements (OTRS) are provided prior to the start of a test by the Materiel Developer, Capabilities Developer, Training Developer, and the Test Unit. These are written statements of readiness to proceed to test from the standpoint of their organization s contribution to that test. The date that OTRSs are required will be documented in the test s TRP (usually OTRR 2). See DA Pamphlet 73-1, for a complete description of each OTRS. (1) The MATDEV OTRS describes the system to be tested in terms of size, shape, weight, transportability, and functional characteristics. This includes the software version to be tested and current documentation to be made available. A detailed statement of how both the system hardware and software characteristics differ from a fully representative initial operational capable system is provided, if appropriate. (2) The CBTDEV OTRS verifies that the doctrine, organization, threat, logistics concept, crew drill, and standing operating procedures (SOPs) in the CBTDEV s support packages are complete, represent planned employment, and are approved for use during the test. (3) The Training Developer OTRS verifies that the training support package is complete, representative of the fielding package, and approved by TRADOC for use during the test. In addition, it verifies that user troops have satisfactorily completed training in accordance with the training support package and are ready for test. (4) The Test Unit OTRS, signed by the test unit commander, certifies that the Unit personnel are military occupation specialty (MOS) qualified through the use of common task testing and, where appropriate, mission training plan tasks. This statement does not certify that unit personnel are trained on the test item. It also certifies that unit schedules and other mission requirements will allow unit participation in the test to the agreed upon levels. g. A normal test will require three OTRRs; however, any OTRR participant may request an additional OTRR be convened. The operational tester will provide at least a 30-day advance notification of the OTRR date to all principal OTRR participants ATEC Pamphlet June 2010

163 (1) A pre-otrr is normally conducted at the action officer level, usually on the day prior to the official OTRR 2 and OTRR 3. The Pre-OTRR is held to identify potential or actual issues, review information to be presented during the official OTRR, develop solutions, and address other areas as appropriate. All agencies attending the official OTRR should provide representatives to the pre-otrr meeting. (2) OTRR 1 should occur at test start (T-date) minus 270 days (T-270) and may be held in conjunction with a system T&E WIPT meeting if possible. Normally this is an action officer review and is chaired by the operational test officer s Test Director. It may be conducted virtually, if this is agreeable to all participants. The operational test representative on the AST develops the agenda and draft content, and forwards to all OTRR members so that they can prepare their briefing. The purpose of OTRR 1 is to identify constraints to test planning and those actions or activities, if any, that appear to be moving too slowly to support the test start date or proper test execution, and to coordinate corrective actions, assign responsibilities and suspense dates to close the gap. The primary risk addressed by this OTRR is that of continuing to expend T&E resources if unacceptable program conditions exist. All OTRRs follow the same general agenda, which is shown in figure 4-4 below. Refer to DA Pamphlet 73-1 for additional information on sub elements of each topic. ATEC Pamphlet June

164 OTRR Sample Agenda Overview (Operational Tester) Purpose Agenda Documentation Status Entrance Criteria Review Program Sponsor Issues (MATDEV/PM) Results of Previous Testing System Equipment Status Technology Readiness Level Operational Test Readiness Statement Safety Release System Delivery Schedules Contractor Support Logistics Support Plan Special Instrumentation/MODSIM System Transfer Plan Certification of System Readiness for OT Certification of stable software design Test Support Documentation Status (TEMP, TSPs, SSP) Special topics (e.g., IA) PM Issues and Concerns Capabilities Developer/Trainer Issues (Capabilities Developer/Trainer) Test Soldier Training Status/Results Operational Test Readiness Statement(s) Test Support Documentation Status (CDD, D&O, TSP, Threat TSP, FD/SC, OMS/MP, Training Dev TSP) Logistic Concept Threat Integration Certification of System Readiness for OT Special topics CBTDEV/Trainer Issues and Concerns Figure 4-2. OTRR Sample Agenda 4-32 ATEC Pamphlet June 2010

165 Developmental Tester Safety Release DT Test Results Operational Tester Test Organization and Structure TRP Resources/Other Agency Support OTA TP/DTP status Test Schedule Overview of test design to address evaluation measures Participation/Other Agencies Pilot Test and End-to-End Data Run Data Collection, Reduction, and Processing Plans Data Displays DAG Composition and Operation Test Instrumentation Status Threat Force Portrayal Readiness Test Site Support Plan Human Factors HUC Funding Status of MOUs MOD/SIM V&V Test Data Release Rules and Plans Security and Force Protection Software Configuration Control Risk Assessment Common Instrumentation Link to Evaluation Data Model Special Topics Tester Issues and Concerns Comments from Other Services (as appropriate) Overall Readiness (AST Chair) Evaluator Assessment of System Readiness Evaluator Assessment of Tactics, Techniques, and Doctrine Evaluator Assessment of Threat Evaluator Assessment of Training Readiness Evaluator Assessment of Test Readiness MOD/SIM Accreditation Evaluator Readiness to receive data, OR Results of EDR, including SEP Status Evaluator Issues and Concerns Overall Assessment Comments by TEO Representative (as appropriate) Comments by DOT&E Representative (as appropriate) Identification and Review of Showstopper/Potential Showstopper Issues (Evaluator and Operational Tester) Review of Action Items (Operational Tester) Discussion (All) Decision (Chair) Figure 4-2. OTRR Sample Agenda (Continued) ATEC Pamphlet June

166 (3) OTRR 2 is normally conducted at T-60. The chair resides with the ATEC CG/ED or OTC Commander (or their designees) in accordance with ATEC Regulation The primary focus of OTRR 2 is the system/combat/training readiness, test resourcing, data management, and the identification of any known problems that would delay test start and preclude incurring Soldier support and test team deployment costs. Resource providers confirm their readiness to release resources to the operational tester. The Materiel Developer, Capabilities Developer, Training Developer, and test player unit each provide an OTRS attesting to the system s readiness for operational testing. Normally, a Safety Release is provided at this OTRR. The safety release and HUC approval must be received prior to conducting any hands-on training or testing involving user troops. The primary risk factor addressed by this OTRR is the expenditure of deployment costs to the test site when unacceptable conditions exist. (4) OTRR 3 is normally conducted after the pilot test but before the beginning of record trials. The chair resides with the ATEC CG/ED or OTC Commander (or their designees) in accordance with ATEC Regulation For continuity, the person who chairs OTRR 2 should also chair OTRR 3. The primary focus of OTRR 3 is test start and the determination that the tested system, players, test organization, instrumentation, and data collection and reduction procedures are ready for testing. OTRR 3 normally follows successful completion of Pilot Test, to include a successful end-to-end data run, reduction, population of the level 3 database, DAG, and system evaluator analysis of the data. All prerequisites for a successful test execution are reviewed at this OTRR. The risk factor addressed by this OTRR is expenditure of test costs for an unsuccessful test. h. Within 3 working days after the OTRR, OTC will notify organizations of issues which arose during the OTRR and which their organization is responsible for resolution prior to test start. Within 5 working days, OTC will provide a status report to acquisition decision makers outlining the OTRR results and recommending a way ahead. Minutes of the OTRR are distributed to members within 10 working days. i. The end product of each OTRR is a decision to execute the OT as planned, direct any changes necessary to ensure successful test execution, or to recommend (to the program decision authority) the delay or cancellation of the test. There are three basic cases where the start of OT should be delayed: when a problem is identified that would affect the validity of data being collected to address evaluation issues; when it apparent that the system has little chance of successfully attaining CTPs or satisfying COICs, and deficiencies cannot be resolved prior to the start of test; and when it is evident that critical test data or information cannot be obtained to adequately answer the issues. Additional information is provided in DA Pamphlet Joint Test Readiness Review (JTRR) A JTRR may be conducted for operational tests that are joint or have multi-service participation. JTRRs are normally conducted in accordance with the lead service TRR procedures and requirements. JTRRs conducted for programs where the Army is the lead test service will normally follow procedures for OTRRs, tailored to meet specific program requirements ATEC Pamphlet June 2010

167 4-16. Overview of test execution a. The AST members will follow the policies that are provided in the ATEC Regulation 73-1 and supplementing procedures provided by ATEC s SCAs for test execution. Throughout the execution, the AST members monitor the conduct of the tests listed in the DSM to ensure that the system is being tested as planned and that data are reported in a complete and accurate manner in support of the system evaluation. b. The AST members must also monitor developments in the acquisition program to ensure that changes in priorities, requirements, threat, schedules, or the availability of resources, including funding, are reflected and accommodated in the T&E. c. Disposition of equipment should be known to the AST. ATEC has experience a multitude of logistical issues due to abandonment of equipment after a test is completed. This causes such issues as excess on hand; determining ownership, re-establishing accountability; coordinating disposition, obtaining funds to ship/turn in/transfer, and are hard to resolve. All equipment (test and test support) not required after the completion of test will be returned to the PM, TACOM loan coordinator or turned-in Developmental test (DT) execution Developmental tests are conducted in laboratories, contractor facilities, proving grounds, and other test facilities. During the test, the test officer controls the system and test variables. DTs are structured so that the system is ultimately subjected to stress levels commensurate with those to which the mature system will be subjected and those expected to be found in representative operating environments. a. Developmental Detailed Test Plans. The government test plan provides explicit instructions for the conduct of tests and subtests. It governs test control, test configurations, data collection, data analysis, and administrative aspects of the tester s operations. The test officer prepares a test plan in accordance with the directions provided by the test directive, the SEP, and the DT OTA TP if required, and determines the best plan for the testing of the system for the area(s) assigned. b. Test Operations Procedures (TOP) and International TOPs (ITOP). TOPs and ITOPs (as well as other international test procedures standardization documents) serve as guidelines for designing and conducting tests and subtests that will be standardized, within reasonable limits imposed by normal variations of materiel and circumstances. Maximum use is made of TOPs and ITOPs. They document the existing state-of-the-art in developmental testing technology; facilitate the preparation of test plans; prescribe the details of planned operations during testing; and reflect current international agreements in specific technical areas. (1) TOPs. The intent of a TOP is to give simple, complete technical instructions for conducting developmental tests. TOPs capture expertise; serve as a tool for test plan preparation; and alert developers, contractors, and allies to test standards. The use of TOPs permits tests to be conducted with accuracy, precision, consistency, and repeatability of test data; achieves test standardization independent of test time or location; and encourages standardization throughout ATEC, Army, industry, and participating nations. The TOP that covers the type of ATEC Pamphlet June

168 system being tested is checked for guidance on the subtests that should be conducted. Common TOPs should be consulted for subtests required on most items (e.g., human factors, safety, and transportability). In the case of small, limited tests, the TOP may serve as the test plan. (2) International Standards. ITOPs and NATO Standardization Agreements (STANAGs) are international agreements that document state-of-the-art proving ground techniques for technical testing of military systems and allow exchange of test data among signatory nations without retest. c. Contractor Test Plans. If the data from the contractor DT is to be used by the system evaluator, the test plan must reflect all the requirements to support a system evaluation. When the system contractor is conducting a DT, whether at the contractor s facilities or at an ATEC test site, a test plan is provided to the T&E WIPT for review by the members and approval by the PM. The test plan should be reviewed by the AST and comments/suggestions provided to the T&E WIPT. Release of data to the evaluator should be stipulated in the statement of work. d. The test officer considers the NET requirements of the test participants and the testing of the NET package. The NET program is conducted by the test sponsor and can become part of the test program by providing information on how well the NET was conducted and how well the test participants were able to operate/maintain the items. e. Specific internal test center guidance and procedures for test center preparation, review and approval of test plans are reflected in internal DTC guidance. f. DT execution will follow the procedures described below. (1) The prerequisites for DT initiation for evaluated programs are listed below. (a) Approved DT OTA TP (if required). (b) Approved DT DTP. (c) Receipt of environmental documentation. (d) Receipt of the Safety Assessment Report/Health Hazard Assessment Report. (e) Receipt of the Security Classification Guide. (f) Receipt of the SSP. (g) A safety release and a HUC review prior to initiating any training or test involving non-test Soldiers. (h) A DTRR will be convened prior to a PQT to ensure test planning, timing, and resourcing are on schedule and to resolve problems that may impact test initiation/execution. The PM who chairs the DTRR prepares a DTRS as part of the minutes documenting that the system is ready for the PQT ATEC Pamphlet June 2010

169 (i) Test item(s) and funding. (2) During conduct of DT, the test officer must ensure that the test is conducted according to the plans and procedures established by DTC. (3) Suspension of Testing. Testing will be suspended when a major problem is identified during the test (i.e., problems that would affect the validity of the data being collected to address the critical technical parameters). The PM or the HQ DTC test manager may suspend DT. When there are undue hazards to personnel or equipment, the test center immediately interrupts testing and notifies HQ DTC and a decision is made on whether or not to continue the suspension. (a) Suspension of RAM testing will not occur if the design changes being made during testing are contributing to reliability growth. Reliability testing will not be suspended merely because statistical inference cannot be obtained. (b) The system evaluator may also recommend suspension if it is apparent that the system has little chance of successfully attaining the CTPs or if it is evident that the test data being obtained will not be useful for the evaluation. (c) Upon suspension, the PM will convene a review to consider future actions. Once the problem is solved, the T&E WIPT will determine any additional tests needed to validate the solutions. A DTRR may be convened before start of a new test. (4) Termination of Testing. The PM recommends termination of DT to the MDA when circumstances warrant. g. Data Collection. (1) During conduct of test, the test officer issues TIRs. TIRs are the primary means of providing information on the item s performance during test to the PM and the AST. The test manager and evaluator will make the final decision regarding acceptable test data. See internal DTC guidance for post-reporting procedures. (2) The test officer prepares a test report in accordance with the test execution directive or DTC report policy guidance. h. Release of Data. Refer to ATEC Regulation 73-1 for detailed information on release of data in support of an evaluation. Release of test data generated during a non-evaluated test is contingent upon approval by the test sponsor. i. Contractor Involvement. The degree and nature of system contractor involvement in DT must be agreed upon in the T&E WIPT and then communicated through the contractual requirements. Developing these agreements early will help to ensure that test data will be usable for evaluation. (1) Contractor involvement may range from no direct involvement, to providing spare parts and technical advice during the conduct of DT, to performing the entire test. ATEC Pamphlet June

170 (2) When the contractor conducts the test, consideration should be given to the use of a combined government/contractor DT team. The test results are reported by the contractor and verified by the government test personnel, thus avoiding duplication of testing. (3) The degree of system contractor involvement in DT scoring conferences will be agreed upon in the T&E WIPT. System contractor personnel will not attend or be directly involved as either members or observers during R&M Scoring Conferences which address data intended to aggregate the OT data for ultimately addressing compliance with Army s Operational RAM requirements. (4) In cases where DT and OT are planned and described in the TEMP to be combined or integrated, the parameters for contractor involvement must be carefully coordinated to ensure the independent evaluation statutory restrictions are met. j. During the conduct of Live-Fire Test, periodic Damage Assessment Team (DAT) meetings will be conducted. The purpose is to provide the assessment of the damage on the combat utility of the platform being tested (see DA Pamphlet 73-1 for guidance). k. Files are maintained at HQ DTC and will provide a complete record of the test and necessary reference material. l. Once the test has been initiated, it is the responsibility of the test officer to ensure all subtests are conducted in accordance with the approved test plan, and ensure TIRs are entered into the ATIRS/ADACS system. The TIR is the primary means of providing information on the item or system performance and test data required by the PM and system evaluator. (See DA Pamphlet 73-1 for details on TIRs.) Operational test execution The OTC is responsible for the proper and safe conduct of operational tests. Operational tests may occasionally be conducted for OTC by one of the ATEC TECOs. All OTC personnel will execute operational tests in accordance with the guidance below. a. Prior to movement to the field test site, the test officer must ensure the following items are on hand: (1) Test item. (2) Documents listed in figure 4-3, Documents Required Prior to Operational Test Events. (3) Test Support. Supply, services, maintenance, medical, transportation, audio-visual, and facilities must be in place prior to test start. b. Pretest Activities. These activities involve all pretest training, organizing for execution and support, preparation of equipment and test areas, the pilot test, adjustment of plans, and all other actions required to prepare for the test. An end-to-end data run to include DAG operations should also be part of the pretest activities ATEC Pamphlet June 2010

171 (1) Training. Regardless of the type of test, some evaluation of training and training support is normally conducted. This is necessary to ensure the skills and knowledge necessary to operate and maintain a system can be attained and sustained within realistic training environments by units using personnel of the type and qualification expected when the system is deployed. (2) Pilot test. A pilot test must always be conducted prior to the actual OT. The primary purpose of the pilot test is to validate test procedures, test data collection and reduction procedures. These procedures include assessing the organization of the test team, validating the test scenario, verifying the adequacy of data collection forms and instrumentation, exercising the support system, and confirming all planned communication and control procedures. The pilot test should be conducted early enough to allow for reaction and correction of any deficiencies. Tests relying heavily on instrumentation may require additional time after the pilot test to correct problems. Data collected during the training and the pilot test may or may not be considered valid. If any of the data from the pilot test is used as data in the test report, it must be obtained under the same test conditions as the record trials. The pilot test will determine if the test is conducted, suspended or terminated. (3) Accreditation. In most cases, OTC uses a model/simulation for which the verification and validation (V&V) was conducted previously or by the agency that created the model/simulation. The requirement is for the development and execution of an accreditation plan to provide data and information for an accreditation report which is used to support the request for accreditation of the model/simulation for use in the test. The only time that a V&V is required would be if OTC developed the model/simulation and no V&V had been performed or the model/simulation was developed by another agency and no V&V was available. This is an AST responsibility, normally executed by the test team in cooperation with the simulation providers. See ATEC Regulation for additional information. (4) Instrumentation Certification. The ATEC CG/ED is the instrumentation certification authority. The authority is delegated to the Director of Test Technology Directorate (TTD), ATEC. Instrumentation certification should be viewed as a process by which the instrumenters, test organization, and the system evaluators/analysts gain confidence in the instrumentation and system under test prior to testing. See ATEC Regulation 70-1 for ATEC Instrumentation Certification procedures. c. Conduct of Test. (1) Adherence to test design. The test officer must ensure that the test is conducted according to the OTA TP and the operational DTP and procedures established by OTC. (2) Test officer s log. The test officer records all the major events of the test, beginning at the start of the pilot test. The major events are decided at the test officer s discretion. (3) Test changes. During the test, it may become apparent that a deviation from the plan is required. Minor adjustments are frequently required and made by the test officer. The test officer should consult with the other AST members to determine the impact of changes on the evaluation. Major deviations from the test plan should immediately be brought to the attention ATEC Pamphlet June

172 of the AST chair s Commander or Director. In any tests, all deviations from the approved OTA TP should be immediately and fully documented. (4) Test Incident Report (TIR). During test execution, each intermittent or incipient malfunction, safety hazard, or degradation in system performance or operation is to be documented in a TIR, prepared by the organization conducting the test. All TIRs are submitted to the ATIRS. (5) Visitors and briefings. The test officer must control all non-atec visitors to the test site. The test officer will designate someone to prepare and deliver an onsite visitors briefing. (6) Operations Security (OPSEC). The purpose of OPSEC is to prevent hostile intelligence from obtaining information on a new system. Examples of OPSEC-sensitive information include the system s vulnerabilities, delays in development, deficiencies that reduce operational capabilities, concept and doctrine for operational employment and the capabilities of new systems. OPSEC will be included in all plans. Any lapse is reported to the directorate OPSEC officer. d. Delay or Suspension of Testing. Testing will be delayed when a problem is identified that would affect the required quantity or validity of the data to be collected. Start of testing may also be delayed when it is apparent that the system has little chance of successfully meeting requirements and deficiencies can t be resolved before the start of the test. Testing may be suspended when a problem is identified during the test that can t be resolved within the test schedule or within reasonable accommodations. Although the test team may temporarily suspend or delay test, only the ATEC CG/ED may officially delay or suspend an operational test. Any T&E WIPT member may recommend delay or suspension to the testing commander. (1) Anyone involved during test execution may delay or suspend testing when critical or catastrophic safety hazards to personnel or equipment are discovered. HQ. (2) When a test is delayed or suspended for any reason, the AST chair will notify ATEC (3) When a test is delayed or suspended, the PM convenes a program review to consider future actions. Once the PM has solved the problem, the T&E WIPT will convene to determine necessary additional tests to validate the solutions. Before testing is restarted, appropriate test readiness reviews will be conducted. e. Termination of Testing. ED ATEC will recommend termination of operational testing to the Vice Chief of Staff of the Army when circumstances warrant. Termination of testing releases test resources (e.g. troops) and the test must be rescheduled. f. Completion of Test. Every effort will be made to ensure that the test is officially completed at the scheduled completion date; the only reason for an extension of the test window is that unforeseen difficulties have precluded successful collection of all required data. The test is not over until the final level 3 DAG-accredited database has been delivered to and accepted by the evaluator, the report is published, and the test site has been fully dismantled and all closeout activities have been accomplished. The test officer and the evaluator will make the final 4-40 ATEC Pamphlet June 2010

173 decision regarding acceptable test data. The test officer must remain on the test site or revisit the site as necessary to phase out all operations. All test and player personnel must acknowledge completion of their assigned duties before release. Any test causing a potential adverse effect on the environment should be reported to senior directorate personnel and appropriate post agencies. g. Contractor Involvement. Section 2399, Title 10 U.S.C. restricts the involvement of system contractors in IOT&E for MDAPs, except for those persons employed by the contractor involved in the operation, maintenance, and support of the system when the system is deployed. Furthermore, ATEC s policy is that system contractors or their subcontractors shall not participate in the IOT&E for any system, regardless of ACAT level, except to the extent they will be involved in the operation, maintenance, and support of the system being tested if the system is acquired. This policy is meant to ensure compliance with Title 10, Section 2399 which in addition to the above states that a contractor that has participated in, or is participating in, the development, production, or testing of such system for a military department or Defense Agency or for another DOD contractor may not be involved in any way in the establishment of criteria for data collection, performance assessment, or evaluation activities for the operational T&E. For this reason, system contractors may not be directly involved in data authentication group sessions or RAM scoring or assessment conferences. Occasionally, at the Chair s request, they may be brought into these two groups as a subject matter expert, but they will not be present for voting. All SCAs must ensure that, before employing any contractor to support them for T&E planning, execution, or reporting, the SCAs in coordination with the contracting office must identify and alleviate any potential conflicts of interest. (1) The ATEC CG/ED may waive the limitations for programs that are not on the OSD T&E Oversight List if the CG/ED determines, in writing, that sufficient steps have been taken to ensure the impartiality of the contractor in providing these services. (2) In order to preserve the integrity of the test data for all combined DT/OT, the SCAs will identify any restrictions or limitations on system contractors participating in testing, including access to the test site or preliminary test information. These restrictions will be documented in the SEP and coordinated with the PM and the appropriate test center. The PM will ensure consistency with the terms and conditions of the contract and the responsible test center will include these restrictions in the test execution and management procedures. h. Release of Data. Release of data is discussed in ATEC Regulation Customer test When the PM requests a test that does not support an evaluation and the test command determines that the test service requested is indeed a customer test, the tester will follow the PM s request explicitly. In some cases, the test sponsor provides the test plan; in others, the sponsor asks that the DTC or OTC develop the test plan. However, to eliminate any misunderstanding by the PM as to the role the customer test plays, the following statement will appear on the inside front cover of all customer test plans/reports: This test is being (or in the case of reports, was) conducted in accordance with the test sponsor s requirements. It may not relate to the system evaluator s issues. ATEC Pamphlet June

174 4-20. Data management overview a. The data generated from the testing of systems serve many purposes in addition to supporting the independent evaluation/assessment and milestone decisions. Data from early DT provides feedback to developers on design elements and system contractors on manufacturing or quality deficiencies. Data from OT provides the user with confidence in the system s performance. Therefore, the ATEC management system should be flexible enough to accommodate data from multiple sources and allow for data usage from various customers as well as support the use of actual T&E data to calibrate and accredit constructive models and simulations. To ensure quality data and data reuse, data producers and evaluators must consider the application of data standards as data are generated, collected, processed, analyzed, and stored. b. Real-time and near real-time data handling procedures at the test site will better position the AST to complete OTA Milestone Assessment Reports within the E+60 day or no later than MDR-45 day timeframe. ATEC employees will use the VDLS as a digital repository. VDLS provides a collaborative knowledge management capability that combines dissimilar test information sources (such as scientific databases,.databases of video and photographic information, analyses, documents, , etc.) and high performance computational capabilities to facilitate knowledge synthesis. It also allows collaboration; facilitates better exchange of information between test organizations and evaluators, ensuring the AST has a better understanding of how the data are collected and reduced; and allows data analyses to begin at the test site rather than waiting for the test report. c. On-site data collection and processing will comply with the following guidelines: (1) Data is shared with the AST as it is collected and processed so that the evaluation team benefits from additional insights to the data and to specific aspects of system performance. (2) Hold DAGs and Scoring Conferences periodically as data are accumulated vice waiting for end of test. (3) The tester will populate the level 3 data set documented in the SEP/OTA TP as data are collected, reduced, quality-controlled, and authenticated. Any difficulties populating the level 3 database are T&E issues that must be corrected to preclude them becoming evaluation limitations. (4) The final DAG and scoring conference will be held and the complete level 3 dataset will be finalized and provided to the evaluation team before AST members and test support personnel are released from the test site. Some meeting participants may be connected via telephone or VTC, if that provides the best solution for DAG and scoring conferences members provided necessary information can be provided to remote participants. (5) AEC and OTC will update internal procedures and planning as necessary to ensure appropriate personnel remain committed to finalizing the level 3 data set as early as possible. Normally this will be the date of the final test s plus the time required to process and DAG or score data from those events. For planning purposes the level 3 data set will be completed NLT 4-42 ATEC Pamphlet June 2010

175 E+10. Level 3 databases will not be considered complete until formally accepted by the evaluator. (6) Electronic DAGs are encouraged. d. The following guidelines provide the framework for the evaluation team data management and analyses at the test site. (1) The evaluation team will develop a strawman OMAR/OER/OFER or OAR prior to the start of test. The straw man report will be based on the SEP and OTA TP, and will include sample data displays and graphics. The Evaluator will include and begin analyzing information collected in prior relevant testing (per the DSM). (2) Analysis procedures identified in the SEP and OTA TP, which was rehearsed with the EDR, pilot test, and end-to-end data run, will be included in the strawman report and exercised with the partial data sets to further validate their functionality and gain insights to system performance. If necessary, the planned analysis procedures will be modified to gain better understanding of the data. (3) On-site analyses will include generating the proposed data displays from the draft (OMAR/OER/OFER) based on the partial level 3 dataset and updating them each time new data is delivered. Additional data displays will be generated as appropriate. (4) None of the on-site analysis work is releasable outside of ATEC until it has been validated by the chain of command. e. The Army Test Incident Reporting System (ATIRS) is an integrated information system within the VDLS developed to manage and process test data and corrective action information to assist the continuous evaluation process. The ATIRS is administered by DTC s Aberdeen Test Center, Aberdeen Proving Ground, Maryland, and is available 24 hours a day, 7 days a week. It provides RAM data as well as some performance, firing, and armor vulnerability information. DTC and OTC, as well as contractors, enter test incident data into ATIRS according to directions from the T&E WIPT. ATIRS utilizes the existing Automated Data Collection System (ADACS), an online, integrated information database system. ATIRS facilitates data entry and allows easy customization of data collection and transmission, data organization and categorization, summarization, analysis, and archiving. RAM 2000 and S-RAM are information systems used by OTC that facilitate data entry of RAM data on standard desktop or notebook computers. These systems were developed to assist engineers, analysts, and data managers in collecting, processing, and reporting RAM test data. Both systems have two primary objectives: to produce calculations that are needed to answer RAM test issues and criteria; and to generate a file that can be electronically streamed into ATIRS. See DA Pamphlet 73-1 for detailed information on the processes and procedures for reporting of DT and OT results and corrective action information in ATIRS. f. The RAM WIPT reviews, classifies and charges RAM data from system-level developmental and operational tests, thereby establishing a common test database to ensure proper and consistent categorization of test incidents. See appendix I of this document for additional guidance on RAM. ATEC Pamphlet June

176 g. All of the data supporting a system evaluation should be documented, organized, and managed using a system evaluation data model. The data model will define all of the data files as well as all of the variables that the Evaluator will draw upon in evaluating the system. Data models are refined into normalized files or tables. A normalized file design describes a single object (for example, person, tank, or sensor) or event (for example, electronic warfare status, firing a gun, or target detection opportunity). Normalized data tables are optimized for data management, query, and consistency. h. It is easy to confuse the design of a data set with a physical data set that has data in it. If a physical data set is envisioned as a data table, the data set design would be represented by the column headings of that table. The design comprises a list of names of variables that are needed to address hypotheses about the effects of variables on system performance or overall mission accomplishment. When a physical data set is created, the variable names become row headings. Physical data sets are populated with data generated at some test. Each observation captured at the test becomes a row in the table. The design of a data set is driven by the design of experiment and the planned data analysis DT data management a. During DT execution, preliminary data are reported and recorded in the test center repositories, VDL, or ATIRS. Our goal is to link all test center repositories within VLDS so that VDLS will be the primary means of providing data to internal and external customers. TIRs may also be written during a test to provide performance and test item failure information. Timely and thorough test reports should be the primary means to provide information on system performance when testing is completed. Timely reporting of DT data is essential to provide feedback on design elements in order to ensure adequate progress toward meeting the users requirements. DT data is provided throughout the test program to allow the materiel developer to consider and affect corrective actions. The PM may allow the contractor immediate access to TIRs so that the developer s analysis and corrective actions can be performed prior to the next test. b. Using information provided in the SEP and the approved DTP, the test officer establishes guidelines for data collection, quality, and management. c. The DT data management process involves: (1) Planning. DT data requirements are determined by the AST and the documents the tests required providing the data. DTC then develops a DTP that provides explicit instructions for the conduct of the test. It includes the specific subtests necessary to provide the data to answer the evaluator s technical parameters and operational issues. (a) Once the test officer has approved the data elements to be collected, a test program specific VDLS folder and ADACS database are established to record all test data and produce the data collection input forms required. The VDLS database provides space for test documents, preliminary and final data, a TIR viewer, and an area for test discussion. The ADACS database provides a complete and thorough listing of all data elements, the size of the 4-44 ATEC Pamphlet June 2010

177 data fields, complete and accurate definitions for all data elements, and a data element dictionary. (b) Prior to the start of test, the test officer will provide data collectors with test specific requirements. Clear and precise instructions will be provided on how and when each data element will be documented, the correct format to be used, and the method of data collection. (2) Data Collection. On a daily basis, the test officer reviews and edits the collected data for completeness, accuracy, consistency, and the adequacy of supporting narratives. Any errors, omissions, or misleading or ambiguous data shall be corrected expeditiously. After review, data are entered into the designated VDLS folder and both TIRs and non-tir data are entered into the ADACS database. (3) Data Reduction. (a) As TIR data are received, the test officer evaluates and assigns the appropriate preliminary evaluation scores or classifications using guidelines that have been established to evaluate the various system or equipment operating parameters. Reports are produced as required for the scoring conference and are available electronically. (b) The ADACS or other test database is updated, including scoring conference reports and decisions and the changes are verified by the test officer. Upon verification, the final approved data and any requested database reports are released to the appropriate acquisition community. (4) Data Confirmation. The purpose of DT data confirmation is to ensure the widest possible use of the data. The T&E WIPT determines the need to confirm certain test data. Data generated by tests at DTC s test facilities should meet data confirmation standards. However, in those instances when a DT is being conducted at a non-government facility and that facility s ability to provide acceptable data is in doubt, the procedures set forth in DA Pamphlet 73-1, will be followed. Data confirmation involves inspection of the facilities, review of test personnel experience and expertise, review of testing and data collection procedures, use of calibrated instrumentation, and may require monitoring by government DT personnel OT data management a. The data management process for operational test shall be structured to ensure that only essential data are collected; data are appropriately safeguarded, reduced, analyzed, and reported; and that record databases are structured and archived for both current and future use. b. The AST shall develop a database and corresponding data element dictionary for each test. The database structure and dictionary must be coordinated within the AST to ensure data elements are appropriate and consistent with any known standardized system data elements. c. The AST shall determine the need for a DAG to support a specific test and coordinate the appointment of a DAG Chair. For tests not associated with an acquisition program, the ATEC Pamphlet June

178 executing agency will determine if a DAG is required. The decision to establish or waive a DAG will be documented in the OTA TP. d. Data management for OT and experimentation is a process that encompasses the full range of activities required to identify, collect, secure, reduce, process, validate, analyze, and report OT data. The data management process includes: (1) Planning. (a) The initial concepts for OT data management are identified by an AST when developing the evaluation strategy and test strategy for the SEP. Following approval of the evaluation strategy and test strategy at the ESR and T&E CIPR, the AST follows a sequential process in identifying data management requirements. Initially, the AST reviews the capabilities and requirement documents; identifies the evaluation focus areas and measures of interest for an evaluation; and develops a DSM that is published in the SEP. The operational tester, in conjunction with the evaluator, uses the DSM and the evaluation details in chapter 3 of the SEP to develop a test level PoA. The PoA identifies the data requirements and data elements that will be collected during OT and used for evaluation purposes. The PoA is documented in appendix A of the OT OTA TP. (b) After data requirements and data elements are known, the AST structures the requirements for the record test database. The OTC AST member then takes the lead in structuring the data collection and reduction organizations and procedures for data management and documents the information in chapter 4 of the OT OTA TP. The OTC test team further documents intricate details in a data management plan in the OT DTP. (c) The data management plan serves as a guidebook for data managers during test conduct. The plan describes the composition of the data collection and data reduction organizations, the chain of responsibility, and mission requirements for organizational elements. The plan also identifies each data recording method (human or instrumentation); assigns data collectors to player units and locations; illustrates the flow of data from collection points through quality control, data reduction, data entry, and record storage; and outlines procedures for controlling data, data recovery, and monitoring of test tests. (2) Data collection. (a) The primary functions of the data collection organization are recording data, checking data for accuracy and completeness, safeguarding data in accordance with the system Security Classification Guide, and transmitting data in a secure and timely manner. Typically, the data collection organization consists of teams that are assigned at pre-defined collection points for the purpose of recording outputs or responses from the test system and player participants, and documenting any observable test conditions that might influence the responses of the player participants. The number of data collectors and teams employed are based on the size and structure of the test. (b) During conduct of an OT, the test analyst is responsible for ensuring that data collection functions are being performed to ensure collection of the appropriate data ATEC Pamphlet June 2010

179 (c) The method used for collection of data (manual or automated) is dependent on the test system and the availability of data collection instrumentation/technology. Typical manual data collection methods are data collection forms, questionnaires, interviews, and player input. Automated methods may include a voice recording system, 1553 data buses, Multiple Integrated Laser Engagement System (MILES), OT-Tactical Engagement System (OT-TES), Global Positioning System (GPS), or other automated technologies. (d) Regardless of the data collection method, the data collection organization implements and maintains control of OT test data from the beginning of the pilot test and throughout all test phases. Before transmitting data, the data collection organization ensures that all data are as accurate and complete as possible. When data are incomplete, the lead data collector, data controller, or data analyst immediately confers with test team members and subject-matter experts to correct inaccuracies or obtain missing information. (3) Data reduction. (a) The data reduction organization is generally responsible for quality assurance, entering test data into an electronic database, and maintaining a library of raw and reduced test data for reference and audit by the DAG. The organization maintains accountability of all data transferred from data collection. Immediately upon receipt, data reducers verify that data received are complete, legible, and valid. The reducers further ensure that incidents are scored by the test team and that incident descriptions contain sufficient information for evaluation and scoring by scoring conference members. (b) Following data entry, the data reduction organization uses manual and automated edit checks to verify that all data have been accurately entered into a record database. Quality control procedures outlined in the data management plan provides guidance for processing test data and structuring test data sets. e. Quick Look Reports. (1) OTC will provide Quick-Look measures to the evaluator on a daily basis or other agreed upon interval of time. This process will begin at the start of the test and will continue as long as the evaluator thinks they are of value. (2) The Quick-Look report will include selected critical measures of system performance and test progress. No more than six (6) measures shall be identified. The number of measures will be kept to the minimum required to ensure the data being collected is adequate and the test is progressing as expected. (3) The format, content, timeliness requirements, and level of quality control of the Quick-Look daily report will be negotiated between the tester and evaluator and documented in the OT OTA TP. (4) The Quick-Look measures will be demonstrated and refined as part of the dummy and end-to-end data run processes. ATEC Pamphlet June

180 4-23. AEC data management AEC personnel involved in managing and producing data products may be referred to as a data management team. The data management team will be composed of one or more Analysts from AEC, possibly to include a statistician. Supporting contractor personnel (e.g., additional Analysts, database manager, and data modeler) may also be included. The mission of the AEC data management team is to manage AEC data products throughout the three phases of a test: planning, operations, and reporting. The Evaluator may also participate in data management activities Data authentication Authentication of data is dependent upon an ongoing process where data is verified to be complete, accurate, consistent, and representative of the system s performance. Data management at a specific test is carried out by the data authentication group (DAG), the test organization s data reduction team, and the evaluation organization s data management team. a. Data Authentication Group (DAG). The DAG is made up of the DAG committee and, as appropriate, the DAG verification team. The committee is made up of the test analyst, system evaluator, and other members of the test community. The DAG is chaired by the tester. The DAG verification team is composed of personnel from the test organization and may include contractors. The mission of the DAG is the verification, validation, and authentication of the data collected at the test. Generally, the data management team will not be part of the committee. They may, however, support the DAG verification team in the production of DAG reports. Additional information is provided in appendix M. b. Data Reduction Team. The data reduction team is made up of test personnel. The primary mission of the data reduction team is the production of a level three test database. The data management team will generally not directly support data reduction, but may be required to design software programs to retrieve data elements required for the evaluation data sets from the test database. c. Data authentication for ATEC-conducted tests may be accomplished by a formal independent DAG or an internal ATEC group, depending on the visibility and size of an test. The DAG is a group of representatives from the testing, evaluation, user, and acquisition community, formed for the purpose of authenticating data that is generated, collected, and reduced. As standard practice, ASTs implement formal DAGs for all OT tests conducted for DOT&E oversight programs, multi-service programs, and for large-scale tests conducted for non-oversight programs. For multi-service programs, the DAG may be Service-specific and may require a MOA or Memorandum of Understanding (MOU) to ensure coordination between Services. ATEC DAGs are generally formed for small-scaled tests when the AST has determined that membership for a formal DAG is not available, funding is not available, or a DAG is not required for other reasons. An ATEC DAG may also be formed to authenticate DT data. See appendix N for additional information ATEC Pamphlet June 2010

181 4-25. Data release a. The AST members have access to the data as it is collected and reduced to allow for preliminary analysis. Early analysis may help determine if a problem exists in the test, collection, and/or reduction procedures. The T&E WIPT members have access to DAG level 3 data at the same time as the system evaluators. b. Premature release of level 1 or 2 data or unauthenticated level 3 data or emerging results outside of the AST can be detrimental to the final reporting process. Assessments made prior to the complete analysis of test results can be misleading, can be found to be incorrect when the complete set of test data is thoroughly analyzed, and can cause biases that are difficult to overcome. See ATEC Regulation 73-1 for detailed information on release of DT and OT data, including exceptions to this general rule. c. Table J-1 defines the levels of data definitions. The database is already designed prior to populating it. Basically the levels are as follows: Levels 1 3: Data Processing (as in validating and verifying and storing data. Level 4: Descriptive Statistics (describe a sample). Level 5: Inferential Statistics (make inferences about the population that the sample represents). ATEC Pamphlet June

182 Table 4-3. Data Level Definitions Data Level Description Possible Sources Examples of Content Disposition Level 1 data raw data Data in their original form. Results of field trials just as recorded. Raw data. Complete data collection forms, filled questionnaires, interview transcripts, test logs. Instrumented data including audio and video records, data bus, position, location, etc. 1. All reported target detection 2. Clock times of all tests 3. Impact locat5ion of each round fired. 4. Recordings of interviews Accumulated during trials for processing. Usually discarded after use. Not published Level 2 data processed data Raw data are processed. Manual data are digitized. Data are manipulated and organized. All data fields are quality controlled and anomalous data are corrected. Invalid data are identified as such with supporting rationale. Data are housed in a relational database with appropriate links between independent and dependent variables or flat files containing all the independent and dependent variables. Calculations limited to counting and elementary arithmetic (+-*/) to generate measures. All data processing completed. Annotated and corrected data collection forms and logs. Annotated, corrected and filtered (delete/extract) instrumented data. Original raw data with No Test events identified.. Level 1 data that has been processed. 1. Record of all valid detections. 2. Start and stop times of all applicable events. 3. Elapsed times. 4. Round impact locations corrected as necessary. 5. Miss distances. 6. Confirmed interview records. Produced during processing. Made available to the DAG for their review. Stored in electronic files. Not published. Level 3 data authenticated data Data which have been checked for accuracy and arranged in convenient order for handling. Operations limited to counting and elementary arithmetic. Spreadsheet, tables, typed lists, ordered and labeled printouts, purified and ordered tape, edited film, edited magnetic tapes. 1. Counts of detections arranged in sets showing conditions under which detections occurred. 2. Elapsed times by type of event. 3. Impact points of rounds by condition under which fired. 4. Interview comments categorized by type. Usually not published but made available to analysts. Stored in institutional data banks. Level 4 data findings or summary statistics Data that have been authenticated by the DAG as representing the most credible source of information as to what actually occurred during the test. Level 2 data that has undergone the authentication process. 1.Record of all valid detections. 2. Elapsed times. 3. Missed distances. 4.Confirmed interview records. Usually not published but made available to analyst. Stored in institutional data banks ATEC Pamphlet June 2010

183 Table 4-3. Data Level Definitions (Continued) Data Level Description Possible Sources Examples of Content Disposition Level 5 data analysis or inferential statistics Data resulting from statistical tests of hypothesis or interval estimation. Execution of planned analysis data Includes both comparisons and statistical significance level. Judgments limited to analyst s selection of techniques and significant levels. Results of primary statistical techniques such as T-tests, Chisquare, F-test, analysis of variance, regression analysis, contingency table analyses and other associated confidence levels. Follow-on tests of hypotheses arising from results of earlier analysis, or fallback to alternate nonparametric technique when distribution of data does not support assumption of normality. Qualitative data in the form of prevailing consensus. 1. Inferred probability of detection with its confidence interval. 2. Significance of difference between two mean elapsed times. 3. Significance of difference between observed probable error and criterion threshold. 4. Magnitude of difference between categories of comments. Published in evaluation reports. (If evaluation report is part of test report, the level 5 analysis results are presented separately from the level 4 findings.) Level 6, data extended analysis or operations Data resulting from further analytic treatment going beyond primary statistical analysis, combination of analytic results from different sources, or exercise of simulation or models. Judgments limited to analysts choices only. Insertion of test data into a computational model or a combat simulation, aggregation of data from different sources observing required disciplines, curve fitting and other analytic generalization, or other operations research techniques such as application of queuing theory, inventory theory, cost analysis, or decision analysis techniques. 1. Computation of probability of hit based on target detection data from test combined with separate data or probability of hit given a detection. 2. Exercise of attrition model using empirical test times distribution. 3. Determination of whether a trend can be identified from correlation of flash base accuracy data under stated conditions from different sources. 4. Delphi technique treatment of consensus of interview comments. Published as appropriate in evaluation reports. Level 7 data conclusions or evaluation Data conclusions resulting from applying evaluative military judgments to analytic results. Stated conclusions as to issues, position statements, challenges to validity or analysis. 1. Conclusion as to whether probability of detection is adequate. 2. Conclusion as to timeliness of system performance. 3. Conclusion as to military value of flash base accuracy. 4. Conclusion as to main problems identified by interviewees. Published as the basic evaluative conclusions of evaluation reports Data retention and disposition To the extent practical, ATEC will retain T&E information using electronic media, using the most convenient format. Table 4-3 presents guidelines for a minimum retention period of ATEC T&E information and maximum length of time between subsequent reviews. ATEC Pamphlet June

184 Table 4-4. Data Retention Guidelines Data Category Raw data data in its original form (Level 1) Audio/video tape and film (Level 1) Written Level 2 data Processed and smoothed automated instrumentation data (Level 2) Test database of record (Level 3) Plans and reports (Levels 4-7) Supplemental analyses (Levels 4-7) Retention Retained for 1 year after end of event. Retained for 1 year after end of event Retained for 1 year after end of event Archived in VDL for 1 year after end of event Archived in VDL permanently Archived in VDL permanently Archived in VDL 3 years for non-oversight 10 years for oversight 4-52 ATEC Pamphlet June 2010

185 Chapter 5 Test and Evaluation Reporting 5-1. Overview A primary mission of ATEC is to provide essential information to decision makers. ATEC evaluations will provide an assessment beyond platform level effectiveness, suitability and survivability by extending the concept of effectiveness, suitability, and survivability into evaluating operational capabilities and limitations. ATEC evaluates and reports on a systems capabilities and limitations at multiple levels of complexity, starting with the individual system or its major subsystems and culminating with an evaluation of the contribution of the system to the unit missions. ATEC reports both on the information obtained from specific events (DT, OT, LFT, other) in test reports for each event which documents event conditions, limitation, and results and on the system level which is the evaluation of the system in terms of mission performance, operational effectiveness, suitability, and survivability. Figure 5-1. T&E Reporting Process Section I Test Reporting 5-2. Test reporting policy ATEC provides written test reports and authenticated event databases as input to the OAR, OMAR, and OER provided to support program decision requirements. The final DT Report (DTR) is provided to the AST and to the T&E WIPT as soon as possible. The AOTR is the normal document of record that is provided to the evaluator. It reports the data and information ATEC Pamphlet June

186 obtained from the conduct of test and describes the conditions that actually prevailed during test execution and data collection. The DAG process is used in the evaluation process for the system. a. For each test event, a test report will be completed. This report may be called by different names depending on the event. Test report formats may be modified to accommodate any peculiar circumstances associated with the event. The test report should fully document the activities and results of the test. The test activity that conducts the test will prepare, approve, and publish the test reports in coordination with the AST. DTC test reports are provided to the test sponsor (Materiel Developer/PM) for review and comment before approval. b. ATEC does not use unit names or designations in its test reports. c. All test reports will have a disclaimer stating this is a test report and not the ATEC evaluation report. A sample disclaimer is: This test report documents findings from the (xyz) test conducted at Test Center (Z), from (date) through (date). It must not be construed as the Army Test and Evaluation Command system assessment or evaluation report for the (name of tested) system. d. For all acquisition evaluated tests, including Life Fire Tests, the following disclaimer will be placed in any section in the report where recommendations are presented (executive summary, etc.) directly preceding the recommendations. The following recommendations are suggestions for improvement based on tester observations made during this test program. They do not reflect final ATEC evaluation recommendations, which will be developed based on the results of this test, modeling and simulation outcomes, and/or other data sources, as applicable Developmental Test Report a. The DTR is provided by the activity conducting the test to the AST and to the T&E WIPT as shown in figure 5-3. b. For long tests, subtest reports may be provided in the interest of timeliness. Final, formal test reports are required prior to decision reviews. c. Test reports for contractor conducted DT are approved by the Materiel Developer. Test reports for government conducted DT in support of an evaluation are approved at DTC HQ or by another performing government test organization. d. DTC s Test Reports will be as follows: (1) An Interim Data Report (IDR) is developed as a tool to respond directly to the data requirement of the DTP. The IDR is prepared for those tests in direct support of an evaluation when the test results are on a critical path to provide the data to the system evaluator to meet the requirements to support a MS decision. The HQ test manager determines if an IDR should be developed based on the timeline requirements of each test program. The HQ-approved IDR goal is 45 days from end of test or subtest. End of test is defined as the day the last data is collected 5-2 ATEC Pamphlet June 2010

187 (i.e., last shot fired, last mile run). The HQ test manager may modify the 45-day goal for specific projects when coordinated within the AST. (2) A Final Test Report (FTR) is also prepared for all evaluated tests. If an IDR has been developed, it will be used as the basis for the FTR. The FTR may be tailored to address the needs of each individual test event. The HQ approved FTR is distributed within 75 days from end of test or the date negotiated within the AST. For non-evaluated tests of DOD acquisition items, an FTR will be prepared unless the Test Record (see below) is used. (3) The Test Record is used for ballistic test requests, customer acceptance and surveillance tests, and to report tests that use military standards and specifications. e. Test reports prepared for customer tests that provide data not expected to be used to support a MDA review may be formatted to support the customer s requirements. Customer support-only activities, such as data collection, range support, or fabrication/consultant efforts, may not require a typical report format. For these programs the Test Center, in coordination with the DTC Test Manager, will develop a suitable documentation method to summarize the support activity and accomplishments of the efforts. Typical methods may include activity reports, or a test record at conclusion of the support effort Live Fire Test Report a. DTC prepares the final Full-Up System Level (FUSL) Live Fire Test (LFT) report. The data from this report is included in the ATEC assessment or evaluation report (OMAR/OER). The FUSL LFT report is provided to DOT&E within 120 days of the end of test. b. There are many LFT&E documents. Figure 5-2 presents an outline schedule of LFT&E events that, if followed, will result in a timely and effectively executed LFT&E program. The schedule for the OTA TP, Final TR, and OER are mandated requirements. ATEC Pamphlet June

188 Schedule Event Lead Pre MS A LF Working Group Formed AEC MS A Initial TEMP Input (Part of T&E Strategy) AEC Before MS B Submit/Approval of Waiver PM MS B Detailed TEMP Input AEC Detailed SEP Input E-180* LF OTA TP AEC E-60* DTP DTC E-60* Pre-Shot Prediction Report ARL/SLAD E Live Fire Test DTC E+60 Final Test Report (TR) DTC E+110 OER for FRP Decision AEC E+120 Final TR and OER to DOT&E ATEC thru TEO to DOT&E E+180 Model Comparison Report SLAD * This scheduling guideline pertains to the FUSL LFT&E phase. Timelines may vary for other LFT&E phases. The LFT&E Strategy (located in the TEMP) identifies which plans require DOT&E approval vs. DOT&E review/informal coordination. Figure 5-2. LFT&E Documents Timeline 5-5. Operational Test Reports The results of all OTC-conducted operational tests and experiments are documented in an OTR or AOTR. The AOTR is the standard reporting document. The OTR may be used as an exception to policy if approved by the OTC Commander. The OTR contains results and conclusions for issues and criteria, supporting analysis, and other pertinent information. Data to support the conclusions are obtained during an OT, experiment, or other event conducted by OTC. An AOTR provides the description of a test and the actual conditions, test limitations and impacts, and test team observations. The AOTR does not provide detailed results, analysis, or other analytical or assessment information. A similar process applies to those operational tests conducted by the TECOs for OTC. a. OTR. The purpose of the OTR is to provide the results of a test event conducted on a system or concept that includes findings and analysis of data collected during the test event. The OTR is prepared, staffed, and finalized to meet a goal approval date of end of test (E) +60 days. In order to meet the goal of an approval date at E+60 or no later than MDR-45 for an 5-4 ATEC Pamphlet June 2010

189 OMAR/OER, the Test Officer will provide a draft OTR no later than E+30 days. The OTR is approved and signed by the Commander, OTC or designee. (1) The OTR and the authenticated event database for the event are the primary data sources provided to the AST as input to development of the overall evaluation. (2) The OTR describes the conduct of the test to include conditions during test execution, any limitations, and any deviations from the approved OTA TP. It also includes the test officer s observations. (3) For tests not in direct support of an acquisition system, the OTR stands alone as the report of the test effort and provides detailed results to the test sponsor, to other interested activities, and to archives. b. AOTR. The purpose of the AOTR is to provide the test organization a medium for forwarding information on the test description, limitation, and observations to the evaluator. (1) The AOTR contains a description of the operational test and the actual conditions, test limitations, and final test team observations. The tester does not assess any of the data collected; provide test results, analysis, or conclusions. (2) The AOTR will be approved not later than E+20 days. c. The authenticated level-3 databases for an OTR and AOTR will be provided to the AST and designated agencies of the acquisition community under separate cover no later than end of test + 10 days (E+10). The preliminary test team observations will also be provided with the database. d. Distribution of the approved OTR or AOTR will be to the AST members, T&E WIPT members, and other activities, as appropriate Test Incident Report (TIR) TIRs are used as a means to keep the PM, Capabilities Developer, and others participating in the acquisition process informed of a systems performance during tests. A TIR is used to capture the minimum essential data on test incidents as they occur. It contains test incident (TI) data and corrective action (CA) data that are merged together by Army Test Incident Reporting System (ATIRS) at APG, Maryland. ATIRS is a restricted database that stores all test incident and corrective action information. Detailed procedures for the reporting of DT and OT results and corrective action information to the ATIRS are contained in AR 73-1 and DA Pamphlet a. TIRs contain the minimum essential data for test incidents and corrective actions. The TIR contains two types of data. One type consists of TI data prepared by the tester. The other type consists of CA data that is prepared by the PM. These two data types are merged together by ATIRS. The TIR process proceeds as follows: (1) The tester (government or contractor) prepares TIRs for all tests required in the TEMP. ATEC Pamphlet June

190 (2) The PM prepares CA data for input into ATIRS for critical and major TIRs. All TIRs are considered for corrective action, and the TIR should reflect action taken with supporting rationale. (3) TI and CA data are entered into ATIRS. ATIRS provides an Army standard method of electronically exchanging, storing, processing, and reporting data on results of testing, their corrective actions and other test-related information. (4) A corrective action review team composed of the PM, Capabilities Developer, and system evaluator review all CA data and associated TI data, to verify that proposed corrective actions are appropriate and effective. The testers are advisers to the team. b. Production and post-production tests of ammunition are excluded from submission of TIRs. Reporting procedures in AR 73-1 are used for these items. c. Since TIR data are transmitted, stored, and interactively accessed via unsecured media, care must be taken to ensure that documents provided to ATIRS contain no classified information. Classified information will be published separately in a classified TIR and distributed per the listing agreed to by the T&E WIPT members. Additionally, an unclassified TIR referencing the classified TIR is provided to ATIRS. d. Access to ATIRS databases is requested through the ATIRS administrator. ATIRS is administered by the Aberdeen Test Center located at Aberdeen Proving Ground, Maryland. As a default, Government users will have open access to ATIRS databases, unless the data is restricted by the PM or tester. The ATIRS administrator has full authority to grant access to databases not restricted by the PM or tester. All contractors are restricted to those data authorized by the PM or tester Technical notes a. Technical notes are written on the initiative of SCA personnel or on the initiative of SCAs or on the initiative of individuals, to supplement normal ATEC reports and to preserve lessons learned, analytic techniques, methodologies, and supplemental data and information on systems under test. Technical notes are also used, as needed, to support the SAR, briefings, professional forum, etc. b. The target audience of technical notes includes future testers, evaluators, and other researchers. They may also be applicable to professional, academic, and technical symposia and publications. c. ATEC policy encourages the expression of professional, academic, and technical decisions by ATEC personnel or support contractors regarding the results of a T&E event. Since they represent the command, the ATEC chain of command will approve any submissions and presentations in advance. This assures the content of any technical note either accurately represents ATEC policies and procedures, or is clearly identified as personal opinion stated prominently at the beginning of the presentation. The technical note must be thorough and professional, in order to favorably reflect upon the command. 5-6 ATEC Pamphlet June 2010

191 d. Technical notes require certain introductory and explanatory material to make them stand-alone documents and vary substantially in size and form depending on the scope of the note Test reporting timelines Test document timelines vary. Figure 5-3 provides a no-later-than date. For the test reports to be of any value, they must be completed, approved, and provided to the system evaluator in a timely manner, i.e., by the due date of the assessment or evaluation report. Developmental Test Reports Interim Data Report (IDR) Final Test Report (FTR) Operational Test Reports Database to Evaluator Abbreviated Operational Test Report (AOTR) Operational Test Report (OTR) E + 45 days E + 75 days E + 10 days E + 30 days E + 60 days Figure 5-3. Test Report Documents Timeline Section II System Reporting 5-9. Evaluation reporting The policies described in this document apply to all evaluated systems regardless of ACAT and/or acquisition strategy. ATEC assessments and evaluations document ATEC findings and recommendations throughout the life cycle of a system. The reporting documents vary depending on individual system needs. ATEC employs multiple evaluation documentation processes: ACAT I, II, OSD oversight list; ACAT III, and rapid initiatives. The AST will generally adhere to the process described but can tailor the process as appropriate to best match the conditions of individual systems. In either case, the purpose is to ensure a logical progression through a series of phases designed to reduce risk, ensure affordability, and provide adequate information for the decision-makers. ATEC position statements will be attached to the front of all reports. a. The system reporting process is characterized by development of an assessment or an evaluation report and/or briefing on evaluation results. The end product of the T&E mission is a well-written report, subsequent briefings, and coordinated, well-documented ATEC findings to the Chief of Staff of the Army (CSA) (as appropriate) with the milestone decision authority (MDA) and T&E WIPT as a secondary recipient. b. Preparation of assessment/evaluation briefings should be viewed as a verbal and visual presentation of the report and not as a separate process. Presentation graphics should start with data displays from the report and especially from the report executive summary. Wording of conclusions should exactly match those in the report. ATEC Pamphlet June

192 c. The assessment/evaluation reporting process documents ATEC s findings throughout the system s life cycle. The reports developed by the evaluator are the OTA Milestone Assessment Report (OMAR), OTA Evaluation report (OER), OTA Follow-on Evaluation Report (OFER), tailor and abbreviated OMAR/OER/OFER), OTA Assessment Report (OAR), and the system analysis report (SAR). Figure 5-1 depicts the ATEC reporting process. d. Evaluations and assessments contain data summaries, analyses, program constraints, T&E limitations on the evaluation, and conclusions with support rationale that clearly identify ATEC s position on the potential capability of the system to fulfill the HQDA approved requirements Overview of ATEC s assessment and evaluation reports a. ATEC s goal is to publish an OMAR/OER no later than MDR-45 days or within 60 days of the end date of the last data collection event, which is a primary data source for the report and is an identified test event within the TEMP. It delivers timely, factual, concise, comprehensive and accurate results of completed events to decision authorities. Each report is an executivelevel document that must be concise. The reports should only contain the data or information necessary to support the findings. For non-oversight systems, ACAT III programs the levels of evaluation process will be followed. The suggested format is provided in appendix F. b. The report is defined as mission-oriented and provides the decision authority with an independent assessment or evaluation of the system s performance and operational effectiveness, suitability, and survivability for each milestone decision review. The report document is written with respect to the milestone it is providing input to (e.g. milestone, full rate production, or materiel release). c. The evaluator has the responsibility for the development of the ATEC reports. Other AST members will provide support OTA Milestone Assessment Reports (OMARs) OMARs are written in support of milestones (MS) B and C. The format is different between MS B and MS C. For multi-service OT programs where ATEC is not the lead, the AST will provide Army unique input to the lead OTA by means of a document called Army Input to Evaluation Report (AIER). Timelines will be documented in ADSS. a. OMARs in support of MS B. The OTA Milestone B Assessment Report(s) focuses on the risk assessment of the program, and examines the current developmental risks, from component level to system of systems and integration level. The assessment also takes into consideration the PM s planned risk mitigation activities, and determines the risk at key stages of the development process as data permits. When data supports it, the OTA Milestone B Assessment Report will include a projection of the system s operational effectiveness, suitability, and survivability. (1) The objective of the risk analysis documented in the OTA Milestone B Assessment Report is to define the likelihood of occurrence of a risk and potential consequences with enough 5-8 ATEC Pamphlet June 2010

193 specificity to support decisions, to include the decision to proceed or not. Risk mitigation plans should be addressed based on available data. Risks may be known or unknown. A known risk is typically an event where the likelihood and the consequence can be defined with reasonable confidence. Unknown risks are those where either the likelihood of occurrence or potential consequences cannot be accurately estimated. (2) The first step in the risk assessment methodology process is to create a dendritic. This dendritic depicts the development and integration of the system from component level including both hardware and software, through subsystem, to system or system of systems level, including both hardware and software. The risk analyst should choose the appropriate level of detail to describe the integrated system and examine risks associated with effectiveness, suitability, and survivability, including enablers such as modeling and simulation. System of Systems System Performance Supportability & Training Survivability System Integration Integrated Simulation System Level Weapons Power System Communications Sub-system Level Hardware Software Component Level Figure 5-4. Risk Assessment Dendritic Example (3) The second step in the risk assessment methodology process is to identify the major risk areas in the program. These may include, for example, risks associated with new technology development for system components, integration of components into major assemblies or platforms, system of systems interoperability, system supportability, or development of modeling and simulation capabilities. Participation from AST members, materiel and capabilities developers is encouraged. Participation from all risk assessment stakeholders (e.g., AMSAA, ASA(ALT), ARL) is encouraged. Ultimately, however, the risk assessment OTA Milestone A or B Assessment Report is an independent view of program risk produced by ATEC. ATEC Pamphlet June

194 (4) The next step is to tailor the risk assessment techniques to fit the program. ATEC merges two risk assessment techniques to assign a risk rating: the likelihood of meeting requirements, and the consequences of not meeting requirements. The discussion that follows provides a basic template for development of a meaningful risk assessment; not all programs will fit this structure. During risk analysis, the participating members must define the appropriate risk levels, readiness levels, and assessment criteria. The assessment techniques require the analyst to gather all relevant program and technical information available. (a) The first technique assesses the likelihood that the risk area will satisfy requirements at a certain point in the development timeline. In this technique, the analyst determines the readiness level of the particular concern based on available information. Using table 5-1, the analyst determines the risk for each readiness level. For example, suppose a new system requires reapplication of existing technologies. If analysis or simulations indicate a high probability of meeting requirements, this readiness level is medium-low risk. After risk has been assessed for each level, the analyst must apply judgment and assign an aggregate risk assessment for this area of concern. This represents the likelihood that the system will satisfy requirements at the MS B. Table 5-1. Readiness Levels and Likely Risk Levels Risk Analysis Simulations Component or Subsystem Readiness Level Integration Demonstrated System of Systems Field Test Level Tested and Met Requirements Low Show meeting requirements over most conditions All developed mature technology All subsystems Interoperability certified System Over range of conditions Medium Low Medium Medium High Show meeting requirements over majority of conditions Show meeting requirements over many conditions Show marginal likelihood of meeting requirements Majority developed; those undeveloped based on mature technology Many developed, and undeveloped based on mature technology Majority undeveloped; some undeveloped based on stateof-the-art technology Major subsystems integrated; rest integrated on another system Major subsystems; testing and analysis indicates full integration possible Some components integrated into subsystems Interoperability with major systems; network Interoperability with component systems IERs defined Major subsystems Some subsystems Some components For many conditions For some conditions For limited conditions High Show cannot meet requirements over many conditions Pushing stateof-the-art components at lab level None None No systems or components No 5-10 ATEC Pamphlet June 2010

195 (b) The second technique assesses the consequences of not meeting requirements, including cost, schedule, or performance requirements based on program status at that point in the development timeline. Using table 5-2 below, the analyst assesses program status at a particular point in time, and determines the risk level. The risk level column reflects the highest risk from either cost/schedule or performance column. Table 5-2. Consequence Risk Levels Cost/Schedule Performance Risk Level Minimal or no impact Minimal or no impact Low Additional resources needed to meet need dates Minor slip in key milestone; not able to meet need date Major slip in key milestone or critical path impacted Cannot achieve major program milestone Acceptable with some reduction in margin Acceptable with significant reduction in margin Acceptable, no remaining margin Unacceptable Medium Low Medium Medium High High (c) For example, if at MS B the analyst has observed a minor slip in schedule and the program is not able to meet the MS date, the consequence to the program would be medium risk. (5) The two assessments, likelihood of achieving requirements and the consequences of not meeting requirements, can then be combined to determine overall risk. Using table 5-3 below, the analyst locates the intersection of the likelihood and consequence risk assessments to determine overall risk. In this chart, overall risk is indicated by color. For example, if the likelihood risk was determined to be medium low, and the consequence risk was determined to be medium high, the intersection is yellow or medium overall. (6) This risk assessment is performed for each risk area (e.g., component design, integration, supportability). ATEC Pamphlet June

196 Likelihood Table 5-3. Overall Risk Assessment Guide HIGH MED/ LOW MEDIUM MED/ HIGH HIGH HIGH MED/ HIGH MED/ LOW MEDIUM MEDIUM MED/ HIGH HIGH MEDIUM LOW MED/ LOW MEDIUM MED/ HIGH HIGH MED/ LOW LOW MED/ LOW MEDIUM MEDIUM MED/ HIGH LOW LOW LOW MED/ LOW MEDIUM MED/ HIGH LOW MED/ LOW MEDIUM MED/ HIGH HIGH Consequence (7) The next step in the risk assessment is to collaborate with the PM over risk mitigation strategies to enable program success. Several events over the developmental timeline should reduce or offset risk. Figure 5-4 depicts the reduction in risk that may be associated with key events in program development. In this waterfall chart, a risk area that is high at MS B may, by virtue of well-planned mitigation and developmental efforts, be projected to be low risk by MS C. This is based on the assumed success of planned risk mitigation activities. This information may be critical to acquisition decision makers. This risk mitigation analysis should be performed as time and available data permits for each risk area, and be incorporated in the. OTA Milestone B Assessment Report. (8) The next step is to create a risk summary. The summary traces risk back to the developmental dendritic. This summary depicts the assessed risk for each risk area across each of the key events. Figure 5-6 depicts an example of the risk summary ATEC Pamphlet June 2010

197 High Risk Med Now Current Rating Program Start Phase 1 (12-14 mos) Risk Mitigation Activity Risk Area End of Phase 1 Now High Program Start Medium Risk Mitigation Activity Phase 2 (36 mos) End of Phase 1 Med/Low MS C Low MS C Projected Risk Low Comp Qual / EDT Sys Qual / PPT PDR CDR DRR Figure 5-5. Risk Mitigation Strategy (9) The final step in the risk assessment process is to establish measurable exit criteria for each risk area and each transition in the development timeline to track progress and manage risk throughout program development. This risk assessment may result in revisions to the acquisition strategy and program documents. (10) Time and available data may not permit completion of this entire process; however, the minimum reported risk assessment will include identification of critical risk areas and the overall risk assessment based on an assessment of the likelihood of achieving requirements and the consequences of not achieving requirements. Risk mitigation and assessments of risk beyond MS B should be undertaken as time and data permit. b. OTA Milestone Assessment Report (OMAR for MS C). The OMAR documents the independent system evaluation findings and recommendations regarding a system s operational effectiveness, suitability, and survivability as well as a system s mission capability. It is provided at the MS C review and may be supported by a System Analysis Report (SAR). The SAR provides the detailed analyses to support the evaluation. The OMAR tells the essential parts of the story. It consists of the bottom line, main arguments, and data to support the arguments. ATEC Pamphlet June

198 Risk System System of Systems Risk Program Start End of Phase Now MS B 1 High Med/High Medium MS C Med/Low System Performance Risk Now Program Start End of Phase MS B 1 MS C Subsystem Integration High Med / High Medium Med/Low System Integration Risk Now Program Start End of Phase MS B 1 MS C Subsystem High Med / High Medium Med/Low Subsystems Risk Now Program Start MS B Component #1 High Med/High Component #2 Med / Low Med / Low Supportability and Training Risk Now Program Start End of Phase 1 MS C Sup/Tng Med Med Med/Low Low Simulation Risk Now Program End of Phase Start 1 MS C M&S Med Med / Low Low Low End of Phase 1 MS C Med / Low Low Med / Low Low Component #1 and System Integration Risks are Key Drivers of Integrated and System of Systems System Risk Figure 5-6. Risk Summary Example OTA Evaluation Reports/OTA Follow-on Report (OER/OFER) The OER documents the independent system evaluation findings and recommendations regarding a system s operational effectiveness, suitability, and survivability as well as a system s mission capability. It is provided at FRP Decision Review and is supported by a System Analysis Report (SAR). The SAR provides the detailed analyses to support the evaluation. a. The OER tells the essential parts of the story. It consists of the bottom line, main arguments, and data to support the arguments. b. For multi-service OT where ATEC is not the lead, the AST will provide Army unique input to the lead OTA by means of a document called Army Input to Evaluation Report (AIER). Timelines will be documented in ADSS. c. The OFER provides additional information on the efficacy of corrective actions for system deficiencies found during the IOT. OFERs therefore are submitted to decision making officials after the FRP decision is made ATEC Pamphlet June 2010

199 5-13. Draft reports The AST will prepare a draft report, whether it is an OMAR, OER, or OFER, to ensure the evaluation team is properly prepared to process data from the test event. The draft report will be more than just an outline shell for the final report. The following guidelines apply to this revised requirement for a draft report: a. Since the final report is limited to 30 pages, this initial draft should conform to a similar constraint. b. Provide brief discussions of the most important measures (no more than 30) from the SEP and addressed within the OTA TP. Provide proposed data displays and discussions of planned analyses for each measure. Include discussions for use of DT information within the evaluation and provide data displays for DT data as appropriate Tailored/abbreviated assessments and evaluations As ATEC continues to conduct testing, assessments, experiments, and evaluations providing essential information to decision-makers, it must make better use of resources, focus on systems presenting the greatest risk, and increase the rigor of testing and evaluation and produce more timely reports. To achieve this efficiency, tailored and abbreviated assessment and evaluation will be used for non-major programs. Paragraph 3-2 provides the details for the assessment process to determine what level of evaluation is required for a non-major program OTA Assessment Report (OAR) The OAR provides an evaluation of progress towards meeting system requirements at other times than milestones and FRP decision. OARs often address only a subset of the overall evaluation requirements. The OAR may identify needed corrective actions; provide information on tactics, doctrine, organizations, and personnel requirements; make projected mission impacts based on what is learned; assess readiness for IOT; and evaluate the system s logistic supportability and MANPRINT design. The acquisition program structure should include assessments or evaluations beginning early in the development cycle and continuing throughout the system s life cycle. The scope of issues to be addressed by the OAR is flexible in that it may or may not cover all requirements related to performance, suitability, and survivability design. The OAR is not tied to an MDR, but is developed as required to execute the continuous evaluation process. The suggested format is provided in appendix F. a. The OAR (1) Is flexible in that it does not necessarily address all aspects of the system s performance, suitability, and survivability. (2) May use data from a single event or series of events. (3) Contains a safety confirmation provided by DTC as an appendix, if required. ATEC Pamphlet June

200 b. The evaluator is the lead in development of the OAR. The other AST members provide support to the evaluator ATEC reports that support type classification and materiel release AR and DA Pamphlet provide policies and procedures, respectively, for Type Classification and Materiel Release of equipment to be fielded to the Army. All materiel (including software) that is developed, jointly developed, acquired, procured, re-procured, distributed, fielded, delivered, used, converted, modified, upgraded or managed by the Army must follow the policies outlined in AR prior to fielding. Exemptions are listed in AR , paragraph 1-5. a. ATEC provides an OMAR or OER of the effectiveness, safety, suitability, supportability and survivability for assigned Army systems to ensure that the system meets all aspects of the capabilities document. The AST forward the OMAR or OER to the PM with a cover memorandum stating the ATEC position on the proposed MR. The memorandum will address system-of-systems risks in releasing the system if it fails to meet software blocking and system of systems requirements. b. ATEC will also provide an OMAR with assessment of technical support, operational effectiveness, and survivability in support of Type Classification. A separate OMAR is not needed to support Type Classification. One OMAR may support both the Type Classification and the Materiel Release. c. Materiel release falls into one of the following categories: full, conditional, training, and urgent. Ammunition and small arms that undergo continuous testing in their production environment usually receive a Readiness for Issue Certification (RFIC) rather than a Materiel Release. The Materiel Release Review Board (MRRB) recommends the type of materiel release after a comprehensive assessment of the total materiel system. The Commander of the appropriate AMC Major Subordinate Command (MSC) usually grants the materiel release approval for all systems. If a MRRB member (such as the independent logistician (DCS G-4), test and evaluation organization (ATEC), or support office) non-concurs with the recommended materiel release, and the MSC Commander cannot resolve the issue, the CG, AMC will decide the type of materiel release. d. The types of system materiel release are: (1) Full materiel release (FMR). A FMR is the formal certification that the materiel is safe, suitable (meets all of its performance requirements), and supportable (logistically) when used within stated operational parameters. This certification provides the authorization to a PM to proceed to a FRP decision review and issue with no materiel conditions requiring further resolution. Criteria for a FMR are in AR , Chapter 4. ATEC is required to provide a materiel release position memorandum along with either an OER or an OAR. (2) Conditional materiel release (CMR). A CMR results when all criteria for a FMR are not met and may occur when the AAE allows a program to proceed into FRP under a CMR; a program has no planned FRP as part of the approved acquisition strategy; a program fields LRIP 5-16 ATEC Pamphlet June 2010

201 materiel prior to FRP; or a post FRP program prepares to field an upgrade that meets the applicability criteria for MR. ATEC response requires a position memorandum recommending conditional materiel release and must identify conditions to be resolved before considering full materiel release. (3) Urgent materiel release (UMR). An UMR is a limited certification that the materiel meets minimum safety requirements, is suitable based upon a requirement memorandum directed by an ONS or the DCS G-3/5/7, and is supportable logistically when used within stated operational parameters. The UMR approval package will specify the quantity, duration, and location of the UMR materiel. The UMR allows the PM to field the materiel rapidly to meet a capability shortfall. (a) Receipt of an UMR request initiates action on the part of DTC. HQ ATEC DCSOPS issues an official tasking upon confirmation that an UMR is valid. Considering the need to expedite the ATEC response, HQ ATEC DCSOPS may direct effort to begin prior to publishing an official tasking. Close Combat Evaluation Directorate is the primary point of contact for all UMRs within AEC. (b) DTC personnel will prepare a Safety Confirmation and coordinate as expeditiously as possible with all personnel involved. When there are limited data or no data on which to base the Safety Confirmation, it will indicate the appropriate risk hazard levels. In rare cases, limited testing may be required. In all cases, expediency in assessing the Safety of the systems is of the highest priority. (c) Electronic copies ( , facsimile) of the Safety Confirmation or coordination statement will be forwarded to the AMC MSC Safety Office and to HQ ATEC DCSOPS through the materiel release mailbox and stored in Vision Digital Library (VDL) by DTC. (4) Training materiel release (TMR). A TMR is a limited certification that provides authorization to a PM to file or issue the materiel to TRADOC/GC schools and training sites for the expressed purpose of curriculum development and training of Soldiers. A TMR may include prototype or test materiel; materiel manufactured under conditions other than normal production; materiel that is incomplete; or when materiel where one or more of the requirements for FMR have not been met. Before a TMR approval, the PM will ensure that critical issues such as safety, availability of spare/repair parts, technical documentation, responsibility for maintenance support, and the other limitations of the materiel are identified and accepted by the trainer. e. Software materiel release and software release. A software MR (SMR) or a software release (SR) action is required for changes in software and/or firmware, including programs, routines and symbolic languages that control the functioning of the hardware and direct its operation (even when it is not part of a materiel modification). When the materiel is fielded through the MR process, the software associated with that materiel is simultaneously certified. (1) When the materiel (system) and software both require MR, the software is released as part of the materiel (system). (2) When the materiel (system) does not require a MR, but the software does, the software will undergo the SMR process on its own. ATEC Pamphlet June

202 (3) Depending on the scope of the software change, software fixes sometimes called patches may be addressed using a SR provided safety, suitability and/or supportability are not affected. f. Materiel systems approved for release must be safe, operationally effective, and logistically supportable. The existence of a residual hazard in the system shall not prevent full release if the appropriate decision authority has accepted the residual risk based on a system safety risk assessment. Additional information is located in AR g. ATEC s goal is to process Materiel Release requests as expeditiously as possible to decrease the time required to field a particular system to Army units. To help reduce the lag time for processing these requests and to better track requests, a central address MaterielRelease@us.army.mil was established. ATEC Headquarters Deputy Chief of Staff for Operations (DCSOPS) and the AEC Integrated Logistics Support (ILS) Evaluation Directorate provide points of contact to monitor incoming requests and process requests throughout the command. Upon receipt of a request, ATEC DCSOPS will issue a tasker, track progress, and deliver the final product to the requesting activity. h. Currently, AEC ILS personnel are forwarding mail from the Materiel Release address to the appropriate AST members listed in ADSS for action. If the system cannot be identified to an AST, the mail is forwarded to the AEC Program Specialist for the directorate that would be expected to have the lead. If a request for an ATEC recommendation, the mail is forwarded to both ATEC and AEC DCSOPS. i. Processing time for a Materiel Release request may vary depending on type of request submitted. Processing times are negotiated between the project office and the AST. j. ATEC prepares an assessment or evaluation to document evaluation results. The assessment or evaluation includes a memorandum that identifies the type of materiel release ATEC recommends and lists the conditions, if any, which would prevent a full release. It must also contain the DTC Safety Confirmation. Copies of the memorandum and supporting assessment are provided to the materiel release coordinators at the fielding AMC Major Subordinate Command (MSC), FORSCOM, TRADOC, independent logistician, requesting office as well as the MaterielRelease@ATEC.Army.Mil k. An amended safety confirmation is provided when it is determined that a software change is likely to impact the safety of the total system. l. ATEC coordinates with the independent logistician in forming the independent logistician position for materiel release. m. ATEC participates in the materiel release process throughout the system s life cycle to ensure that the system continues to meet requirements following modifications, updates, and shelf-life extensions. AR identifies the prerequisites for materiel release. ATEC is provided copies of all documentation (for example, independent logistician position; safety statement, signed materiel fielding agreement, statement of supportability, statement of interoperability, and intra-army interoperability certification) ATEC Pamphlet June 2010

203 n. A get-well plan is required for all systems that do not meet the full release criteria. It lists each condition that precluded a full release and describes the reason for conditional MR. The plan includes each issue to be resolved, the interim solution, and the projected date for resolving each of the conditions, as well as the projected date for the full release when all conditions are eliminated. In addition, it identifies the proponent to certify when the condition is corrected. Changes to a get-well date require MSC General Officer agreement. or memorandum concurrence is required from condition proponent to close condition. The AST lead or designated member s a response after coordination within the AST. An AEC or ATEC memorandum is required for conversion to full materiel release after all conditions are closed. Conditions over three years old are tracked by an ASARC. o. The Army Materiel Release Tracking System (MRTS) is used for reporting/tracking all materiel release activities. Each project office through the supporting AMC MSC materiel release coordinator maintains the MRTS. Several reports are available from MRTS including forecasts for planned materiel releases and a record of approved materiel releases. It also includes get well plans for less than full releases, and several tailored reports to assist tracking of materiel release activity. p. Fields are available in ADSS to assist the AST in tracking materiel release activity. Projected and completed Materiel Release Decision Dates can be added to the Milestones, Materiel Release Decisions Field containing a date and purpose. Documents requesting or responding to materiel release activity can be added to Documents, Document Storage containing a published date, title, and the actual document System Analysis Report (SAR) The purpose of a SAR is to provide the detailed analysis that supports ATEC findings as reported, but in a less restricted time frame than the OMAR/OER/OFER. The SAR will be produced by the AEC 60 days after the OMAR/OER/OFER is completed. The SAR documents the analyses that were conducted but not presented within the OMAR/OER/OFER. The suggested format is provided in appendix F. Below are the general guidelines for the SAR. a. The SAR is not constrained by a page requirement like the OMAR/OER/OFER. Since, both the SAR and OMAR/OER/OFER collectively provide the full scope of the analysis and evaluation efforts, each will include reference to the companion report. b. The SAR provides detailed analysis supporting an OMAR/OER/OFER. The analysis should be detailed enough to allow anyone to reconstruct the data and perform analyses without having to rely on the original analyst s or evaluator s expertise. The SAR includes in-depth T&E analyses, causality investigations, and diagnostic excursions. c. The SAR provides the analysis supporting an OMAR/OER/OFER only when the analysis is too detailed for inclusion in the OMAR/OER/OFER d. The SAR accounts for all issues and measures contained in the SEP when supporting an OER. e. Approval authority is AEC Director. ATEC Pamphlet June

204 f. Whereas the OMAR/OER/OFER has a standard distribution schedule, the SAR will be distributed only upon individual requests submitted to ATEC System reporting timelines The documents for multi-service OT&E will follow the guidance provided in the Memorandum of Agreement (MOA) between the Service OTAs. Actual dates may be adjusted based on the approved milestone dates. OTA Assessment Report (OAR) OTA Milestone Assessment Reports (OMAR) (Includes Tailored/Abbreviated) OTA Evaluation Report (OER) (Includes Tailored/Abbreviated) OTA Follow-On Evaluation Report (OFER) System Analysis Report (SAR) Quick Look Reports Emerging Results Brief As required E + 60 days (E + 90 for MOT&E) or no later than MDR - 45 days Same as above E + 60 days OER + 90 days Daily during OT As required Figure 5-7. System Assessment/Evaluation Report Timeline ATEC Document Distribution Guidelines The AST will make electronic notification/distribution of all POR ATEC documents. DCSOPS will make electronic notification/distribution of all CLRs, Safety Confirmations and FOA EXSUMs. Below is a list of activities that routinely receive copies of ATEC s program of record documents. The AST Chair will prepare a cover memo with the 4 Ws on all oversight OMARs and OERs as well as special interest CLRs. The cover memo will be used by the ATEC Command Group as an EXSUM when sending to selected Army leadership. Table 5-4 is to be used as a guide. DCSOPS and the SGS will maintain the address list for Table ATEC Pamphlet June 2010

205 Table 5-4. ATEC Distribution Guidelines Activity SEP OTA TP OMAR/OER CLR/FOA Vice Chief of Staff (VCSA) No No Oversight No Director of Army Staff(DAS) No No Oversight No Milestone Decision Authority (MDA) Oversight No All All Director, Operational Test and Evaluation (DOT&E) Director, Developmental Test and Evaluation (DDT&E) Oversight Oversight Oversight All Oversight No Oversight All Director, Test and Evaluation Office (TEO) Oversight Oversight Oversight All Assistant Secretary of Army (Acquisition, Logistics, and Technology (ASA(ALT)) No No Oversight All DCS, G-1 Oversight No Oversight All 1 DCS, G-2 No No No All 1 DCS, G-3/5/7 All No All All 1 DCS, G-4 All No All All 1 DCS, G-6 All No All All 1 DCS, G-8 All No All All 1 Training and Doctrine Command (TRADOC)/TRADOC Capability Manager (TCM) All All All All Forces Command (FORSCOM) No No No All 4 Army Materiel Command (AMC) No No All All The Surgeon General (TSG) All No Oversight No Joint IED Defeat (JIEDDO) No No No All REF No No No All ATEC LNOS All All All All PEO/PM All All All All JITC All All All All AMSAA All All All All ARL (SLAD) All All All All DTIC All All All All ATEC LIBRARY + VDL All All All All 1 As required 2 Reports forwarded with a personal memorandum signed by ATEC ED 3 Copies provided to the ATEC Technical Library and as required to SCAs Libraries. 4 Includes FOA Team products (i.e., Quick Look, EXSUM, Quad Charts, etc). C&L: Capability and Limitation DT: Developmental Test FOA: Forward Operational Assessment LF: Live Fire OER: OTA Evaluation Report OMAR: OTA Milestone Assessment Report OT: Operational Test OTA: Operational Test Agency SEP: System Evaluation Plan ATEC Pamphlet June

206 This page intentionally left blank ATEC Pamphlet June 2010

207 Appendix A References Section I Required Publications The following publications are available on the APA Web site ( unless otherwise stated. DOD publications are available at AR 70-1 Army Acquisition Policy AR 73-1 Test and Evaluation Policy AR Department of the Army (DA) Information Security Program AR Manpower and Personnel Integration (MANPRINT) in the System Acquisition Process AR Type Classification, Materiel Release, Fielding, and Transfer DA Pamphlet 73-1 Test and Evaluation Guidelines DFAS-IN Manual FY Army Management Structure, Fiscal Year XX (changes every year) CJCSI G Chairman of the Joint Chiefs of Staff Instruction, Joint Capabilities Integration and Development System CJCSI D Interoperability and Supportability of Information Technology and National Security Systems, (IT/NSS) DOD R Department of Defense Financial Management Regulations (FMRs), Volumes 1-15 DODD E Information Assurance (IA) DODD Major Range and Test Facility Base ATEC Pamphlet June 2010 A-1

208 DODD Defense Acquisition System DODI Operation of the Defense Acquisition System JCIDS Manual Replaces CJCSM OMB Circular A-109 Major System Acquisitions Section II Related Publications A related publication is a source of additional information. The user does not have to read a related reference to understand this document. American National Standards Institute (ANSI) Z39.18 American National Standard for Information Sciences, Scientific and Technical Reports- Organization, Preparation, and Production AR 5-11 Management of Army Models and Simulations AR 5-12 Army Management of the Electromagnetic Spectrum AR 5-13 Total Army Munitions Requirements Process and Prioritization System AR Field Operating Agencies, Office of the Chief of Staff, Army AR Nuclear and Chemical Survivability Committee AR 25-1 Army Information Management AR The Department of the Army Freedom of Information Act Program AR 25-6 Software Configuration Management A-2 ATEC Pamphlet June 2010

209 AR Health Hazard Assessment Program in Support of the Army Materiel Acquisition Decision Process AR Policies and Procedures for the Acquisition of Medical Materiel AR 70-6 Management of the Research, Development, Test, and Evaluation, Army Appropriations AR 70-8 Soldier-Oriented Research and Development in Personnel and Training AR 70-9 Army Research Information Systems and Reports AR Fuels and Lubricants Standardization Policy for Equipment Design, Operation, and Logistic Support AR Publications and Reprints of Articles in Professional Journals AR Centers for Analysis of Scientific and Technical Information AR Use of Volunteers as Subjects of Research AR Department of the Army Sponsorship of Unclassified Scientific or Technical Meetings AR Standards for Technical Reporting AR Research, Development, Test, and Evaluation of Materiel for Extreme Climatic Conditions AR International Cooperative Research, Development and Acquisition AR Space Test Program Management AR Scientific and Technical Information Program ATEC Pamphlet June 2010 A-3

210 AR Engineering for Transportability AR Military Civilian Technical Transfer AR Airworthiness Qualification of U.S. Army Aircraft Systems AR DOD In-House RDT&E Annual Activities Report AR Independent Research and Development AR Survivability of Army Personnel and Materiel AR 71-2 Basis of Issue Plans (BOIP) and Qualitative and Quantitative Personnel Requirements Information (QQPRI) AR 71-9 Materiel Requirements AR Force Development Documentation-Consolidated Policies AR 75-1 Malfunctions Involving Ammunition and Explosives AR Environmental Protection and Enhancement AR Environmental Effects of Army Actions AR Training Devices Policies and Management AR Public Information AR Disclosure of Information and Visits and Accreditation of Foreign Nationals A-4 ATEC Pamphlet June 2010

211 AR Information Systems Security AR Productions requirements and Threat Intelligence Support to the U.S. Army AR The Army Safety Program AR Regulations for Firing Guided Missiles and Heavy Rockets for Training, Target Practice, and Combat AR Policies and Procedures for Firing Ammunition for Training, Target Practice, and Combat AR The Department of the Army Command and Control System (DACCS) AR Electronic Warfare (EW) Policy AR Operations Security (OPSEC) AR Human Factors Engineering Program AR Logistic Planning Factors and Data Management AR Integrated Logistic Support AR Inventory Management Supply Policy below the Wholesale Level AR Modification of Materiel and Issuing Safety-Of-Use Messages and Commercial Vehicle Safety Recall Campaign Directive AR Measure and Diagnostic Equipment program ATEC Pamphlet June 2010 A-5

212 ATEC Pamphlet Modeling and Simulation ATEC Regulation Instrumentation Development and Acquisition ATEC Regulation 73-1 System Test and Evaluation Policy ATEC Regulation Accreditation of Models and Simulations and Certification of Instrumentation for T&E DA Pamphlet 5-11 Verification, Validation and Accreditation of Army Models and Simulations DA Pamphlet 70-3 Army Acquisition Handbook DA Pamphlet Acquisition Management Milestone System DA Pamphlet ILS Program Assessment Issues and Criteria DA Pamphlet ILS: Developmental Supportability Test and Evaluation Guide DA Pamphlet Instructions for Preparing the ILS Plan DA Pamphlet ILS Managers Guide DA Pamphlet Instructions for Materiel Release, Fielding, and Transfer Defense Acquisition Guidebook DOD M-2 Foreign Comparative Testing DOD M-1 Data Element Standardization Procedures DOD M-x DOD Enterprise Data Model Development, Approval, and Maintenance Procedures A-6 ATEC Pamphlet June 2010

213 DODD Protection of Human Subjects and Adherence to Ethical Standards in DOD Supported Ressearch DODD Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS) DODD Defense Acquisition, Technology, and Logistics Workforce Education, Training and Career Development Program DODD Defense Acquisition University DODD DOD Modeling and Simulation (M&S) Management DODD M DOD Modeling and Simulation Glossary DODI DOD Modeling and Simulation (M&S) Verification, Validation, and Accreditation (VV&A) DODD Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) DODD Director of Defense Research and Engineering DODD Director of Operational Test and Evaluation (DOT&E) DODD Assistant Secretary of Defense for Networks and Information Integration/DOD Chief Information Officer (ASD(NII)/DOD CIO) DODD Single Managers Responsible for Military Explosive Ordnance Disposal Technology and Training (EODT&T) DODD Single Managers for Conventional Ammunition (SMCA) DODD Environmental Effects Abroad of Major Department of Defense Actions ATEC Pamphlet June 2010 A-7

214 DODD Management of Department of Defense Information DODD DOD Data Administration DODI Procedures for Interoperability, and Supportability of Information Technology (IT) and National Security Systems (NSS) DODI Joint Test and Evaluation (JT&E) Program DODI DOD Information Assurance Certification and Accreditation Process (DIACAP) Office of Management and Budget (OMB) Circular A-109 Major System Acquisitions Public Law (P.L) , 85 Statutes (STAT) 423, Title II First law that mentioned Test and Evaluation (T&E) Title 10, United States Code (USC), Section 139 Established the Director, Operational Test and Evaluation (DOT&E) 10 USC 2302 Definitions 10 USC 2302d Defines major system dollar threshold amounts 10 USC 2350a Cooperative research and development projects: allied countries 10 USC 2353 Contracts: acquisition, construction, or furnishing of test facilities and equipment 10 USC 2366 Major systems and munitions programs: survivability testing and lethality testing required before full-scale production 10 USC 2399 Operational Test and Evaluation (OT&E) of defense acquisition programs (also addresses contractor testing) A-8 ATEC Pamphlet June 2010

215 10 USC 2400 Low-rate initial production of new systems 10 USC 2430 Major defense acquisition program defined 10 USC 2432 Selected acquisition reports 10 USC 2435 Baseline description 10 USC 2681 Use of T&E installations by commercial entities U.S. Army Training and Doctrine Command (TRADOC) Pamphlet 71-9 Requirements Determination Process U.S. Army Materiel Command (AMC) Pamphlet Templates for Streamlining Acquisition ATEC s Web Site Address ATEC Pamphlet June 2010 A-9

216 This page intentionally left blank. A-10 ATEC Pamphlet June 2010

217 Appendix B ATEC Liaison Offices B-1. Purpose As part of ATEC s early involvement strategy, ATEC HQ DCSOPS maintains a number of liaison offices. The T&E service discussed below are accessed through ATEC DCSOPS. B-2. Liaison Offices ATEC has Liaison Offices (LNOs) located throughout the continental United States. The offices are manned by experienced ATEC representatives capable of providing T&E insights into the early stages of the acquisition cycle. a. LNOs are co-located with key TRADOC and Army PEOs locations. Table B-1 provides the locations and contact information for all offices. Table B-1. LNO Locations and Contact Information TRADOC HQ LNO, Ft. Monroe, VA DSN Armor Center LNO, Ft. Knox, KY DSN /4782 Combined Arms Center LNO, Ft. Leavenworth, KS DSN PROGRAM EXECUTIVE OFFICES (PEO) PEO Ammunition LNO, Picatinny Arsenal, N.J. DSN PEO Aviation/Missiles/Space LNO, Redstone Arsenal, AL DSN PEO C3T LNO, Ft. Monmouth, N.J. DSN PEO CS & CSS LNO, Warren, MI DSN JPEO CBDP LNO, Falls Church, VA DSN PEO Ground Combat Systems LNO, Warren, MI DSN PEO IEWS LNO, Ft. Monmouth, N.J. DSN PEO Soldier/EIS, Ft. Belvoir, VA DSN PEO STRI LNO, Orlando, FL DSN Joint Forces Command, Norfolk, VA JIEDDO, Alexandria, VA Rapid Equipping Force (REF), Ft. Belvoir, VA /4244 Note: LNO locations and contact information will change due to BRAC. An update to this table will be provided when this occurs. b. LNOs roles and responsibilities are as follows: ATEC Pamphlet June 2010 B-1

218 (1) Monitor technical demonstrations (pre-ms A) for possible ATEC involvement or assistance; provide ATEC info on emerging systems and requirements; expedite entry of new program starts into ADSS/TSARC process; facilitate the flow of information between ATEC and PEO elements; and report on matters that impact T&E execution. (2) Review, comment, and assist with the preparation of T&E documentation. (3) Provide a focal point for program conflict/issue resolution. (4) Provide on-site liaison and coordination between ATEC elements and the PEO, PMs, APMs, and key staff members. (5) Provide T&E expertise, guidance and information. (6) Assist PEO/PMs with T&E Program Objective Memorandum (POM) development and justification. (7) Assist with Milestone reviews and definition of T&E acquisition regulatory requirements. (8) Provide support to ATEC elements visiting the PEO. B-2 ATEC Pamphlet June 2010

219 Appendix C Joint Capabilities Integration and Development System (JCIDS) Checklists C-1. Purpose The purpose of this appendix is to provide a standard process to review JCIDS documents by all members of the ATEC System Team. The checklists are divided into a review of the three capabilities documents: ICD (enclosure C-1), CDD (enclosure C-2), and CPD (enclosure C-3). The review should address gap discussion, mission area discussion, concept of operations (CONOPS) discussion, and detailed requirement review. For an explanation of the items listed, see CJCSM C. C-2. Comments Matrix Upon completion of the review, the AST may find it necessary to develop an effective comments matrix. This matrix will identify ambiguities and address omissions in the capabilities documents. The format for the matrix is at the discretion of the AST, but it must specify the following items: a. Page, paragraph, and sub-paragraph (if appropriate) of the text in question. b. The question. c. Rationale for the question. ATEC Pamphlet June 2010 C-1

220 Enclosure C-1 ICD Review Checklist The ICD documents the requirement to resolve a specific capability gap or a set of capability gaps for a given timeframe. It identifies possible materiel or non-materiel solutions to the gap(s). The ICD summarizes the results of DOTMLPF analysis and identifies any changes in U.S. or allied doctrine, operational concepts, organization, training, and policy, where applicable. The ICD may be developed as a single document defining required capabilities and approaches to providing those capabilities. The ICD may also be developed based on the analysis in an approved JCD combined with a completed Functional Solution Analysis (FSA) that addresses one or more of the capability gaps identified in the JCD. The ICD should include an analysis of the following areas: (1) Concept of Operations Summary. The concept of operations summary is a description of the relevant part of the Joint Operations Concepts (JOpsC) CONOPs and/or Unified Command Plan (UCP)-assigned mission to which this capability contributes. (2) Joint Functional Area. Cite the applicable functional areas, the range of military operations, and the timeframe under consideration. (3) Required Capability. Describe the capabilities required and the timeframe in which they are required as identified during the Functional Area Analysis (FAA). (4) Capability Gaps and Overlaps or Redundancies. Describe, in operational terms, the missions, tasks, and functions that cannot be performed or are unacceptably limited or when and how they will become unacceptably limited. Identify those capabilities for which there exist overlaps or redundancies. (5) Threat and Operational Environment. (a) Describe in general terms the operational environment, including joint operating environments, in which the capability must be exercised and the manner in which the capability will be employed. (b) Summarize the current and projected threat capabilities (lethal and nonlethal) to be countered. (6) Functional Solution Analysis Summary. Study ideas for non-materiel approaches (DOTMLPF analysis). Identify any changes in U.S. or allied doctrine, operational concepts, tactics, organization, training, materiel, leadership and education, personnel, facilities, or policy that are considered in satisfying the deficiency in part or in whole. (7) Integrated Architecture Products. Verify that the required architecture framework view (operational view (OV)-1) product is included. (8) Final Recommendations. Describe the best materiel and/or non-materiel approaches as determined by the FSA. C-2 ATEC Pamphlet June 2010

221 Enclosure C-2 CDD Review Checklist The CDD is the sponsor s primary means of defining authoritative, measurable, and testable capabilities needed by the Warfighters to support the EMD phase of an acquisition program. It captures the information necessary to deliver an affordable and supportable capability using mature technology within one or more increments of an acquisition strategy. The CDD must include a description of the DOTMLPF and policy impacts and constraints. The CDD will be validated and approved before Milestone B. (1) Capability Discussion. Describe the capability that the program delivers and how it relates to the characteristics of the future joint force as identified in the Capstone Concept for Joint Operations (CCJO), CONOPs, and integrated architectures. (2) Analysis Summary. Summarize all analyses (i.e., analysis of alternatives (AoA) and/or other support analysis) conducted to determine the system attributes and to identify the KPPs. (3) Concept of Operations Summary. Describe the relevant part of the JOpsC, CONOPs, and/or UCP-assigned mission to which this capability contributes; what operational outcomes it provides; what effects it must produce to achieve those outcomes; how it complements the integrated joint warfighting force; and what enabling capabilities are required to achieve its desired operational outcomes. (4) Threat Summary. Summarize the projected threat environment and the specific threat capabilities to be countered. (5) Program Summary. Provide a summary of the overall program strategy for reaching full capability and the relationship between the increment addressed by the current CDD and any other increments of the program. (6) System Capabilities Required for the Increment(s). Provide a description of each attribute and list each attribute in a separate numbered subparagraph. Include a supporting rationale for the capability and cite any analytic references. (7) Family of Systems (FoS) or Systems of Systems (SoS) Synchronization. In FoS and SoS solutions, the CDD sponsor is responsible for ensuring that related solutions, specified in other CDDs and CPDs, remain compatible and that the development is synchronized. (8) Information Technology and National Security Systems Supportability. For systems that receive or transmit information, provide an estimate of the expected bandwidth and quality of service requirements for support of the capability (on either a per-unit or an aggregate basis, as appropriate). (9) Intelligence Supportability. Identify, as specifically as possible, all projected requirements for intelligence support throughout the expected acquisition life cycle. ATEC Pamphlet June 2010 C-3

222 (10) Electromagnetic Environmental Effects (E3) and Spectrum Supportability. Define the electromagnetic spectrum requirements that the system must meet to assure spectrum supportability and the requirements for spectrum availability necessary for full use of system capabilities. (11) Logistics Supportability Assessment and Documentation. The ten Integrated Logistics Support (ILS) elements serve as a baseline to develop and document logistics supportability requirements for the materiel system. As applicable, the capability developer shall address each element when developing the CDD. Documenting logistics supportability requirements early in the developmental process is essential to assure the system s associated support structure is communicated to the materiel developer. (12) Assets Required to Achieve Initial Operational Capability (IOC). Describe the types and initial quantities of assets required to attain IOC. (13) Schedule and IOC and Full Operational Capability (FOC) Definitions. Define what actions, when complete, will constitute attainment of IOC and FOC of the current increment. (14) Other DOTMLPF and Policy Considerations. Discuss any additional DOTMLPF and policy implications associated with fielding the system that have not already been addressed in the CDD, to include those approaches that would impact CONOPs or plans within a combatant command s area of responsibility. (15) Other System Attributes. Address attributes that tend to be design, cost, and risk drivers, including environment, safety, and occupational health (ESOH); human system integration (HSI); embedded instrumentation; electronic attack (EA); information protection standards and information assurance (IA); and wartime reserve mode (WARM) requirements. (16) Program Affordability. Cost will be included in the CDD as life-cycle cost or, if available, total ownership cost. (17) Integrated Architecture Products. Verify that the required architecture framework view products include all views (AV)-1, OV-1, OV-2, OV-4, OV-5, OV-6C, systems view (SV)- 2, SV-4, SV-5, SV-6, and technical view (TV)-1. (18) Net Ready (NR) KPP requirements. Verify that the required Net Ready KPP products include (a) Net Centric Operations Warfare Reference Model (NCOW-RM) (b) NR-KPP statement. (c) IA Statement of Compliance. (d) Technical Standards/Interface Elements. C-4 ATEC Pamphlet June 2010

223 Enclosure C-3 CPD Review Checklist A CPD is finalized after post critical design review and is validated and approved before the Milestone C acquisition decision. Because a CPD is finalized after post critical design review and after the majority of capability development, it is normally not appropriate to introduce new requirements at this point. New requirements should be included in the next increment in an evolutionary program or in a future modification or upgrade. The CPD must include a description of the DOTMLPF and policy impacts and constraints. (1) Capability Discussion. Describe the capability that the program delivers and how it relates to the characteristics of the future joint force as identified in the CCJO, CONOPs, and integrated architectures. (2) Analysis Summary. Summarize all analyses (i.e., AoA and/or other support analysis) conducted to determine the system attributes and to identify the KPPs. (3) Concept of Operations Summary. Describe the relevant part of the JOpsC, CONOPs, and/or UCP-assigned mission to which this capability contributes, what operational outcomes it provides, what effects it must produce to achieve those outcomes, how it complements the integrated joint warfighting force, and what enabling capabilities are required to achieve its desired operational outcomes. (4) Threat Summary. Summarize the projected threat environment and the specific threat capabilities to be countered. (5) Program Summary. Provide a summary of the overall program strategy for reaching full capability and the relationship between the increment addressed by the current CPD and any other increments of the program. (6) System Capabilities Required for the Increment(s). Provide a description of each attribute and list each attribute in a separate numbered subparagraph. Include a supporting rationale for the capability and cite any analytic references. (7) Family of Systems or Systems of Systems Synchronization. In FoS and SoS solutions, the CPD sponsor is responsible for ensuring that related solutions, specified in other CDDs and CPDs, remain compatible and that the development is synchronized. (8) Information Technology and National Security Systems Supportability. For systems that receive or transmit information, provide an estimate of the expected bandwidth and quality of service requirements for support of the capability (on either a per-unit or an aggregate basis, as appropriate). (9) Intelligence Supportability. Identify, as specifically as possible, all projected requirements for intelligence support throughout the expected acquisition life cycle. ATEC Pamphlet June 2010 C-5

224 (10) Electromagnetic Environmental Effects (E3) and Spectrum Supportability. Define the electromagnetic spectrum requirements that the system must meet to assure spectrum supportability (11) Logistics Supportability Assessment and Documentation. The ten ILS elements (Maintenance Planning; Manpower and Personnel; Supply Support; Support Equipment; Technical Data; Training and Training Support; Computer Resources Support; Facilities; Packaging, Handling, Storage and Transportability (PHST); and Design Interface) serve as a baseline to develop and document logistics supportability requirements for the materiel system. As applicable, the capability developer shall address each element when developing the CPD. Documenting materiel system logistics supportability requirements early in the developmental process is essential to assure the system s associated support structure is communicated to the materiel developer. (12) Assets Required to Achieve Initial Operational Capability (IOC). Describe the types and initial quantities of assets required to attain IOC. (13) Schedule and IOC and Full Operational Capability (FOC) Definitions. Define what actions, when complete, will constitute attainment of IOC and FOC of the current increment. (14) Other DOTMLPF and Policy Considerations. Discuss any additional DOTMLPF and policy implications associated with fielding the system that have not already been addressed in the CDD, to include those approaches that would impact CONOPs or plans within a combatant command s area of responsibility. (15) Other System Attributes. As appropriate, address attributes that tend to be design, cost, and risk drivers, including ESOH, HSI, embedded instrumentation, electronic attack (EA), information protection standards and IA, and wartime reserve mode (WARM) requirements. (16) Program Affordability. Cost will be included in the CPD as life-cycle cost or, if available, total ownership cost. (17) Integrated Architecture Products. Verify that the required architecture framework view products include AV-1, OV-1, OV-2, OV-4, OV-5, OV-6C SV-2, SV-4, SV-5, SV-6, and TV-1. (18) Net Ready KPP requirements. Verify that the required Net Ready KPP products include (a) Net Centric Operations Warfare Reference Model (NCOW-RM). (b) NR-KPP statement. (c) IA Statement of Compliance. (d) Technical Standards/Interface Elements. C-6 ATEC Pamphlet June 2010

225 Appendix D Capability and Limitations Report (CLR) Writing Guide D-1. The format of the CLR is flexible and can be adjusted as necessary to meet the needs of individual programs. The CLR should be concisely written in plain English to facilitate understanding by nonscientific (non-t&e) personnel. The highlights of the report should be emphasized using editorial techniques such as bold text, underlines, bullets, and boxing of text. The CLR format should make it inviting and easy to read for Warfighters, who scan for information they need rather than reading cover to cover. D-2. Each CLR should contain the following major paragraphs: (1) Purpose. This paragraph states the purpose of the CLR, in order to put it in context for both the Warfighter and the acquisition community. It should read similarly to the statement below. This assessment provides a rapid report of the capabilities and limitations of the XXX System. It provides Warfighters with information, limitations, and recommendations for use in theater. It provides useful information to the acquisition authority and material developer for further system development. (2) Executive Summary (EXSUM). This paragraph can be located before or after the Purpose paragraph. When possible, it should be no longer than one paragraph and less than 150 words. It may also have a box or two with the most pertinent findings. The target audience is the Warfighter, so the summary should be clearly stated using ordinary language. Provide the bottom line up front. Include key information regarding capabilities, limitations, safety risks, important employment considerations, and critical interoperability issues. (3) Mission Need. This paragraph addresses the intended capabilities and use of the RAI system. If known, include the Operational Needs Statement (ONS) number and the requesting unit and/or organization in this paragraph. (4) System Description. This paragraph describes the system, and should include pictures or other descriptive graphics. Do not include a complete description of every subsystem or specifications. Write a summary description of what the RAI is and how it is designed to work. Do not duplicate the PM/developers description as this may give the casual reader the impression that system T&E results are being stated. This paragraph, as well as the Mission Need paragraph, should clearly state that these are the intended uses and capabilities of the system, not findings of the evaluation. (5) Data Sources. This paragraph describes the testing that was performed, if any, in support of the CLR. Since this is a report for Warfighters, it need not go into comprehensive detail. For example, state the Test Center, not the individual range on which tests were performed unless significant to the findings. Test dates may be included as a frame of reference for the reader. (6) Test Limitations. Include what could not be tested or assessed, why, and the impact. Be brief and concise. If there are no limitations then state so. ATEC Pamphlet June 2010 D-1

226 (7) Observations. Address the findings in each area. Use bulletized lists for each of these sections. List all demonstrated capabilities. Address capabilities that the Warfighter does not currently have. This section should only include what we know is true about the system. (a) Capability Observations. State the major capabilities that were observed during testing with respect to current system and mission profiles, and identify the impact on the ability of the Soldier to perform the mission using the RAI system. Be concise. If you are not sure of a capability then address why under Test Limitations. (b) Limitation Observations. List all known limitations. Ensure important limitations are included in EXSUM. Accurately describe how aspects of the system do not meet mission need. Be concise. If you are not sure of a limitation then address why under test limitations, or include information in the Unknown section. This section will only include what we know is true about the system s limitations. Write limitations in parallel, if possible, with capabilities. For example, if a system is compatible with some equipment but not others, you would have one capability bullet and one limitation bullet, ideally at the same place in each list (first bullets, second bullets, etc.) (c) Safety Observations. Include all safety issues in this section. Ensure safety information is consistent with the system safety confirmation. Even though there are very short suspense s, the AST must share draft CLR and draft Safety Confirmations within the AST to ensure that ATEC documents do not disagree. It is acceptable, however, to alert Warfighters to, and comment on, the mitigation strategies used for issue resolution in the Safety Confirmation. Explain High and Medium risks and provide options to mitigate risks. Include all High and Medium safety risks in an EXSUM. (d) Interoperability Observations. Include all known interoperability observations in this section. Address interoperability with Army, other U.S. agencies, and coalition systems. If there are unknown, critical interoperability issues state in this section AND in unknowns section. (e) Training Observations. Address any known training issues on the system, to include operator training, maintainer training, and new equipment training. Also include a statement if no or limited training has been or will be provided to the end users. If ATEC did not receive a training support package to review, include a statement similar to No system training support package has been provided for ATEC to evaluate. (f) Supportability Observations. Include RAM, logistic support, and MANPRINT, observations as well as repair capabilities with tools and equipment currently in units, or are new items required. Include maintenance support concepts including unique maintenance training/skills. (g) Survivability Observations. Include threat survivability observations. (h) Unknowns. Include key unknowns regarding system capabilities so the reader understands the full scope of the system assessment. List questions that ATEC would have explored had the resources (time, test facilities, funding, etc.) been available. An example of an unknown is whether the system is effective beyond 50 meters. D-2 ATEC Pamphlet June 2010

227 (8) Employment Considerations. Include important suggestions for using the system. Include operational considerations that enable Warfighters to use the system to their best advantage and mitigate risks. (9) Recommendations. In this section, include recommendations for both the user and the acquisition community. Again, bulletized lists are the preferred format for each of these subsections. If some recommendations fit in both the user and acquisition paragraphs, list them in the main recommendations section. Recommendations should be linked to earlier discussion, for example, a limitation, a shortfall in training, interoperability, or etc. (a) User Recommendations. Include important recommendations to aid the user in system operation. Provide information such as employment, training, and logistical recommendations for the Combatant Commanders who will receive the equipment. For example, Allow 10 minute system warm-up before use. (b) Acquisition Recommendations. Include recommendations to the material developer that will render the system more safe, effective, suitable, survivable, interoperable, reliable, maintainable, and supportable. For instance, list product development, production, and support package recommendations that the materiel developer, Capabilities Developer, system contractor, or other acquisition community members should consider. If appropriate, add a statement recommending consideration for legal review of the program before release of the system. (10) References. Include all references used to create the report. Be sure to include the Safety Confirmation. D-3. The overall highest classification of document must be marked top and bottom of every page in accordance with AR 380-5, paragraphs 4-4 and 4-6, and Section 1, Marking Classified National Security Information, Oct 07. a. In accordance with AR 380-5, paragraph 4-7, each classified document will be marked with the source of the classification on the front page of classified CLR template to reflect following CAVET: Derived from: (to be followed by the name of the document, SCG classify the information). used to Declassify on: Date of Source: As determined by SCG instructions Date of SCG b. In accordance with AR 380-5, paragraph 4-14, downgrading instructions are not required for every classified document, but they must be placed on the face of each document to which they apply. When the OCA has determined that a document will be downgraded to a lower classification upon the passage of a date or event, the document will be marked: Downgrade to SECRET on... followed by the date or event, and/or Downgrade to CONFIDENTIAL on... followed by the date or event. This marking is placed immediately before the Declassify on ATEC Pamphlet June 2010 D-3

228 line and is used in addition to, and not as a substitute for, declassification instructions. Indicated below are additional examples of derivative classifications: **NOTE** SCGs still exist calling for use of either OADR or X1-X8 codes in the Declassify on line. These guides are now incorrect since the use of these codes is no longer authorized. If the guide in question calls for the use of X1-X8, use September 23, 2028 in the Declassify on line unless receiving additional clarification from the Original Classification Authority (OCA). If the guide in use still calls for the use of OADR, use October 15, 2020 in the Declassify on line. These dates reflect 25 years from the last authorized use of these codes. (OADR) should be marked as follows: Derived from: Multiple Sources Declassify on: Source marked OADR Date of Source: 1 September 1995 Document created prior to March 2003 (X1-X8) to be marked as follows: Derived from: Multiple Sources Declassify on: Source marked X2, X3, X5 Date of Source: 10 Feb 96 (*The most restrictive date must be used) Document created before 14 Oct 95 (OADR) to be marked as follows: Derived from: Multiple Sources Declassify on: Source marked OADR Date of Source: 1 September 1995 Document created prior to March 2003 (X1-X8) to be marked as follows: Derived from: Multiple Sources Declassify on: Source marked X2, X3, X5 Date of Source: 10 Feb 96 (*The most restrictive date must be used) Document containing OADR and X1-X8 markings: Derived from: Multiple Sources Declassify on: Source marked X2, X3 and OADR Date of Source: 10 Feb 96 (Use the most recent date) Rationale: Draft CLR coversheet does not currently reflect classification CAVET example, or indicate where ATEC wants Instructions placed on page. D-4 ATEC Pamphlet June 2010

229 (Marking for training purposes only) SECRET Enclosure D-1 CLR Format ATEC (U) Capabilities and Limitations Report for the XX Month 20XX SECRET Overall Guidelines Follow DA Pamphlet , Effective Writing for Army Leaders. Avoid passive voice if possible. Avoid vague terms. Be brief and concise. Follow index for major headings. Use Arial font throughout. <Add picture of system here> SECRET Date: Approved: ROGER A. NADEAU Major General, USA Commanding Prepared in support of <Enter requesting organization> Distribution authorized to U.S. Government agencies only (Test and Evaluation) (date of determination). Other requests for this document shall be referred to Commander, ATEC, 4501 Ford Avenue, Alexandria, VA The use of trade names in this document does not constitute an official endorsement or approval of the use of such commercial hardware or software. This document may not be cited for purposes of advertisement. Use this statement if trade names are used in report. If not, delete. SECRET (Marking for training purposes only) ATEC Pamphlet June 2010 D-5

230 (Marking for training purposes only) SECRET USATEC XX Month 20XX Produced as input to the U.S. Army s Rapid Materiel Equipping Initiatives Purpose 2 Mission Need 2 (U) Purpose This evaluation provides a rapid assessment of the capabilities and limitations of the system. It provides warfighters with information, limitations, and recommendations for use in theater. Additionally, it provides useful information to the acquisition authority and material developer for further system development. Author may modify if customer requests assessment of additional areas. U) Executive Summary Bottom Line Up Front. o Limit to 150 words. Include key information regarding: Enter high and medium safety risks. System Description 2 Data Sources 2 Test Limitations 2 Observations 5 Capabilities 3 Limitations 3 Safety 4 Interoperability 4 o o o o o Capabilities. Limitations. Highlight High and Medium safety risks in separate text boxes. Important employment considerations. Critical interoperability issues. Training 4 Supportability 4 Survivability 5 Unknowns 5 Employment Considerations Recommendations 5 5 Users 5 (U) Mission Need Enter brief summary of mission need. o Include ONS number if known. o Include requesting unit/organization if known. Include program requirements/goals/objectives. SECRET (Marking for training purposes only) D-6 ATEC Pamphlet June 2010

231 XX Month 20XX (Marking for training purposes only) SECRET USATEC XX Month 20XX (U) System Description Include key components and parts. Include a brief description of system operations. Include pictures, diagrams or engineering drawings, labels as required. (U) Data Sources Include ALL data sources used to create report. (U) Test Limitations Include what could not be tested or assessed and why. Be brief and concise. If there were no limitations then state, There were no test limitations. SECRET (Marking for training purposes only) ATEC Pamphlet June 2010 D-7

232 (Marking for training purposes only) SECRET USATEC XX Month 20XX (U) Observations SYSTEM NAME (U) Capabilities (U) Limitations Use bullet statements. Use bullet statements. List ALL demonstrated capabilities. Address capabilities this system provides that the warfighter does not currently have. Ensure important capabilities are included in EXSUM. Accurately describe the ability of the system to achieve mission need. Be concise! Avoid vague terms like good. Avoid phrases like seems to or appears to. If you are not sure of a capability then address why under Test Limitations, or include information in the Unknowns section, NOT here! THIS SECTION SHALL ONLY INCLUDE WHAT WE KNOW IS TRUE ABOUT THE SYSTEM S CAPABILITIES! List ALL known limitations. Ensure important limitations are included in EXSUM. Accurately describe how aspects of system do not meet mission need. Be concise! Avoid phrases like seems to or appears to. If you are not sure of a limitation then address why under test limitations, or include information in the Unknowns section, NOT here! THIS SECTION SHALL ONLY INCLUDE WHAT WE KNOW IS TRUE ABOUT THE SYSTEM S LIMITATIONS! The template shows capabilities and limitations set up into columns. This is preferred, but if you have more than one page of capabilities and limitations use standard text format. If this report addresses more than one system, list capabilities and limitations of each system separately. SECRET (Marking for training purposes only) D-8 ATEC Pamphlet June 2010

233 (Marking for training purposes only) SECRET USATEC XX Month 20XX (U) Training Observations Include key training observations. Explain what the warfighter must know to utilize the system effectively. (U) Safety Observations Include all safety issues in this section. o Summarize Safety Confirmation. (U) Interoperability Observations Include all known interoperability observations in this section. Address interoperability with Army, other US and coalition systems. If there are unknown, critical interoperability issues. For example, unknown interoperability with Blue Force comms, state in this section AND in unknowns section. Include critical interoperability issues in EXSUM. (U) Supportability Observations Include RAM observations in this section. Include logistic support observations information. o o o Ability to obtain and replace damaged components. Repair of battle damage. Repair capabilities with tools and equipment currently in units, or are new items required. MANPRINT. Troubleshooting/Diagnostics. Maintenance support concepts including unique maintenance training/skills. SECRET (Marking for training purposes only) ATEC Pamphlet June 2010 D-9

234 (Marking for training purposes only) SECRET USATEC XX Month 20XX (U) Survivability Observations Include threat survivability observations. Examples: o o Altitude limitations make the system susceptible to antiaircraft threats. For armor protection systems include a system diagram showing protection levels. (U) Unknowns Include key unknowns regarding system capabilities so the reader understands the full scope of the system assessment. For example, It is unknown whether the system is effective beyond 50 meters. (U) Employment Considerations Include important suggestions for using the system. For example, Recommend using System A with System B to provide 360 degree coverage. (U) Recommendations (U) User Recommendations Include important recommendations to aid the user in system operation. For example, Allow 10 minute system warm-up before use. (U) Acquisition Recommendations Include recommendations to the material developer that will render the system more safe, effective, suitable, survivable, interoperable, reliable, maintainable, and supportable. (U) References Include all references used to create the report Be sure to include a Safety Confirmation. SECRET (Marking for training purposes only) D-10 ATEC Pamphlet June 2010

235 Enclosure D-2 CLR Staffing 1. All CLRs will complete internal ATEC development and staffing within 30 business days or less after receipt of final database from the last test event) or completion of the last test event required to support the CLR. Table 1, CLR Timelines, reflects the number of business days for development and staffing to include the number of days for each review level. a. The AST Chair will develop the CLR in coordination with AST members, team leaders and division chiefs as part of the development process, and will develop an emerging results brief, if required. b. The AST Chair s directorate (lead directorate) will officially staff the developed CLR for final review and concurrence/non-concurrence. The AST Chair will also coordinate the developed CLR with the sponsor. c. An ATEC Form 45 will be submitted with the CLR when officially staffed, and it will include a suspense date for each review level. The review levels are: (1) AST Directorate Management Review, which includes the lead Army Evaluation Center (AEC) directorate; Operational Test Command (OTC); Developmental Test Command (DTC); and AEC s Survivability Evaluation Directorate (SVE), Reliability and Maintainability Evaluation Directorate (RAM), Integrated Logistics Support Evaluation Directorate (ILS), and Security. (2) AEC Lead Directorate Management Final Review, which includes directorate management (Director/Technical Director (TD)). (3) AEC Technical Editing Final Review. (4) AEC Headquarters (HQ) Review. (5) ATEC HQ Review and Approval. d. The timeline for official staffing through AEC HQ (level 4) is 10 business days or less. Each review level will conduct its review and annotate on an ATEC Form 45 their response (concurrence or concurrence with comments) within 2 business days, unless there are critical comments. (1) When a reviewer has critical comments, the reviewer will contact the AST Chair s directorate for resolution within 2 business days. (2) When OTC or DTC internal process requires a CLR be submitted to OTC Commander or DTC Director for review and concurrence, the OTC or DTC AST member must inform the AST Chair. The approved process will be included in the T&E Concept Brief to ATEC leadership. The ATEC Form 45 will reflect this requirement. The OTC and DTC HQs will provide their signature within 5 business days using the ATEC Form 45. ATEC Pamphlet June 2010 D-11

236 (3) At review levels 1 and 2, the review will be conducted in parallel within and between directorates. e. The DTC will develop the safety confirmation (SC) in time to support CLR development and staffing (see table 1). f. The CLR Updates will follow the same staffing process as that required for CLRs. 2. Procedures. a. CLR Development. (1) The AST Chair must develop the CLR in coordination with AST members before officially beginning internal ATEC staffing. (2) Inputs from team leaders/division chiefs and initial technical editing are to be obtained during the development process. In addition, any assigned SCAs internal process must meet the CLR development timeline in table 1. b. CLR Internal ATEC Staffing. The process is described in individual steps in the following paragraphs. In addition, figure 1 provides a visual depiction of the staffing process. NOTE: The same ATEC Form 45 is used for all review levels and must include all required office symbols in the coordination section. See enclosure 1 for key office symbols and ATEC Regulation for a complete listing of office symbols. (1) Upon agreement by all AST members, the developed CLR is ready to begin the official staffing process. The AST Chair prepares an ATEC Form 45 with appropriate suspense for each review level. The sample ATEC Form 45 provided in enclosure 1 includes instructions for preparation. (2) AST Directorate Management Review (Level 1). The AST Chair staffs CLR and ATEC Form 45 in parallel via SIPR/NIPR (as appropriate) to all members of the AST Directorate Management review team (Review Level 1). The AST Chair will also provide NIPR alert to AST members that the CLR has been provided via SIPR. In addition, the AST Chair will submit the CLR to the sponsor for informal coordination. AST Directorate Managers will provide concurrence signature within 2 business days. The AST Chair will satisfactorily address any comments from directorate management in coordination with the AST, if necessary, in 1 business day. (3) AEC Lead Directorate Management Final Review (Level 2). The AST Chair will submit the revised CLR and ATEC Form 45 via SIPR/NIPR to his/her directorate management for final review and concurrence. The Director or TD will complete his/her final review ensuring the Level 1 review comments were addressed, and concur within 2 business days. If there are any additional critical and/or substantive comments, the AST Chair will satisfactorily address them prior to the Director or TD signing the ATEC Form 45 for the directorate. (4) AEC Technical Editing Final Review (Level 3). The AST Chair will submit the revised CLR and ATEC Form 45 via SIPR/NIPR or CD (most efficient method) with appropriate D-12 ATEC Pamphlet June 2010

237 suspense. Technical editing will edit and return the CLR with signed ATEC Form 45 to the AST Chair within 2 business days. (5) AST Chair Final Review. The AST Chair will conduct a final check to ensure documents are ready for AEC HQ review. The AST Chair or designee will electronically send the CLR, signed ATEC Form 45, and the signed SC to AEC HQ administrative staff for review by the AEC Director and TD. This final check should be completed on the same day as receipt. (6) AEC HQ Review (Level 4). AEC HQ administrative staff downloads the CLR package on the same day as receipt, when possible. The AEC HQ Director and TD will review the CLR within 2 business days. If minor administrative errors are found, the administrative staff will make the corrections to keep the document moving forward. If significant changes are required, the AEC HQ administrative staff will return the document on the same day (when possible) to the AST Chair for corrections. The corrections should be made in 1 business day or less and returned to AEC HQ. Once the AEC TD and Director or designees have signed the ATEC Form 45, the AEC HQ administrative staff will forward the documents to ATEC HQ for approval. Again, this is done on the same day that it is received. (7) ATEC HQ Review and Approval (Level 5). If any changes are required, ATEC HQ will return the CLR to AEC HQ administrative staff. The AEC HQ administrative staff will either correct the document or return it to the AST Chair for immediate corrections. A maximum of 1 business day is allowed for this action. AEC HQ administrative staff resubmits the CLR package to ATEC HQ in less than 1 day. The established process for returning approved documents to the AST Chair will continue to be used. Note 1: Directorate management will review CLR and concur electronically when it is more efficient to do so. Note 2: To prevent return of the CLR package by AEC HQ administrative staff, the AST Chair s directorate must ensure the following prior to submittal: Technical Editing (final editing) concurrence. AEC security concurrence. SVE, RAM, and ILS concurrence or annotation on Form 45 stating why it is not required (e.g., not a member of AST, missed suspense, etc.). OTC concurrence or annotation on Form 45 stating why it is not required. DTC concurrence or annotation on Form 45 stating why it is not required. Statement on Form 45 that CLR was coordinated through PM or RI sponsor. Director s (AST Chair s directorate) or designee s concurrence. SC included. Statement on Form 45 that at the T&E Concept Review it was agreed that OTC HQ or DTC HQ concurrence was or was not required. ATEC Pamphlet June 2010 D-13

238 c. CLR Timeline. The timeline for CLR development, staffing, and approval as well as for the SC is shown in table 1. The CLR is to be completed and approved within 30 business days after the last test event ends (E+30) that is required to support the CLR. Table D-1. CLR Timeline Business Days Number of Business Day Review Levels Action E+0 0 E N/A The last agreed-upon test event to support assessment ends Emerging results brief completed in first 7 days (if required)* AEC (AST Chair) develops CLR in coordination with all AST members including team leaders, division chiefs, and initial edit. DTC develops and publishes SC; safety findings are informally coordinated with AST prior to publication. E+22 5** Levels 1 and 2 CLR staffing completed at AST directorate and Lead directorate management levels E+24 2 Level 3 Final technical edit of CLR E+27 3** Level 4 E+29 2 Level 5 E+30 1 Level 5 CLR staffed and concurred with by AEC HQ. CLR package will include the DTC approved SC. CLR staffed and approved by ATEC HQ. CLR package will include the DTC approved SC. ATEC posts approved CLR and SC to ATEC GWOT site. DTC will make any additional distributions after CLR approval. *If emerging results brief is required, lead directorate needs to ensure adequate support is provided so that briefing and CLR can be worked in parallel. **Includes 1 day for rework or other administrative actions. Note 1: The CLR, SC and customer delivery dates may be changed by ATEC-RRT when agreed to by the customer and AST. D-14 ATEC Pamphlet June 2010

239 AST Directorate Mgmt Review AST LEVEL Develop/prepare CLR (to include team leader inputs/initial editing) then Chair submits CLR electronically with Form 45 for higher level reviews after AST agrees its ready Para 4.a and 4.b(1) DTC ILS (TD/DIR) RAM (TD/DIR) SVED (TD/DIR) POD (Security) RI/PM Sponsor Lead DIR (TD) Any comments? Y N Para 4.b(4) AST Chair addresses in coordination AST members as needed AST Chair electronically sends CLR & 45 to Dir or TD TECH EDITING conducts final edits/ format CLR and returns to Chair with electronically signed 45 AST Chair conduct final check and sends CLR, package/45 electronically to AEC HQs Admin AEC Lead Directorate Mgmt Final Review Para 4.b(3) AST Chair TD or Dir reviews CLR before signing 45 Y Director signs & AST Chair sends/hand carries CLR & 45 to Tech Editing AEC HQs Review Para 4.b(6) Any comments? N AEC TD signs N OTC Reduce staffing from 20 to 10 days AST DIR Mgmt: 2 days (parallel staffing) Lead DIR Mgmt (AST Chair directorate) Final Review - 1to 2 days Tech Editing Final: 2 days AEC HQs: 2 days 2 days for re-staffing due to any rework *If AEC HQs comments or minor, AEC TD or Director may direct its admin staff to make corrections Para 4.b(2) AEC DIR signs and AEC HQs Admin (N) forwards to (S) AEC HQs Admin carry hardcopy to ATEC for approval AEC HQs Admin download 45/ CLR package for AEC TD review N AEC TD reviews CLR before electronically signing 45 AST Chair* addresses in coordination w/ AST as needed Y Any comments? ATEC HQs Review Para 4.b(7) Any comments? Y AEC DIR reviews CLR before signing 45 Any comments? AEC HQs admin forward to AEC DIR N AST Chair addresses in coordination w/ AST as needed Figure D-1. CLR Staffing Process Map ATEC returns to AEC HQs/ Admin forward to Chair Y ATEC Pamphlet June 2010 D-15

240 Enclosure D-3 Example of Timeline The actual timeline is negotiated between the sponsor and the AST. BUSINESS DAY # of DAYS ACTION Request received 0 Test Support/Notification package received. Package includes at a minimum: system name, system description (unclass version), quad chart, deliverable timeline (CLR, SC required dates), sponsor info (office, name, phone, and ) 1 RRT develops ATEC TD RI Guidance Form 45 2 ATEC TD decides level of support required (i.e., Safety Confirmation - CLR - Waiver) 1 ATEC RI tasker package finalized and AST tasker (Form 45) released Req AST formed and ADSS entries complete 7 AST initial test coordination meeting 10 Develop and brief T&E Concept Req Approve T&E Concept 5 Resourcing and TRR 4 Funding window MIPRs cut to SCAs Req Test window Req + 85 T+7 SC complete/signed by DTC Commander Req + 85 T+7 ERB (w/chart from SC) Req + 92 T+21 AST Writes CLR Req T+51 CLR staffed and approved CALENDAR DAYS: ERB + 20 Ship to OCO 20 Full package (SC + CLR) 30 Deploy for initial training 30 Deployment (Free Play) FOA 30 FOA EXSUM After 6 months the CLR should be updated. D-16 ATEC Pamphlet June 2010

241 Enclosure D-4 Data Authentication Group (DAG) Procedures for RAI 1. Overview An ATEC DAG is formed to authenticate test data. The DAG is a group of representatives from the testing, evaluation, user, and acquisition community, formed for the purpose of authenticating data that is generated, collected, and reduced. DAG procedures are basically the same for Rapid Acquisition Initiatives as for programs of record. This enclosure is a reminder that Rapid Acquisition Initiatives may also require a DAG. DAG procedures, a generic charter and a SOP are located in appendix M. 2. Definition DAGs are multidisciplinary teams with membership from various agencies that have responsibility and a vested interest in the system under test. The AST coordinates with the acquisition community and identifies appropriate membership for the DAGs. Membership for DAGs generally includes the evaluator, the test officer, the test analyst, the data manager, Materiel Developer/PM, DTC, and subject-matter experts. Mandatory members are evaluator, tester, PM/developer, and the Capabilities Developer. a. The mission of a DAG is to authenticate that the data collected and reduced at an event is suitable for analysis and evaluation. Authentication of data is dependent upon an ongoing process where it is verified that the data is complete, accurate, consistent, and representative of a system s performance. b. An Internet application, WebDAG was developed to allow authorized DAG members Internet access to test data from any location capable of Internet connectivity at any time. It allows unique configuration for each DAG event and may be configured for each test in accordance with identified DAG requirements. WebDAG is not appropriate for all tests, especially for tests on classified systems, tests of short duration, and tests where internet access is limited or non-existent. 3. Considerations for forming a DAG a. Need for DAG. Any event that is anticipated to provide data for use in a system assessment or evaluation is a candidate for a DAG. The AST establishes the need for a DAG to support a specific event and is documented in the T&E concept and in the test plan. b. Tailoring DAG Duties to Specific Event. The duties for each specific event may be tailored by the DAG Chair to permit responsiveness to the requirements of the system assessment or evaluation strategy and the unique features of the event. Thus, different events may have different DAG Chairs, memberships, duties, and procedures as documented in eventspecific DAG Charters and DAG standing operating procedures (SOPs). 4. DAG timelines. This date may be changed to meet program requirements, after coordination with the AST. ATEC Pamphlet June 2010 D-17

242 This page intentionally left blank. D-18 ATEC Pamphlet June 2010

243 Appendix E Test Schedule and Review Committee (TSARC) E-1. The Test Schedule and Review Committee (TSARC) is a Department of the Army forum which resources testing requiring Soldier/equipment support. Test is a generic term that includes Operational tests (OT); Developmental tests (DT), Logistics Demonstrations (LD), and Rapid Response Initiatives (RR). E-2. When approved, the Five Year Test Program (FYTP) becomes a tasking document for test execution and resource allocation within existing program/budgetary constraints and Army priorities for tests scheduled for the current and budget years. The FYTP identifies Test Resource Plans (TRPs) that the Army Commands, Army Service Component Commands, and Direct Reporting Units have agreed to support in the current and budget years as tasking documents. Early and effective planning is the key that permits testing to be accomplished with a minimum of disruption to non-testing missions of support agencies and commands and to ensure proper resourcing. E-3. To maximize the use of limited resources and to minimize the impact on the supporting unit operational readiness, the TSARC performs the following functions: a. Reviews and validates coordinated Test Resource Plans (TRPs) for inclusion in the FYTP. Ensures satisfaction of the 1 year notification requirement for resource commitments (personnel and equipment) or compliance with the exception to policy procedures. b. Reviews and coordinates resources to support operational T&E and troop support of developmental T&E beyond the materiel developer s resources. c. Reviews and coordinates resources to support Logistics Demonstrations. d. Reviews and coordinates resources to support Rapid Initiatives (Urgent Material Releases in support of Other Contingency Operations). e. Reviews testing schedules to minimize the test support impact on units providing support. f. Reviews funding for operational T&E. g. Ensures TSARC is synchronized with the Army Force Generation Process (ARFORGEN) and Army G-3/5/7 priorities. h. Recommends approval of the FYTP, the final product of the TSARC. E-4. Correspondence to the TSARC should be addressed to the Commander, U.S. Army Test and Evaluation Command (CSTE-OP), 4501 Ford Avenue, Alexandria, VA E-5. TSARC composition. The TSARC is composed of general officers or equivalent representatives of Army Secretariat, HQDA staff elements, the Army Commands, Army Service ATEC Pamphlet June 2010 E-1

244 Component Commands, and Direct Reporting Units. Detailed TSARC composition is provided in AR a. ATEC chairs the TSARC, provides an executive secretary and administrative and logistical support; records and distributes minutes of the TSARC meetings; and, after Army G3/5/7 approval, distributes the FYTP via the ATEC web ( b. The chair (ATEC CG/ED) may invite other staff agencies/commands to attend when test programs fall within their functional area of responsibility or involve their resources. TSARC principals may invite observers from within their agencies or commands when their presence would be beneficial to the TSARC process. c. Funds for travel, per diem, and overtime for TSARC participation are provided by the attendee s parent organization. E-6. The semi-annual TSARC process executes three working group meetings; Initial, Mid- Cycle, and Council of Colonels (CoC), and one General Officer (GO)/Senior Executive Service (SES) meeting. Additionally, ATEC hosts an internal meeting to refine/validate issues prior to the CoC/GO meetings. E-7. TSARC schedule. Table E-1. TSARC Schedule Action Spring Cycle Fall Cycle FYTP Published/Distributed Dec July New/Revised TRPs to ATEC Jan July Initial Working Group TSARC Feb Aug Initial WG Minutes Published/Distributed Feb Aug Draft FYTP Posted to ADSS Feb Aug Resolution of Issues due to ATEC Mar Sep Mid-Cycle WG Meeting Apr Oct Mid-Cycle WG Minutes Published Apr Oct Resolution of Issues due to ATEC May Nov CoC WG Meeting May Nov GO Read Ahead Provided Jun Dec GO TSARC Meeting Jun Dec GO TSARC Minutes Pub/Distributed Jun Dec FYTP Posted to ADSS Jun Dec Army G3 FYTP Approval Jul Dec E-8. TSARC requirements. E-2 ATEC Pamphlet June 2010

245 a. Initial/Mid-cycle Working Group Submissions. TRPs must be coordinated within ATEC (SCAs) and with the PM prior to each submission. (1) TRPs are routinely submitted to the TSARC at least one year prior to the date TSARC member resources are required. Prior to test execution, the system under test must have an Army approved TEMP. (2) ATEC (OTC/DTC/LNO) and the USA MEDBOARD will submit TRPs to support all test requirements identified in the TEMP, including Program Objective Memorandum (POM) years. Exceptions will be made on a case-by-case basis. (3) Revised TRPs are submitted during the TSARC working group to incorporate TRP changes from a previous FYTP or to identify new requirements. b. Out-of cycle-submissions. TRPs will be submitted out of cycle when test resources are required in less than twelve months. Out-of-cycle TRP submissions are processed using the submission process outlined below. This process is used to accommodate urgent and/or unforeseen tests. (1) The out of cycle TRP must be coordinated within ATEC (SCAs) and with the PM before following the formal submission procedures discussed below. (2) An out-of-cycle TRP requires a submitting command s GO or SES signature on the transmittal correspondence (cover memorandum) with a thru line for ATEC HQ approval. (3) The transmittal memorandum to all TSARC members must include the following: (a) Rationale as to why TRP was not submitted IAW TSARC process and why test is required. (b) Statement that funding is available within existing resources. (c) A specific calendar suspense date when comments and concurrence/nonconcurrence must be provided via the ATEC information database, the ATEC Decision Support System (ADSS). (d) The submitting command will forward the signed out-of-cycle memo (with ATEC SCA concurrences) to the ATEC DCSOPS for ATEC HQ approval and TSARC staffing. c. Should TSARC non-concurrences develop, ATEC will attempt to resolve them. If support cannot be resolved, the out-of-cycle TRP will be discussed during the next scheduled TSARC meeting. The GO TSARC Chairman may determine that there is insufficient time to present the out-of-cycle TRP to the next scheduled GO TSARC to meet the test schedule. If so, the out-of-cycle TRP will be processed as an Exception-to-Policy TRP. Exception-to-Policy TRPs will be submitted by ATEC to Army G3 for resolution and HQDA G-3/5/7 will publish appropriate tasking order. ATEC Pamphlet June 2010 E-3

246 E-9. Long-Range Planning Requirements for a specific test may involve many long lead time functions. a. Flying hours. b. Ammunition. c. Instrumentation. d. Installation support. E-10. TRP Preparation. The TRP is used to resource OT, DT, LD, and RR requiring Soldier support, and contains the following: information necessary to support T&E, test conditions, scope, tactical context, resource/resource requirement suspense dates, test milestone dates, and cost estimates. The assigned test activity (DTC/OTC/USAMEDBOARD) will prepare and maintain the TRP for OT/DT/RR. The AEC will update and maintain their funding and other requirements, as appropriate, for all OT and RR efforts. ATEC HQ will prepare and maintain TRPs for LDs. See paragraph E-11 for details on LD submissions. a. TRPs are developed and processed in the ADSS. ADSS software is designed for preparing TRPs, electronically notifying offices that TRPs are ready for staffing, and notifying activities of staff actions as well as the need to revise or update TRPs for submission to the TSARC. Additionally, ADSS is the distribution mechanism for TRP information throughout the T&E community for coordination. b. A TRP is developed at the earliest possible date for LD/DT/OT tests scheduled through the POM window and identified in the TEMP. TRPs accepted by the TSARC and included in the approved FYTP become formal tasking documents for test execution and resource allocation within existing budget and program constraints for the current and budget years. Although the FYTP is an official tasking document, organizations requiring resources are not relieved from the provisions of other applicable documents such as AR and HQDA restrictions and policies such as those pertaining to flying hours. c. TRPs for tests beyond the first year of the POM will be as complete as possible. Integration into ARFORGEN processes requires at least 5, and preferably 6, years out from date of execution. The TRPs should contain information to permit advance planning for critical resources and provide funding information to the PM/materiel developer. d. In the notes area on page 1 of the TRP, TRPs will include a cost estimate detail indicator: Category 1 Approved TEMP; Category 2 Draft TEMP; Category 3 No TEMP (i.e., duration, unit size, total mileage/rounds, etc.). Test parameters describe the factors upon which the cost estimate is based. E-11: Logistics Demonstrations. a. Required on all acquisition programs per AR (1) PM responsibility performed as part of developmental testing. E-4 ATEC Pamphlet June 2010

247 (2) Performed by representative Soldiers (MOS/ASI/Grade). (3) Includes production representative hardware and software. The LD is typically performed during the EMD or production phase prior to IOT&E and must be completed prior to FRP and materiel release. (4) Includes both a physical teardown and diagnostics/prognostics demonstration for field level maintenance tasks including PMCS and troubleshooting. (5) Is the primary source of data for the ATEC AEC ILS evaluator to support evaluations and assessments to include MANPRINT, maintainability, technical publications, adequacy of system support package, etc. b. TSARC process. (1) PMs will coordinate their LD requirements with their ATEC System Team chairperson. (2) PM will identify requirements at least 1 year in advance and provide a timeframe for the LD and specifics on types of Soldiers needed for the LD. Requirements identified with less than 1 year lead time would be submitted using TSARC out-of-cycle guidelines noted in paragraph E-8b above. (3) The Chair will validate the requirement and provide to the ATEC PEO Liaison POC for input to a TRP and submission to the TSARC. E-12: The TSARC process has been integrated with the ARFORGEN process. The TSARC validates the test support requirements for unit/soldier support. HQDA G-3/5/7 prioritizes and then submits them via the Integrated Requirements Priority List (IRPL) for sourcing. ATEC Pamphlet June 2010 E-5

248 This page intentionally left blank. E-6 ATEC Pamphlet June 2010

249 Appendix F Formats F-1. Standard formats are provided for commonality of each report. F-2. Formats for the RAI documents are located in appendix D. Formats addressed in this appendix include: Evaluation Determination Matrix (EDM) Enclosure F-1 ATEC Form 45 for Low-Risk Evaluations Notification memorandum Early Strategy Review (ESR) Enclosure F-2 T&E Concept In-Process Review (CIPR) System Evaluation Plan (SEP) Enclosure F-3 Abbreviated/Tailored SEP Operational Test Agency Test Plan (OTA TP) Enclosure F-4 LF OTA TP DT/OT OTA TP Detailed Test Plan (DTP) Enclosure F-5 DT DTP OT DTP OTA Assessment Report (OAR) Enclosure F-6 OTA Milestone Assessment Report (OMAR) Enclosure F-7 OTA Evaluation Report (OER) Abbreviated OMAR/OER Tailored OMAR/OER System Analysis Report (SAR) Enclosure F-8 OTA Follow-on Evaluation Report (OFER) Enclosure F-9 Technical Note Enclosure F-10 ATEC Pamphlet June 2010 F-1

250 Enclosure F-1 Evaluation Determination Matrix (EDM) Below is an example of the EDM that will be presented for approval to the AST Chair s Director. The actual scales are provided in paragraph 3-2. Note that the scoring rules from the table below are the first sentence of the scoring justification, followed by pertinent facts from AST. Table F-1. EMD Example Factor Description Score Scoring Justification System Maturity Technical Complexity Linkage to Other Systems Soldier Survivability Impact on Unit Mission Proven historical performance Level of technology compared to state-ofthe-art Dependency/ interoperability with other systems System provides protection to Soldiers (shield/armor/ safety/cbrn) Ability of unit to complete mission if system fails to meet requirements 4 Completely new design, unproven. No product with similar functionality and performance requirements is available either for commercial or military use. 4 Pushes state of the art (new idea, step up in technology). This system is the first to use [xxx] in an [xxx] environment, intended to be used by operators with no specialized training. 3 Supports operation of a major system (other system can work in a degraded mode) OR interoperable with a few systems. This system supports the [xxx], although [xxx] will operate in a substantially degraded mode without it. 1 Provides no protection to Soldiers. The [xxx] provides [xxx] rather than physical protection. 4 System critical to unit mission completion (don t do mission without this; e.g., crewserved weapon, asbestos gloves). The [xxx] is critical to mission completion as it provides [xxx]. F-2 ATEC Pamphlet June 2010

251 Form 45 for Confirmation of Low-Risk Evaluations MAKE SURE TO USE THE MOST CURRENT VERSION OF ATEC FORM 45 ATEC Pamphlet June 2010 F-3

252 Notification Memorandum for Low-Risk Evaluations TEAE-xxx MEMORANDUM FOR Program Manager X (PM X), [full mailing address] SUBJECT: Army Evaluation Center (AEC) Evaluation for the [Program name] 1. Background. [Put necessary background here how/why program came to ATEC for review, current program status.] 2. ATEC Position. Independent evaluation based on formal operational or developmental tests is not warranted. Safety confirmations and releases will continue to be supported by ATEC s Developmental Test Command (DTC) as needed. ATEC will continue to be involved in requirements review as necessary. Based on current information, the system is effective, suitable, and survivable relative to its requirements for its intended use and future fielding. ATEC supports material release of the [Program name]. ATEC recommends that PM X continue [ongoing test or acquisition processes, safety, etc.] [NOTE: Include the following statement and subparagraphs, or include more detailed information (perhaps the documents used in the review) as an enclosure.] The recommendation for this evaluation is based on the following: a. ATEC assessed the [Program name] as a low risk program that does not require an evaluation after analysis of its: - history and past performance - mission impact - technical complexity - protection provided to Soldiers - level of interoperability [Key facts go in list below use whatever facts are pertinent to your program/evaluation.] b. Fielding began in 2005 and continues. c. [Program name] has material release exemption. d. Each [system name] undergoes an on-site test and safety acceptance. e. DTC provided safety confirmations. 3. If there are significant program changes, please contact ATEC for another program evaluation. ATEC reserves the right, based on program changes, to reassert itself into the program for independent test and evaluation purposes. ATEC (DTC) will remain involved for safety testing purposes as necessary. F-4 ATEC Pamphlet June 2010

253 Enclosure F-2 Early Strategy Review and Concept In-progress Review (ESR and CIPR) A. Combined ESR/CIPR Format Introduction (AST Chair) AST composition System description (include any special test configurations) Mission description [include operational and organizational (O&O), BOI, how it s used] Evaluation Determination Matrix and level of evaluation (include scores) Acquisition strategy (quantities, etc.) Program Schedule (milestones, etc.) Key documentation status (include expected products due to evaluation category) Evaluation Concepts and Strategy COIs/COICs/KPPs/AIs Design of Experiment (DOE) Mission-Based T&E (MBT&E) Priority measures Evaluation structure issues (COI, AI, etc.) and key measures (kinds of information that must be captured for the evaluation: performance, RAM, survivability, ILS, MANPRINT) Methodologies and types of events that are necessary to capture the information Proposed schedule of events that support the evaluation strategy Data Source Matrix Data Management Common Instrumentation DT Events (Briefed by DTC test manager or appropriate individual) DT Event 1 Type of event Description/concept (scope, number, and types of test items; threat; location) Entrance criteria Measurement areas (mobility, safety, reliability, environmental, etc.) Test Design (sample size, factors, conditions, standards, controls) Instrumentation, models/simulations Event timeline (milestones, environmental doc approvals, plans, TRRs, reports, etc.) Event resources (cost, etc.) Event limitations/concerns and actions/mitigations (high risk areas, limitations, other concerns) DT Event 2 DT Event N OT Events (Briefed by OTC test officer or appropriate individual) OT Event 1 ATEC Pamphlet June 2010 F-5

254 Type of event Description/concept (tactical context, threat, scope, force structure, location) Entrance criteria Measurement areas (mobility, safety, reliability, environmental, etc.) Test Design (sample size, factors, conditions, controls) Instrumentation, models/simulations Event timeline (milestones, environmental doc approvals, plans, TRRs, reports, etc.) Event resources (cost, etc.) Event limitations/concerns and actions/mitigations (high risk areas, limitations, other concerns) OT Event 2 OT Event M Summary (AST Chair) Other items deemed necessary Unusual requirements Resources summary (event cost, evaluation cost, system/units under test, ammunition required, test items, etc.) (as known) Concerns/high risk areas Evaluator concerns and actions/mitigations (high risk areas, evaluation limitations, other concerns) Safety Focus Areas: Safety critical software components Primary safety data to support SC Any adverse safety findings during DTC testing Safety Release to be provided supporting OT Recommendation B. ESR Format Introduction AST composition System description (include any special test configurations) Mission description [include operational and organizational (O&O), BOI, how it s used] Evaluation Determination Matrix and level of evaluation (include scores) Acquisition strategy (quantities, etc.) Program schedule (milestones, etc.) Key documentation status (include expected products due to evaluation category) Evaluation concepts and strategy COIs/COICs/KPPs/AIs DOE MBT&E Priority measures F-6 ATEC Pamphlet June 2010

255 Evaluation structure issues (COI, AI, etc.) and key measures (kinds of information that must be captured for the evaluation: performance, RAM, survivability, ILS, MANPRINT) Methodologies and types of events that are necessary to capture the information Proposed schedule of events that support your strategy (key parts of program schedule) Data management Common instrumentation Safety issues Other items deemed necessary (include those that are applicable) Special, long-lead instrumentation requirements Potential high cost items Unusual or lengthy event requirements Evaluator concerns and actions/mitigations (high risk areas, evaluation or test limitations, other concerns) C. CIPR Format Introduction (AST Chair) AST composition System description (include any special test configurations) Mission description (include operational and organizational (O&O), BOI, how it s used) Acquisition strategy (quantities, etc.) Program schedule (milestones, etc.) Data Source Matrix Safety Focus Areas: Safety critical software components Primary safety data to support SC Any adverse safety findings during DTC testing Safety Release to be provided supporting OT Key documentation status Summary of ESR DT Events (Briefed by DTC test manager or appropriate individual) DT Event 1 Type of event Description/concept (scope, number, and types of test items; threat; location) Entrance criteria Measurement areas (mobility, safety, reliability, environmental, etc.) Test design (sample size, factors, conditions, standards, controls) Instrumentation, models/simulations Event timeline (milestones, environmental doc approvals, plans, TRRs, reports, etc.) Event resources (cost, etc.) ATEC Pamphlet June 2010 F-7

256 Event limitations/concerns and actions/mitigations (high risk areas, limitations, other concerns) DT Event 2 DT Event N OT Events (Briefed by OTC test officer or appropriate individual) OT Event 1 Type of event Description/concept (tactical context, threat, scope, force structure, location) Entrance criteria Measurement areas (mobility, safety, reliability, environmental, etc.) Test design (sample size, factors, conditions, controls) Test limitations Instrumentation, models/simulations Event timeline (milestones, environmental doc approvals, plans, TRRs, reports, etc.) Event resources (cost, etc.) Event limitations/concerns and actions/mitigations (high risk areas, limitations, other concerns) OT Event 2 OT Event M Summary (AST Chair) Other items deemed necessary Unusual requirements Resources summary (event cost, evaluation cost, system/units under test, ammunition required, test items, etc.) (as known) Concerns/high risk areas Conclusions F-8 ATEC Pamphlet June 2010

257 Enclosure F-3 System Evaluation Plan (SEP) The SEP format is flexible. Abbreviated and Tailored SEP formats are also included. A. SEP Format Cover Page Signature Page SF 298 Notices AST Coordination Sheet Table of Contents List of Figures (if needed) List of Tables (if needed) 1. Introduction 1.1 Purpose 1.2 Background 1.3 System Description 1.4 Key Milestones and Schedules SYSTEM EVALUATION PLAN (SEP) 2. Evaluation Overview 2.1 Evaluation Strategy 2.2 Evaluation Focus Area System Contribution to Mission Capability Evaluation Focus Area n Evaluation Focus Area n System Effectiveness Evaluation Focus Area (n+1) (Topic, COI or Measure) m Evaluation Focus Area Issue (n+m) System Suitability Evaluation Focus Area (n+m+1) o Evaluation Focus Area (n+m+o) System Survivability Evaluation Focus Area (n+m+o+1) p Evaluation Focus Area (n+m+o+p) 2.3 Evaluation Limitations and Impacts 3. Evaluation Details 3.1 Analytical Approach Description 3.2 System Contribution to Mission Capability Analysis Evaluation Focus Area Measure 1-1. Title ATEC Pamphlet June 2010 F-9

258 a Definition b Required Data c Analysis Approach Measure n Measure 1-n Evaluation Focus Area n Evaluation Focus Area n 3.3 System Effectiveness Analysis Evaluation Focus Area (n+1) Measure (n+1) a Definition b Required Data c Analysis Approach Measure (n+1) n Measure (n+1)-n Evaluation Focus Area (n+2) 3.3.n Evaluation Focus Area n 3.4 System Suitability Analysis Evaluation Focus Area (n+1) Measure (n+1) a Definition b Required Data c Analysis Approach Measure (n+1) n Measure (n+1)-n Evaluation Focus Area (n+2) 3.4.n Evaluation Focus Area n 3.5 System Survivability Analysis Evaluation Focus Area (n+1) Measure (n+1) a Definition b Required Data c Analysis Approach Measure (n+1) n Measure (n+1)-n Evaluation Focus Area (n+2) 3.5.n Evaluation Focus Area n 3.6 Additional Data Requirements 4. Event Concept Overview 4.1 Overview 4.2 Event # Purpose of Event Scope Methodology/Factors and Conditions/Sample Sizes Tactical Context/Situation Test Limitations and Impacts 4.3 Event #2 F-10 ATEC Pamphlet June 2010

259 4.n Event #n Appendix A Data Source Matrix Appendix B Critical Operational Issues and Criteria Appendix C System Level Data Element Dictionary Appendix D OTA TP Approval Matrix References Distribution List Acronyms/Abbreviations/Glossary (Optional) BCM (Optional) ATEC Pamphlet June 2010 F-11

260 SYSTEM EVALUATION PLAN (SEP) Cover Page Signature Page SF 298 Notices AST Coordination Sheet Table of Contents List of Figures (if needed) List of Tables (if needed) CHAPTER 1 INTRODUCTION 1.1 Purpose. The purpose of the evaluation is to provide information to decision makers. State what decisions are to be supported. Give the milestones/reviews that are to be supported. Indicate the types of reports that will be prepared and the purpose of those reports relative to the acquisition of the system. 1.2 Background. Discuss the system acquisition strategy, including any evolutionary acquisition descriptions. 1.3 System Description. Describe the system (or CBTDEV/TNGDEV product). Describe the similarities and differences between the system under test or simulation and the objective system being developed. Summarize the concept for force structure and employment. A proper review of the TEMP (if available) should provide adequate information to complete this paragraph. The description should include as a minimum the following: Describe the system, its hardware, software, etc Describe the key capabilities to be provided by the system. Additionally, describe the deficiencies to current capabilities that the system will correct Identify and describe the mission capabilities the system is expected to enhance or enable Describe the system s concept of operation, including information from the Operational Mode Summary/Mission Profile (OMS/MP), as appropriate Include a picture or diagram of the system. 1.4 Key Milestones and Schedules. List all milestones for the key T&E events leading to milestone decision reviews (MDRs) or other events supported by the T&E effort. Include a comprehensive list of all events (study and analysis milestones, model completion dates, document approval milestones, model completion dates and test events) critical to the overall T&E effort. F-12 ATEC Pamphlet June 2010

261 CHAPTER 2 EVALUATION OVERVIEW 2.1 Evaluation Strategy. Describe the basic concept of the evaluation. Describe the evaluation strategy for contribution of the system to mission capability, effectiveness, suitability, and survivability. The evaluation strategy should include, but not be limited to, the following: The strategy for evaluating the contribution of the system to overall mission capability, including Battlefield Functional Area capability and sustainment. The strategy for evaluating system effectiveness, suitability, and survivability. The strategy for identifying system capability limitations, assessing risks to LUT and IOT readiness, and the potential impact on mission capability. A discussion of program characteristics that may influence the evaluation approach: technical risk, down-select requirements, DT requirements, OT requirements, scope of system fielding, and other factors that will impact the scope of the evaluation. 2.2 Evaluation Focus Areas. For each evaluation domain (contribution to mission, system effectiveness, system suitability, and system survivability), describe the plan for addressing the key evaluation focus areas (EFAs). EFAs are the critical areas affecting the decision to acquire the system. a. The EFAs can be stated as topics, COIs, or COICs and are documented in Chapter 3. EFAs can be derived from any source (evaluator s insight, KPP, CTP, OMS/MP, etc.), and are supported by measures. b. This section should reflect the approved ESR, and ought to include the War-Fighting System, defined to be a single System, System-of-Systems, or Family-of-Systems described in the capabilities documents, and the Doctrine, Organization, Training, Materiel, Leadership Personnel, and Facility (DOTMLPF) System Contribution to Mission Capability Evaluation Focus Area n Evaluation Focus Area n System Effectiveness Evaluation Focus Area (n+1) (Topic, COI or Measure) m Evaluation Focus Area Issue (n+m) ATEC Pamphlet June 2010 F-13

262 2.2.3 System Suitability Evaluation Focus Area (n+m+1) o Evaluation Focus Area (n+m+o) System Survivability Evaluation Focus Area (n+m+o+1) p Evaluation Focus Area (n+m+o+p) 2.3 Evaluation Limitations and Impacts. List all known limitations on the evaluation, including environmental impacts, and those relevant to individual data sources that support the evaluation or assessment. a. Discussion of each limitation should include a description, an explanation of why the limitation exists, which evaluation focus areas are impacted and an assessment of the operational impact. b. Remember that system limitations are not evaluation limitations. System limitations should be listed in paragraph 1.3. CHAPTER 3 EVALUATION DETAILS 3.1. Describe the analytical approach to be used to support the integrated system evaluation strategy. Identify and describe the measures, factors and conditions; assumptions and limitations surrounding the analysis; any special or unique statistical analysis techniques or tools, to include Models/Simulations (M&S), and how they will be used. a. Chapter 3 should provide the detailed methods, processes, definitions, test requirements, and other information that documents the specific information requirements for the evaluation. It provides a higher level of detail and discussions than that contained in Chapter 2. It should be considered as the comprehensive plan for collecting and analyzing the information needed to support the evaluation. It should identify and describe the information that the test members require to adequately plan for and develop tests and simulations to support the evaluation. The measures should be organized by evaluation domains and evaluation focus areas. b. Measures identify information required from tests, M&S, and other sources. Definitions and discussions of the measures, to include specific operational conditions, sample size (if known) and data precision requirements provide testers and other personnel necessary information on which to base test designs or other information requirements. Technical specifications such as those that measure mileage, speed, acceleration, weight restrictions, electrical power requirements, etc.) may be provided in tabular formats. The Data Source Matrix F-14 ATEC Pamphlet June 2010

263 (DSM) appendix to the SEP should include all measures organized by evaluation domain and focus area with identification of criteria, technical requirements, and sources for the data clearly identified. The AST should endeavor to limit the number of measures to only those critically needed for evaluation of the system System Contribution to Mission Capability Analysis Evaluation Focus Area Measure 1-1. Title a Definition. Describe the measure in detail, what exactly you want to measure and analyze b Required Data. Include the tests that data will come from, any documents you need for review, and the proposed general methodology for data collection. Describe exactly the data that you need to perform your analysis. State any unique measurement requirements needed (e.g., measures to one-thousandth of an inch instead of one-tenth of an inch). Include precision requirements, factors and conditions, sample size, etc c Analysis Approach. Describe how you intend to analyze the data, statistical treatments you intend to use, confidence levels desired, sample data displays, etc Measure n Measure 1-n Evaluation Focus Area n Evaluation Focus Area n 3.3 System Effectiveness Analysis Evaluation Focus Area (n+1) Measure (n+1) a Definition. Describe the measure in detail, what exactly you want to measure and analyze b Required Data. Include the tests that data will come from, any documents you need for review, and the proposed general methodology for data collection. Describe exactly the data that you need to perform your analysis. State any unique measurement requirements needed (e.g., measures to one-thousandth of an inch instead of one-tenth of an inch). Include precision requirements, factors and conditions, sample size, etc. ATEC Pamphlet June 2010 F-15

264 c Analysis Approach. Describe how you intend to analyze the data, statistical treatments you intend to use, confidence levels desired, sample data displays, etc Measure (n+1) n Measure (n+1)-n Evaluation Focus Area (n+2). 3.3.n Evaluation Focus Area n 3.4 System Suitability Analysis Evaluation Focus Area (n+1) Measure (n+1) a Definition. Describe the measure in detail, what exactly you want to measure and analyze b Required Data. Include the tests that data will come from, any documents you need for review, and the proposed general methodology for data collection. Describe exactly the data that you need to perform your analysis. State any unique measurement requirements needed (e.g., measures to one-thousandth of an inch instead of one-tenth of an inch). Include precision requirements, factors and conditions, sample size, etc c Analysis Approach. Describe how you intend to analyze the data, statistical treatments you intend to use, confidence levels desired, sample data displays, etc Measure (n+1) n Measure (n+1)-n Evaluation Focus Area (n+2). 3.4.n Evaluation Focus Area n 3.5 System Survivability Analysis Evaluation Focus Area (n+1) Measure (n+1) a Definition. Describe the measure in detail, what exactly you want to measure and analyze. F-16 ATEC Pamphlet June 2010

265 b Required Data. Include the tests that data will come from, any documents you need for review, and the proposed general methodology for data collection. Describe exactly the data that you need to perform your analysis. State any unique measurement requirements needed (e.g., measures to one-thousandth of an inch instead of one-tenth of an inch). Include precision requirements, factors and conditions, sample size, etc c Analysis Approach. Describe how you intend to analyze the data, statistical treatments you intend to use, confidence levels desired, sample data displays, etc Measure (n+1) n Measure (n+1)-n Evaluation Focus Area (n+2). 3.5.n Evaluation Focus Area n 3.6 Additional Data Requirements. Technical specifications (such as those that measure mileage, speed, acceleration, weight restrictions, electrical power requirements, etc.) should be described in general terms in this paragraph, and listed with details in tabular format. CHAPTER 4 TEST OVERVIEW The test executors prepare this chapter, in coordination with the other AST members. This chapter provides an overview of the test/simulation event strategy in sufficient detail to provide adequate data for evaluation. It addresses evaluation focus areas, criteria, and measures specified in the Pattern of Analysis (PA) and defined in the DSM. Each event will be included in the DSM, and each DSM event will be included here. An event may consist of multiple tests at multiple sites. For each event, include the purpose, scope, design of experiment, factors and conditions, sample size, confidence requirements, and any known requirements for long lead instrumentation, simulation, or stimulation. The events in this chapter are listed by DT events (Contractor and Government), OT events, and other events. 4.1 Overview. Provide an overview of the events to be conducted. 4.2 Event #1. Identify the event Purpose of Event. The purpose of the test/simulation design is to record the test execution methodology required to collect and report the required data as specified in the pattern of analysis and in accordance with the provisions specified by the scope of the test/simulation. Where the SEP is written to support an evaluation with no PVT, PQT, AWE, EUE/T, LUT, IOT or FOT as data sources, this chapter consists of a brief statement by the AST to that effect. ATEC Pamphlet June 2010 F-17

266 4.2.2 Scope. In very general terms, provide information on test/simulation location, environment, duration, and test player units. Configuration data on player forces that will operate and maintain the system and are required to portray appropriate supporting, supported, and adjacent forces, as well as opposing forces. Identify types of test units or organizations planned for the test/simulation. Discuss how the units are organized and any significant requirements of the units. Include Table of Organization (TOE) designations, as required Methodology/Factors and Conditions/Sample Sizes. Explain the concept(s) for obtaining the required data as outlined by the scope of each test/simulation issue for which data is generated. This information should provide the reader with an understanding on the type of test/simulation approach (such as comparison testing, testing to a standard, and/or subjective analysis/military judgment) Tactical Context/Situation. Describe the military environment in which the item or concept will be tested/simulated. Matters to be considered for inclusion are the assumed levels of conflict intensity, description of the threat, missions to be performed by the test unit, summary of the scenario, nature of terrain and environment, and known departures from doctrine Test Limitations and Impacts. All known limitations to adequacy of the test such as duration, number of systems, availability of interoperable systems, test unit availability, and ITTS and their impact on the measures will be described. Known system limitations are not test limitations. Each limitation is presented with a discussion of the limitation description, the impact of the limitation, and explanation of why the limitations exist, and what can be done to ameliorate the limitation. A test limitation does not exist if the required test data collected will be sufficient sample size, valid and reliable. 4.3 Event #2 Continue the pattern established for Event #1. through 4.n Event #(n) (Repeat above sub items as required for each data source or event.) Appendix A. Data Source Matrix (DSM). The DSM includes all the evaluation measures. The DSM is a crosswalk between the evaluation domains, evaluation focus areas, COI/COIC, measures, and data collection events. Under each event for each measure, identify whether the event is a primary or secondary source of data for the measure. The DSM is normally shown as a table. F-18 ATEC Pamphlet June 2010

267 Survivability Suitability Effectiveness Market Survey CDT GDT* LUT IOT M&S FDTE DSM Evaluation Domain EFA COI/ COIC Measure 2. Lethality 3. Interoperability 1-2. Time to Engage 1-1. P kill/shot 2-1. Shot time lines 2-1. P kill/shot P S S (etc.) 4 (etc.) (etc.) 5 (etc.) (etc.) 6. RAM 7 E E3 (* GDT broken down by Sub-test.) 3-1. Messaging with ABCS P P S P P P 2-1. Ao 6-1. Ao P P S S 6-2. EFF P P 6-3. MR P P S S 2-2. MTTR 6-4. MTTR P P S S 7-1. Jamming S P S S 7-2. EMP S P 7-3. RFDEW S P S 7-4. TEMPEST S P Appendix B. Critical Operational Issues and Criteria (COI and COIC). List all the COIs and COICs with scope and rationale, exactly as provided by the Capabilities Developer or functional proponent for the system. Technical specifications will be listed in a tabular format. Include a table that reflects the COIs and COICs mapped to the evaluation focus area or measure, such as the one below. COI 1: Lethality COI/COIC 1-1: Probability of kill/shot EFA 2: Lethality EFA/Measure* 1-2: Time to Engage Measure 2-1: Shot time lines COI 2: RAM EFA 6: RAM 2-1: Operational Availability (Ao) Measure 6-1: Ao 2-2: Mean time to repair (MTTR) Measure 6-4: MTTR COI 3: Survivability 3-1: Electromagnetic Environmental Effects (E3) EFA 7: E3 *Evaluation Focus Areas and Measures are fully described in Chapter 3 of the SEP. ATEC Pamphlet June 2010 F-19

268 Appendix C. System Level Data Element Dictionary (SLDED). The SLDED will start with the DSM. 1. DSM. Measures in the DSM are associated with events that will generate the data needed for the measures. Considerations for selecting an event will include tactical context, robustness, and fidelity that are required in the event to properly address the measure. 2. Pattern of Analysis. The pattern of analysis traces the evaluation focus areas to associated measures, data requirements, and data elements to be developed. The purpose is to identify the data elements required to address measures required for the evaluation. 3. Data elements required to address the measures will be given standard names, characteristics, and definitions. The purpose is to establish a reusable, translatable, and common T&E language that serves as a resource for the T&E community. The AST uses the pattern of analysis to identify the universe of data elements to be standardized. Common data elements must be developed and used across all T&E events throughout the life cycle of the system. 4. Data elements are the lowest level of information collected, and generally require recording an item of information that is factual, or based on observation or instrumentation, require no linkage with any other data elements. Examples of data elements are: Vehicle ID of shooter Shot date/time Test Player ID of shooter Position/location at time of shot Intended target ID Result of shot 5. Data elements are characterized by specifying the data element type and field length, the range if the element is a continuous variables, domain of valid values if the element is a code, and the elements structure (format) if it is made up of two or more data elements. 6. Data elements are grouped into tables of like information, with the definition of what would cause a data record to be generated in the table. The tables may be linked relationally, or may be flat files; generally, relational databases are preferred. Appendix D. OTA TP Approval Matrix. This section delineates the agreement between ATEC and OSD on which OTA TPs will need to be furnished to OSD for approval. It applies to all OSD oversight programs. It is the result of negotiations between the AST and the DOT&E action officer. Sample matrix is below. F-20 ATEC Pamphlet June 2010

269 Event 1 (e.g., PQT (DTC Phase)) Sample OTA TP Approval Matrix OTA TPs Approval Level OSD Event 2 (e.g., PQT (Contractor Phase)) Event n * SEP designated OTA TP For approval Event n+1 * SEP designated OTA TP For information CDR, DTC CDR/DIR AST Chair PM CDR/DIR AST Chair CDR, OTC CDR/DIR AST Chair CDR, OTC CDR/DIR AST Chair * Transmitted memorandum from ATEC CG/ED to DOT&E WITH CF: TEO. For information only For information only Approval For Information only REFERENCES DISTRIBUTION LIST ACRONYMS (optional) ABBREVIATIONS (optional) GLOSSARY (optional) ATEC Pamphlet June 2010 F-21

270 B. Abbreviated/Tailored SEP Format Cover page Signature page SF 298 Notices AST coordination sheet Table of contents List of figures (if needed) List of tables (if needed) 1 Introduction 1.1 Purpose 1.2 System description 1.3 Key milestones, events, and reports SEP-A/SEP-T Format 2 Evaluation plan 2.1 Evaluation structure/evaluation summary 2.2 Effectiveness Issue (COI, AI, etc.) Measure title Measure definition (include requirements) Data requirements (factors and conditions) (include required sample sizes, etc.) Analytical/evaluative plan/treatment/strategy 2.3 Suitability (same) 2.4 Survivability (same) 3 Events 3.1 Event #1 title (event description/concept/scope/long-lead instrumentation requirements) [low detail; tailor for audience to understand DSM and evaluation strategy prior to publication of EDPs] Appendix A. Data Source Matrix Appendix B. COICs/KPPs Acronyms/Abbreviations Distribution List F-22 ATEC Pamphlet June 2010

271 Enclosure F-4 Operational Test Agency Test Plan (OTA TP) Format The OTA TP should provide the reader, reviewer, and decision-maker with the information on the event requirements, structure, variables, schedule, methodology, and other material necessary for understanding and visualization of how the event will occur. The OTA TP defines the methodology for collection of data to support the evaluation strategy. A. LF OTA TP Format OTA TEST PLAN FOR THE LIVE FIRE TEST PHASE OF THE [SYSTEM NAME] 1. Introduction 1.1 Purpose 1.2 Scope 1.3 System Description 1.4 Background 2. Event Description 2.1 Event Summary 2.2 Event Design Test Test 1 Item Configuration Test 1 Matrix Test Test 2 Item Configuration Test 2 Matrix 2.2.n Test n 3. Analytical Approach and Data Requirements 3.1 Analytical Approach Methodology Methodology n Methodology n 3.2 General Data Requirements Test Test n Test n Appendix A Data Source Matrix Appendix B Data Definitions References Distribution Acronyms/Abbreviations/Glossary ATEC Pamphlet June 2010 F-23

272 OTA TP FOR THE LIVE FIRE TEST PHASE OF THE [SYSTEM NAME] This particular OTA TP example was prepared for an engineering test phase of a warhead. Test results will contribute to the lethality Live Fire evaluation. This test phase is NOT the Full-Up- System-Level test. 1.0 INTRODUCTION 1.1 Purpose. This document defines the test scope, data requirements, and data analysis that support the U.S. Army Test and Evaluation Command s (ATEC s) lethality evaluation of the System Name. Describe where this test phase occurs in the program. State if this is an overarching OTA TP or if more OTA TPs can be expected as the program progresses. Reference the TEMP containing the LFT&E strategy since it established the need for this test phase. State which decision will be supported. 1.2 Scope. State in general terms how this test will support the LFT&E evaluation. Show a program schedule and highlight this test phase, as well as future test phases that will support a LF lethality evaluation. Program Management FY04 FY05 FY06 FY FY08 FY09 MS B PDR IPR CDR DRR MS C FRP AH-64D IOC CA Phase 1 INCREMENT 1 Phase 2 Pilot Production Line Deliveries AH-1Z EOC Test Support LRIP 1 Deliveries FY10 F/A-18E/F EOC LRIP 2 Deliveries FY11 RAH-66 IOC w/jcm MH-60 R/S EOC FRP Adv Proc LRIP 1 LRIP 2 FRP CA Test Program (Deliveries) Non Platform Specific AH-64D AH-1Z F-18E/F Risk Mitigation Tests 2 CFTs (1 ea Dome) 4 CTVs 6 Noise&Vib 16 LDTs (Jettison) 8 LDTs (Jettison) EDT 5 Baseline 4 EPT 1 LDBF/CFT 1 FTHM 1 HWIL (+1SSS) PPT (System DT/DT-A Qual) DT DT-A LUT/ Shots Shots OA 18 Env Qual (+3S) (+6SSS) 2 Acc Aging (+2SSS) Additional PPT * 2 Plat Int (+1SSS) 5 Safe Sep 1 Safe Sep (CTV) (CTV) 2 LDTs (Fit Check) DT 1 Integr (+2S) 9 Safe Sep (+1S) (CTV) 6 DT 3 DT 4NED (+4S) (+3SSS) 6 DT (+2SSS) 3 DT 2 Integration 10 DT 3 DT 6 LDTs (Jettison) 11 Safe Sep (+2S) (+1S) (CTV) (+5SSS) 6 CATMs** 6 LUT** (+1S) 6 OT-B (+2S) 10 OT-B PQT 7 PQT (+3SSS) DT/ DT-A 2 DT Shots 3 DT Shots (+1S) 3 DT Shots IOTE 1 HWIL (+1SSS) Live Fire 18 OT Shots** (+2S) 16 OT Shots (+2S) 26 OT Shots (+2S) PET 4 Shots** 4Shots LETHALITY DATA LEGEND: PVT Risk Mitigation/Warhead Characterization Flight tests w/ Live WH Confirmation of 2nd Source Seeker Dedicated End-to-End Developmental Test RAH-66 MH-60R/S 8 LDTs (Jettison) 1 Integration (+1SSS) 2 CATMs 12 LDTs (Jettison, Integration) 18 CTVs 1 Integration 8 Safe Sep (CTV) DT Shots 6 DT (+3SSS) 6 DT (+2SSS) 2 DT Shots 3 DT Shots 10 OT Shots** 9 OT Shot (+2S) Figure 1. Program Schedule with Tests Supporting Live Fire Lethality Highlighted 1.3 System Description. Include a diagram of the test item, if available. In this example, it is a missile showing warhead components. F-24 ATEC Pamphlet June 2010

273 1.4 Background. Summarize the overall LFT&E strategy and, if necessary, any program issues necessary to help explain the particulars of the test. 1.5 Live-Fire Critical Issue (CI). Directly from the LFT&E strategy. 1.6 Limitations. Acknowledge any limitations which may affect test results or evaluation results. State how the limitations will be mitigated. Examples: LF testing is meant to occur with production-representative configurations. Testing conducted early in the program may have to be repeated if the LFT&E IPT decides the configuration has changed significantly during the course of the program. 2.0 EVENT DESCRIPTION 2.1 Event Summary. Briefly describe the tests, who will provide test articles and who is responsible for test item configuration. (For example, some tests require live warheads, while inert warheads are sufficient for others. Some tests require tactical seekers while seeker simulants are sufficient for others.) 2.2 Event Design. Provide a matrix of tests within this test phase and test agency responsibility (such as penetration, fragmentation, overpressure) Test 1. State the purpose of Test 1. State if the test results will supplement any other tests (for example., sled velocity check-out tests may also provide penetration data) Test 1 Item Configuration. List the test-unique characteristics of this test item (such as live vs. inert, any simulants used). Provide diagram if available Test 1 Matrix. If a matrix is impractical at this time, at least state the number of tests that will be conducted and the important factors Test 2. State the purpose of Test 2. State if the test results will supplement any other tests (for example, sled velocity check-out tests may also provide penetration data) Test 2 Item Configuration. List the test-unique characteristics of this test item (such as live vs. inert, any simulants used). Provide diagram if available Test 2 Matrix. If a matrix is impractical at this time, at least state the number of tests that will be conducted and the important factors n Test n. State the purpose of Test n. State if the test results will supplement any other tests (for example, sled velocity check-out tests may also provide penetration data). 2.2.n.1 Test n Item Configuration. List the test-unique characteristics of this test item (such as live vs. inert, any simulants used). Provide diagram if available. ATEC Pamphlet June 2010 F-25

274 2.2.n.2 Test n Matrix. If a matrix is impractical at this time, at least state the number of tests that will be conducted and the important factors. 3.0 ANALYTICAL APPROACH AND DATA REQUIREMENTS 3.1 Analytical approach. State if any computer models or methodologies will be used in the evaluation. Summarize how test data will be used with these methodologies Methodology 1. Describe the model or methodology. Describe which test from Chapter 2 will provide data to be used with the methodology. Describe how/if the data will be used to develop algorithms unique to this system and how the data contributes to the VV&A process Methodology 2. Describe the model or methodology. Describe which test from Chapter 2 will provide data to be used with the methodology. Describe how/if the data will be used to develop algorithms unique to this system and how the data contributes to the VV&A process. 3.1.n Methodology n. Describe the model or methodology. Describe which test from Chapter 2 will provide data to be used with the methodology. Describe how/if the data will be used to develop algorithms unique to this system and how the data contributes to the VV&A process. 3.2 General Data Requirements. State the desired level of detail in the database provided by the tester to the evaluator. (e.g., database level 1-7). State who will receive the database. Describe any test information that is common to all the tests in this test phase. For example: Test number and date. Item serial numbers. Photographs of the test hardware. Photographs of the test setup. Video footage of test event. Test item configuration information. Target material description. Deviations from specified test setup or procedure. Any unusual occurrences Test 1. This should correspond to Test 1 in Chapter 2. State how the evaluator will use this data. List the data needed from this test. Provide diagrams or figures as necessary Test 2. This should correspond to Test 2 in Chapter 2. State how the evaluator will use this data. List the data needed from this test. Provide diagrams or figures as necessary. 3.2.n Test n. This should correspond to Test n in Chapter 2. State how the evaluator will use this data. List the data needed from this test. Provide diagrams or figures as necessary. F-26 ATEC Pamphlet June 2010

275 B. DT/OT OTA TP Format 1. OTA TP Changes/Updates. (a) Update as necessary to focus on the priority measures identify in the SEP revision. (b) Add level 3 database design information to include both the data element dictionary and database design. This database information lays the foundation for the evaluation and must be jointly developed by the tester and evaluator to ensure success. Completing the level 3 database design early allows the evaluation team to develop analysis tools and identify data displays that will be used for evaluation reporting. (c) The OTA TP will be jointly approved by the OTC Commander and the AEC Director. 2. A determination will be made by the DTC AST member in coordination with the whole AST as to whether a DT OTA TP is required based on the level of detail available in the SEP. A DT OTA TP is documented by the DT member of the AST based on information provided by the system evaluator. The DT OTA TP is a single document that addresses one or more DT events for acquisition-evaluated systems. DT OTA TPs are developed when the SEP is not yet available or when established TOPs, ITOPs, and MIL-STDs do not provide sufficient test design information for the development of the DTP. The DT member is responsible for writing the DT OTA TP and coordinating it with the AST. The DT OTA TP for an acquisition-evaluated program is jointly approved by the Commander/Director (or designee) of both DTC and AEC no later than T-120. The DT OTA TP will consist of the following: Description of required tests. Description of the conditions under which the system will be tested (e.. environmental conditions, component level versus system level, threat) Test criteria and test methodology Data requirements including sample size, number of shots, miles, trials, etc. Data presentation. o Statistical methods, e.g., confidence levels, risk o Structure of the database Description of how item will be used in the IOT (design of the IOT to address readiness). Address modeling and simulation efforts. 3. DT/OT OTA TP Format. The basic format for either a DTC OTA TP or an OTC OTA TP is the same. Below is an example of the expected OTA TP: ATEC Pamphlet June 2010 F-27

276 OTA TEST PLAN Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) 1. Introduction 1.1 Purpose 1.2 Scope of Event 1.3 System Description 1.4 Background 1.5 Event Limitations 1.6 Event Milestones 2. Event Description 2.1 Event Summary and Overall Methodology 2.2 Event Design 2.3 Schedule of Major Events or Phases 2.4 Tactical Context 2.5 Description of Players/Forces 2.6 Control Procedures 2.7 Instrumentation, Simulation, and Stimulation Requirements 2.8 Validation, Verification, and Accreditation Requirements 2.9 Training Concept 2.10 Environmental Impacts 3. Analytic Detail 3.1 Analytic Approach 3.2 Statement of First EFA Measure 1-1. Title of Measure X Last Measure for EFA 3.2 Statement of Second through last EFA 4. Data Management 4.1 Overview Data Collection Organization and Procedures Data Reduction Organization and Procedures Critical Data Process Description 4.2 Data Authentication Group (if required) Organization Procedures 4.3 Event Database Procedures for Development Structure of the Database Appendix A Pattern of Analysis Appendix B Data Source Matrix Appendix C Data Definitions References Distribution Acronyms/Abbreviations/Glossary F-28 ATEC Pamphlet June 2010

277 DT/OT OTA TP Format Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) CHAPTER 1 INTRODUCTION 1.1 Purpose. Provide an understanding of the event (name and type), how the event fits into the overall evaluation strategy, and how the results of the event will be used. Additional information appropriate includes the acquisition category of the system; whether the system is on the oversight list; the milestone decision or other review for which the event is providing input; and the product that will be produced (Test Report, OAR, OMAR/OER). 1.2 Scope of Event. Provide a brief overview of the event to include what, when (dates), where (locations), phases, and general kinds of activities to be conducted or simulated. If the event does not support an acquisition program, state the reasons for conducting the event. The level of detail may include: Desired level of operational realism. Number of scenarios. Type and size of player and support units to include the type of representative personnel who will operate and maintain the system. Test environment under which the system is to be employed and supported during testing; specify the maneuver area, terrain, vegetation, threat, and security. Plans for interoperability and compatibility testing with U.S. or Allied weapon and support systems as applicable. Type of resources to be used, the threat simulators and the simulation(s) to be employed. (Whenever models and simulations are to be used, explain the rationale for their credible use.) Extent to which the logistics support is representative of the true operational environment. 1.3 System Description. Provide a short overall description of the system to be tested to include a diagram as the focal point of the description. Do not parrot the PM s description or use terms like intended to and designed to. Identify any peculiar system configurations required or known system limitations. Describe any differences between the tested system and the system that will be fielded; for example, discuss the level of software maturity and the extent to which the system is integrated, interoperable, or compatible with other systems. Characterize the system (prototype, engineering development model, production-representative, or production configuration). The SEP is a good source for this information; however, update SEP information where necessary. If the system or concept is complex, provide a general description and a more detailed description in an appendix. ATEC Pamphlet June 2010 F-29

278 1.4 Background. Explain the status of the SEP, if appropriate. Discuss where the event falls in the evaluation sequence. Include a summary of significant historical data leading up to the event and a brief discussion of previous related events, identified problems, and significant results. 1.5 Event Limitations. Discuss constraints and all known differences between the event environment and the realistic operational environment as portrayed in the OMS/MP. Explain differences in the event as described in the SEP versus current scope of the event, particularly as it applies to the ability of the tester to collect required data to address measures assigned to this event. Explain the degree to which the differences could impact the measures for evaluation. Some items that might be considered are: scenario variations from OMS/MP, the number of systems available, instrumentation, facilities, and difference between the weather conditions described in the OMS/MP and the conditions at the event site. 1.6 Event Milestones. Provide a list of all milestones important to the success or completion of the event. For example: TABLE 1. TEST PLANNING MILESTONES Test concept(s) development completed Tester/AST T-615 Test Concept In-Process Review (CIPR) completed Tester T-600 Draft Outline Test Plan (OTP) completed and staffed Tester T-570 OTP submitted TSARC Tester T-530 Operational Test Readiness Review (OTRR) #1 conducted Tester T-270 Coordination draft OTA TP staffed Tester/AST T-225 Draft OTA TP coordination comments due tester Coordinating Agencies T-210 Final draft OTA TP submitted for approval Tester/AST T-195 OTA TP approved (Commander/Director of AST Chair and Commander of Tester) Detailed Test Plan (DTP) update to test director in test organization Tester/Chair Headquarters T-120 Tester T-090 Approved OTA TP due Human Use Committee (HUC) Tester T-060 OTRR #2 conducted Tester T-060 New equipment training (NET) for player unit completed PM/MATDEV T-011 Pilot test initiated Tester T-010 Pilot test completed Tester T-005 OTRR #3 conducted Tester T-001 Test start Tester T-000 Test end Tester E-000 Final RAM scoring conference completed AST E-010 Final Data Authentication Group (DAG) meeting completed DAG E-005 Database validation completed Tester E+10 Test After Action Review Brief Tester E+10 TR Approved (if TR is test product) Tester E+60 F-30 ATEC Pamphlet June 2010

279 CHAPTER 2 EVENT DESCRIPTION 2.1 Event Summary and Overall Methodology. Provide a clear description of the events that will be performed under specified sets of conditions to generate and collect data to the degree of precision required. Include a description of the event design that enables the reader to understand and visualize the event execution paint a picture. Describe if the event is a comparison to a baseline or standard or whether it is exploratory in nature. Include discussion of specified threat, unique operational conditions, environmental conditions (CBR, EMI or other), terrain, or other operational characteristics. 2.2 Event Design. In matrix format and by event phase, describe the event variables, factors, conditions, and test controls (systematically varied, tactically varied, held constant, uncontrolled). Provide sample size matrices by major factors and conditions; if necessary, provide different matrices for each event phase. Identify the types of trials that are to be conducted and the conditions under which they must occur. Organize the trials into appropriate groupings identified as phases or subtests. a. Event Variables (Factors and Conditions). During early planning, the AST identifies the event variables through review of the system s requirements documents. There are three types of event variables (often referred to as event factors) independent, dependent, and extraneous. During events all three types of variables assume discrete values (or conditions). It is the tester s responsibility to control the independent variables in order to measure the response in the dependent variables. An extraneous variable is one that is not selected or cannot be controlled by the tester; however, it may have an effect on the dependent variable. One of the primary considerations in designing an event is to minimize and/or document the effects caused by extraneous variables. b. Test Controls. The AST develops the list of event variables and adjusts the factors and conditions to meet the final event requirements. The adjustment must be based upon the data required to answer the event issues, criteria, and measures. Factors may be controlled in one of four ways: tactically varied, systematically varied, held constant, or uncontrolled. Tactically varied factors are preferred because they develop as a result of tactical operations employed in the event and enhance event realism. Systematically varied factors are used to permit examination of all required factors in sufficient quantity for effective analysis. Factors are held constant for the test when prior knowledge or testing indicates a preference, or no other option for that factor is available. Uncontrolled factors should be held to a minimum. When critical factors for a system are identified, the most representative conditions for that factor are developed into the event matrices with the number of conditions held to a minimum. An example of typical factors and conditions is shown in table 2. ATEC Pamphlet June 2010 F-31

280 TABLE 2. SAMPLE LIST OF TYPICAL FACTORS AND CONDITIONS Factors Control Conditions Range of engagement Systematically varied , , 900-1,300 meters Light conditions Systematically varied Day, night Target movement Systematically varied Moving, stationary Threat arrays Systematically varied IAW threat support package CBR Systematically varied No MOPP, MOPP 2, MOPP 4 Terrain (phase I) Systematically varied Flat, rolling Terrain (phase 2) Tactically varied Rugged, swamp Enemy action Systematically varied Attack, defend Battlefield obscuration Systematically varied No smoke, smoke EW environment Systematically varied IAW threat support package Doctrine/tactics Held constant IAW D&O support package or IAW TRADOC support package Logistics support Held constant ORG, DS Communications status radio-digital Tactically varied Radio voice Enemy target Tactically varied Troops, vehicle bunker Weather Uncontrolled Rain, dry, snow System operating status Uncontrolled Fully operational, degraded mobility, degraded firepower nonoperational c. Combining Conditions. The selected set of test conditions is used to determine what combinations of conditions are appropriate. For example, a hypothetical system s target detection capability could be influenced by three training level conditions (untrained, average, and highly proficient), three weather conditions (clear, overcast, and precipitation), and two terrain conditions (flat and mountainous). This situation would require consideration of 18 possible test combinations (3 x 3 x 2 = 18). The radio communications capability of the hypothetical system could require consideration of training and terrain conditions (3 x 2 = 6 combinations) because weather conditions have little effect. A suggested technique is to draw a matrix listing possible combinations that interact and influence system performance. Normally, systematically varied controlled factors form the basis of this matrix. d. Number of required trials. The number of trials for a phase is normally dictated by statistical requirements to EFAs or measures. The required sample size is determined numerically by defining statistical parameters and formally calculating the sample size. Where there are no statistical criteria, test designers must determine how many test trials are necessary to average out chance differences between repetitions of specific events. Essentially, this process determines how many repetitions are required to provide confidence that the test results are valid and representative of a true operational environment. A trial matrix is developed for each phase or set of requirements to show the number of iterations necessary to achieve the desired level of data collection for each phase. An example matrix is shown in table 3. F-32 ATEC Pamphlet June 2010

281 System Type Day TABLE 3. SAMPLE OF EVENT MATRIX 200 Meter range 700 Meter range Moving trial Still trial Moving trial Still trial Rounds Number Rounds Number Rounds Number Rounds Number XYZ ABC Night XYZ ABC NOTE: Trial numbers indicate the randomized order in which trials will be performed so that the order of events will be mixed for different crews. 2.3 Schedule of Major Events or Phases. a. Describe the overall layout of the event to include sequence of phases and timelines. Briefly summarize each of the event phases. b. Event phase matrices are developed based on the factors and conditions established in paragraph 2-2. Factors are combined in the smallest number of groupings that will allow a practical and realistic exercise. Care must be taken to avoid groupings of incompatible conditions (e.g., live fire versus live targets) and unrealistic conditions (e.g., concentrated armor attack under nuclear attack). Each grouping becomes one subtest or phase. The tester identifies the types of trials that are to be conducted and the conditions under which each occurs. The tester then organizes the trials into appropriate groupings identified as phases or subtests. Different event phases generally require separate matrices. Event phase matrices may require supplemental notes, annotations, or accompanying text to clarify the information presented in the matrix. The process of grouping factors and conditions ensures appropriate operational realism, minimizes the potential for bias in the data, and provides sufficient test control to meet safety, instrumentation, and test management requirements. An example of an event phase matrix is shown in table 4. TABLE E-4. SAMPLE EVENT PHASE MATRIX Phase Dates Conditions Trials Maneuver 5-15 Jan Tactical, RTCA Attack, defense meeting engagement Live-fire Jan Day, night Moving, stationary targets Transportability Jan Tactical, strategies Aircraft, rail, wheel loading ATEC Pamphlet June 2010 F-33

282 2.4 Tactical Context. The tactical context is a general description of the military environment in which the event will be performed. For operational tests, describe the degree of realism in the battlefield environmental conditions. Subjects may include the assumed level of conflict intensity (for example, mid-intensity), an abbreviated description of the threat (for example, IAW southwest Asia scenario), the missions to be performed by the test unit, and a summary of the scenario (including reference to the applicable TRADOC standard scenario). In addition, describe the nature of terrain and environment and departures from doctrine. 2.5 Description of Players/Forces. List the sizes and types of units (friendly and threat), how equipped, how represented (if simulated, describe degree of realism), and any specific capabilities. Describe contractor involvement in the planned event. Discuss the operational force structure; is it in accordance with the Operational Mode Summary/Mission Profile (OMS/MP)? Explain deviations from OMS/MP and reason for deviations. Identify the Military Occupational Specialty (MOS) of the Soldiers expected to operate and maintain the system and explain any deviations in the test player MOS. 2.6 Control Procedures. Describe how the test team is organized and the procedures it will use to ensure test activities are executed and data is collected. The content of the control procedures depends upon the level of simulated realism required in the event. For example, discuss if an operations cell will be required to coordinate with the test unit or if a checklist or mission event list will be employed. Describe the supervision and monitoring that the test officer or other controllers will perform to assure that the test item or concept is properly employed. Discuss any OPSEC requirements and describe methods taken to meet the requirement. Discuss any planned briefings to players about battlefield conditions and how message traffic from higher and adjacent units will be simulated. Describe any checks that will be performed to assure that misuse of the system or misunderstanding of the test requirements does not bias the event results. Discuss how visitors to the test site will be controlled and any procedures implemented to ensure that data is not released from the test site by visiting personnel. 2.7 Instrumentation, Simulation, and Stimulation Requirements. List the type and purpose of all instrumentation, simulation and/or stimulation resources required for the event. Describe where each is used and its unique capability to provide data. Discuss how each represents either friendly and/or threat player forces, as appropriate. Many sources exist to assist the event planner in identifying available instrumentation systems and their capabilities. The event director/test officer should coordinate with the instrumentation engineers as early in the event planning phase as possible to identify requirements. Instrumentation engineers will assist the event director/test officer in the development of all instrumentation requirements for the test. If the instrumentation, simulation, or stimulation is complex, provide an appendix with detailed descriptions of each. 2.8 Validation, Verification, and Accreditation (VV&A) Requirements. Discuss VV&A requirements for all targets and simulators. Identify the accreditation authority (AA) for each simulation or stimulation used. Except when ATEC coordinates for an outside command to accredit M/S, the AA for M/S is the person who approves release of the evaluative report. F-34 ATEC Pamphlet June 2010

283 2.9 Training Concept. Briefly describe test-training requirements. Training requirements (amount/type of training and resources) consist of two major elements: training the test players and training the test organization personnel. Summarize training to be given to player personnel or units (to include Opposing Forces (OPFOR)) in preparation for OTRS certification and testing. Summarize training to be presented to controllers, data collectors and reducers, and to other test personnel. The training concept is expanded into a Training Plan during detailed test planning Environmental Impacts. Reference and briefly summarize any environmental assessments. Identify any impact to the environment caused by the test and procedures for reducing the impact to the environment. Refer to AR to determine if an environmental assessment is required for the event. CHAPTER 3 ANALYTIC DETAIL 3.1 Analytic Approach. Describe the level of data and the analytic procedures for the event. Include a matrix that depicts the association of the measures. For each event phase, show the measures to be addressed. The analytic approach is a function of the event product and should be tailored for each event as appropriate. 3.2 Statement of First EFA. State EFA 1 as it is provided in the approved SEP; include methodology that is peculiar but applies to all or most of the EFA. Methodology will include detailed discussions concerning the following major areas: overall operational test approach, test phases, control procedures, data collection and reduction specific to the EFA. Do not include information that will be addressed as a unique methodology relating to a measure. If phases, control mechanisms, collection procedures, and reduction methods are closely aligned for all measures associated with this EFA, the information should be stated in this paragraph Measure 1-1. Title of Measure 1-1. a. Measure Methodology. Present the formula for Measure calculations. Discuss relationships between data elements required to address this measure. Define any unique (or previously undefined) terms or data elements within the measure. Identify factors and conditions that apply or are unique to this measure. b. Data Collection Method. Describe any unique procedures to be used for collecting and recording the data to address this Measure. Identify the techniques to be used, such as subject matter expert observations, questionnaires, or instrumentation. If questionnaires are used, identify how often and who will respond to the questionnaires. Also, identify the type of questions and responses that will be used. If special contract support is required to collect the data to address this measure, identify the amount of and type of support required. (NOTE: System contractor involvement in collection, reduction, or scoring of data is not allowed during an IOTE). c. Data Reduction. Identify any unique procedures for reducing the data for this measure. If a support contractor is used to collect data from instrumentation or some other contracted test ATEC Pamphlet June 2010 F-35

284 activity (for example data from the 1553 data bus), discuss how the contractor data will be combined or merged with data generated from other sources. (NOTE: System contractor involvement in collection, reduction, or scoring of data is not allowed during an IOTE). Define the quality control procedures such as data summaries used to identify potential inconsistencies in data collection, entry, and data reduction. d. Data Evaluating Methods. Address how (or if) the data generated from multiple phases will be aggregated to produce a single set of data to address this measure. Discuss the analytic procedures that will be used to aggregate the data for this measure. 3.2.X Last Measure for EFA. State the measure and repeat the subparagraphs above, as necessary. 3.2 Statement of Second through last EFA. Continue with format, as above, for remainder of EFAs and measures. CHAPTER 4 DATA MANAGEMENT 4.1 Overview. Provide an overview of the data management organization and general procedures. If possible use a dendritic and/or flowchart to describe the overall management structure Data Collection Organization and Procedures. Describe the data collection organization and procedures. If more than one data collection team is necessary, describe the different teams that will be required. Include a discussion of the data collection techniques to be used, such as subject matter expert observations, instrumentation, and special contract support. Discuss the methodology for recording, controlling, and ensuring the quality of data during data collection. If necessary, explain security procedures for handling classified data. Summarize plans that describe when, where and how checks of data forms and/or instrumentation outputs will be made to ensure that data are being collected and recorded according to the definitions and formats required. Describe the chain of responsibility for the data collection. For example, some questions to answer are: How will the data flow from the data collector(s) to the data manager(s)? How will the data flow from site to site, if necessary? How and when will data collectors receive training? What type of data collection methods (instrumentation, manual, automated) will be used? What are the plans for recovery of data, in case data is lost or erroneously not collected? Data Reduction Organization and Procedures. Describe the data reduction organization and procedures. If more than one data reduction team is necessary, describe the different teams that will be required. Discuss the process by which recorded test data is organized, reduced, controlled, and stored. If applicable, discuss how classified data will be reduced and stored. Indicate the process through which each set of collected data is to pass before reaching a level 3 database. Define the quality control procedures such as data summaries F-36 ATEC Pamphlet June 2010

285 used to identify potential inconsistencies, which will be used to detect and correct errors made in data collection, data entry or data reduction. Some questions to answer are: Who will reduce the data and what general methods will be used? How will the data flow from the data reduction section to the data managers? How do the data reducers handle quality control issues with data collected manually or by automation? What quality control processes will be implemented during data reduction? Critical Data Process Descriptions. Discuss requirements for end-to-end data runs during pilot testing. Describe the requirements for data turnaround during test events. How soon after collection is the data expected to be processed and presented to the DAG for authentication? Discuss the schedule and priority of delivery for products. If appropriate, discuss in terms of types of data (performance, RAM, or MANPRINT). If creation of a specific file requires merging of manual, automated, and/or instrumentation data, then describe procedures and processes for developing the file. 4.2 Data Authentication Group (if required). The AST determines the need for a DAG. If not required, so state and do not include the subparagraphs below. If a DAG is required, provide the reader with a basic understanding of how the DAG functions. Discuss how the DAG will interface with the data management section. Provide a dendritic or flowchart to explain how the data flows from the data management section to the DAG (if not previously presented). Expand on how the data flows within the DAG. The DAG should be tailored to accommodate the unique requirements of the event to be conducted Organization. State who will chair the DAG and which organizations will provide members to the DAG. If the DAG will be assisted by special analysis teams, then provide a dendritic to explain how the different teams (performance, RAM, MANPRINT, etc.,) will interact with the DAG Procedures. Summarize the information from the DAG charter. Discuss the procedures for investigating data anomalies. Explain the procedures the DAG will use for verifying completeness of the data and for authenticating the data. Summarize the procedures for documenting or flagging data anomalies that are not resolved. 4.3 Event Database. The general structure for the Level 3 event database should be that described in appendix C of the SEP, the System Level Data Element Dictionary (SLDED). The fields and tables included in the event database should be only those that address measures that apply for the specific event. This chapter should describe each of the event databases for the test, in terms of major file types (performance, RAM, MANPRINT, etc.). Explain any unusual agreements between the event executor and the AST for the collection, analysis, and transfer of data Procedures for Development. Discuss what type of files will be created and how the files will be structured. Explain any links established to ensure consistency between data files. ATEC Pamphlet June 2010 F-37

286 Explain any unusual agreements between the event executor and the AST reference development of files and file formats and structures Structure of the Database. For each of the major file types discuss the number of files that will be created. Explain any relationships between the files. Discuss in general terms the categories of data that each file will contain; for example, a performance file may contain data on detection and engagement of threat targets. Identify the software package (and version) or format for the files (ASCII, SAS, ACCESS, etc.). If more than one software package or format is used, identify the appropriate package for each file. Identify the medium (electronic, disk, Bernoulli) for transferring each type of file to the AST. APPENDIX A PATTERN OF ANALYSIS APPENDIXES The event executor in conjunction with the other members of the AST normally develops the Pattern of Analysis (PoA). It is developed as the first step in a formal test design. The PoA is always required and is a major element of the OTA TP; it links the EFAs, measures, and criteria to specific data elements. The PoA accomplishes this by providing the transition between the measures to the actual data elements. This appendix lists in dendritic style or narrative forms the COI, associated criteria, EFAs, measures, and required data elements. The dendritic style is usually appropriate for briefing at higher levels to T&E WIPT members or DOT&E. However, for development, identification, and tracking of data elements to specific measures, the narrative form is preferred. This is mainly because the effort expended to create this appendix contributes to defining the data element dictionary for the event database. As such, the PoA serves as a tool that helps the AST determine what data needs to be collected in order to answer the measures, criteria, and issues. As a general rule, only include the issues, criteria, EFA, and measures that will be answered by the event defined in the OTA TP. It is not necessary to include measures that will be answered through other events defined in the SEP. Upon completion, the PoA should be closely reviewed by the AST to ensure that it is complete, unbiased, and logically structured. As appropriate, include a graphic figure in dendritic format showing the decomposition of the issue to the final data elements. This will enable the reader to track the logical flow of the plan. 1. Evaluation Focus Area. State each EFA in its own major paragraph. The EFAs are stated in the form of a question for which the response will contribute to the decision-making processes. Where a EFA is also a COI or COIC, put the statement of the COI or COIC in a subparagraph. 1.1 Measure. State each measure under the EFA as the next subparagraph. State the measure as written in the SEP; the measures should track from the SEP through the OTA TP to the PoA. The measure should be answerable in quantitative terms using distributions, means, percentages, etc. Measures may also be qualitative, with data provided through questionnaires. The objective whether quantitative or qualitative should be to phrase the measure in measurable terms. Measures need to be defined well enough to determine the data required as well as the method F-38 ATEC Pamphlet June 2010

287 for applying the data to the measure. Subjects or categories of system capabilities, such as battlefield management, digital displays, and MANPRINT, are not measures; they are evaluation subjects. D.R Data Requirements. List all data requirements and data elements (DE) under the measure. Define each DE in sufficient detail to preclude any ambiguity or misunderstanding. Data requirements and data elements are the exact questions that the instrumentation, data collector, or questionnaire respondent is expected to answer during an event trial. If the data collector is providing the response, the question must be worded objectively and in a clear and simple manner. To the extent possible, the data collector should be able to check a box or fill in a code, number, or time in the data collection form or questionnaire. The data collector should not have to make subjective assessments or recall information from previous activities or trials in order to answer the data element. If a test player is responding to a questionnaire, then the same philosophy applies and the questions should remain clear and simple. However, some flexibility is allowed in terms of recall because a player may be asked to provide an assessment of system capabilities based on activities that have occurred during the event. Test players may also be asked to provide comments or narrative responses to open-ended questions. 1.m+1 Last EFA 1 Measure. x Last Evaluation Focus Area. x.1 First Measure under Last EFA x.m+1 Last Measure under Last EFA APPENDIX B DATA SOURCE MATRIX (DSM) The DSM is taken directly from the SEP. APPENDIX C DATA DEFINITIONS REFERENCES DISTRIBUTION LIST ACRONYMS ABBREVIATIONS GLOSSARY The above items are provided as needed. ATEC Pamphlet June 2010 F-39

288 Enclosure F-5 Detailed Test Plan (DTP) DTC Detailed Test Plan. For Acquisition-Evaluated Projects. The DT DTP is the major planning document for the conduct of the DT for an evaluated system. The test officer uses the test directive, SEP, TOP/ITOP, Performance Specification, and other relevant documentation to develop the DTP for acquisition-evaluated projects. The test officer will prepare a DTP in accordance with the directions provided in the test directive. The draft DTP is initially provided to the AST and T&E WIPT for review and comment. After all internal coordination, the DTP is finalized. The Test Center commanders/directors (LTC/GS-15 level) are charged with ensuring the quality, both content and format, and approving the test plan prior to obtaining HQ DTC approval. The DTP is approved by the DTC Commander/Director (or designee, usually the Test Division Chief). No major deviations from the approved DTP can be made without the approval of HQ DTC. Minor deviations will be reflected in the narrative portion of the test report. The guidelines and format for a DTP are shown below. Front Cover Disposition Instructions Table of Contents DTP Format 1.1 System Description 1.2 Summary 1.3 Unique Test Personnel Requirements 2.0 Subtests 2.1 Name of Subtest Objectives Criteria and Data Analysis/Procedure Test Procedures and Data Required APPENDIXES Test Criteria Test Schedule Informal Coordination References Abbreviations Distribution List Front Cover The outside front cover and the reverse of the front cover are prepared in accordance with administrative specifications. The reverse of the front cover will contain the following information: DISPOSITION INSTRUCTIONS F-40 ATEC Pamphlet June 2010

289 Destroy this document when no longer needed. Do not return to the originator. The use of trade names in this document does not constitute an official endorsement or approval of the use of such commercial hardware or software. This document may not be cited for purposes of advertisement. TABLE OF CONTENTS If the report is to contain hyperlinks, the following note will be included: Note: To use a hyperlink in this report, click on the blue, underlined text. To return to the previous position, click on the back arrow on the Adobe toolbar (at the bottom of the page). The back arrow is not shown until after a hyperlink has been clicked. This table of contents lists each major section and subtest heading of the test plan. 1.1 SYSTEM DESCRIPTION SECTION 1. INTRODUCTION The description is derived from the SEP, TEMP, or other source documents. It should describe the test item in terms of function, technical parameters, physical characteristics, mission, and threat. If the test item consists of several major components, identify these and describe each. If test items differ from previously tested items, describe the differences in the test items. If literature on the test item exists, include appropriate extracts of the description and cite the reference. Ensure that the manufacturer s performance claims are not included as facts in the description. Listing physical characteristics rather than performance information is advisable. This description must permit full understanding of the item. Include a line drawing or photograph, if available. 1.2 SUMMARY Briefly address the test authority, test concept, and test objective(s) in this paragraph. The test authority will be included in the test directive and referenced as an appendix. The test concept is a statement of the overall purpose of the testing actions. It is a general answer to the question Why is the test effort being performed? The test objectives present an overview of the type of subtests and objectives. Cite sufficient detail to allow the reader to clearly understand the extent of testing. Address special test considerations required such as known limitations. The DSM in the SEP details those measures for which DTC is responsible relative to the evaluation. If there is no SEP or DT OTA TP, include general test concepts and design based on the scope of work. ATEC Pamphlet June 2010 F-41

290 An example of the summary is shown below: The U.S. Army Developmental Test Command (DTC) issued a Test Execution Directive (app C, ref 1) to the U.S. Army Aberdeen Test Center (ATC), Aberdeen Proving Ground (APG), Maryland, to conduct a [insert type of test] on one [insert system name] (SN [insert serial number]) (DTC Project No. [insert]). This [type of test] will examine [insert subtests]. Testing will be conducted from [insert test dates]. A summary of the test objectives is presented in table 1. TABLE 1. TEST OBJECTIVES Subtest 2.1 Initial Inspection 2.2 Test Item Control and Data Management [insert subtest] Objective To ensure that each item is undamaged, complete, and ready for testing. To identify by test item the location in which they are to be distributed and distribute all the test items to the appropriate test facilities and field locations, and retrieve all independent laboratory test results and field data and consolidate them into a final report. [insert objective] 1.3 UNIQUE TEST PERSONNEL REQUIREMENTS Describe the use of SOMTE personnel in terms of critical operator and maintainer tasks or the need for Soldiers to conduct the test. 2.0 SUBTESTS This is the most important section of the test plan. For acquisition-evaluated projects, include specific subtests necessary to provide test data to answer the evaluator s Critical Technical Parameters (CTPs) and to verify compliance with contractual specifications. State what is being measured and how the test will be accomplished; when appropriate, reference applicable elements from the SEP, OTA TP, or DTC test directive. Keep the subtest limited in scope. In general, the criteria are the guide for subtest partitioning. A short subtest discussing only one specific topic is easier to write and read. Longer subtests may be divided into supplements. Include such subtests as safety; performance; vulnerability, ballistic survivability; logistic supportability, reliability; climatic suitability, durability; nuclear and electromagnetic survivability, transportability, and human engineering. For subtests to be satisfied by contractor/developer testing that are described in a published contractor/developer test plan, for which the test manager has assigned certain responsibilities to the Test Center (e.g., data analysis, data reduction, or on-site monitoring), ensure that complete information on that subtest is provided. Describe all elements for that subtest from the externally prepared document and delineate the responsibilities of the Test Center. F-42 ATEC Pamphlet June 2010

291 Include sufficient detail to enable approving authorities to determine if the scope of the specific subtests will accomplish the subtest objective, to determine the justification for conducting the subtests, and to allow a tester other than the plan s author to conduct the test. 2.1 NAME OF SUBTEST OBJECTIVES CRITERIA AND DATA ANALYSIS/PROCEDURE State the criteria verbatim from their sources. State source and paragraph in parenthesis following each criterion. The criteria may be derived from the SEP, requirements documents, test directive, regulations, standards, and/or operating procedures. When a portion of a listed criterion is not to be examined, underline the nonapplicable portion and add the following statement at the bottom of each page on which underlining is used: NOTE: Underlined portion of criterion will not be addressed. The criteria should be quantitative where possible, but qualitative criteria are acceptable if the circumstances do not lend themselves to numerical values. The criteria should be stated so they may be considered in terms of met, partially met, or not met. For a subtest with no criteria available, the test director may develop criteria, listing the Test Center as the source. The approval of the test plan will constitute approval of the subtest criteria. If no criteria would be required (i.e., subtest designed to provide data with no assessment), state This subtest is conducted to document technical performance. A complete listing of criteria will be included in appendix A. In those cases when several criteria are listed, appendix A of the plan may be referenced in lieu of listing numerous criteria (e.g., See items 2 through 10 and 17 in appendix A ). Describe how the data are to be analyzed by the Test Center and how they should be reported. For each criterion, specifically identify in a separate subparagraph how it will be addressed and what condition will lead to a determination that the criterion has been met. Address only that portion of a criterion applicable to the specific subtest. Present the analytical procedure in the same order that it was introduced in the test procedure paragraph. Specify any compression, reduction, averaging, compilation, and/or statistical treatment. Include any assumptions that are appropriate. For analytical and/or statistical procedures, the procedure should be identified in sufficient detail to allow another individual to carry out the analysis and to allow a reviewer (test manager, evaluator, PM, etc.) to judge the adequacy of the methodology. Sufficient detail does not mean the complete equations for computations such as a single T-test, standard analysis of variance (ANOVA), or simple point estimate, and confidence limits. Identification of statistical test ATEC Pamphlet June 2010 F-43

292 name, sample size, confidence/or risk levels (whichever is appropriate), and any distribution assumption is sufficient. Analytical procedures should provide interpretations of criteria statements where necessary. An example of the contents for this section is as follows: 2.1 FIELD DURABILITY AND PERFORMANCE TEST (FDPT) Objective The objective of this test is to subject the MBS system to use in U.S. Army field environments (including tropical, temperate, and cold climates) to determine the comfort, performance, compatibility and durability characteristics for multiple missions, terrains, and climates Criteria and Data Analysis Data gathered during this test will be analyzed as presented in table TABLE CRITERIA AND DATA ANALYSIS Criterion (App A Item) 11 - System Performance Remarks/Analysis If it is verified that the MBS is constructed of materials that can provide foot protection from environmental and climatic conditions/extremes, the criterion will be met Weight Reduction 13 - Comfort, Mobility, and Traction If the MBS achieves reductions in weight and a weight reduction of 5 percent over current boot(s) and components, the criterion will be met. If it is verified that the MBS enhances Soldier s comfort, mobility, and traction, the criterion will be met Support If it is verified that the MBS is a lightweight (rugged), comfortable and supportive boot system designed for prolonged foot movement (road marches) while the Soldier carries heavy loads, normally designated as 120 lbs [maximum]), the criterion will be met Test Procedures and Data Required The procedures to be used in the subtest should be described in sufficient detail to allow the reader to understand what will occur. If possible, TOPs, ITOPs, or MIL-STDs should be referenced. Describe in sufficient detail how the subtest will be conducted so that another individual knowledgeable in testing of such materiel could follow the procedure and conduct the test. Describe how the test item will be operated or what it will be exposed to in order to conduct the subtest. F-44 ATEC Pamphlet June 2010

293 Include a listing of all data elements that will be recorded during the subtest. Specify the accuracies (associated instrumentation tolerances) required for the measurements. The use of standard forms, data sheets, or questionnaires is encouraged. If standard forms, data sheets, or questionnaires are to be used, include them as an appendix and reference that appendix. An example of the Test Procedures and Data Required section is shown below: Test Procedures and Data Required a. General. Testing will be conducted at three separate U.S. Army locations to capture arctic, tropic, and temperate environment use for the MBS candidates. At each location, approximately 60 U.S. Army Soldiers (30 Soldiers for each of the two MBS candidates) will wear the MBS candidates for a 60-day period while conducting pre-planned garrison and field training operations. Additional trials with small groups of test participants will be conducted to focus on areas of donning, doffing, and compatibility of the (TOP) b. Test Conduct. The test team will conduct an in-briefing, sizing/fitting, new equipment training (NET), and data collection training for all military test participants during the first day of testing at each location. The 60 test participants requested at each site to support the FDPT will be properly fitted with one MBS candidate by the test team. (1) Demographic/anthropometric data. Demographic and anthropometric data will be collected on each test participant prior to test initiation. (2) Fitting and issue. Each test participant will be fitted with one MBS candidate. The test participant will don the MBS and be asked to conduct walking and bending movements to assess the acceptability of fit. c. The following data are required: A complete listing of all test participants and associated TICN. Results of all demographic and anthropometric data for each test participant and associated percentile rankings. Results of all averaged don/doff time trials and narrative comments of any problem areas. Results of compatibility trials including photographs and narrative observations. Durability data including wear times, missions conducted, and issues of performance. All test incident reports (TIRs) including damage to the MBS test items documented through the Automated Test Incident Reporting System (ATIRS). ATEC Pamphlet June 2010 F-45

294 SECTION 3. APPENDIXES A. TEST CRITERIA Extract appropriate test criteria verbatim from the SEP, requirements document, contract specification and standards, or other sources. This format will subsequently be used in the test report format with a Remarks column added. List as shown below: Item Applicable Source Test Criteria Subtest Remarks When a portion of a listed criterion is not to be examined, underline the nonapplicable portion and add the following statement at the bottom of each page on which underlining is used: NOTE: Underlined portion of criterion will not be addressed. B. TEST SCHEDULE Provide realistic schedules of the test effort to ensure efficient programming and utilization of resources. Front load tests that provide input for the safety release. Every effort should be made to minimize the time required to accomplish the test. Prepare an incremental test schedule presenting an estimate of net testing time. C. INFORMAL COORDINATION Informal coordination is that coordination effected with a PM or MSC. Include a list of all agencies with which the draft test plan was informally coordinated. Indicate when coordination comments are not incorporated into the plan. List the agency providing the comment, the comment, and the reason for not accommodating the comment. Major disagreements or anomalies, in the judgment of the test plan preparer, will immediately be brought to the attention of the DTC test manager. D. REFERENCES This appendix should list all documents mentioned in the plan, in the order in which they were mentioned. E. ABBREVIATIONS All acronyms, brevity codes, short titles, and abbreviations used in the plan are listed alphabetically with an explanation of their meaning. Do not list commonly used terms. F. DISTRIBUTION LIST F-46 ATEC Pamphlet June 2010

295 The distribution list will list all agencies receiving the plan in accordance with the HQ DTC test directive and internal Test Center requirements. All published and distributed technical reports, including DTC s final test report, must be provided to the Defense Technical Information Center (DTIC) according to DOD/DA requirements. ATEC Pamphlet June 2010 F-47

296 OTC DTP. OTC also uses a DTP. It is an event-level document used to outline specific highly detailed instructions, missions, tasks, organization, and procedures on how an operational test or other event will be conducted. The OT DTP serves as a blueprint of detailed information for the test team. This information expands and supplements information contained in the OTA TP to cover all aspects of planning for a test to include specific details on the organization, logistical requirements, missions, tasks, procedures, and other requirements for a test. The OTC DTP is not an approved document and is not staffed outside OTC. An OTC DTP outline is provided below. For additional information, see OTC s TOPM Detailed Test Plan Item Memorandum Approved System Evaluation Plan Approved Event Design Plan Test Control Plans Scenario and Events Schedules Training Plans and Requirements Security and Protection Plans Data Management Plans, Procedures, and Documents Safety, HUC, and Environmental Assessments Technology Support Requirements Administrative Support Requirements Key Support Documentation and Minutes Other Topics TAB A B C D E F G H I J K L F-48 ATEC Pamphlet June 2010

297 Enclosure F-6 Operational Test Agency Assessment Report (OAR) OTA Assessment Report (OAR). The OAR is an assessment of progress towards meeting system requirements at other times than milestones and FRP decisions. The OAR often addresses only a subset of the overall evaluation requirements. It may identify needed corrective actions; provide information on tactics, doctrine, organizations, and personnel requirements; make projected mission impacts based on what is learned; assess readiness for IOT; and evaluate the system s logistic supportability and MANPRINT design. The scope of issues to be addressed by the OAR is flexible in that it may or may not cover all requirements related to performance, suitability, and survivability design. It is not tied to an MDR but is developed as required to execute the continuous evaluation process. It may use data from a single test or a series of tests. It contains a safety confirmation provided by DTC as an appendix, if required. A sample outline is provided below. Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) 1. Introduction a. Purpose b. Scope c. Data Description d. Limitations 2. Summary 3 Analysis 4. Recommendations OTA Assessment Report Enclosure of safety confirmation, if required. ATEC Pamphlet June 2010 F-49

298 Enclosure F-7 OTA Milestone Assessment Report (OMAR)/ OTA Evaluation Report (OER) OMAR/OER Format The OMAR/OER is limited to approximately 30 pages. A recommended format for the OMAR/OER is provided on the following pages. An Executive Summary is not needed. Chapter 2, Conclusions, should contain more than just the conclusions. It should contain how you arrived at the conclusions and the key data that supports the conclusions. Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) 1. Introduction 1.1 Purpose of Evaluation 1.2 Background 1.3 Scope 2. Conclusions OMAR/OER 3. Findings and Analysis 3.1 Mission Capability Methodology for Evaluation of Mission Capability First Mission Capability EFA Measure 1-1 (Findings) Measure 1-2 (Findings) Second EFA through 3.1.k Last EFA 3.1.(k+1) Discussion of Mission Capability 3.2 Effectiveness Methodology for Evaluation of Effectiveness First Effectiveness EFA Measure 1-1 (Findings) Measure 1-2 (Findings) F-50 ATEC Pamphlet June 2010

299 3.2.3 Second EFA through 3.2.k Last EFA 3.2.(k+1) Discussion of Effectiveness 3.3 Suitability Methodology for Evaluation of Suitability First Suitability EFA Measure 1-1 (Findings) Measure 1-2 (Findings) Second EFA through 3.3.k Last EFA 3.3.(k+1) Discussion of Suitability 3.4 Survivability Methodology for Evaluation of Survivability First Suitability EFA Measure 1-1 (Findings) Measure 1-2 (Findings) Second EFA through 3.4.k Last EFA 3.4.(k+1) Discussion of Suitability 4. Recommendations 4.1 Improve the System 4.2 Modifications to Test and Evaluation Strategy Appendix A Critical Operational Issues and Criteria References Distribution Acronyms/Abbreviations/Glossary ATEC Pamphlet June 2010 F-51

300 OMAR/OER Format Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) Chapter 1 Introduction Six pages is the goal for the length of this chapter. The author should extract and update material from Chapter 1 of the approved SEP to complete the paragraphs of this chapter. Provide necessary information to understand the basis of both the test and the evaluation. Most of this chapter is written during the preparation of the strawman report Purpose of Evaluation. State purpose of the evaluation effort. System evaluation reports provide independent conclusions of mission capability, operational effectiveness, suitability, and suitability as defined in the SEP Background. Provide a short system description, and a brief background of the system s acquisition history, including previous testing and current status Scope. Provide a brief narrative of the major events which support the analyses in this report. Describe the events conducted to address all measures in the SEP. These may include literature reviews, observations, tests, and simulations; provide event, date, environmental conditions, and duration. Give location and dates of the test events and who conducted it. Briefly describe the test execution. Be concise. Chapter 2 Conclusions Four pages or less is the goal for the length of this chapter; it does not need to be long. This chapter should provide the bottom line. This chapter does not need to address all EFAs, measures, or COI, but should address the major findings, as well as the arguments (including supporting data) that led to those conclusions. The focus is to provide the conclusions, the arguments (measures) that drew you to those conclusions, and the key summarized data which support the conclusions. This chapter should discuss how the answers to the key measures contribute to mission capability, operational effectiveness, suitability, and survivability of the system. F-52 ATEC Pamphlet June 2010

301 Chapter 3 Findings and Analysis Limit this chapter to 10 pages. Describe analytic results keyed to the EFAs and measures from the SEP. Not all measures need to be addressed in this chapter, just those with key results for the system. Use this chapter to tell the story of your evaluation the EFAs and measures should lead the reader along the process you used to structure the evaluation, collect the data, and analyze the results, so that they understand your conclusions and what they are based on. 3.1 Mission Capability. State an evaluation statement of is, is not, or is with limitations. Provide supporting evidence (i.e., findings) for conclusion based on the description of mission capability stated in the SEP. Include appropriate and sufficient system level results, findings, or trends as necessary. As in the SEP, annotate the measures and findings that are associated with COICs or KPPs Methodology for Evaluation of Mission Capability First Mission Capability Evaluation Focus Area (EFA). Answer the issue in terms of the measures Measure 1-1 (Findings) Measure 1-2 (Findings) Second EFA through 3.1.k Last EFA. 3.1.(k+1) Discussion of Mission Capability. 3.2 Effectiveness. State an evaluation statement of is, is not, or is with limitations. Provide supporting evidence (i.e., findings) for conclusion based on the description of effectiveness stated in the SEP. Include appropriate and sufficient system level results, findings, or trends as necessary. As in the SEP, annotate the measures and findings that are associated with COICs or KPPs Methodology for Evaluation of Effectiveness First Effectiveness EFA. Answer the issue in terms of the measures Measure 1-1 (Findings) Second EFA through 3.2.k Last EFA. 3.2.(k+1) Discussion of Effectiveness. 3.3 Suitability. State an evaluation statement of is, is not, or is with limitations. Provide supporting evidence (i.e., findings) for conclusion based on the description of suitability stated in the SEP. Include appropriate and sufficient system level results, findings, or trends as necessary. As in the SEP, annotate the measures and findings that are associated with COICs or KPPs Methodology for Evaluation of Suitability First Suitability EFA. Answer the issue in terms of the measures. ATEC Pamphlet June 2010 F-53

302 Measure 1-1 (Findings) Second EFA through 3.3.k Last EFA. 3.3.(k+1) Discussion of Suitability. 3.4 Survivability. State an evaluation statement of is, is not, or is with limitations. Provide supporting evidence (i.e., findings) for conclusion based on the description of survivability stated in the SEP. Include appropriate and sufficient system level results, findings, or trends as necessary. As in the SEP, annotate the measures and findings that are associated with COICs or KPPs Methodology for Evaluation of Survivability First Survivability EFA. Answer the issue in terms of the measures Measure 1-1 (Findings) Second EFA through 3.4.k Last EFA. 3.4.(k+1) Discussion of Survivability. Chapter 4 Recommendations Limit this chapter to 3 pages. Do NOT make acquisition recommendations, e.g., progression to next phase of development, production, etc. 4.1 Improve the System. Base this paragraph on shortfalls of the system and the future capability of the system to fulfill the approved requirements. Include the evaluator s recommendation for fixes needed before the next milestone, planned improvements, and potential product improvements. Include evaluator s suggestions for improvements to system design and to doctrine, tactics, organization, and training. The evaluator should state recommendations in terms of what needs to be fixed rather than how to fix it, which is the mission of the developer. 4.2 Modifications to Test and Evaluation Strategy. Based on the SEP, the AST recommends issues to address in subsequent studies, testing, or evaluations. Plan for follow-on actions to ensure correction of identified limitations. Include summary of planned product improvements and test and survey data required to verify success of fixes. APPENDIXES Add appendixes to the SER only in the most exceptional circumstances. Normally, information suitable to be placed in appendixes will be covered in the SAR. F-54 ATEC Pamphlet June 2010

303 APPENDIX A Critical Operational Issues and Criteria A.1 COI 1. Answer the issue in terms of the established criteria, operational or technical, and any additional measures which determined the answer. A.1.1 Criterion 1-1 (Finding) Reference the measures in Chapter 3 where more detailed analysis can be found, either for the COI or COIC. REFERENCES (Optional) ACRONYMS (Optional) DISTRIBUTION (Optional) GLOSSARY (Optional) ATEC Pamphlet June 2010 F-55

304 Enclosure F-8 System Analysis Report System Analysis Report (SAR) Format. The SAR is a detailed analysis supporting the OMAR or OER. The SAR documents the analyses that were conducted but not presented within the OMAR or OER. It does not repeat the information that is already provided in the OMAR or OER. a. SAR Format. The SAR must discuss each analysis of events conducted. A recommended format for the SAR is given in the following pages. Authors may deviate from this format to achieve clarity, if necessary. Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) SYSTEM ANALYSIS REPORT 1. Introduction 1.1 Purpose of Evaluation 1.2 Background 1.3 Scope 1.4 Test, Simulation and Evaluation Limitations 2. Findings and Analysis 2.1 Mission Capability First EFA First Measure for EFA a Specific Methodology Used b Data Utilized c Analysis and Discussion of Results through a Last measure to Satisfy EFA Second Mission Capability EFA 2.1.b Last Mission Capability EFA 2.2 Effectiveness First EFA First Measure for EFA 1 F-56 ATEC Pamphlet June 2010

305 a Specific Methodology Used b Data Utilized c Analysis and Discussion of Results through a Last Measure to Satisfy EFA Second Effectiveness EFA 2.2.b Last Effectiveness EFA 2.3 Suitability First EFA First Measure for EFA a Specific Methodology Used b Data Utilized c Analysis and Discussion of Results through a Last Measure to Satisfy EFA Second Suitability EFA 2.3.b Last Suitability EFA 2.4 Survivability First EFA First Measure for EFA a Specific Methodology Used b Data Utilized c Analysis and Discussion of Results through a Last measure to Satisfy EFA Second Survivability EFA 2.4.b Last Survivability EFA Appendix A Force Level Efforts Appendix B Manpower, Personnel, Human Factors Engineering, and MANPRINT Integration Appendix C Safety and Health Hazards Appendix D Transportability/Deployability References Distribution List Acronyms/Abbreviations/Glossary (Optional) ATEC Pamphlet June 2010 F-57

306 SYSTEM ANALYSIS REPORT Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) Foreword The authors will prepare a brief foreword. For historical purposes, the foreword will list all individuals (to include the names, office symbols, and organizations) having input to or knowledge of the writing of the report to include all members of the AST. At a minimum, the evaluator, test officer, and their chief analysts will be included in the foreword. Chapter 1 Introduction Most of the information in this chapter is updated from the SEP. Most of this chapter can be written during the preparation of strawman reports. 1.1 Purpose of Evaluation. State purpose of the evaluation effort. System evaluation reports provide independent conclusions of mission capability, operational effectiveness, suitability, and suitability as defined in the SEP. 1.2 Background. Provide a short system description, and a brief background of the system s acquisition history, including previous testing and current status Scope. Provide a brief narrative of the major events which support the analyses in the OMAR or OER Describe the events conducted to address all measures in the SEP. These may include literature reviews, observations, tests, and simulations; provide event, date, environmental conditions, and duration. Give location and dates of the test events and who conducted it. Briefly describe the test execution. Be concise. 1.4 Test, Simulation and Evaluation Limitations. The tester-provided TRs are a partial source of information for this paragraph. List limitations on planning and conduct of T&E. Refer to test limitations in the TR. Show the impact of the test limitations on the evaluation. State known limitations of the evaluation and those data sources (to include OT) which support the evaluation. State if there were no limitations. (Note: System limitations are not test limitations.) Based upon the impact of the reported test limitations, the AST will ascertain the adequacy of the primary data sources to be considered in answering the operational issues and preparation of the SAR. Chapter 2 F-58 ATEC Pamphlet June 2010

307 Findings and Analysis This chapter contains test results, analyses of test data, and evaluative findings. It thoroughly describes the test data and analytic results keyed to the EFAs and measures from the SEP, leading the reader towards the conclusions. Use this chapter to tell the story of your evaluation how you structured it, what data you collected, and what the data told you. The level 3 system level database (including results from all applicable events) is the basis for developing this chapter. As in the SEP and OMAR or OER, highlight EFAs or measures which reflect KPPs or COICs. 2.1 Mission Capability First EFA. State each EFA in its own major paragraph. State the EFA as written in the SEP First Measure for EFA 1. State each measure in turn under the EFA as the next logical subparagraph (e.g., ). State the measure as written in the SEP, including the numbering (e.g., Measure 1-4). Answer each measure as shown below a Specific Methodology Used. Describe how data pertaining to this measure were generated, collected, and reduced. Explain significant deviations from test methodology described in the SEP or tester s portion of the OTA TP and the reasons for not addressing any measure listed in the DSM for the events addressed in the SAR b Data Utilized (Level 4 Summary Statistics). Complete level 4 data reduction and analyze and discuss data results. Use tabular, graphic, or other appropriate formats to show the descriptive statistics information pertaining to the data collected for the measure. Show the source of the data (e.g., DT, OT, M&S, FDTE, contractor testing, study, analysis). If the measure utilizes multiple data sources, use a separate subparagraph for each data source. Provide data from the OT first. List data from other data sources in order of importance after the OT data c Analysis and Discussion of Results. Use methodology shown in the SEP to calculate the results of the measure. State how the system performed the measure. Provide any information that will increase understanding of the summary statistics data. For example, discuss how conditions existing during the test may have impacted on the collection or validity of the data and the impact of the conditions on the results. State confidence levels for numerical results. Use levels 5, 6, and 7 data to provide analyses. After calculating the results, proceed to the next measure. through a Last measure to Satisfy EFA Second Mission Capability EFA. ATEC Pamphlet June 2010 F-59

308 2.1.b Last Mission Capability EFA. 2.2 Effectiveness. (Continue the pattern established for mission capability) 2.3 Suitability. (Continue the pattern established for mission capability) 2.4. Survivability. (Continue the pattern established for mission capability) Appendixes Each additional appendix shall be referred to in the text. If the SAR contains more than one appendix, each will be identified with a capital letter, in the order in which it is mentioned in the plan. A single appendix should be labeled Appendix. Similar items may be combined to avoid creating unnecessary appendixes. Appendix A. Force Level Effects. Appendix B. Manpower, Personnel, Human Factors Engineering, and MANPRINT Integration. Appendix C. Safety and Health Hazards. Appendix D. Transportability/Deployability. References List all references required to understand the SAR. Always include the ICD, CDD, functional description, System MANPRINT Management Plan, OTP, TEMP, and SEP. Distribution List This list provides a permanent record of initial distribution and in the case of classified reports, can be extremely useful because they can later be used for communicating instructions regarding handling and downgrading. Acronyms/Abbreviations (Optional) Glossary (Optional) F-60 ATEC Pamphlet June 2010

309 Enclosure F-9 OTA Follow-on Evaluation Report OTA Follow-on Evaluation Report (OFER). This report is for an OT that may be necessary during or after production to refine the estimates made during IOT, provide data to evaluate changes, and verify that deficiencies in materiel, training, or concepts have been corrected. It may also provide data to ensure that the system continues to meet the operational needs and that it retains its effectiveness in a new environment or against a new threat. A sample format is provided below. Cover Page Signature Page Table of Contents List of Figures (if needed) List of Tables (if needed) 1. Introduction a. Purpose b. Scope c. Data Description d. Limitations 2. Summary 3 Analysis 4. Recommendations OTA Follow-on Evaluation Report Enclosure of Safety Confirmation, if required. ATEC Pamphlet June 2010 F-61

310 Enclosure F-10 Technical Note Technical Notes Format. Technical Notes are prepared to supplement ATEC reporting documents and may be used to support the SAR, briefings, etc. They preserve lessons learned for future testers and evaluators. A recommended format for a Technical Note relating to any ATEC test, evaluation, or assessment is shown in the following pages along with the recommended content of each paragraph. Authors may deviate from this format where necessary to achieve clarity or conciseness. Cover Page Signature Page Notices Abstract Foreword 1. Introduction 1.1 Purpose of the Technical Note 1.2 Scope 1.3 Related T&E Events 1.4 System Description TECHNICAL NOTE FORMAT 2. Detailed Findings and Analysis 2.1 Title of First EFA Addressed Title of Measure Discussion of Methodology Used Discussion of Data Used Analysis and discussion of Results 2.1(n+1) Overall Issue Findings and Discussion 3. Conclusions 3.1 Summarized Evaluation Approach 3.2 Results 3.3 Supporting Data 4. Recommendations Appendix A General Appendix B Supplemental Analysis Appendix C Detailed Observations and Notes References Glossary Index Distribution F-62 ATEC Pamphlet June 2010

311 TECHNICAL NOTE FORMAT Cover Page Signature Page Notices This page is optional. Further notes on limitation of distribution and any disclaimers or warning notices may be included on the cover page. If they are too voluminous to fit on the cover page, they may be included on a Notices page. Table of Contents Abstract The abstract is an optional component of the front matter of the Technical Notes. The abstract presents a concise statement (approximately 200 words) of the purpose, scope, and major findings of the SER. The abstract shall be understandable independent of the rest of the SER. It shall contain no undefined symbols or acronyms, and make no reference by number to references, figures, or tables in the body of the SER. Foreword The author(s) will prepare a brief foreword. For historical purposes, the foreword will list all individuals (to include the names, office symbols, and organizations) having input to or knowledge of the writing of the report to include all members of the AST. As a minimum, the evaluator, test officer, and their chief analysts will be included in the foreword. List of Figures (if needed) List of Tables (if needed) Chapter 1 Introduction As necessary, extract any required information from the SAR for this chapter and tailor it to the Technical Note. 1.1 Purpose of the Technical Note. State the purposes or objectives for writing the Technical Note. State the why of the report. Show how the report supplements the SAR or advances the general body of knowledge and state of the art for T&E practices and methodologies. 1.2 Scope of the Technical Note. State the what of the report. Include information on data sources used to prepare the report. Provide an overview of relevant models, analyses, tests, equipment, and any other resources that together are basis for analysis. Address depth of analyses in the report. 1.3 Related T&E Events. Briefly describe the T&E events to which the Technical Note is related showing which portion of the effort is germane to the report. ATEC Pamphlet June 2010 F-63

312 1.4 System Description. Summarize pertinent system description in sufficient detail to identify the system and to understand the report. Describe the system to include its major roles, missions, and components or characteristics. Describe differences between the system under test and the objective system under development. Include line drawings and flow charts as needed to visualize the system. Chapter 2 Detailed Findings and Analysis This chapter contains the what and how of the report and establishes results, analyses of data, and analytic findings. It thoroughly describes the test data and analytic results keyed to the EFAs and measures from the SAR which are pertinent to the subjects and material covered by the Technical Note. Ignore other issues and measures. The level 3 database described in appendix A of the Technical Notes is the basis for developing this chapter. 2.1 Title of the First EFA Addressed. State each EFA pertinent to this report in turn in its own major paragraph (e.g., 2.1). State the issue as written in the SEP/SAR Title of the Measure. State each measure pertinent to this report in turn as the next logical subparagraph (e.g., 2.1.1). State the measure as written in the SEP/SAR Discussion of Methodology Used. Describe how data pertaining to this measure and to the subject of this report were generated, collected, and reduced. Complete level 4 data reduction and analyze and discuss data results. Use tabular, graphic, or other appropriate formats to show descriptive statistics information pertaining to the data collected for the measure Discussion of Data Used. Show the source of the data. If the measure requires multiple data sources, use a separate subparagraph for each data source. Provide data from the OT first. List data from other data sources in order of importance after the OT data Analysis and Discussion of Results. Show the effect of the measure on the topic of the report. Provide any information that will increase understanding of the summary statistics data. Discuss how conditions existing during the test may have impacted on the collection or validity of the data and the impact of the conditions on the results. State confidence levels for numerical results. Use levels 5 and 6 data to provide analyses. After calculating the results, proceed to the next measure as subparagraph 2.1.2, etc. 2.1.(n+1) Overall Issue Findings and Discussion. Discuss how the issue relates to the topic of the report based on answers to the measures. Add any discussion, statistical testing, or analysis pertaining to the operational issue not previously addressed. Proceed to the next operational issue as major paragraph 2.2, etc. Chapter 3 Conclusions F-64 ATEC Pamphlet June 2010

313 This chapter contains the bottom line of the report. Conclusions will be based on the findings in Chapter 2 and on the judgment of the author. Suggested major paragraphs are described below. 3.1 Summarize the overall approach to evaluation of conclusions. Show the methodologies by which the answers to the issue questions were further analyzed and consolidated to draw conclusions on the overall topic of the report. Present major results and evaluation that addresses the issues and provide a narrative as to how the issue answers contribute to the conclusions. 3.2 Provide the results from all data sources and the overall assessments of the issue based on the results. Include combinations of issue conclusions considering mission, need, current and projected threat, trade-offs, planned improvements, and potential product improvements. 3.3 Address the adequacy of supporting data. Also include key technical evaluation findings that supplement or support the operational findings or which present significant information. Chapter 4 Recommendations Base this chapter on shortfalls of the system or the analysis. This paragraph includes the evaluator recommendations for improvements to systems or methods. Include suggestions for improvements to system design, doctrine, tactics, organization, and training. Based on the adequacy of this OT&E event and system performance, the evaluator recommends issues to address in subsequent studies, testing, or evaluations. Plan for follow-on actions to ensure correction of identified limitations. References This portion of the back matter lists the mathematical and statistical sources used to develop procedures. The list of references begins on a new page. List all references required to understand the Technical Note. Appendixes Appendixes contain information that supplements, clarifies, or supports the text. They contain material that might otherwise interfere with an orderly presentation of ideas in the text because placing detailed explanations, supporting data, or long mathematical analyses in appendixes shortens the text and makes it easier to read. However, information essential to the purpose of the text shall appear in the text. The appendixes to the Technical Note provide copies of supporting documents, provide information on the supporting data bases, and expand report paragraphs, as needed. Appendixes provide information to the reader, yet avoid interrupting the logical flow of information in the main body of the document due to incorporation of too much detail. For this reason, the appendixes are optional. ATEC Pamphlet June 2010 F-65

314 Appendix A. Description of Database. This appendix contains the detailed description of the structure and contents of the database used to develop the report. Include automated, hard copy, and audiovisual data used in the report. Appendix B. Supplemental Analyses. Expand on analyses included in the body of the report as required. Appendix C. Detailed Observations and Notes. Expand as necessary to cover notes and other observations by participants or observers. Glossary The glossary is an optional component of the back matter of the plan. If the glossary is used, it will begin on a new page. Index The index is an optional component of the Technical Note back matter. Include an index if the Technical Note is exceptionally complex or voluminous. Distribution The distribution list is the last piece of back matter. This list provides a permanent record of initial distribution and in the case of classified reports, can be extremely useful because they can later be used for communicating instructions regarding handling and downgrading. F-66 ATEC Pamphlet June 2010

315 Appendix G Reliability, Availability, and Maintainability G-1. Purpose This guidance provides policies, responsibilities, and general procedures for Reliability, Availability, and Maintainability (RAM) data collection, analysis and evaluation. It establishes requirements and methodology for a minimum level of RAM data collection as the standard for ATEC while maintaining the capability for comprehensive RAM data collection when required and approved by appropriate authority. G-2. Policy This policy will be incorporated in the next update of ATEC Regulation This appendix will be revised at the same time. a. RAM data and testing. (1) The primary sources for RAM data will be (a) Robust government developmental testing (DT). (b) Program manager conducted or sponsored contractor testing. (c) Operational testing. (d) Analytical studies, models and simulations, or other sources. (2) Government DT will be designed to collect comprehensive RAM data under conditions that duplicate operational field conditions as much as possible. (3) RAM data collection during operational tests should focus on identifying what fails, under what general conditions, and what it takes to get the system operating again. Precision collection of detailed RAM data to compute numerous measures under carefully controlled operational mode summary/mission profile (OMS/MP) to determine if the system meets numerical criteria should only be done during operational testing as an exception. Test incident reports (TIRs) will be developed based upon the incident data recorded, classified as appropriate (e.g., information, failure, non-failure, or no test), and entered into the Army Test Incident Reporting System (ATIRS). (4) To minimize the cost and distraction from operational realism caused by having large numbers of RAM data collectors in the box during operational tests, system operators or crewmembers of the system under test (SUT) or other data collection personnel, as an additional duty, will collect RAM data unless the complexity of maintenance actions is inconsistent with system operator collection.. The test organization will ensure adequate training of the designated personnel on specific requirements for data collection. ATEC Pamphlet June 2010 G-1

316 (5) The ASTs identifying RAM data requirements that cannot be provided by developmental testing or the standard RAM collection described above may request an exemption from this policy. Exemptions may be requested at the CIPR or at SEP approval. b. Reliability growth planning curves. (1) Background on recent reliability growth planning curves. The Defense Science Board (DSB) Task Force recognized a significant increase in the number of DOD weapon systems programs evaluated as not being operationally suitable. The main reason is the lack of materiel readiness due to poor system reliability, availability, and maintainability. The DSB report attributed this to the discontinuance of reliability growth management practices under acquisition reform. In response to this, it is DOD policy that programs to be formulated to execute a viable RAM strategy that includes a reliability growth program. Thus, ATEC is placing greater emphasis on reliability growth planning and tracking. (2) Reliability growth planning curves will be developed for all USD(ATL) and ASA(ALT) Policy applicable programs, in addition to other programs as determined by the Director, AEC. The ASA(ALT) policy applies to all pre-milestone B programs that possess a Joint Potential Designator of JROC Interest. Note that not all systems will need a reliability growth program, such as commercial off-the-shelf (COTS) items and Government furnished equipment (GFE), where the Government accepts the reliability and performance characteristics as is. Also note that the ATEC System Team maintains the right to make recommendations to the Director, U.S. Army Evaluation Center (AEC), regarding the need for a reliability growth program. (3) The ATEC workforce will henceforth discontinue use of the Military Handbook 189 planning model for developing reliability growth programs and planning curves thereof. (4) AMSAA s PM2 model will be utilized for all aspects of reliability growth planning going forward. This includes (but is not limited to) developing reliability growth programs, constructing reliability growth planning curves, and evaluating the effectiveness of proposed reliability growth plans. (5) The PM2 model possesses two variants: 1) for discrete-use systems and 2) for continuous-use systems. These two versions of PM2 are sufficiently robust for constructing reliability growth plans for systems in both usage domains. (6) PM2 reliability growth planning curves and relevant documentation will be incorporated into, and approved as part of, the Test and Evaluation Master Plan (TEMP). They will also be incorporated into ATEC s System Evaluation Plans. c. Reliability test thresholds for Engineering and Manufacturing Development and Demonstration (EMDD) phase. (1) This policy requires an EMDD reliability test threshold to be established for all programs with a Joint Potential Designator of Joint Requirements Oversight Council (JROC) interest. The essence of this guidance is the creation of reliability, availability, and G-2 ATEC Pamphlet June 2010

317 maintainability entrance criteria for operational testing. Achievement of this threshold is a major focus during Design Readiness Reviews. (2) In coordination with the T&E WIPT RAM Subgroup, develop a reliability threshold before MS B. This is required to ensure that the reliability threshold is incorporated into solicitations for the EMDD effort. The Reliability threshold should be established early enough to be incorporated in the EMDD contract. (3) When a reliability threshold is not established, the default value for the threshold will be 70% of the reliability threshold requirement(s) specified in the CDD. (4) The threshold must be demonstrated with at least 50% statistical confidence. The threshold will be approved as part of the MS B Test and Evaluation Master Plan (TEMP). (5) Application of the threshold to the reliability metrics that impact the Materiel Availability key performance parameter (KPP) or the Ownership Cost key system attributes (KSA) will be included in the EMDD contract. (6) ATEC will include test and evaluation planning in the TEMP for evaluation of the reliability threshold. The program is expected to meet or exceed the reliability threshold value at the end of the first full-up, integrated, system-level DT event in EMDD. (7) The T&E WIPT s RAM Subgroup, will define what constitutes this DT event. Earlier DT or OT events may also be needed in order to surface failure modes that are difficult to identify with reliability modeling and simulation. In advance of the threshold DT event, ATEC s System Team (AST), with support from Army Materiel System Analysis Activity (AMSAA), will review the materiel developer s Reliability Case documentation and test data to determine if the system is on a path to achieving the reliability threshold and report findings to the system s T&E WIPT RAM Subgroup. G-3. Responsibilities a. The AST is responsible for (1) Reviewing RAM requirements for the specified acquisition system and determining the overall requirements for RAM information, to include development of plans for obtaining data from all available data sources. (2) Developing and submitting justifications to the ESR/T&E CIPR Chair for exemption from the standard level of RAM collection in specific OT events as required. b. Operational Test Command (OTC) is responsible for (1) Collecting and processing RAM data in all OTs, providing the data in an authenticated event database to designated users, and submitting or entering TIRs into ATIRS. (2) Providing the OTC position to the ESR/T&E CIPR Chair on requests for exemptions from the standard level of RAM data collection for specific OTs. ATEC Pamphlet June 2010 G-3

318 (3) Developing and implementing detailed guidance and procedures for the standard level RAM data collection, processing, and distribution. c. DTC is responsible for (1) Providing necessary guidance and support to the AST chair and AST members during the process of assessment of the level of RAM collection required for system DTs. (2) Identifying actions, activities, or events that could be conducted with reduced resources to serve as data sources to mitigate the reduced level of RAM data collection in OT. (3) Providing the DTC position to the ESR/T&E CIPR Chair on requests for exemptions from the standard level of RAM data collection for specific DTs. d. AEC is responsible for (1) Providing necessary guidance and support to the AST chair and AST members during the process of assessment of the level of RAM collection required for system evaluations. (2) Identifying actions, activities, or events that could be conducted with reduced resources to serve as data sources to mitigate the reduced level of RAM data collection in OT. (3) Providing the AEC position to the ESR/T&E CIPR Chair on requests for exemptions from the standard level of RAM data collection. d. Chairs of T&E CIPRs are responsible for approving requests for RAM data collection beyond the standard level in OT. G-4. Procedures a. The AST, in coordination with the T&E WIPT, will review RAM requirements developed by the Capabilities Developer and specified in the system requirements or capabilities documents with the goal of identification of appropriate data sources for obtaining required data. b. The approved allocation of RAM requirements to data sources will be fully documented in the data source matrix of the SEP. c. Data collected for the standard level of RAM collection will consist of the following: (1) Equipment daily status will consist of identification of the system under test (SUT), date, and start and stop time of each equipment status change (operational, non-operational) for the date. (2) Incident data will consist of SUT identification, mission description or number, date, incident start and end time, and detailed description. G-4 ATEC Pamphlet June 2010

319 (3) Maintenance data will consist of SUT identification, date, identification of the start and stop time of each active maintenance repair time period, and description of actions performed. (4) This policy does not preclude having a limited number of RAM expert data collectors debrief system operators at the end of trials or during appropriate breaks in the operational test activity. d. The AST should determine as soon as possible if an exemption from the standard level of RAM data collection in OT is needed. Justification for the exemption will be developed and briefed during the T&E CIPR for a decision. The AST members will coordinate and provide positions of their respective SCA on the exemption to the T&E CIPR Chair. Specific justification will be presented during the T&E CIPR to support the request. The justification provided will describe in detail the level of RAM data collection proposed, rationale for the increased level of collection, estimated cost for the collection, and the need for a scoring conference for the collected data. e. Criteria for consideration of requirements for exemption for the standard level of RAM data collection will be based upon the following: (1) Specific key performance parameters, critical operational issues, or other requirements data that require comprehensive (or modified comprehensive) RAM data collection to fully address for evaluation. (2) High risk and/or new technology employment in the system that should use comprehensive RAM data collection as a necessary risk reduction action. (3) System with critical combat capabilities for which realistic RAM information cannot be adequately collected outside of OT. (4) History of RAM performance in previous data sources is less than adequate and requires detailed RAM data collection in OT. G-5. OT RAM Guidance a. Additional OT RAM information is provided in OTC s Test Operating Procedure and Methodology (TOPM) , located at b. Other OT RAM references: (1) U.S. Department of Defense (2008), Defense Science Board Task Force Report on Developmental Test and Evaluation, Washington, DC: Government Printing Office. (2) U.S. Department of Defense (2005), DOD Guide for Achieving Reliability, Availability, and Maintainability, Washington, DC: Government Printing Office. (3) U.S. Department of Defense (1981), Reliability Growth Management, Military Handbook 189, Washington, DC: Government Printing Office. ATEC Pamphlet June 2010 G-5

320 (4) Memorandum, Headquarters, Department of Defense (DOD), Under Secretary of Defense for Acquisition, Technology, and Logistics (USD (ATL)), 21 Jul 08, subject: Reliability, Availability, and Maintainability Policy. (5) Memorandum, Headquarters, Department of the Army (HQDA), Assistant Secretary of the Army for Acquisition, Logistics, and Technology (ASA(ALT)), 6 Dec 07, subject: Reliability of U.S. Army Materiel Systems. (6) Ellner, P. M., and Hall, J. B. (2006), Planning Model Based on Projection Methodology (PM2), Technical Report , U.S. Army Materiel Systems Analysis Activity (AMSAA). (7) Chairman of the Joint Chiefs of Staff Instruction (CJCSI) G Joint Capabilities Integration and Development System, 1 Mar 09. (8) Department of Defense Instruction (DODI) , Operation of the Defense Acquisition System, 8 Dec 08. G-6 ATEC Pamphlet June 2010

321 Appendix H Survivability H-1. Purpose ATEC is responsible for the evaluation of survivability of Army weapon systems, automated information systems, and other materiel throughout its life cycle in order to provide a meaningful basis for decision making. Although the exact procedures used to assess the survivability of any system may vary, the general approach is similar. The survivability approach must address the system s capabilities to avoid/evade as well as withstand the effects of the threat. Survivability analysis and evaluation is a process that must start early and continue throughout the system life cycle. H-2. Policy The system evaluator must coordinate closely with the Capabilities Developer, Materiel Developer and the Intelligence Communities to ensure consistency among the System Threat Assessment Report (STAR), capabilities document, and the TEMP. With the baseline established for these system threat requirements, the evaluator can define the survivability evaluation measures in the SEP. The evaluation plans must address all of the survivability elements consistent with the system s effectiveness, suitability, and survivability requirements and issues. The analysis and test efforts focus on providing the necessary system survivability characterization data and information to support the evaluation. Survivability aspects of materiel developed for the individual Soldier must be integrated with the function(s) to be performed by the Soldier. The evaluator must guide and support the analysis and test planning, as well as the data collection efforts, to ensure that these efforts are properly focused and will provide the appropriate data to support the system evaluation. H-3. Definition and Requirements Survivability is the capability of a system (materiel and crew) to avoid or withstand a man-made hostile environment without suffering an abortive impairment of its ability to accomplish its designated mission. Defense Acquisition Guidebook stipulates that unless waived by the appropriate authority, mission critical systems regardless of their Acquisition Category (ACAT) shall be survivable to the threat levels anticipated in their operating environment. System (to include the crew) survivability from all threats found in the various levels of conflict shall be considered and fully assessed as early as possible in the program, usually during the Engineering and Manufacturing Development Demonstration (EMDD) formally known as System Development and Demonstration (SDD) phase. Survivability against the full spectrum of battlefield threats must be considered in all systems acquisition programs, including new key developments, Non-Developmental Item (NDI) acquisitions, and any system upgrades that can impact the system s ability to withstand the specified threats. a. Survivability requirements are incorporated in the planning and execution of all aspects of a system s acquisition life cycle, beginning with the earliest phases. Capabilities Developers coordinate the formulation and staffing of survivability requirements during the formulation of ATEC Pamphlet June 2010 H-1

322 the requirements documents to ensure that the requirements are reasonable and appropriate. The threat statements and operational environments specified in the Initial Capability Document (ICD) guide the preliminary survivability planning. The Capability Development Document (CDD) and the Capability Production Document (CPD) identify the survivability KPPs, thresholds and objectives, define the Soldier survivability and system survivability requirements, and identify in general terms the threats to the system. b. The system s survivability requirements are based on the current and projected threats delineated in the STAR and capabilities documents. The Capabilities Developer develops and maintains the STAR through MS B, thereafter; it becomes the responsibility of the materiel developer. DA Deputy Chief of Staff for G-2 (Intelligence) approves the STAR and any revisions. Survivability requirements may include the following: Electromagnetic Environmental Effects (E3) Electronic Warfare (EW) Information Assurance (IA) Obscurants and Atmospherics Chemical, Biological, and Radiological-Nuclear (CBR) Environments Soldier Survivability (SSv) Ballistic Survivability/Lethality (LFT&E) H-4. Survivability Analyst Responsibilities The Survivability Analyst must: a. Serve as an ATEC System Team (AST) member. b. Define Survivability Evaluation Focus Areas (EFAs) and measures. c. Develop the Survivability input to the SEP and OMAR/OER. d. Support the T&E WIPT. e. Establish and coordinate the evaluation requirements with the combat and materiel developers and the threat community. f. Ensure consistency among the system threat assessment report (STAR), requirements documents, and TEMP regarding the Survivability threats and requirements, and the tests and analyses that must be conducted to provide input to the evaluation. g. Guide and support Survivability analysis and test planning, as well as data collection efforts. h. Conduct Survivability evaluation. H-2 ATEC Pamphlet June 2010

323 i. Support milestone evaluation requirements. H-5. Soldier Survivability AR established Soldier Survivability (SSv) as the seventh domain of MANPRINT as of 1 October The other six domains are Manpower, Personnel Capabilities, Training, Human Factors Engineering, Safety, and Health Hazard Assessment. SSv is unique and is composed of six additional components: I Reduction of Fratricide; II Reduction of Detectability; III Prevention of Attack; IV Minimization of Damage; V Minimization of Medical Injury; and VI Reduction of Physical and Mental Fatigue. ARL s Survivability/Lethality Analysis Directorate (SLAD) is designated as the Army lead for performing the SSv assessment. As a special exception, SSv testing and analysis related to the operation of improvised explosive device (IED) defeat systems is conducted under the auspices of the Joint IED Defeat Organization (JIEDDO). SLAD is supported by ARL s Human Research and Engineering Directorate (HRED) and by the U.S. Army Medical Research and Materiel Command for assessment of ACAT I, II, and a number of the larger ACAT III programs, whereas ARL HRED (primarily their field elements) performs SSv assessments on the remaining ACAT III programs, projects, and products. a. Issues for each MANPRINT domain are generated throughout a program s life cycle from concept to production phase, and again for any major modification. Each program develops a System MANPRINT Management Plan (SMMP), which contains all domain issues. Since the SMMP is not continuously updated, it may not show the latest issues, and reflect the issues that have been resolved and how they were resolved. b. An SSv assessment is a roll-up of the outstanding SSv issues and is used as input in formulating an ODCSPER position in preparation for the milestone decision review. The Survivability Analyst utilizes the SSv assessment information to develop MOEs, MOPs, and data elements in the Survivability plans and reports. c. The ATEC evaluator will obtain the SMMP from the maintainer (PM, TRADOC, and others), confirm the implications and status of these SSv issues through the ARL SLAD System Lead or the ARL/HRED Product POC, and utilize these issues in building the EFAs and measures in the Survivability portion of the SEP. H-6. Electromagnetic Environmental Effects (E3) E3 refers to the impact of the electromagnetic environment on the operational capability of military forces, equipment, systems, and platforms. E3 refers also to a system s electromagnetic effects environment and its impact on the system itself as well as on systems nearby. E3 encompasses electromagnetic compatibility (EMC); electromagnetic interference (EMI); electromagnetic vulnerability (EMV); electromagnetic pulse (EMP); hazards of electromagnetic radiation to personnel (HERP), ordnance (HERO), and volatile materials/fuels (HERF); Electrostatic Discharge (ESD); precipitation static (p-static) and the natural phenomena effects of lightning. EMC is normally divided into two areas, intra-system, and inter-system compatibility. Both deal with the ability of a system to operate in its intended electromagnetic environment without causing or suffering degradation. EMI is defined as any electrical or electronic ATEC Pamphlet June 2010 H-3

324 disturbance, phenomenon, signal, or emission (man-made or natural) that causes undesirable or unacceptable responses, malfunctions, or degradation of performance. EMP is a high magnitude; broadband energy burst that is radiated over very short time duration. HERP deals with the effects of personnel exposure to electromagnetic fields. HERO refers to induced conditions or effects caused by external electromagnetic radiation that might lead to premature detonation or duding of any electro-explosive device. HERF refers to induced conditions or effects caused by external electromagnetic radiation that might lead to premature ignition of volatile materials or fuel vapor; e.g. transmitting antenna may cause a static discharge of electricity and ignite fuel vapors during fueling operations. Lightning is electromagnetic pulses and skin currents usually split into two categories: near strike and direct strike. ESD is an electric current that can result following a build-up of static charge on a system, component, or human. P-static is a build-up of static charge on the surface of an object caused by the motion of the object in the environment (i.e., air, dust, or rain flowing over an aircraft, rotor, or missile) and the electromagnetic effects resulting from the subsequent discharges. a. The goal of the E3 program is to ensure that Army materiel will accomplish its intended mission in the electromagnetic environment expected in peacetime and wartime. Emitters, electrical motors, and nature (e.g., lightning) typically create the E3 environment. b. It is neither practical nor feasible to make every system/subsystem impervious to electromagnetic effects. Program sponsors, in coordination with the materiel and Capabilities Developers, must assess the E3 risk to their system, and determine whether the risk is acceptable. All activities responsible for procuring subsystems, component parts, or support equipment shall ensure proper coordination with program sponsor of the host system. The most stringent intended use of the equipment will be used to identify system shortcomings. Safety of personnel or munitions is critical. System protection is generally required to preclude unsafe situations. Program sponsors must take actions to ensure that their items are maintainable at an acceptable level of readiness to allow operation in the expected electromagnetic environment throughout the system s life cycle. c. The generic strategies to counteract E3 are (1) Protection design and manufacture electrical and electronic equipment with appropriate grounding, bonding, shielding, filtering, or protective circuitry. (2) Operational avoidance operates electrical and electronic equipment away from electromagnetic sources, eliminate particularly susceptible configurations/deployments, or eliminate reliance on susceptible items altogether. (3) Proliferation field electrical and electronic equipment in sufficient numbers to compensate for expected/known susceptibility to ensure accomplishment of the mission. (4) Mobility and dispersion mobilize and/or disperse electrical and electronic assets to increase survivability and compound targeting difficulties. This method is primarily employed to counter EMP and electronic attack (EA). d. The E3 evaluation/analysis process examines the following considerations: H-4 ATEC Pamphlet June 2010

325 (1) The intended operational Electromagnetic Environment (EME). (2) E3 waivers requested/granted. (3) E3 and performance criteria chosen. (4) E3 tests planned/conducted. (5) Methodology employed. (6) Results obtained. (7) Recommendations made. (8) Conclusions drawn. e. The evaluation/analysis will look at not only what was done, but also what was not done and the rationale behind the decision. The review and analysis process should look at the relevancy, adequacy, scope, and technical merits of the various E3 tests planned and/or conducted. Finally, the mission impact of both E3 environment-induced performance and operational degradation should be analyzed. H-7. Electronic Warfare a. Electronic Warfare (EW) is the use of electromagnetic or directed energy to control or deny use of the electromagnetic spectrum or attack the enemy. EW may be categorized as follows: (1) Electronic Attack (EA) The area of EW involving the use of electromagnetic or directed energy to attack personnel, facilities, or equipment with the intent of degrading, neutralizing, or destroying enemy combat capability. Counter radio-controlled IED systems typically fall under EA. (2) Electronic Support (ES) The area of EW involving actions to intercept, detect, identify and locate radiated electromagnetic energy sources for the purpose of immediate threat recognition and attack warning (3) Electronic Protection (EP) The area of EW involving actions taken to protect personnel, facilities, and equipment from the adverse effects of friendly or enemy employment of electromagnetic or directed energy weapons. b. EA can deny or disrupt sensor performance by signal denial or interference, deception, and partial or complete damage. Typical EA devices are Radio Frequency (RF) jammers, Infrared Electro-Optical (IR/EO) jammers, and RF/laser Directed Energy Weapons (DEW). High-Power Microwave (HPM) and Ultra Wideband (UWB) are subsets of RF DEW. EA tactics techniques are scenario and system dependent. ATEC Pamphlet June 2010 H-5

326 c. Radio Frequency jammers may use a myriad of noise or deceptive jamming techniques to degrade, neutralize or destroy enemy spectrum-dependent systems. Some common jamming techniques include barrage noise, spot noise, swept spot noise, range gate pull-off (RGPO), velocity gate pull-off (VGPO), side lobe, cross-eye, and Digital RF Memory (DRFM). Some of the effects caused by RF jamming include false alarms, reduced signal-to-noise ratios (SNR), false position/range velocity, or tracking errors. d. ES provides an enabling capability for-intelligence collection, threat avoidance, responsive targeting, and tactical actions, including the employment of countermeasures (i.e., flares, chaff, self-protect jamming, etc.) as well as ECCM (i.e., flare rejection, home on jam, and ability to discriminate between actual targets and decoys). Electronic Intelligence (ELINT) systems, Communications Intelligence (COMINT) systems, Radar Warning Receivers (RWRs), Laser Warning Receivers (LWRs), and other EM spectrum dependent detection systems provide this enabling capability. e. EP includes protective measures used to counter or nullify EA or ES threats. Examples of EP intended to counter EA include adaptive null or steerable null processors/antennas, spread spectrum modulation, and signal processing. Examples of EP intended to counter ES include radiated power management, and Low Probability of Intercept (LPI) or spread spectrum waveforms. Some examples of EP designed to counter both EA and ES include low side lobe antenna designs, frequency hopping/agility, deceptive radiation, and wartime reserve modes/frequencies. In addition to these EP techniques, which are technology solutions, there are operational EP techniques. For example, to counter enemy ES/EA capabilities, Emissions Control (EMCON) procedures can minimize the probability of intercept. H-8. Information Assurance (IA) Information superiority is one of the key goals for ensuring Warfighter access to modern information technology. Widespread use of modern information technology has led to a critical dependence of operations on information that may be vulnerable to attack. This vulnerability needs to be addressed throughout each new system s development and testing phases and on systems undergoing P3I and spiral development (evolutionary acquisition). Formalized evaluation of all systems IA capability is required to ensure that users are aware of a system s vulnerabilities to information attacks. a. These IA procedures apply to any system that collects, stores, transmits, or processes unclassified or classified information. b. The applicable DOD directives, instructions, and regulations that govern IA are DODD DODI , DODI , and DOTE policies for Operational Test and Evaluation of Information Assurance in acquisition programs. The Defense Acquisition Guidebook discusses IA and states, Information assurance requirements shall ensure availability, integrity, authentication, confidentiality, and non-repudiation of critical program technology and information.. This includes providing for the restoration of information systems by incorporating protection, detection, and reaction capabilities. Information assurance requirements shall be established and maintained throughout the acquisition life-cycle for all DOD-owned or controlled information systems that receive, process, store, display or transmit H-6 ATEC Pamphlet June 2010

327 DOD information, regardless of mission assurance category, classification or sensitivity. DODD S is the overarching classified document on Information Operations. The DOTE memorandum on OT&E of IA supplements these documents by formulating OT&E policy for IA. c. Responsibilities. (1) The AST Chair is responsible for (a) Overseeing the development, documentation, and execution of the IA T&E strategy for the assigned program and for reporting the results of all IA T&E conducted. (b) Requesting assignment of an analyst from the Survivability Directorate to support the development of an IA T&E strategy immediately following the creation of the AST. (c) Coordinating the IA Risk Assessments with members of the system IPT (AST Survivability Analyst, Capabilities Developer, Materiel Developer and Threat Representative) during the development of the IA T&E strategy. (d) Requesting assignment of an IA Test Team from the IA Test Cell (Electronic Proving Ground) when the system is assigned a risk level of 3 or 4. Risk levels definitions are provided in paragraph H-8d. (e) Overseeing the development of the IA T&E strategy by the Survivability Analyst (for risk level 1 and 2 systems) or by the Survivability Analyst and IA Test Team (for risk level 3 and 4 systems.) (f) Ensuring that the IA T&E strategy is coordinated with the AST, reported to the chain of command and incorporated into all appropriate documentation to include TEMP, SEP, OTA TPs, TRs and OMAR/OERs/SAs. (g) Ensuring that adequate resources are provided for in TEMPs and TRPs. (h) Reporting readiness for IA testing at appropriate OTRRs. (i) Ensuring that all IA test data are properly authenticated by the Data Authentication Group. (j) Integrating IA T&E analyses/assessments into the system SA/OMAR/OER and emerging results and final report briefings. (2) The Survivability Analyst is responsible for (a) Developing an appropriate IA T&E strategy in accordance with the system s established Mission Assurance Category (MAC) and Confidentiality Level (CL). In the absence of such guidance, conduct an IA Risk Assessment based on inputs from the Capabilities Developer, Materiel Developer, and Threat Representative. (For risk level 3 or 4 systems, this will be supported by the IA Test Team.) ATEC Pamphlet June 2010 H-7

328 (b) Updating the IA T&E strategy as necessary to reflect changes in the operational requirements and threat system development. (c) Developing IA test data collection and transfer requirements for inclusion in the SEP and OTA TPs. (d) Monitoring IA test execution and data management. (e) Participating as voting member in DAG for authenticating IA test data. (f) Coordinating transfer of IA test data from IA tester. (g) Analyzing all data from all IA tests and other sources. (h) Preparing IA assessment for inclusion in OMAR/OERs/OARs. (3) The IA Test Team is responsible for (a) Supporting the development of the IA T&E strategy for systems that require independent IA OT&E and/or DT&E (risk level 3 and 4 systems). (b) Developing the IA test strategy to include test policies, test procedures and test resource requirements. (c) Supporting the Survivability Analyst in developing requirements for formal transfer of IA test data. (d) Preparing detailed IA test plans to include test data collection, management, authentication, reduction, and database content and format. (e) Overseeing the test execution. (f) Supporting the Survivability Analyst during DAG proceedings. (g) Transferring IA test data to Survivability Analyst for the system evaluation. (h) Preparing IA test input to test reports, OARs, and OMAR/OERs. (4) AEC/SED is responsible for (a) Establishing an accredited assessment team for IA assessments. (b) Developing detailed methodology for levels 1 through 4 assessments described in paragraph H-8d. (c) Assigning personnel to each AST to support the IA T&E process. (5) DTC/EPG is responsible for H-8 ATEC Pamphlet June 2010

329 (a) Establishing and resourcing an accredited IA test cell 1 including Red Team capabilities. (b) Developing detailed methodology for level 3 and 4 IA testing. (c) Developing/acquiring appropriate instrumentation and tools to conduct levels 3 and 4 IA testing. (d) Providing detailed planning for level 3 and 4 testing for each T&E program. program. (e) Developing resource requirements for level 3 and 4 testing for each T&E (f) Executing levels 3 and 4 testing for each T&E program. (g) Preparing section on IA test results (levels 3 and 4 IA testing) for inclusion in Test Reports. (h) Supporting Survivability Analyst in assessing IA test results. d. Risk Assessment. The IA risk assessment includes the DIACAP process, the DOT&E Policy for Operational Test and Evaluation (OT&E) of Information Assurance (IA) in Acquisition Programs, and MAC and CL assignment of IA controls. CDPs are not approved unless they list MAC and CL assignment for the system. The DIACAP and DOTE policy require IA events and IA controls based on the MAC and CL assignment as defined in the supporting JCIDS documentation. The IA risk assessment must review the DIACAP Score Card, and plan of actions and milestones to mitigate susceptibility. The DOT&E Policy details a six step process and IA measures and time based metrics that must be captured for the evaluation of protection, detection reaction and restoration. e. Documentation. (1) IA EFAs, measures, and assessment approach will be integrated into all system T&E documents including TEMPs, SEPs, OTA TPs, TRs and OMAR/OER/SAs. The results of the Risk Assessment and IA T&E level assignment will also be included in the TEMP along with descriptions of the IA T&E strategy and resource requirements. (2) The SEP will provide greater detail of the IA T&E strategy to include precise definitions of all IA assessment measures and data required for their assessment, together with descriptions of the assessment approach. The SEP will also outline the specific events at which data are to be collected for the IA assessments and describe how the results from different events are to be aggregated. (3) Each OTA TP will detail the IA T&E data collection requirements including required test conditions/parameters and specifications for level 3 data handoff from the tester to the evaluator. Each will provide a detailed description of the planned IA testing and include resources specifically used for IA data collection and assessment. ATEC Pamphlet June 2010 H-9

330 (4) Each TR will include a detailed description of IA tests, all test conditions/parameters, and IA test results. Detailed descriptions of test limitations and variances to the OTA TP will be described separately. Each TR will include a data element dictionary that clearly defines all IA test data provided in the IA test database to include field name, field length, and datum type and datum options. (5) System assessments/evaluations will provide descriptions of all IA T&E events conducted to date, data collected at each event and results of all IA T&E at each event. They will accrue results of the analyses with each test event and will provide cumulative IA assessments over the system s T&E life span so that a current IA assessment will be available at any time following the IA Risk Assessment. They will also include separate, detailed descriptions of all T&E limitations and variances to the original plans. f. Coordination. (1) Risk Assessment: The system IA Risk Assessment will be coordinated among all the appropriate IPT members. The AST Chair will take the lead for this coordination. The AST Chair, the Capabilities Developer, the Materiel Developer and a threat representative (DIA, DCS G-2, INSCOM) will constitute a quorum for all IA T&E Risk Assessment decisions. (2) Joint T&E Programs: For those programs falling under the DIACAP, direct coordination with the System Security Authorization Agreement (SSAA) signatories throughout the acquisition cycle is necessary to minimize duplicative effort by the OTAs. Opportunities to meet operational requirements through concurrent testing will be maximized, particularly in DIACAP Phase 2 Vulnerability Assessments and Phase 3 Security Test and Evaluation and Penetration Testing. (3) All risk assessments and subsequent IA T&E strategies will be coordinated with the ATEC DCSOPS to assist in command-wide IA T&E resource planning and management. H-9. Obscurants and Atmospherics Most weapons systems incorporate optical or electro-optical sensors or other sensors with coverage throughout the radio and acoustic spectrum. The use of sensors improves weapons performance (lethality) and increases a host platform s survivability through an increase to lethality. Sensors will often employ software to aid in target discrimination, tracking, or to initiate fusing. When the performance of sensors or their software can be degraded by aerosols or obscurants either inherent or the operating conditions or introduced by threat operators, these conditions should be examined during test events. a. Obscurants, natural aerosols, and atmospheric effects that may be encountered on the battlefield may affect weapons employing sensors (and the software logic). In addition to performing missions in hot, basic, and cold environments, wet and dry conditions, and urban and open terrain, sensors will need to be effective in combat environments that will include conditions where target discrimination may be more difficult. Many factors will impact sensor performance: clutter (natural or battle-induced), optical turbulence from hot roads and terrain, dust from moving vehicles or munitions, burning crude oil, man-made smokes, rain, snow, fog, H-10 ATEC Pamphlet June 2010

331 etc. To determine the effectiveness of weapons systems using sensors, the weapons systems should be analyzed for performance in realistic combat environments that include some portion of these effects. b. The STAR and capabilities documents detail the threat and requirements for a system, often specifying the environments in which the system is expected to perform. In addition to all of the natural aerosols (rain, snow, fog, dust) battle-induced aerosols (dust, combustion debris, oil fires), and battle field interferents in the form of vapors (ammonia, cigarette smoke, paint fumes), the threat may employ man-made obscurants. Man-made obscurants are varied and are delivered in various forms. Petroleum distillate smokes (diesel-fuel-based, glycol-based, etc.), which are effective from the UV to the near IR (1 to 2 m) depending on the method of manufacture, are normally generated from vehicles or smoke generators. Exothermic smokes (phosphorous-based, chlorinated-hydro-carbon-based, etc.) are normally delivered as artillery munitions or self defense grenades or through smoke pots. Other more advanced particulate smokes can be delivered in various forms to affect even the mid-to-far IR. Chaffs are delivered by munitions and are tailored to affect specific radar bandwidths. H-10. Nuclear, Biological, and Chemical (NBC) Weapons There are two types of nuclear effects: Nuclear Weapons Effects (NWE) and NBC effects. NWE includes effects which occur after 60 seconds of a nuclear detonation (neutron induced gamma radiation and fallout) are addressed as residual nuclear contamination under CBR effects. CBR survivability is approached in terms of mission effectiveness by establishing an CBR defensive architecture appropriated for the system. Personnel survivability aspects are addressed by employing CBR defensive equipment and Tactics, Techniques, and Procedures (TTP) to ensure Soldier survivability. The materiel survivability aspects are addressed through CBR Contamination Survivability (CBRCS). NBC Defense Architecture Integrated NBC Defensive System Detection Individual Protection Collective Protection Contamination Control NBC Contamination Survivability Hardness Decontamination Compatibility M Training Medicine Figure H-1. Nuclear, Biological, and Chemical Survivability ATEC Pamphlet June 2010 H-11

332 a. Evaluations for personnel aspects of CBR survivability for Army systems are usually characterized as equipment integration issues. In most cases, the particular CBR defensive equipment has already undergone Army evaluation as an end item. Integration issues may include: Has the proper over pressurization of a vehicle or shelter been achieved? Has sufficient space been provided for detectors and alarms, decontamination equipment and Mission Oriented Protective Posture (MOPP) IV gear and are they readily accessible and operable? Other issues or questions are have the decontamination procedures been provided, demonstrated, and validated to (1) minimize the spread of contamination (contact hazards and off gassing) and (2) achieve the goals of the decontamination process? Are revisions needed in the logistic support, the decon procedures, and/or other operational procedures in order to accept the decontaminability level of the item? If the system is not decontaminable or sufficiently hardened, can the limitations(s) be addressed through doctrine, training, or other operational considerations? What is the impact of CBRCS on a system s life cycle cost estimate (LCCE)? b. The characteristics of CBRCS are hardness, decontaminability, and compatibility. CBRCS is required for mission essential systems and equipment. The requirements for CBRCS can be found in Defense Acquisition Guidebook and AR c. As defined in AR 70-75, Survivability of Army Personnel and Materiel, CBRCS is the capability of a system (and its crew) to withstand an CBR-contaminated environment and relevant decontamination without losing the ability to accomplish the assigned mission. It is important to note that although the definition states, a system (and its crew), CBRCS addresses materiel survivability only. A CBRCS system hardened against CBR contamination and decontaminants is decontaminable and is compatible with individual protective equipment. d. Department of the Army (DA) approved CBRCS criteria for Army materiel (figure H-2) are based on the philosophy that a Soldier crew surviving an CBR attack should be able to continue using mission-essential systems and equipment, in a full protective ensemble if necessary. The systems and equipment should be capable of rapid restoration to a condition such that all operations can be continued in the lowest protective posture consistent with the mission and threat, and without long-term degradation of materiel. HARDNESS: System s ability to perform mission-essential functions < 20% functional degradation DECONTAMINABILITY: Hazard posed by system residual or desorbed agent < 5% casualty risk (unprotected Soldiers) COMPATIBILITY: System s operability by Soldiers in MOPP IV < 15% performance degradation Figure H-2. DA-approved CBRCS Criteria for Army Materiel e. The criteria provide the values for hardness, decontaminability, and compatibility necessary for system evaluation. Deriving the data required for evaluation requires careful planning. For instance, it is often not practical to contaminate and decontaminate a very expensive system (especially for limited production quantities or one-of-a-kind items that will be H-12 ATEC Pamphlet June 2010

333 tested and fielded). The test and evaluation community, in many cases, must rely on available material databases and data from similar systems where information may be limited or lacking. f. An CBRCS evaluation involves understanding the Capabilities Developer s requirements and the Materiel Developer s approach to meeting those requirements. The Capabilities Developer s requirements define the mission profile(s) from which the mission-essential functions and tasks are determined. The Materiel Developer s system design (geometry, materials, and functionality) is then considered. The CBRCS evaluation (figure H-3) considers all relevant information from developmental and operational testing. Sources of data for analysis and evaluation include materials test results and databases, operational test data, and operator/observer feedback. 1. Mission profile (to determine exposure time). 2. Selected quantifiable mission essential function (materiel) and operation/maintenance tasks (Soldier) with associated system components. 3. Design/material/components/system review and analysis to identify agent/decontamination solution and accessible and vulnerable materials and components. 4. CB material database. 5. Material susceptibility to agent/decontaminant. 6. Specific and significant material property change (caused by agent/decontaminant). 7. Residual agent and desorption rate after contamination and decontamination. 8. Component/system agent testing, if existing data is not sufficient. 9. Time to perform tasks in MOPP IV and BDU. 10. Problems/comments noted by operators and observers. 11. System-specific decontamination procedures from the manufacturer and PM, in conjunction with the user requirements. Figure H-3. CBRCS Evaluation Data Requirements H-11. Nuclear Weapons Effects During the post-cold War era, the military faces the threat of nuclear weapons used by rogue nations seeking power in a regional war. The result will not be global annihilation; the future belongs to the survivors. The proliferation of these horrific weapons presents a grave and urgent risk to the United States and our citizens, allies, and troops abroad. Reducing this risk is an absolute priority of the United States (William J. Perry in Proliferation: Threat and Response, OSD, April 1996). ATEC Pamphlet June 2010 H-13

334 a. Tactical systems will not survive a direct hit from a surface burst or at ground zero (the point on the surface directly under the detonation) of a near-surface burst. However, as one proceeds outward from ground zero, the nuclear weapon effects decrease. At some point, depending on the size and height of burst of the weapon, there is a distance at which half of the Soldiers are expected to survive well enough to be able to complete their mission. The equipment must be functional for these survivors. These Soldiers may die within days or weeks. If their mission is important enough to cost their lives, the mission is important enough to complete, and therefore their equipment must be functional. The percent of Soldiers needed to survive in order to complete the mission is set by TRADOC and used by the U.S. Army Nuclear and Chemical Agency (USANCA) in defining system-specific requirements. Air blast, thermal radiation, initial nuclear radiation, and low-altitude electromagnetic pulse are the nuclear weapon effects resulting from a surface or near-surface burst. These effects occur within the first minute following detonation. b. High-Altitude Electromagnetic Pulse (HEMP) results when a nuclear detonation, even a relatively small one, occurs outside the Earth s atmosphere. HEMP does not affect personnel and is not concurrent with any personnel threatening effects (in contrast to low-altitude electromagnetic pulse). HEMP, however, creates a blanket pulse that can cover a whole theater of operations. The result is theater-wide loss of all susceptible electronic equipment with no impact on the health of the Soldier. Small systems that are found on the battlefield in large numbers may need to survive just HEMP and not the other nuclear effects of a surface or nearsurface burst. The idea is to prevent theater-wide loss, but accept the risk of losing some items of equipment in the case of a surface or near-surface burst. (1) As specified in AR 70-75, USANCA has the task of defining all nuclear survivability criteria for mission-essential systems. Criteria are numerical values specifying the level of each nuclear weapon effect the system must survive. AR also states that all mission-essential electronics must survive HEMP. HEMP criteria are not based on personnel survivability, but on U.S. MIL-STD-2169B, which has been incorporated into Quadripartite Standardization Agreement QSTAG-1031 and other international agreements. A system needing to survive at a given distance from a surface or near-surface burst has a requirement for all effects (HEMP as well as near surface-burst effects). Specific survivability criteria for all effects are assigned in accordance with requirements. Actual values are derived from international agreements (QSTAG-1031) based on the premise that the equipment must be as survivable as the Soldier and does not need to be more survivable than the Soldier. The actual values are also influenced by the location of the equipment on the battlefield. (2) The system response can be categorized into two main types. One is the structural or component response to the physical effects of air blast and thermal radiation. The other is the response of electronic and electrical components to the effects of HEMP, low-altitude electromagnetic pulse, and initial nuclear radiation. Initial nuclear radiation consists of neutron fluence, gamma dose rate, and gamma total dose effects. (Note this contrast to the alpha and beta radiation of the residual contamination, which is addressed as part of CBRCS.) c. The data required for evaluation for system responses can be obtained from a mix of testing and analysis. While it is sometimes completely possible to address blast and thermal H-14 ATEC Pamphlet June 2010

335 radiation survivability through modeling, the most frequent use of modeling is to determine how to minimize necessary testing (e.g., to a questionable exposed component). d. The quality of the evaluation depends on a clear understanding of what the system is required to do, what nuclear effects the system is required to survive, supporting analyses and testing, any modifications to criteria through the waiver process, and the battlefield impact of any open issues. e. The evaluator must proactively work problems in any of these areas. The evaluation predominantly addresses the battlefield impact of any open issues, but most of the evaluator s time is invested in having the right information available in time to write the evaluation. f. The SAR will define any problems uncovered by analysis or testing. It will describe any fixes, retest, waivers, or conditions of waivers. In the area of nuclear, chemical, and biological survivability, the waiver process for the Army is handled by a technical working group called the Nuclear and Chemical Survivability Committee Secretariat (NCSCS) with the recommendation going through the General Officer of the Nuclear and Chemical Survivability Committee and final decision by the DCS G-3. ATEC is a voting member of the NCSCS. If the PMO has significant problems with the standard process in terms of criteria, test, or evaluation, the problem should be brought before the NCSCS. The tester and evaluator should recommend fixes based on experience from other systems. If the tester or evaluator sees a problem that is beyond the state of the art or can only be fixed at exorbitant cost, he/she should recommend that the PMO seek the advice of the NCSCS. If appropriate channels are used at the right time, all open issues should be resolved. The OMAR/OER should be able to reflect a system that is capable of performing, as needed, following a nuclear event. The OMAR/OER bottom line will include the following nuclear survivability information: (1) Changes in criteria via the waiver process. (2) Change in system use that should be recommended to overcome a problem induced by a nuclear environment [an example would be to revert to a (manual vs. automated) process, where possible, if electronics fail]. (3) Caveats or special instructions that should be included in training and training manuals (and state whether such instructions are in manuals). (4) Summary of the system s overall capability to complete its mission following a nuclear event. (5) If necessary, any open issues caused by lack of information to support evaluation and the mission impact of those issues. H-12. Ballistic Survivability/Lethality Live Fire T&E (LFT&E) a. The ballistic survivability evaluation addresses system survivability to ballistic threats for programs not on OSD LFT&E Oversight list, but is either manned, have ballistic requirements, or are expensive to replace if destroyed. Ballistic survivability must consider acquisition ATEC Pamphlet June 2010 H-15

336 avoidance (don t be seen), hit avoidance (don t get hit if seen), and kill avoidance (minimize damage to crew or hardware if the system is hit). b. Acquisition avoidance will generally be captured under the E3, EW, or obscurant evaluations. Hit avoidance may be assessed under EW or obscurants if signature suppression, modification, or spoofing is employed. Hit avoidance will be assessed under ballistic survivability if active protection mechanisms are used to physically block or degrade an engagement by a threat s lethal mechanism. The ballistic survivability evaluation encompasses kill avoidance measures (assuming a hit) that include: Protection/damage tolerance from a lethal mechanism, Vulnerability reduction given a threat interaction, Graceful degradation given damage (to enable the crew to remove themselves from the engagement area), and Battle damage repair given damage (to enable the system to expeditiously return to battle). c. Major systems will be required to undergo mandated LFT&E. The LFT&E of a covered system is an integral part of the acquisition process. Lethality LFT&E issues address the ability of missiles and munitions to defeat their targets. Survivability LFT&E issues address the ability of platforms (ships, land vehicles, aircraft) to survive attacks against them with a primary emphasis on crew/passenger survivability. LFT&E affects design by making U. S. military warfighting materiel more lethal and survivable. LFT&E uses a building block approach to test and evaluate a system s vulnerability or lethality by beginning early in the development stage with component testing, followed by subsystem tests, and culminating in full-up system-level (FUSL) tests with test assets fully configured for combat. Although the LFT&E process is similar for all systems, it is never identical between systems due to the nature of the system. d. Mandated and Governing Regulations. (1) Title 10, USC, section 2366, contains requirements for vulnerability and lethality LFT of covered systems, major munitions programs, and product improvement to covered systems and major munitions programs. The legislation requires that covered systems designed to provide crew protection and those covered munitions programs are not to proceed beyond Low-Rate Initial Production (LRIP) until after they have completed LFT and a report submitted to the defense committees of Congress by the Secretary of Defense. (2) The Defense Acquisition Guidebook requires the test results and the complete service evaluation of all testing, identified in the LFT&E strategy, to be documented and submitted to DOT&E no later than 120 days after the FUSL-test completion. The guidebook also specifies mandatory procedures and formats. e. Although LFT&E programs may differ significantly in scope and timing, the level of maturity at various stages of the acquisition process is similar: By Milestone B, a decision shall be made as to whether the system meets the legislative criteria for a covered system/major H-16 ATEC Pamphlet June 2010

337 munitions program and the TEMP must contain an approved LFT&E strategy. Initial draft strategies should identify proposed issues, present data in support of the issues, and Live Fire Tests to be conducted throughout the acquisition process. By Milestone B, the TEMP must contain a mature strategy. The entire LFT&E program, to include testing, evaluation, and reporting, must be completed by the full rate production decision. f. Live Fire Strategies. (1) The Live Fire Strategy outlines the LFT&E for a system. ATEC is responsible for developing the strategy with the help of the Live Fire WIPT. The LF WIPT consists of members from ATEC (both the Live Fire tester for ground systems and the Live Fire evaluator), Army Research Laboratory (ARL) (both the Life Fire tester for aviation systems and vulnerability experts), system appropriate U.S. Army Center and School, Test and Evaluation Office (TEO), OSD (DOT&E/LFT), DCS G-2, and the system s PM. (2) The strategy defines the Live Fire critical issues, tests, documentation, responsibilities and resources to address the Live Fire critical issues. The LFT&E strategy provides the top-level description of the LFT&E portion of the system s test and evaluation strategy and is an integral part of the TEMP. Once approved, the LFT&E strategy provides the basic roadmap for what vulnerability/lethality testing and evaluation has to be conducted before transitioning to full-rate production. (3) The LFT&E strategies are an integral part of the TEMP as outlined in the Defense Acquisition Guidebook. The guidebook also states that the TEMP shall include a matrix that covers all tests within the LFT&E strategy, their schedules, the issues they will address and which planning documents the services propose for submission to DOT&E for approval and which are proposed to be submitted for information and reviews only. The process for submission is approved by the ATEC Chain of Command, through TEO to DOT&E. All the LFT&E planning documents directed in the LFT&E strategy to be forwarded for approval are developed and coordinated in the LFT&E IPT (chaired by ATEC) and include those OTA TPs and/or DTPs designated in the matrix for approval. g. Request for Waivers. A request for waiver from FUSL Live Fire Testing may be processed if such testing is unreasonably expensive and impractical. This must be prepared and staffed through DOT&E prior to MS B. Alternative plans for evaluating the vulnerability or lethality of the system shall be included with the request for waiver. Waiver authority resides in the USD (A&T) for ACAT ID programs, and in the Component Acquisition Executive for lessthan-acat ID programs. The Defense Acquisition Guidebook contains information for applying for waivers from LFT&E. h. LFT&E Testing. LFT&E uses a building block approach, gathering data and output to evaluate the vulnerability and lethality of systems through the use of component, sub system and FUSL testing and through validated and accredited modeling and simulation. The LFT&E makes maximum use of existing data and capitalizes on data gathered through the system development through the building block approach. The Defense Acquisition Guidebook repeats the requirement of a Detailed Test and Evaluation Plans. ATEC provides the OTA TP or a Detailed Test Plan to describe the detailed test procedures, test conditions, data collection and ATEC Pamphlet June 2010 H-17

338 analysis processes to be used during the conduct of each Live Fire Test. Defense Acquisition Guidebook provides additional detail on the content of the Detailed Test and Evaluation Plans required for the Full-up System Level Live Fire Tests. For FUSL Live Fire Tests, the LF OTA TPs and DTPs will be submitted for approval according to the following guidelines. The LF OTA TP is submitted to the ATEC Leadership 180 days before test initiation, and the DTP shall be sent through TEO 60 days before test initiation. (The LF OTA TP precedes the DTP because information in the LF OTA TP is required to generate the DTP). TEO forwards the LF OTA TP and DTP to DOT&E for comment/approval at least 30 days before test initiation for those events designated in the strategy for approval at the DOT&E level. DOT&E has 15 days for submission of comments subsequent to its receipt of the plans. Note: in some cases the FUSL LF DTP may not require formal DOT&E approval. The document approval matrix located in the LFT&E Strategy (part of the TEMP) identifies which plans require DOT&E approval vs. DOT&E review/informal coordination. When DOT&E approval is not required, the DTP is approved by DTC Headquarters or HQ ATEC. (1) FUSL LFT. The FUSL LFT&E test is conducted by the DTC for ground systems and lethality, and by ARL/SLAD for aviation systems. Prior to any shots being fired, the systems involved will be thoroughly inspected by all groups involved with the test and evaluation. Testing is usually conducted on or in highly instrumented test ranges that are specifically designed for this purpose. These ranges have provisions for an enormous amount of data collection and special features to ensure the safety of personnel, test items, and the environment. (2) Target Vehicles. The target vehicles used in FUSL LFT&E testing are, in general, fully stowed with ammunition, fuel, and supplies. The vehicles should be fully operational and actually running at the time the test shot is fired. Special exceptions are made to remove ammunition or to use a less than fully operational vehicle when the inclusion of these is expected to cause an extensive amount of damage. The test results must then be extrapolated to determine what might have happened if the system would have been fully operational or the ammunition would have been present. (3) Post Shot Analysis. ARL/ SLAD, in conjunction with the DTC for ground systems, is responsible for the post-shot damage assessment at the conclusion of each test shot. After ARL/SLAD has conducted the preliminary analysis, the appropriate Army Center and School will then conduct a Battle Damage Assessment and Repair (BDAR) analysis/exercise to determine the system s ability to return to an operational condition. This BDAR analysis will also be used to develop new procedures to be taught or tools and parts to be carried on the vehicle to increase its ability to be brought back to operational status after being hit by a threat munitions. (4) The FUSL LFT&E planning documentation must contain at least the following information: (a) A cover page providing the name of the system, the activity/agency responsible for preparation of the Plan, date, classification, and applicable distribution statement. (b) A coordination sheet containing signatures of service approval authorities. H-18 ATEC Pamphlet June 2010

339 (c) Administrative information: name, organization, telephone, and addresses of key LFT&E personnel. d) Description of threat weapons or targets that the system is expected to encounter during the operational life of the system, and the key characteristics of these threats or targets which affect system vulnerability and lethality; a reference to the specific threat definition document or authority; a discussion of the rationale and criteria used to select the specific threats/targets and the basis used to determine the number of threats and targets to be tested and evaluated in LFT&E. (e) If actual threats/targets are not available, then the plan must describe the threat/target surrogate to be used in lieu of the actual threat/target, and the rationale for its selection. (f) A statement of the test objectives in sufficient detail to demonstrate that the evaluation procedures are appropriate and adequate. (g) A description of the specific threats or targets to be tested including a detailed configuration and stowage plan (to include payload configuration) for each shot. Describe the rationale or operational scenarios on which the target configuration and stowage was based. (h) A listing of any differences between the system to be tested and the system to be fielded. As specifically as possible, identify the degree to which test results from the tested configuration are expected to be representative of the vulnerability or lethality of the fielded systems. (i) Identification of any test limitations, particularly any potential loss of realism from absence of components, arising from the use of surrogates, from the removal of fuses on stowed ammunition, or any other environmental, safety or resource constraints. Identify the impact of these limitations on test results. (j) A description of the shot selection process. Describe the process to be used to establish the test conditions for randomly selected shots, including any rules ( exclusion rules ) used to determine whether a randomly generated shot may be excluded from testing. For engineering shots (i.e., shots selected to examine specific vulnerability or lethality issues), describe the issue and the associated rationale for selecting the specific conditions for these shots. List the specific impact conditions and impact points for each shot, and whether it is a random or engineering shot. (k) A detailed description of the test approach, test setup, test conditions, firing procedures, damage assessment and repair process, planned test sequence, instrumentation, data collection and analysis procedures, and responsibilities for collecting and documenting test results. Include any standard forms that will be used to document test results. (l) A prediction of the anticipated results of each shot. These predictions may be based on computer models, engineering principles, or engineering judgment. Detail should be consistent with the technique used for casualty/damage prediction. ATEC Pamphlet June 2010 H-19

340 (m) A detailed description of the analysis or evaluation plan for the Live Fire Test. The analysis or evaluation plan must be consistent with the test design and the data collected. Indicate any statistical test designs used for direct comparisons or for assessing any pass or fail criteria. (n) A general description, including applicable references, of any vulnerability and lethality models to be used to support shot-line selection, pre-shot predictions, or the analysis or evaluation. This material should include a discussion of model algorithm or input limitations, as well as references to the sources of key model inputs. (o) A detailed description of the approach to analyzing and mitigating the potential environmental impacts, consequences, or effects of the test activities, unless adequately described elsewhere. i. LFT&E Reporting. (1) Upon conclusion of the FUSL Live Fire Test, DTC or ARL/SLAD prepares the Live Fire Test report, which is completed and delivered through HQ ATEC through TEO to DOT&E within 120 days of the end of the test. It is ATEC s responsibility to prepare the OMAR/OER for the Army. This OMAR/OER is based on the information provided in DTC s Final Test Report (FTR). The OMAR/OER then expands this information and looks at the systems overall survivability or lethality. The OMAR/OER makes extensive use of models and simulations to look at a magnitude of situations that the test item may encounter in combat. These simulations make extensive use of vulnerability/lethality files developed by the ARL with use of the vulnerability/lethality tests and LFT&E tests. In most cases, the OMAR/OER then looks at how the system s performance will affect the outcome of a simulated battle using one of several combat models. (2) After completion, the OMAR/OER is submitted through TEO to DOT&E. A copy furnished (CF) will also be sent to the DOT&E action officer at the same time. TEO has the responsibility to forward the OMAR/OER, in conjunction with the DTC or ARL/SLAD Live Fire Test Report, to OSD 45 days prior to the FRP DAB. (3) Parallel to the OER, OSD prepares a Beyond Low Rate Initial Production (BLRIP) report. The BLRIP is submitted to Congress to meet the requirements spelled out in the legislation. H-20 ATEC Pamphlet June 2010

341 Appendix I MANPRINT I-1. Purpose Manpower and Personnel Integration (MANPRINT) is an engineering analysis and management process to identify and articulate requirements and constraints of human resources, human performance, and hazards to personnel to ensure the human is fully and continuously considered in the system design. The assessment of MANPRINT is an essential element of a system s evaluation strategy at each decision point. MANPRINT considerations include the ramifications to overall system status when human performance degrades over time. Also, recommendations as to how risks to system performance due to MANPRINT issues can be mitigated should be addressed. Both system design and operator/maintainer issues can be a source of MANPRINT issues. (See AR ) I-2. Policy The Army MANPRINT policy is documented in AR Evaluation planning is conducted from the mission level to the system level, i.e., top-down, while evaluation execution is conducted from the system level through IOTE entrance criteria into the mission level, i.e., bottom-up. A successful MANPRINT evaluation will integrate Human Systems Interface functionality appraisals with mission level appraisals. MANPRINT consists of seven domains: Human Factors Engineering (HFE), Manpower, Personnel, Training, System Safety, Health Hazards, and Soldier Survivability. The system level MANPRINT assessment influences the mission level evaluation planning through the investigation of projected mission level impacts identified by system level concerns. The MANPRINT evaluator is part of the overall system AST. a. System Focus Evaluation Strategy. The MANPRINT evaluator s primary MANPRINT effort during the System Level evaluation is to identify MANPRINT problems and foresee their impact on the Mission Level evaluation. Therefore, the MANPRINT evaluator may find it necessary to encourage the developers timely consideration of MANPRINT in anticipation of the evaluation of system effectiveness and suitability. In addition, feedback from developmental testing may well influence MANPRINT design. b. Mission Focus Evaluation Strategy. The major concern of the MANPRINT evaluator s effort during the mission level evaluation is to address the following: Can the unit, given appropriate training, use the system as designed to successfully carry out their tasks in support of the mission? Thus, the MANPRINT component of the Mission Level evaluation will consist of studying the impact of the Human Systems Interface on mission accomplishment. I-3. System Focus Evaluation Strategy Development The Human Systems Interface is the primary concern during the system level evaluation. As such, the system level of evaluation will be the initial focus area for MANPRINT evaluations. ATEC Pamphlet June 2010 I-1

342 a. Human Factors Engineering Domain. (1) The primary thrust of the MANPRINT evaluator s Human Factors effort will be to demonstrate that no aspect of the HFE design will impede the crew in the conduct of their mission level tasks, or potentially cause any critical safety or health hazards. In other words, the layout of controls and information displays will have to be user friendly. Because equipment performance cannot be determined independently of human performance, the total system evaluation design includes hardware and software engineering (the Machine); and leaders, operators, maintainers, and support personnel (the Man). HFE studies the integration of human characteristics into system design to optimize man machine performance under operational conditions. Consequently, most of the HFE work will be done during the DT phase, as system functionality is being developed and bending metal is still a viable option to fix system problems. As the system proceeds along the materiel acquisition cycle, the cost of metalbending solutions increases. So, in addition, does the likelihood that crew procedures will become laden with workarounds, in lieu of the implementation of proper HFE solutions. Because it is so important to document HFE problems as early on as possible, the system evaluator shall (a) Identify Soldier performance and other human factors problems that are correctable by design changes to hardware and software. (b) Encourage the materiel, combat, and training developers to conduct appropriate testing with mock-ups, prototypes, or competing hardware and software early enough to influence the final system design. (2) Human Factors Engineering. The goals in assessing HFE during the system evaluation are to: (a) Ensure that the equipment is easy to operate, maintain, and support. (b) Minimize the time required to accomplish mission tasks. (c) Minimize the chance for operator error and accident. (d) Minimize the training needed. (3) Human Task Performance. The MANPRINT evaluator should consider the limitations and capabilities of Human operators, System Equipment, operator expectations of task sequence or functionality, Human Maintainers, Embedded fault detection and diagnostics. Job-task performance data can be obtained from checklists, interviews, questionnaires, subject matter expert (SME) observations and ratings, direct performance measures, and video (time and motion studies). (4) Human Reliability. The MANPRINT evaluator should consider various sources of human error, as appropriate: Soldier aptitude, training, equipment configuration inducing human error, and environmental conditions inducing human error. The PM, through the MANPRINT corrective action and review process, should be encouraged to eliminate typical sources of human error through HFE design modifications. Reasons for uncompleted and/or unsuccessful I-2 ATEC Pamphlet June 2010

343 tasks should also be determined. Such reasons may include insufficient manpower; inadequate aptitudes; training deficiencies; human factors design deficiencies; lack of, or poor, job performance aids; and lack of feedback devices. Regardless of the domain to which a problem may be attributed (e.g., Training) there may be an HFE solution (such as making the system simpler to operate). (5) Workload. Depending on the type of system, the MANPRINT evaluator may want to study crew workload. This study should include: (a) Cognitive Workload: Information processing demands, memory requirements, learning and retention requirements, and sensory discrimination requirements. (b) Physical Workload: Task overload, biomedical considerations, strength and endurance considerations (this issue will interact with the personnel issue of Soldier physical profile), whether the number of Soldiers planned to perform critical tasks is sufficient to meet the system performance requirements, and ease of maintenance. (Does the system require major dismantling for access to frequently replaced components? Are built-in self-diagnostics feasible?) (c) Equipment and Workspace Design: The system evaluator should consider the effects of the human-computer interface, that is, interface compatibility with the capabilities of the target audience. Poor HFE interface design (or poor training, or both) could be reflected by the appearance of task repetitions, increases in error rates, excessive use of online help or system documentation, requests for assistance, and the time-honored appearance of Soldier complaints. Other factors to address are ergonomic considerations, anthropometric data, stress, heat stress, psychological stress, continuous operations, fatigue, isolation, crowding, and CBR conditions. (Can the operator perform all required tasks in the prescribed manner while wearing MOPP IV or other special equipment?) b. Manpower Domain. (1) The MANPRINT evaluator s manpower effort will study whether the crew size is adequate to accomplish the mission tasks. In most cases a new Army system shall be required to utilize no greater manpower than the system it replaces. The MANPRINT evaluator should assess the impact of excessive manpower demands on the operation of the system. study: (2) The MANPRINT evaluator should consider the following manpower focus areas for (a) The adequacy of the total manpower requirement. (b) The adequacy of the distribution of ranks/grades. (c) The adequacy of the MOS/ASI/SQI needed to operate the new system. (d) The impact of operator demand on the capability to carry out the mission tasks in the expected operational environment. ATEC Pamphlet June 2010 I-3

344 (e) Identification of labor-intensive (high-driver) tasks, which might result from either hardware or software interface design problems. (f) Whether increased workload can be designed out by increasing system performance. operations. (g) The adequacy of the number of operators needed for 24-hour continuous (h) The adequacy of the number of shifts needed to support MOPP IV operations. (i) Identification of the trade-offs between Manpower and Personnel capabilities, training, and HFE. (j) Whether any tasks currently performed on other systems are likely to be assigned to system under test (SUT) operators as additional tasks and whether those tasks are supported by the SUT c. Personnel Domain. The MANPRINT evaluator s personnel effort will study whether there is a correct mix of MOS and skill levels needed to use and operate the system. (1) Purpose of Personnel Domain: Skill creep is a continuing problem with the emergence of new technology systems. It is necessary for the MANPRINT evaluator to assess whether the MOS and skill levels needed to use and operate the system exceed those of the old system, unless such increases have been authorized. A representative sample of the target population must be used during T&E to obtain a proper measure of system performance. If aptitude constraints affect system use, they should be identified. (2) Target Audience Focus Areas: The Target Audience Description (TAD) defines the qualifications of the users and operators. Potential focus areas for the MANPRINT evaluator to consider are: (a) Identification of the knowledge, skills and abilities required by the new system. (b) Comparison of the required knowledge, skills and abilities to the aptitudes that the target audience possesses. (c) Identification of any aptitude-sensitive critical tasks. (d) Identification of any physical limitations (color vision, acuity, hearing, strength). (e) Identification of the physical standards for the operators and the maintainers. (f) The expected mental category distribution. (g) The minimum reading grade level for the operators and maintainers. (h) Identification of the security requirements for the operators and maintainers. I-4 ATEC Pamphlet June 2010

345 d. Training Domain. (1) There are a number of focus areas within the training domain: (a) The adequacy of the training of the Soldiers/operators to perform their system critical tasks. (b) The transfer of the effects of training to the performance of test Soldiers under test conditions. This helps in determining whether the training has performance validity (i.e., what is learned in the classroom is effective on the job). (c) Awkward hardware and software characteristics, and difficult operational procedures, which cause a decline in Soldier performance. (d) The instruction or education, and-on-the job or unit training needed to provide personnel their essential job skills. The system must be designed so that the specified target population can be readily trained. (e) The kind of training and proper Soldier/operator aptitudes needed to acquire the skills necessary to perform their system tasks. The various types of training are individual, crew/team, unit collective (e.g., platoon, company and higher echelons), and sustainment or refresher training. (2) Training Plans. The MANPRINT Evaluator should obtain the following information from the TRADOC Capabilities Manager (TCM): (a) How training is to be given: embedded and/or hands-on equipment; training aids devices, simulators, or simulations (TADSS), where simulations may be constructive (based on math or computer models); virtual using digital imagery; or live in the field. (b) Where training is given: school, unit organization, on-the-job (OJT), home (i.e., correspondence courses), or training centers (e.g., National Training Center, Fort Irwin, CA). (c) What training is to be given: specific system or combat tasks. (d) Who conducts the training: what instructors are needed and what special skills are required. (e) The target audience description. (f) The timeframe and length of training and how it is to be distributed. (g) The reasonableness of the training objectives. (h) The need for refresher courses and how often. be: (3) Possible Measures of Performance. Some measures of performance to consider may ATEC Pamphlet June 2010 I-5

346 (a) Percentage of tasks performed to standard. (b) Time taken to achieve qualification. (c) Number of practice trials taken to achieve qualification. (d) Percentage failures to follow procedures. (e) Percentage tasks not trained to standard. (f) Percentage tasks inadequately trained. (g) Percentage tasks incorrectly trained. (h) Adequacy of training equipment or materials. (4) Training Strategy Considerations. The evaluator will want to know the following: (a) Whether the target audience is known and documented. (b) The system s critical tasks. (c) Whether the target audience and/or trainers have participated in planning the training strategy. (d) Lessons learned during the Instructor and Key Personnel (IKP) training. e. System Safety and Health Hazard Domains. (1) The MANPRINT evaluator s system safety effort will study whether there are any system characteristics that may potentially injure the crew and whether system characteristics may be harmful to the crew s long-term health. These closely related domains, when properly studied, will ensure that the system does not harm the Soldier-user. (2) System Safety and Health Hazard Focus Areas. The primary focus of these domains is studied at the system level during developmental testing. Ensure that developmental testing includes a front-loaded test design so that safety data can be collected as early in the test as possible. Also ensure that specific safety tests are performed on critical devices or components to determine the nature and extent of hazards presented by the item. Potential mission impacts resulting from these domains should be assessed at the mission level during operational testing. MANPRINT evaluators will want to ensure that the following evaluation tasks are accomplished: (a) Monitor the fixes to hazards that were identified in earlier developmental testing and user testing. (b) Identify any undiscovered potential hazards to Soldier safety and health. (c) Assess whether the restrictions of the Safety Release are suitable for training situations and will permit adequate operational testing. I-6 ATEC Pamphlet June 2010

347 personnel. (d) Monitor the conditions during test that can identify suspected hazards to (e) Monitor the conditions that lead to hazards, as required by operational test issues and criteria and the Safety Release. (f) Encourage health and safety professionals to identify potential system hazards and operational test issues. f. Soldier Survivability Domain. (1) The focus of this domain is to determine whether the threat of fratricide is low; whether the probability of Soldier detection, attack, and injury is low; and whether Soldier fatigue is minimal. (2) MANPRINT evaluators need to identify any problems in these areas: (a) Possible attack by other U.S. or allied units because of misidentification, weapon system design characteristics, or inadequate training. (b) Easy detect ability of the system because of its prominent physical signature. (c) Appearance of the system as a high value target. (d) Minimizing crew injury if system is attacked, system ability to protect the crew from attacking weapons and from on-board hazards (e.g., munitions, fuel), as well as the ability for rapid evacuation of the injured. (e) Reduction of physical and mental fatigue and the effect of environmental stressors on the crew. I-4. Mission Focus Evaluation Strategy Development Based on what is learned during the system level evaluation and potential mission impacts from each MANPRINT domain studied, the MANPRINT evaluator should be able to develop the focus areas to be studied during the mission level. Some system level concerns will signal focus areas that require correction to facilitate mission success. Some will not. Sound judgment will be required. The MANPRINT evaluator will have to determine which system concerns are merely shortcomings and which are significant deficiencies. All system concerns that are significant deficiencies and have a high potential to negatively impact on mission success (that is, they pass the so-what test) should be pursued at the mission focus evaluation level. Some new mission level concerns might result from operational testing and will likely cause the MANPRINT evaluator to reexamine the evaluation. a. Human Factors Engineering Domain: The focus of this domain during the mission evaluation will be to demonstrate that no aspect of the systems HFE design, based on the system evaluation focus, impedes the capability of the Soldiers to perform their mission tasks. Most of the HFE work will have been done during system development as System functionality is being ATEC Pamphlet June 2010 I-7

348 developed and bending metal is still a viable option to fix system problems. During the mission level evaluation, the MANPRINT evaluator should determine whether any of the system level problems identified remain consequential to mission success. b. Manpower Domain: The focus of this domain during the mission evaluation will be to assess whether the level of Manpower was sufficient to accomplish the mission. c. Personnel Domain: The focus of this domain during the mission evaluation will be to assess whether mission level success was reduced by characteristics inherent to the MOS and skill level of the representative test players. d. Training Domain: The focus of this domain during the mission evaluation will be to assess the impacts of any training shortcomings or deficiencies, found during the system level evaluation, on the capability of the Soldiers to perform their mission tasks and consequently the impact on mission success. e. System Safety and Health Hazards Domains: The focus of these domains during the mission evaluation will be to review mission level events to ensure that no system safety or health hazard concerns have slipped through the system development process. I-5. MANPRINT Assessment Process a. MANPRINT Assessment Overview. The MANPRINT evaluator should convene several MANPRINT Assessment Conference (MAC) meetings during the T&E process. Representatives of the AEC MANPRINT evaluator, TCM, and PM should be invited to attend. The purpose of the MAC process is to identify and correct the system level problems prior to OT, and to project the impact of uncorrected system level problems on mission performance. The MAC will propose the responsible agency for any required fixes. The plans for corrective action and demonstration of fixes will be provided to the MAC. The final recommendation on each MANPRINT problem found, and the rating of its impact on mission accomplishment will be determined by the MAC members. At the OTRR, the AST Chairman will categorize the current MANPRINT status of the system readiness for test as Green, Amber, or Red. Further MAC meetings will take place as operational test (OT) data are being released by the tester. (1) Objective. The objective of the MAC process is to ensure that all MANPRINT problems are assessed correctly and receive appropriate attention. In this way, system problems are corrected and the Soldier has the best possible equipment available to accomplish his mission. (2) Scope. Incidents addressed by the MAC include all incidents that affect the ability of Soldiers to accomplish their mission tasks to use or operate the system. These incidents may involve individual or team performance, human resources, or hazards to personnel and equipment. From the perspective of the Soldier who will use the system in combat, the best solution to any human factors problem is to engineer it out of the system. Therefore, it is especially important to include Soldier performance considerations in any list of system problems or deficiencies. Problems addressed by the MAC members can come from a variety of sources. Some may come from the MANPRINT assessment, which is conducted as part of the I-8 ATEC Pamphlet June 2010

349 operational test; some may come from system hardware performance data, and others may come from MANPRINT incident reports (MIRs). The final outcome of MAC procedures will be a list of MANPRINT problems, precisely defined and ranked in terms of severity with respect to mission accomplishment. (3) Definitions. (a) MANPRINT Incident Report (MIR): Any event that indicates a characteristic of the total system that adversely affects the ability of Soldiers to operate the system. Included are problems with Soldier performance, human resources, or hazards to Soldiers or equipment. MIRs may develop at any point along the system acquisition process. They will include developmental test (DT) and operational test (OT) test incident reports (TIRs) that are attributable to one or more MANPRINT domain. (b) Incident Description: A concise and accurate statement describing the MANPRINT incident. The statement describes human or system performance that is below standard, or a hardware or software feature that has an adverse impact upon human or system performance. (c) MANPRINT Impact Rating: The AEC MANPRINT evaluator s judgment of the likely impact of a MANPRINT problem on mission accomplishment. This impact may be direct, in terms of degradation or cessation of system performance, or indirect, in terms of immediate or long-term detrimental effects upon the crew. This rating is established using the MANPRINT Incident Impact Rating Scale. This scales problems along two dimensions of subjective judgment: severity and probability, as illustrated in figure I-1. (d) MANPRINT Deficiency: An undesirable system characteristic that most likely will degrade mission accomplishment. (Impact Ratings of IA-E, IIA-D, IIIA-C, IVA-B,VA) (e) MANPRINT Shortcoming: An undesirable system characteristic that most likely will not degrade mission accomplishment. (Impact Ratings of IIE, IIID-E, IVC-E, VB-E) ATEC Pamphlet June 2010 I-9

350 Probability Severity I II III IV V A Deficiency Deficiency Deficiency Deficiency Deficiency B Deficiency Deficiency Deficiency Deficiency Shortcoming C Deficiency Deficiency Deficiency Shortcoming Shortcoming D Deficiency Deficiency Shortcoming Shortcoming Shortcoming E Deficiency Shortcoming Shortcoming Shortcoming Shortcoming Severity I Catastrophic: total loss of system, or fatality of crew II Profound: mission failure, or severe injury of crew III Severe: lessened ability to perform mission or injury to crew IV Serious: mission performance degraded or light injury to crew V Slight: some system capability degraded, or unnecessary strain on crew Probability A B C D E Usually Frequently Occasionally Rarely While possible, very remote Figure I-1. MANPRINT Impact Rating Scale (4) Procedures: The procedures for processing MANPRINT incidents and implementing the corrective action process are explained in detail below: Step 1: MANPRINT data collectors complete information on each MANPRINT incident in a MANPRINT Incident Report (MIR) as soon as the event is observed. Step 2: MANPRINT analyst back-briefs tentative MIR findings, as appropriate, to test player Soldiers, data collectors, and other Test Directorate and Evaluation Directorate personnel in a series of group debriefings. They discuss appropriate MIRs to confirm the validity of tentative findings and to assist the MAC in prioritizing the Incidents. Step 3: After the group debriefings, the MANPRINT analyst compiles all the MIRs into a MANPRINT Problem Summary Sheet. This summary sheet identifies the unique I-10 ATEC Pamphlet June 2010

351 MANPRINT problems identified and serves as a worksheet or draft of the MANPRINT Report; the summary sheet includes the following information: Unique MANPRINT problem number. Problem title. MANPRINT domain. Problem description. Problem classification (shortcoming or deficiency). Contributing causes. Probable mission task consequences. Mission impact severity rating. Potential solutions. Responsible agent. Step 4: The members of the MANPRINT Assessment Conference (MAC) examine each MANPRINT problem and assist the MANPRINT evaluator to determine whether the problem has been correctly described and verified. Step 5: The MAC chairman then determines the severity of the MANPRINT problem on mission task performance. Step 6: The MAC then proposes which agency is responsible for developing a solution, and when that solution will be presented to the MAC. Step 7: When the responsible agency presents the solution to the MAC, the mission impact will be reassessed. Step 8: When no more MANPRINT fixes are possible prior to the writing of an OAR/OMAR/OER, the AST chairman rates the severity of the outstanding problems using the MANPRINT Problem Impact Rating Scale. He then enters the severity rating in the column marked Mission Impact Rating on the MANPRINT Problem Summary Sheet. Step 9: This Summary Sheet portrays the ordering of MANPRINT problems as determined by the MANPRINT evaluator. The problems are listed in order of rated severity. For simplicity, within a given severity categories the problems are listed in the sequence of their problem numbers, which, for audit trail purposes, are never changed. Step 10: The Summary Sheet is then briefed to the MAC members to see if there are any objections to the rating. If so, the objections will be discussed and the MANPRINT evaluator may choose to change the rating. Step 11: The final version of the Summary Sheet is input to the MANPRINT portion of the OAR, OMAR, or OER. ATEC Pamphlet June 2010 I-11

352 Step 12: The evaluator also incorporates the MANPRINT problem summary findings into the OMAR, and other report documents as appropriate. b. Questionnaire Administration. Good questionnaires produce useful information. As the system evaluation proceeds, the MANPRINT analyst will want to learn the consensus view among the test players. This will indicate where to go next, whom to interview, and what to ask. To get there, the analyst will need to design questionnaires from the top down. That is, start out by asking yourself what focus areas you want the questionnaires to address. Then begin to develop relevant items that will yield objective, quantifiable data. Quantifiable data lend themselves to reduction and analyses much more readily than do open-ended subjective textbased questionnaire items. Questionnaires will usually be written by the MANPRINT evaluator and administered by the tester. For more information, refer to AEC s Evaluator Handbook. c. Structured Interviews/Debriefings. These are scripted sessions, particularly useful in a follow up to a questionnaire administration, or a MANPRINT Incident or other system test anomaly. The sessions should be conducted by a subject matter expert interviewer/debriefer. These sessions are much more labor intensive than questionnaire administration, and are best suited to gleaning information from an individual or a small group. While the questionnaires will give group consensus data, the sessions will enable attention to be focused on specific issues that require investigation in depth. d. After Action Reviews. AARs will provide an opportunity to conduct informal group debriefings with the test players. Unresolved MIRs will form the basis for most of the guided discussion. I-12 ATEC Pamphlet June 2010

353 Appendix J Data Analysis J-1. Purpose This appendix provides policies, responsibilities, and general procedures for development of data models and for data analysis. When ATEC Regulation 73-1 is updated to include this policy, this pamphlet will be updated to delete the policy portion of this appendix. J-2. Policy a. The AST is responsible for developing and coordinating the integrated T&E (IT&E) program for both major and non-major acquisition systems under test in support of the acquisition decision making process. This IT&E process provides a structured approach to supporting this task and data collection, management, and analysis are integral components in that process. It is the analyst s role, working in concert with the tester, and evaluator, to provide the fact based underpinnings for T&E findings and recommendations. b. Data analysis, in support of the ATEC T&E mission, is a complex, collaborative process that requires detailed planning and coordination. ATEC Regulation 73-1 defines ATEC policies and responsibilities for the T&E of Army, Joint-Service, and Multi-Service systems and forms the foundation for T&E data collection, data management, and analysis. c. As ATEC becomes increasingly more dependent on information and information technology systems to perform its T&E mission, the need to ensure unambiguous data exchange grows correspondingly. A solid, complete, well-documented data model is critical to accurate data exchange, which is a primary reason why the ATEC Regulation 73-1 requires the Evaluator, with input from all AST members, to design the system data model. The ATEC Regulation 73-1 also requires data models for Early Strategy Reviews of DOD oversight programs, for all Operational Test Agency Test Plans (OTA TPs), Concept In-Process Reviews, and dummy database validations. Solid data engineering is required to perform good data modeling and in an organization as geographically and culturally diverse as ATEC, a collaboration mechanism to perform data engineering is needed. ATEC has created the ATEC Reference Repository to support and foster data engineering collaboration throughout the command. The ATEC Reference Repository, developed in accordance with the ATEC Test Technology To-Be Architecture, will contain evolving ATEC data and metadata policy, standards, guidelines, and models for use throughout the entire command. d. The ATEC Reference Repository will contain the Common Reference Model (CRM) that specifies the shared and exchanged data within the command. It is through the use of the CRM that ATEC will realize data reuse and unambiguous information exchange throughout the command. The CRM will be developed over time and will form the basis for ATEC s data engineering and data modeling activities in support of T&E. It is ATEC s intent that the CRM be maintained through input by ATEC data modelers and subject matter experts throughout the command. The details describing development and evolution of the CRM are in ATEC Test Technology To-Be Architecture, 16 October The framework for describing metadata registries is in the International Organization for Standardization (ISO)/International Electro- ATEC Pamphlet June 2010 J-1

354 technical Commission (IEC) 11179, Information Technology-Metadata Registries (MDR), dated 15 September Table J-1 provides a definition of the seven levels of data. Table J-1. ATEC Data Level Definitions Data Level Description Sources Examples of Content Disposition Level 1 Raw Data * Data in their original form. Results of field trials just as recorded. Raw data. Completed data collection forms, filled questionnaires, interview transcripts, test logs. Instrumented data including audio and video recordings, data bus, position/ location, etc. 1. All reported target detections. 2. Clock times of all events. 3. Impact location of each round fired. 4. Recordings of interviews. Accumulated during trials for processing. Usually discarded after use. Not published. Level 2 Processed Data** Raw data are processed. Manual data are digitized. Data are manipulated and organized. All data fields are quality controlled. + Anomalous data are corrected. Invalid data are identified as such with supporting rationale. Data are housed in a relational database with appropriate links between independent and dependent variables or flat files containing all the independent and dependent variables. Annotated and corrected data collection forms and logs. Annotated, corrected and filtered (delete / extract) instrumented data. Original raw data with No test events identified. Level 1 data that has been processed. 1. Record of all valid detections. 2. Start and stop times of all applicable events. 3. Elapsed times. 4. Round impact locations corrected as necessary. 5. Miss distances. 6. Confirmed interview records. Produced during processing. Made available to the DAG for their review. Stored in electronic files. Not published. Calculations limited to counting and elementary arithmetic (+,-, *, /) to generate measures. All data processing completed. Level 3 Authenticated Data Data that have been authenticated by the DAG as representing the most credible source of information as to what actually occurred during the test. Level 2 data that has undergone the authentication process. 1. Record of all valid detections. 2. Elapsed times. 3. Missed distances. 4. Confirmed interview records. Usually not published but made available to analysts. Stored in institutional data banks. Level 4 Data Findings or Summary Statistics Data that have been authenticated by the DAG as representing the most credible source of information as to what actually occurred during the test. Level 2 data that has undergone the authentication process. 1.Record of all valid detections. 2. Elapsed times. 3. Missed distances. 4.Confirmed interview records. Usually not published but made available to analyst. Stored in institutional data banks. J-2 ATEC Pamphlet June 2010

355 Table J-1. ATEC Data Level Definitions (Continued) Data Level Description Sources Examples of Content Disposition Level 5 Data Analysis or Inferential Statistics Data resulting from statistical tests of hypothesis or interval estimation. Execution of planned analysis data Includes both comparisons and statistical significance level. Judgments limited to analyst s selection of techniques and significant levels. Results of primary statistical techniques such as T-tests, Chisquare, F-test, analysis of variance, regression analysis, contingency table analyses and other associated confidence levels. Follow-on tests of hypotheses arising from results of earlier analysis, or fallback to alternate non-parametric technique when distribution of data does not support assumption of normality. Qualitative data in the form of prevailing consensus. 1. Inferred probability of detection with its confidence interval. 2. Significance of difference between two mean elapsed times. 3. Significance of difference between observed probable error and criterion threshold. 4. Magnitude of difference between categories of comments. Published in evaluation reports. (If evaluation report is part of test report, the level 5 analysis results are presented separately from the level 4 findings.) Level 6 Data Extended Analysis or Operations Data resulting from further analytic treatment going beyond primary statistical analysis, combination of analytic results from different sources, or exercise of simulation or models. Judgments limited to analysts choices only. Insertion of test data into a computational model or a combat simulation, aggregation of data from different sources observing required disciplines, curve fitting and other analytic generalization, or other operations research techniques such as application of queuing theory, inventory theory, cost analysis, or decision analysis techniques. 1. Computation of probability of hit based on target detection data from test combined with separate data or probability of hit given a detection. 2. Exercise of attrition model using empirical test times distribution. 3. Determination of whether a trend can be identified from correlation of flash base accuracy data under stated conditions from different sources. 4. Delphi technique treatment of consensus of interview comments. Published as appropriate in evaluation reports. Level 7 Data Conclusions or Evaluation Data conclusions resulting from applying evaluative military judgments to analytic results. Stated conclusions as to issues, position statements, challenges to validity or analysis. 1. Conclusion as to whether probability of detection is adequate. 2. Conclusion as to timeliness of system performance. 3. Conclusion as to military value of flash base accuracy. 4. Conclusion as to main problems identified by interviewees. Published as the basic evaluative conclusions of evaluation reports. *Assumes all data required to evaluate an oversight system has been mapped to the system evaluation data model. **Data Processing: The converting of raw data to machine-readable form and its subsequent processing (as storing, cleaning, tabulating, updating, combining, rearranging, or printing out) by a computer. +Quality Control: The process of validating and verifying data. Data validation ensures the data represent the system s performance during test and that the test reasonably represented the system s intended operational environment, given test limitations. Data verification ensures the data are checked for completeness, accuracy, and consistency. ATEC Pamphlet June 2010 J-3

356 e. It is strongly recommended that Erwin data modeler be used to develop the command s data models. f. All data models developed by ATEC, in support of assessments, will be posted and will be available for download and reuse in the ATEC Reference Repository. (1) This includes system evaluation data models designed by the Evaluator, in conjunction with the Development Test Command (DTC) Test Manager and the Operational Test Command (OTC) Test Officer who must agree with the definitions of all data elements and data relationships within the evaluation data model. (2) In addition, the ATEC Reference Repository will also include those data models developed by DTC and OTC based on the system evaluation data model. DTC Test Managers and OTC Test Officers will coordinate with the Evaluator to ensure data elements and files map directly to the system evaluation data model to include data element names and definitions consistent with ATEC published naming and definition conventions. g. The Evaluator, DTC Test Officer, and/or OTC Test Officer are responsible for approving the data model prior to posting on the ATEC Reference Repository. h. The ATEC Evaluator s Handbook will be used as the basis for data collection, data management, and analysis for all ATEC test activities. J-3. Basic Approach to Analysis a. It is important to understand that the analysts learns from the data. Before learning can begin, data is screened and as many outliers and anomalies are resolved as possible. This presents a clear picture as the analysts start looking at the data. First, the analysts must understand what answer(s) they are looking for from the data. This is usually stated in the EFAs. After investigating the characteristics of the data and the analysts are satisfied they know what it looks like, then the analysts can start to apply statistical techniques. At the completion of the analysis, the analysts should have a good idea of the answer and what factors are important. The analysts can then compute estimates of the measures. b. In conducting the analysis, it is important to examine the data for trends and make sure to look for supporting evidence and similar results whenever possible. This could strengthen conclusions. The analysts make optimal use of Soldier ratings and comments, external data, models, and simulations. Sometimes small-scale simulations are more helpful in finding out what the data is saying then trying to run a big combat model. Finally, develop clear data displays that stand on their merit and clearly explain the results of the findings. c. Analysis is based on test data, modeling and simulation, reports, studies, and other appropriate sources. The analysis involves a comparison of the system s performance observed from the test data, with respect to the SEP requirements that reflect the criteria in the capability documents (ICD, CDD, CPD), COIC, EFAs, and the Critical Technical Parameters (CTPs) J-4 ATEC Pamphlet June 2010

357 contained in the TEMP. The subsequent system evaluation may also compare the test results to other valid and credible event data sources. d. It is important to know the benefit of the analysis being conducted and the value it adds to the integrated system evaluation. There will be more value and greater return on the analysis, if the analyst sticks to trying to find out what the data is saying, rather than just computing a number. J-4. Analysis Considerations The analysis, albeit not the only element, is an integral and critical part of system evaluations. Analysis is also not independent of the remaining elements of the evaluation process. The key indicators of successful analyses in support of our system evaluations are timeliness and quality. a. Timeliness is defined as being able to provide an analysis product in advance of the evaluation deliverable date. Quality is defined as the degree of excellence achieved in the final products of the analysis, as demonstrated by the following characteristics: (1) The conclusions drawn are consistent with the analysis performed. (2) The test design is balanced so available statistical software programs can perform common analysis techniques. Unbalanced designs complicate the analysis, usually leading to reduced scope or special formulas that require more time and effort by the analyst. (3) The analysis is based on the least complicated model, which explains the greatest amount of variation in the data. (4) The graphical presentation of the results are simple, clear, stand on their own merit, and intuitive to the reader. b. Timeliness necessitates the availability of adequate software tools to meet the stringent time requirements to conduct quality analyses to support the evaluation. Related to this is the necessity for timely access to the data in a form (logical data base structure) and download capability to available software tools (i.e. SAS, SAS-JMP, SAS Insight, EXCEL, etc.). c. The availability of adequate software tools allows the analysis process to be conducted much more efficiently. Streamlining does not mean that the quality of our analyses is compromised. It means that whatever analysis we do to support the evaluation process be done efficiently and effectively. d. Each analysis in support of a system evaluation will be different, however certain elements will be common to all analyses. Some of these are distributional analysis (determining the shape and the form of the data), statistical outlier analysis, testing for parametric assumptions, hypothesis testing (parametric and non-parametric), point and interval estimation, correlation, analysis of variance, multivariate analysis, multifactor contingency and correspondence analysis, cluster analysis, reliability, availability, and maintainability analysis. The order and extent to which these analyses are done depends on how many measures there are ATEC Pamphlet June 2010 J-5

358 to be addressed, how many factors there are to take into consideration, and the assumptions that have to be met. e. There are many elements of the evaluation process, which can assist in the streamlining of our analysis. Just a few of them are: (1) Minimizing the number of critical measures and factors and conditions to be analyzed. (2) Ensuring an adequate and appropriate scope is put on causality analysis. (3) Making sure that a level 3 database structure, in the form compatible with interactive analysis (SAS, SAS-JMP, etc.), is available. (4) Ensuring timely and effective integration of the test data (test data base) into the level 3 database. (5) Ensuring adequate DAG support software tools are available. (6) Instituting a timely RAM scoring and RAM database updating process. (7) Conducting interim data analysis during test execution. (8) Ensuring that a strawman Independent Evaluation Briefing and SER are prepared well in advance. (9) Making maximum use of interactive graphic software tools in the development of graphs and slides to be used for briefings and presentations. J-5. Techniques of Analysis The AST analysts identify the procedures and techniques that will be used in the analysis and evaluation in the SEP. Identification of appropriate analysis techniques are based on, for the most part, the hypothesis to be tested, the type of data to be analyzed, the number of factors and conditions under which the data is to be collected, the test design, and the assumptions about the data collected. The data we collect falls into three basic categories: continuous data, count data, and comment data. Types of continuous data might be: time measurements, error measurements, distance measurements. Types of count data might be: hits or misses, successes. Several practical analysis techniques have been applied over the years and have been known to provide credible and defensible conclusions. New analysis techniques have also been discovered and are being applied where appropriate. Some of these techniques are show below: a. Count Data/Comment Data. (1) Pareto charts. (2) Correspondence analysis with computed figures of merit for count and rating data, and trend analysis. J-6 ATEC Pamphlet June 2010

359 (3) Multi-factor contingency analysis for count data under multiple factors and conditions. (4) Binomial distribution statistical analysis and hypothesis testing. b. Continuous or Count/Comment Data. (1) Graphical analysis. (2) Non-parametric analysis (e.g., median test, sign test, rank tests). (3) Logistic regression. (4) Reliability analysis (e.g., Kaplan-Meier, reliability growth, exponential, weibull). (5) Small scale simulations (e.g., SIMSCRIPT this is a simulation language, not a simulation). c. Continuous Data. (1) Control charts. (2) Goodness of fit testing (e.g., normal, exponential, weibul, lognormal). (3) Hypothesis testing of means, variances, and data distribution characteristics (e.g., tests for normality, equality of variance, independence, and chi-square tests). (4) Analysis of variance. (5) General linear multiple regression. (6) Log-linear modeling. (7) Correlation and cluster analysis. (8) Availability analysis (e.g., beta distribution, equipment, distributed systems, networks). J-6. Analysis Software Tools There are several analysis software tools available to the AST analysts. The more typical software analysis tools used are SAS, SAS JMP, ProDV, EXCEL, Math CAD, and Mathematica. The POC for availability and access to these tools is the ATEC Deputy Chief of Staff for Information Management. ATEC Pamphlet June 2010 J-7

360 This page intentionally left blank. J-8 ATEC Pamphlet June 2010

361 Appendix K Test Types K-1. Purpose This section provides a brief description of the approved Army test types that are performed throughout the acquisition cycle as defined in AR Figure K-1 provides examples of test types during each acquisition phase. Figure K-1. Examples of Test Types during Acquisition Phase K-2. Developmental Test Types a. Pre-Full Rate Production (FRP) Developmental Testing. Pre FRP developmental testing ranges from before program initiation to the FRP decision and includes testing funded by RDTE categories (see DOD Financial Management Regulation, volume 2B, chapter 5, for information on funding categories), as follows: ATEC Pamphlet June 2010 K-1

362 (1) Research Effort/Test. A technical effort or test conducted during pre-system acquisition to determine early technical parameters, to support the research of the item, and to provide fundamental knowledge for solutions of identified military problems. (2) Technical Feasibility Test (TFT). A TFT is typically conducted during technology development to assist in determining safety, to establish system performance specifications, and to determine feasibility of alternative concepts. Testing during this phase identifies and reduces risks in subsequent acquisition phases. This test provides data for the system evaluation that supports the MS B decision. (3) Engineering Development Test (EDT). An EDT is conducted after milestone B and before milestone C to provide data on performance and safety, the achievability of critical technical parameters, refinement and ruggedization of hardware configurations, and determination of technical risks. The EDT includes testing of compatibility and interoperability with existing or planned equipment and systems and the system effects caused by natural and induced environmental conditions. (4) Production Prove-out Test (PPT). A PPT is conducted during system acquisition (post MS-B and before production testing with prototype hardware) for the selected design alternative. (5) Production Qualification Test (PQT). The PQT is a system level test conducted post MS C that ensures design integrity over the specified operational and environmental range. It must be completed using LRIP assets, when available, prior to the FRP decision review. A PQT normally uses prototype or pre-production hardware and software fabricated to the proposed production design specifications and drawings. Such tests include contractual RAM demonstration tests required prior to production release. This test provides data for the system evaluation that supports the FRP decision. (a) The objectives of the PQT are to obtain Army confirmation that the design is stable, logistically supportable, capable of being produced efficiently, and will meet the performance and user requirements, assess the performance envelope, and determine the adequacy of any corrective action indicated by previous tests and to validate the manufacturer s facilities, procedures, and processes. (b) PQT may also include tests, which are not included in the technical data package or contract (e.g., environmental extremes, test-to-failure) when necessary to obtain engineering data for corrective action verification or other purposes. PQT may be accomplished in phases (e.g., preliminary engineering, shoot off, specific problem correction, etc.). (6) Software Qualification Test (SQT). The software SQT is essentially the same as the PQT for materiel systems and may have been designated as an SQT for C4I/IT systems in the past and should be used for grandfathered systems only. A SQT is a system level test conducted by the Army developmental tester using live data files supplemented with user prepared data and executed on target hardware. Conversion procedures and special training requirements are introduced as additional elements for verification and validation. SQT objectives are to have the Government confirm that the design will meet the performance/user requirements and to K-2 ATEC Pamphlet June 2010

363 determine the adequacy and timeliness of any corrective actions indicated by previous testing. System users participate in the technical and functional aspects of the SQT. (7) Live Fire Test (LFT). For those weapons systems required by law to undergo live fire T&E, the LFT may be conducted as part of, or in conjunction with, the PQT. The LFT demonstrates the ability of the system to provide battle resilient survivability, or the munitions to provide lethality. It will provide insights into the principal damage mechanisms and failure modes occurring as a result of the munitions/target interaction and into techniques for reducing personnel casualties or enhancing system survivability/lethality. (8) Logistic Demonstration (Log Demo). The Log Demo is a system-level test to provide data for the evaluation of the supportability of the system s design and the system support package. To evaluate the supportability of the materiel design and the preliminary system support package. A nondestructive disassembly and re-assembly of equipment is conducted on a dedicated engineering prototype before the FRP decision. During a Log Demo, the achievement of maintainability goals; the adequacy and sustainability of tools, selected test program sets, built-in test equipment, associated support equipment items, technical publications, and maintenance instructions is evaluated. The adequacy of trouble-shooting procedures; personnel skill requirements; the selection and allocation of spare parts, tools, test equipment, and tasks to appropriate maintenance levels; and the adequacy of maintenance time standards is also evaluated. (9) C4/IT Interoperability Certification Test. This test applies to all Army C4I systems having interfaces or interoperability requirements with other systems and/or other Service systems (joint interoperability certification testing). The test may consist of simple demonstrations using message analysis or parsing software with limited interface connectivity, or extend to full-scale scenario-driven exercises with all interfaces connected. The U.S. Army CECOM SEC serves as the Army Participating Test Unit Coordinator (APTUC), and in that capability, supports interoperability testing of C4I systems conducted by the DISA, JITC for system certification and re-certification. The CTSF/JITC MOA states that the CTSF is responsible for Intra-Army certification; the JITC is responsible for joint interoperability certification. In addition, JITC can certify joint interoperability at the CTSF. The CECOM SEC APTUC arranges and coordinates all Joint interoperability testing with DISA and coordinates the participation of all Army elements and systems. See JITC plan 3006, Joint Interoperability Test Plan for testing Tactical Digital Link and US Message Test Format systems. (10) Software Development Test (SDT). A SDT covers the full spectrum of tests that are inherent to software development including M&S, unit, module, integration, security, stress, conversion, software certification, and full-up system testing prior to government testing. b. Production Developmental Testing. Production developmental testing is required to verify that the requirements specified in the ORD and production contracts for hardware and software are met. It also provides test data for the system assessment required for materiel release decision, ensures the product continues to meet the prescribed requirements, and provides a baseline for post-production testing. ATEC Pamphlet June 2010 K-3

364 (1) Production Verification Test (PVT). A system-level DT conducted post-frp to verify that the production item meets critical technical parameters and contract specifications, determine the adequacy and timeliness of any corrective action indicated by previous (pre-frp) tests, and validate the manufacturer s facilities, procedures, and processes. This test provides data for the system evaluation for materiel release decision so the evaluator can address the adequacy of the materiel with respect to the stated requirements. The PVT will also provide a baseline for the test requirements in the TDP for post-production testing. The PVT is accomplished during the first limited production or full-scale production contract. (a) The PVT may take the form of a first article test (FAT) if such testing is required in the TDP. A FAT may be required for quality-assurance purposes to qualify a new manufacturer or procurements from a previous source out of production for an extended period of time. The FAT ensures the contractor can furnish a product that conforms to all contract requirements for acceptance. Requirements for FATs may be invoked in production contracts by citation of the applicable Federal Acquisition Regulation (FAR) first article inspection and approval clause. When a FAT is specified in a contract, it may not be waived or changed without prior approval of the head of the contracting activity. FATs may be conducted at government facilities or at contractor facilities when observed by the government. Requirements for the FAT should be consistent with those of the PQT. (b) The PVT may include tests that are not included in the data package or contract (e.g., environmental extremes, test-to-failure) when necessary to obtain engineering data for corrective action verification or to support a materiel release decision. (2) First Article Test (FAT). A FAT maybe required for quality assurance purposes to qualify a new manufacturer or procurements from a previous source out of production for an extended period (usually 2 years) and to product assemblies, components, or repair parts conforming to requirements of the technical data package. (3) Comparison Test (CPT). A CPT is a test of a randomly selected sample from production. CPT is conducted as a quality assurance measure to detect any manufacturing or quality deficiencies that have developed during volume production which may have reduced effective operation of the item or resulted in item degradation. CPT is conducted or supervised by an agent independent of the producer or by Government on-site quality assurance personnel. (4) Quality Conformance (Acceptance) Inspections. Inspections that are examinations and verification tests normally prescribed in the TDP to be performed by the contractor (subject to witnessing by the on-site Quality Assurance Representative) on the item, lots of items, or services to be offered for acceptance. These examinations may include in process and final comparisons with technical quality characteristics required to verify that materiel meets the terms of the contract and should be accepted by the Government. (5) Post Deployment Software Support (PDSS). Tests in support of PDSS are DTs that are conducted during PDSS for software intensive materiel systems. They parallel those tests conducted pre-frp decision, but are usually abbreviated based on the number, magnitude, and complexity of the modifications or maintenance. Test in support of PDSS are conducted to K-4 ATEC Pamphlet June 2010

365 assure that software modifications meet requirements, do not impair existing functions or performance, can be employed by users, and are effective and suitable. (6) Service Level Test. A Service level test (SLT) is the final preparation test prior to participating as a system under test in the joint interoperability test. The U.S. Army AMCOM SED serves as the Service level test agent for Army aviation, air, and missile defense systems. A Joint C4I interoperability certification test is conducted if major hardware and software modifications to the C4I system have been made that impact on previously established joint interface requirements. Re-certification test schemes must be developed and must be commensurate with the level of changes involved in both the C4I system and the systems with which it must interoperate. The CECOM SEC APTUC arranges, coordinates, and participates in all Joint interoperability testing with DISA, JITC, and coordinates the participation of all Army elements and systems. The U.S. Army AMCOM SED interfaces with the CECOM SEC to plan and schedule the Army aviation, air, and missile defense system participation in Joint C4I interoperability certification testing. (7) C4/IT Interoperability Recertification Test. This test is conducted if major hardware and software modifications to the C4/IT system have been made that impact on previously established interface requirements. A service level test must be conducted prior to the joint interoperability re-certification test. Re-certification test schemes must be developed and must be commensurate with the level of changes involved in both the C4/IT system and the systems with which it must inter-operate. The CTSF is responsible for intra-army re-certification, and the CECOM SEC APTUC arranges and coordinates all Joint interoperability testing with DISA, JITC, and coordinates the participation of all Army elements and systems. c. Post-production Developmental Testing. Post-production developmental testing is conducted to measure the ability of materiel in the field, in storage, and following maintenance actions (reworked, repaired, renovated, rebuilt, or overhauled) to meet user s requirements (for example, conform to specified quality, reliability, safety, and operational performance standards). (1) Surveillance/Stockpile Reliability Tests. Surveillance/stockpile reliability tests include destructive and nondestructive tests of materiel in the field, depot, or extreme environmental locations. They are conducted to determine suitability of fielded or stored materiel for use, evaluate the effects of environments, measure deterioration, identify failure modes, and establish or predict service and storage life. They may be at the component-throughsystem level. System level programs may include dedicated hardware allocated for this purpose, fielded materiel, or supplies in storage. Libraries of component parts to provide a baseline for subsequent surveillance test data comparisons may be established at contractor or government facilities. (2) Reconditioning Tests. Criteria for reconditioning tests will be incorporated in depot maintenance work requirements (DMWR), modification work orders (MWO), technical manuals (TM), technical bulletins (TB), and contracts. Reconditioning tests fall into five types, the most important of which are pilot and initial. ATEC Pamphlet June 2010 K-5

366 (a) Pilot reconditioning tests are conducted to demonstrate the adequacy of the documented technical requirements, processes, facilities, equipment, and materiel that will be used during volume reconditioning activities. The pilot model will be reconditioned in strict accordance with DMWR, MWO, TM, TB, and contracts. Pilot reconditioning tests will be applied when DMWR, TMs, or TBs, are used the first time or when major changes are made. (b) Initial reconditioning tests are conducted to demonstrate the quality of the materiel when reconditioned under volume (rate) procedures and practices. These tests relate to the PVT during production. (c) Control Tests are conducted on random items from volume reconditioning operations to verify that the process is still producing satisfactory materiel. Criteria should be the same as for initial reconditioning tests. These tests relate to CPTs during production. (d) Acceptance Tests are conducted on in-process materiel and at the completion of reconditioning activities, and provide data upon which an accept/reject decision is based. (e) Baseline Evaluation Tests (BETs) are conducted simultaneously on reconditioned and new production materiel of the same configuration to provide a comparison of performance and to determine the degree of re-conditioning required. This test will be considered when the item is being reconditioned for the first time, when significant modifications affecting performance are incorporated, or to provide data upon which to base a decision on upgrading versus new procurement. d. As Required Developmental Testing. All as-required testing requirements that pertain to a specific system acquisition will be incorporated into the TEMP. e. Foreign Comparative Testing (FCT). FCT involves T&E of North Atlantic Treaty Organization (NATO) allies defense equipment. The principal objective of the FCT program is to leverage NDI of allied and friendly nations to satisfy DOD component requirements to correct mission area shortcomings. The process is dependent on a developed foreign item, user interest, a valid requirement, good procurement potential, and a successful evaluation. The FCT program is authorized by Title 10, U.S. Code, Section 2350a. Guidance can be found in DOD M 2, the DOD Federal Acquisition Regulation (FAR) supplement, and the Defense Acquisition Guidebook. K-3. Operational Test Types a. Pre-FRP Operational Testing. Operational testing for systems normally occurs from program initiation (MS B) to the FRP decision review. Pre-FRP test requirements will be incorporated into the TEMP in support of MS B or C. The types of OT conducted prior to FRP are EUT, LUT, and IOT and are defined below. Pre-FRP testing for information systems is typically conducted during the development phase and normally consists of an IOT and a LUT. The following items constitute user pre-frp tests: (1) Early User Test (EUT). An EUT is conducted before milestone B with RDTE funds. During an EUT, procedures that are described for IOT and modified as necessary by maturity or K-6 ATEC Pamphlet June 2010

367 availability of test systems and support packages may be used. During a EUT, solutions to known issues that must be addressed before milestone B are sought. (2) Limited User Test (LUT). The LUT is any type of operational test, conducted to reduce system risk, other than the IOT. LUTs normally are held between milestone B and FRP decision reviews to address a limited number of operational issues. The LUT may be conducted to provide a data source for operational assessments in support of LRIP decisions and for reviews conducted before IOT. The LUT may be conducted to verify fixes to problems discovered in IOT that must be verified prior to a FRP decision (that is, the fixes are of such importance that verification cannot be deferred to the FOT). The LUT will not be used to circumvent requirements for an IOT before a full-production decision; segment an IOT through a series of limited objective tests; or replace planned FDTE that can address the required issues. (3) LUT (Information Systems). The pre-frp LUT for battlefield automated systems (BAS) and information systems is similar in purpose to the LUT for materiel systems. (a) It is an operational test conducted usually between milestone B and the IOT and normally addresses a limited number of operational issues. The LUT may be conducted to provide a data source for operational assessments in support of acceptance of hardware and commercial off-the-shelf software for operational test-beds and interim blocks of software functionality prior to the IOT. (b) For information systems developed under the evolutionary acquisition strategy testing is required for each block or software release Post-IOT. A LUT, scheduled for each subsequent increment or block, will address the effectiveness, suitability, and survivability and include regression testing for previously fielded blocks. Additional LUTs may be conducted to verify fixes to problems discovered in IOT. The LUT will not be used to circumvent requirements for an IOT as prescribed by DOD directives and the Law. (4) Initial Operational Test (IOT). The IOT is a field test required by Title 10 USC Section It is conducted under realistic operational conditions with a production or production-representative system (or key component of such a system) to determine its operational effectiveness and operational suitability for use by typical users in combat or when otherwise deployed. The IOT environment is as operationally realistic as possible and includes use of realistic threats. Typical users operate and maintain the system under conditions simulating actual deployment conditions. (5) IOT (Information Systems). The pre-frp IOT for information systems is similar in purpose to the IOT for materiel systems. In addition, the IOT uses a production database and is executed on target hardware. The test should include the conversion, training, software verification and validations processes, and ensure that the system meets the collective user and proponent needs, can be operated by users, and is ready for deployment. b. Post-FRP Operational Testing. Operational testing in the production and deployment phase supports both development and fielding subsequent to the IOT, and supports PDSS for information systems. ATEC Pamphlet June 2010 K-7

368 (1) Follow-On Operational Test (FOT). An FOT is an operational test that may be necessary during or after the production phase to refine the estimates made during IOT, provide data to evaluate changes, and verify that deficiencies in materiel, training, or concepts have been corrected. FOT may also provide data to ensure that the system continues to meet operational needs and that it retains its effectiveness in a new environment or against a new threat. For software intensive systems, the FOT typically serves as the operational test in support of PDSS. (2) User Acceptance Test (UAT) (Information Technology). The functional proponent or Capabilities Developer will conduct a UAT for systems that support PDSS. For systems that have both a functional proponent and a Capabilities Developer, the functional proponent will conduct the UAT. The UAT is limited in scope relative to an FOT and conducted primarily to verify the functionality of the changes to the information technology in the user environment. c. As Required Testing. All as-required testing requirements that pertain to a specific system acquisition will be incorporated into the TEMP. (1) Customer Test (CT). A CT is a test conducted by ATEC for a requesting agency external to ATEC. The requesting agency coordinates support requirements and provides funds and guidance for the test. It is not directly responsive to Army program objectives and is not scheduled for approved by the TSARC unless external sources are required for test support. It can provide valuable information throughout the acquisition cycle on safety, performance, reliability, and military utility for developers, contractors, evaluators, and users. AR 73-1, T&E Policy, states that a CT will not be used to circumvent requirements for an IOT. This applies to all programs regardless of ACAT. (2) Supplemental Site Test (SST) (Information System). An SST may be necessary for information systems that execute in multiple hardware and operating system environments if there are differences between user locations that could affect performance or suitability. The SST supplements the IOT and UAT. K-8 ATEC Pamphlet June 2010

369 Appendix L ATEC Test Facilities L-1. General a. ATEC is geographically dispersed in 23 locations throughout the United States. Each location has unique capabilities suited to its particular assigned T&E mission. This appendix provides a brief description of DTC test facilities and OTC test directorates. A detailed listing of DTC s test center test capabilities is provided in table L-1, while table L-2 provides a synopsis of OTC s core competencies. Additional detailed information may be obtained from the specific test facility/organization or its parent command. Internet links to each test command can be found on the ATEC homepage at b. The Test Facilities Register (TESTFACS) also provides detailed information on existing major test facilities, instrumentation, and test equipment. TESTFACS is maintained by the HQ ATEC Instrumentation Division (CSTE-OP-IN), and is available on CD. L-2. DTC a. DTC is headquartered at Aberdeen Proving Ground, Maryland. The DTC HQ Test Management Directorate has primary responsibility for the conduct of DTC s test mission. The directorate has four test divisions. b. The test divisions are responsible (within their commodity areas) for managing the planning, conducting, and reporting of developmental tests; interfacing with the test customers to ensure test strategies are sound; and verifying that all system safety and health issues of systems under test are identified. The test managers in these divisions represent DTC on the AST and the T&E WIPT. c. DTC s test centers are located at various sites throughout the world and provide land, ranges, test courses, climate, instrumentation, and facilities to test military hardware of all types under controlled conditions and across the full spectrum of man-made and natural environments. A brief discussion of the test centers follows: L-3. DTC s Test Centers a. Aberdeen Test Center (ATC) at Aberdeen Proving Ground (APG). ATC is located on Aberdeen Proving Ground, which is situated on the Chesapeake Bay in Maryland, approximately 60 miles northeast of Washington, D.C. It is a Major Range and Test Facility Base (MRTFB) boasting a multipurpose test center with diverse capabilities and the Army s Center of Excellence for congressionally mandated live-fire vulnerability and lethality testing. (l) ATC provides a single location where combat systems can be subjected to a full range of tests from automotive endurance and full weapons performance through induced environmental extremes, developmental testing of command, control, communications, and computers (C4) to full-scale live fire vulnerability/survivability/lethality testing using an extensive array of test ranges and facilities, simulators, and models. Testing is conducted on ATEC Pamphlet June 2010 L-1

370 both full systems and system components and includes armored vehicles, guns, ammunition, trucks, bridges, generators, material handling equipment (MHE), night vision devices, individual equipment such as boots, uniforms, and helmets, and surface and underwater naval systems. (2) ATC offers numerous exterior and interior firing ranges, world renowned automotive courses, chambers simulating various environmental conditions, two underwater explosion ponds, sophisticated non-destructive test facilities, multifunctional laboratories, the Phillips Army Airfield, and an extensive industrial complex which includes maintenance and experimental fabrication capabilities. Ammunition is prepared in on-site ammunition plants to meet customer needs. Experienced personnel also conduct and/or support tests at other locations throughout the world with extensive mobile instrumentation. b. West Desert Test Center (WDTC). WDTC is located at Dugway Proving Ground (DPG) approximately 75 miles southwest of Salt Lake City, Utah, in the Great Salt Lake Desert. This remote, isolated installation serves as the Defense Department s primary chemical and biological defense testing center. (1) WDTC conducts exploratory and developmental tests of chemical and biological defense systems, smoke and obscurant munitions and delivery systems, and incendiary devices. Testing is also conducted on all materiel commodities to assess chemical/biological hardness and contamination/decontamination survivability. (2) The WDTC facilities include indoor laboratories and test chambers, as well as outdoor test sites and extensively instrumented test grids for use with simulants. State-of-the-art chemical testing facilities support indoor testing of large-scale military vehicles and aircraft in hazardous environments as well as simulant-only testing. The Life Sciences Test Facility has the only chamber in the United States designed to test against potentially lethal agents in aerosol form. Other facilities allow testers to evaluate the environmental results from open burning and open detonation, accurately replicating real-world disposal operations. The WDTC range also includes extensive mortar and artillery firing ranges for testing smoke and illumination rounds. c. Redstone Test Center (Provisional)(RTC (P)). RTC (P) is located on Redstone Arsenal in northern Alabama, adjacent to the high technology community of Huntsville. It is an Army leader in aviation systems, aviation components, and rocket and missile systems and components testing. (1) RTC (P) conducts performance, quality assurance, and reliability testing of small rockets, missiles, rocket and missile components, and associated hardware. It is unique in its ability to test electrical, electro-optical, mechanical, and explosive components for product assurance, and verify component, subsystem, and system performance before committing to flight testing. RTC also conducts intelligence, surveillance, and reconnaissance (ISR) testing on biometrics and surveillance/reconnaissance systems/subsystems. All types of natural and operationally induced dynamic, environmental, and electromagnetic testing can also be performed. RTC (P) is also a center of excellence for testing lightning s effects on munitions and ordnance. L-2 ATEC Pamphlet June 2010

371 (2) Located in the foothills of the Appalachian Mountains, the RTC (P) open-air ranges provide an uncluttered environment that is highly instrumented. Facilities include fully instrumented flight ranges, dynamic warhead test sled tracks, static rocket motor test stands and a full range of dynamic, climatic, electromagnetic and lightning facilities for testing missiles and weapon systems. Highly automated laboratory facilities are available for testing all types of weapons components and subsystems under realistic climatic and dynamic conditions. RTC (P) tests the airworthiness of Army aircraft components and subsystems for safety, qualification, and reliability. It also operates the Army s largest rocket motor static test facility. (3) With the BRAC consolidation of the former Aviation Technical Test Center (ATTC) and Redstone Technical Test Center (RTTC), the resulting RTC (P) conducts developmental flight-testing and airworthiness qualification testing on subsonic fixed- and rotary-wing aircraft, aircraft systems and subsystems, and aviation support equipment. Flight-testing focuses on assessing system performance, system integration with the aircraft and other installed systems, system safety, Soldier/machine interface, human factors engineering, and logistics supportability. Airworthiness qualification testing, which is performed by experimental test pilots, assesses the flight characteristics and handling qualities of the aerial vehicle and its in-flight performance. Because of the test mobility inherent to aviation, RTC (P) has the capability to conduct extensive testing at off-site locations throughout the continental U.S., where specific test capabilities or climatic conditions are required. (4) RTC (P) maintains a fleet of test bed aircraft, representing the Army s fielded aviation systems. The one-of-a kind Helicopter Icing Spray System allows RTC (P) to evaluate airframe icing characteristics and deicing/anti-icing system performance in artificial icing conditions.. (5) With the BRAC consolidation and the associated gain of nearly 50 years of experience in the field of aviation developmental testing, RTC (P) is a highly flexible test organization that provides a high degree of test mobility on the total integrated aviation system. d. White Sands Test Center (WSTC). WSTC is located at White Sands Missile Range in the Tularosa Basin in south central New Mexico, near the communities of El Paso, Las Cruces, and Fort Bliss, Texas. At 2.2 million acres, it is the DOD s largest overland test range. (1) WSTC is primarily a missile range for testing ballistic and guided missiles, and air defense systems, but it also supports a variety of testing needs. These include the full range of electromagnetic effects and nuclear environments testing; artillery and associated command and control systems; aircraft (fixed-wing) armament; and temperature, shock, and vibration effects. Due to its expansive size, WSTC provides the opportunity for post-test analysis on recovered debris. (2) WSTC has more than 1,500 precisely surveyed instrumentation sites with high-speed cameras, tracking telescopes, interferometer systems, and radar and telemetry tracking/receiving stations to collect data during testing. Laboratory facilities include environments, weapon systems simulation, guidance and control, propulsion, climatic, metallographic and microbiological. The Lightning Test Facility provides direct and near strike capability for systems under test. In addition to on-post missile and rocket launch sites, the range has ATEC Pamphlet June 2010 L-3

372 developed facilities in New Mexico, Utah and Idaho for long-range firings which impact on WSTC. e. Electronic Proving Ground (EPG). EPG is located on Fort Huachuca, in southwestern Arizona near the foothills of the Huachuca Mountains. EPG also has field offices at Fort Lewis, Washington, and Fort Hood, Texas. (1) EPG conducts developmental testing of intelligence, surveillance, and reconnaissance (ISR) equipment and systems. It also conducts tests on electronic warfare, optical/electrooptical, unmanned/micro aerial vehicles, global positioning systems, aircraft navigation and avionics systems and biometrics and target acquisition architecture. Test capabilities include the full spectrum of electronics testing from tests of subsystems such as antennas, transceivers, or switches to the entire system and testing of command, control, communications, computers (C4). EPG has the capability to perform electromagnetic compatibility and vulnerability analyses of tactical electronic equipment and systems to include generation of realistic friendly and enemy electromagnetic battlefield environments. Instrumented range services include video and telemetry tracking, position location via radar and position location systems, air surveillance and tracking, and meteorological monitoring. (2) EPG maintains a full-service, highly instrumented test range and can track and collect data from all types of air and ground systems. Facilities include an electromagnetic environmental test facility, environmental chambers, a stress loading facility to measure the full load performance of communication systems, an EMI/EMC/TEMPEST test facility, and many unique, specialized facilities for testing of antennas, radar, unmanned aerial vehicles, and computer software. The surrounding mountain ranges create a natural and effective barrier to outside electromagnetic interference and allow the unrestricted use of a wide range of frequencies. f. Yuma Test Center (YTC). YTC is located at Yuma Proving Ground in southeastern Arizona in the Sonora Desert, approximately 24 miles northeast of the city of Yuma. In addition to its desert environment test mission, YTC also has both the Cold Regions Test Center (CRTC) and Tropic Regions Test Center (TRTC) and their corresponding test missions. It is one of a few places in the continental United States that can test military munitions and hardware in an area removed from urban encroachment, and free from noise concerns. At YTC, the combination of extensive ranges, priority on airspace use, unencumbered natural environment testing, and strong, mission oriented workforces, ensures that military equipment performs as required under the extreme climates to which it is likely to be exposed when deployed. (1) YTC is a general purpose proving ground with the additional mission of desert natural environment testing and functions as a DOD Major Range and Test Facility Base (MRTFB). YTC is located within a road, rail, and air network, offering rapid access to its testing and training areas. YTC has priority of use on the seven Restricted Airspace Areas overlying its range area and the King of Arizona (KOFA) Game Range. It includes five major types of landscape, characterized as rugged mountains, moderately rugged mountains, rugged hills, alluvial fans, and alluvial aprons and plains. The proving ground hosts the Military Free Fall School, Winter Training for the Army s Golden Knights, and numerous Army and USMC units during training exercises. YTC is divided into two major range areas, the KOFA Firing Range L-4 ATEC Pamphlet June 2010

373 Complex and the Cibola Range, with desert environment and desert automotive testing balanced between the two. KOFA Range is an integrated test complex for open air testing for direct fire weapons, artillery, mortars, mines and countermines, demolitions, and small missiles. The KOFA ranges are named after the KOFA Mountains. Cibola Range is the most highly instrumented rotary-wing aircraft range in the United States. (2) The YTC desert environment and desert automotive test facilities provide the ideal location for testing individual and Soldier support equipment and automotive systems and components under harsh, desert conditions. There are eight special desert terrain test courses, prepared test slopes and obstacles, and a two and one half mile paved dynamometer course available for automotive testing. These are backed by vehicle fording basins, swim testing facilities, fuel and lubricant testing, and instrumentation capabilities available for wheeled and tracked vehicles. The Mid-East test course is a grueling 22-mile desert terrain course that simulates conditions found in the world s deserts. The course is so rugged that an experienced driver will take up to 4 hours to complete it. (3) YTC s National Counterterrorism/Counterinsurgency Integrated Test and Evaluation Center (NACCITEC) and the Joint Experimental Range Complexes (JERC-1, JERC-2. JERC-3) boasts proven expertise in testing electronic countermeasures that defeat improvised explosive devices. g. Cold Regions Test Center (CRTC). The CRTC operates as a component of YTC and is headquartered at Fort Greely, Alaska. The Army Scientific and Advisory Panel Ad Hoc Group on Climatic Testing concluded that CRTC was an irreplaceable asset of primary utility for operational testing and recommended arctic natural environment testing for complete systems and components testing in cold chambers. (1) CRTC offers a full range of test capabilities and professional expertise for temperate, Basic Cold (-5 F to 25 F) and Cold (-25F to 50F) natural environment testing for Army systems. These include combat and tactical vehicles, infantry and special operations weapons, ammunition, missiles, clothing, and individual equipment, power generation and decontamination equipment, and direct and indirect fire weapons. It operates over 670,000 acres of range, and almost all forms of individual sub-arctic environments (to include rugged mountains, tundra, glacial stream beds, deep forest, and snow and ice fields) are available within 50 miles of Fort Greely. CRTC is the only U.S. test site that realistically combines the elements of a winter battlefield with a test season long and cold enough to guarantee suitable test conditions. The winter test window runs from October to March, with the coldest temperatures usually experienced in December and January. Temperate testing, approximating the Northern European climate, is available from April through September. CRTC retains priority of use on airspace overlying its test ranges. (2) CRTC testing is centered at Bolio Lake Test Complex, a forested bowl-like depression capable of accommodating test and test support operations. The complex contains the cold-start engine test facility as well as large maintenance and storage facilities. CRTC also has eight firing ranges for testing small arms, direct fire, artillery, small missiles, and explosives. Military Operations in Urban Terrain (MOUT) facilities are available at Fort Richardson for testing small arms. Allen Army Airfield at Fort Greely provides air delivery capabilities and ATEC Pamphlet June 2010 L-5

374 regularly hosts large-scale arctic training operations. The CRTC staff provides a critical nucleus of extreme cold weather test and operations experience. The CRTC staff is augmented by safari teams and engineering and instrumentation capabilities from YTC. h. Tropic Regions Test Center (TRTC). TRTC operates as a component of YTC and is also headquartered there. Its test facilities and ranges are located in Hawaii, Panama, Suriname, and other tropic areas within Central and South America. This test center maintains an array of test areas in a variety of settings including tropic forest, jungle, open lands, and coastal environments. TRTC conducts humid tropic tests on a wide variety of military systems, materials, weapons, and equipment of all conceivable types, sizes, configurations, and uses, to determine the effects of tropic conditions on materiel, Soldier performance, and reliability. The combined factors of heat, humidity, solar radiation, insects, fungus, bacteria, and rainfall can quickly reduce the performance of both Soldier and machine and corrode materials beyond utility. The Army Research Office (ARO) study performed to validate tropic test sites indicates that Hawaii meets many of the tropic conditions required; however, certain tests, notably those dealing with sensors and communications systems, require extreme conditions such as those found in the Republic of Panama. For this reason, YTC has worked through the State Department and negotiated Cooperative Research and Development Agreements with Panamanian universities for testing and research on sensors, communications equipment, and medical operations. i. DTC s Test Capabilities. Table L-1 provides a detailed listing of DTC s test center test capabilities and is provided as a reference for test planning. Table L-1. DTC s Test Center Test Capabilities Capability ATC DPG EPG RTC WSMR YPG Active Protection Systems R P S1 Air Delivery Systems/Air Drop S1 S1 P Air/Missile Defense Systems S1 P S1 Aircraft Systems Airworthiness P Aviation Subsystems and Components P Aviation Platforms and Aviation Life Support Equipment P Aircraft Survivability Equipment (ASE) S1 P S1 Aircraft Armaments - Fixed Wing P Aircraft Armaments and Armament Systems Rotary - Integration P Aircraft Armaments and Armament Systems Integration, Rotary - Firing S1 P Aircraft Armaments - Component Level P Avionics (Navigation, Communication, and Pilotage) S1 P Propulsion Systems - Ground Test P Propulsion Systems - Aircraft Test P Antenna Testing P S1 S2 ATIRS/ADACS Development and Maintenance P L-6 ATEC Pamphlet June 2010

375 Automotive Vehicles Capability ATC DPG EPG RTC WSMR YPG System Level and Driveline Components P R Electrical System/Software Performance P R Stability, Handling, Response and Control P R Turret and Weapon Systems P S1 R C4 (Command, Control, Communications, and Computers) Automotive/Vehicle Systems P R S1 S1 Aviation Systems S1 P S2 Missile Systems S1 S1 P Navigation Systems P R S1 S1 S1 Systems Components P R S1 Information Assurance P R S2 Network P R S2 Chemical and Biological Defense Decontamination Detection Chemical and Biological Individual/Collective Protection Contamination Avoidance Directed Energy Weapons S1 P Direct Fire Systems (Non Missile/Rocket) Shoulder Fired Weapons (Non Missile/Rocket) P S1 Small Arms Systems Direct Fire Weapons Performance P S1 Fire Control P S1 Direct Fire Munitions Performance/Acceptance S1 P Electromagnetic Environmental Effects (E3) Electromagnetic Interference(EMI) Electromagnetic Compatibility (EMC) External Electromagnetic Environment (EME) Air Systems S1 P S1 Aviation Safety of Flight (ADS-37A) P Command, Control, Communications and Computers P R S1 ISR S1 P Ground Systems S1 S1 S1 P Lightning P S1 Electromagnetic Pulse (EMP) S1 P Electro-Static Charge Control/ Precipitation Static (P-Static) Measurement and Analysis P S1 Hazard of Electromagnetic Radiation to Ordnance (HERO) Measurement and Analysis P R Hazard of Electromagnetic Radiation to Fuel (HERF) Measurement and Analysis R P R Hazard of Electromagnetic Radiation to Personnel (HERP) Measurement and Analysis R R P R Electrical Bonding R R R R External Grounds R R R R Emission Control (EMCON) R R R R P P P P P ATEC Pamphlet June 2010 L-7

376 Capability ATC DPG EPG RTC WSMR YPG Electronic Countermeasures - IED S1 S1 S1 S1 P Electronic Warfare P S1 S1 Emissions Characterization (Non -RF) P S1 Engineering Equipment Construction/Material Handling Equipment (MHE) P R Mine Detection/Neutralization Capabilities S1 S2 P Bridging Systems P S2 Demolition/Munitions S1 P Watercraft and Marine Systems P S2 UXO Detection Systems and Technology P S1 Environmental Mitigation Technologies Extreme Natural Environments S1 P Indirect Fire Systems Mortars S1 P Indirect Fire Weapon Systems S1 S1 P Indirect Fire Munitions Performance/Acceptance S1 P ISR (Intelligence, Surveillance and Reconnaissance Systems) Intelligence/C2 Systems P R R Biometrics S1 P Surveillance/Reconnaissance Systems/ Subsystems S2 R P S1 Target Acquisition Architectures (Infrared Electro Optical Sensors, Radar) S1 S1 S1 P S1 Littoral Warfare P Meteorological Technology Development Electronic Countermeasures - IED S1 S1 S1 S1 P Electronic Warfare P S1 S1 Emissions Characterization (Non -RF) P S1 Engineering Equipment Construction/Material Handling Equipment (MHE) P R Mine Detection/Neutralization Capabilities S1 S2 P Bridging Systems P S2 Demolition/Munitions S1 P Watercraft and Marine Systems P S2 UXO Detection Systems and Technology P S1 Environmental Mitigation Technologies Extreme Natural Environments S1 P Indirect Fire Systems Mortars S1 P Indirect Fire Weapon Systems S1 S1 P Indirect Fire Munitions Performance/Acceptance S1 P ISR (Intelligence, Surveillance and Reconnaissance Systems) Intelligence/C2 Systems P R R Biometrics S1 P Surveillance/Reconnaissance Systems/ Subsystems S2 R P S1 P P P L-8 ATEC Pamphlet June 2010

377 Capability ATC DPG EPG RTC WSMR YPG Target Acquisition Architectures (Infrared Electro Optical Sensors, Radar) Littoral Warfare Meteorological Technology Development Missiles/Rockets S1 S1 S1 P S1 Line-of-Sight Missiles S2 P R S2 Non Line-of-Sight Missiles S1 P Missile/Rocket - Propulsion Systems P S1 Components/Subsystems (Warheads Fusing, Guidance/Seeker, etc.) P S1 Non-lethal Weapons P S1 S2 Nuclear Weapons Effects Optical and Electro-Optical Systems (e.g., Gun Sights; FLIRs; NVGs, Target Acquisition, etc.) Smoke and Obscurants SOFIMS P P R P S1 S1 Smoke and Obscurants Effectiveness S2 P S2 Smoke Generation S2 P S2 Soldier Systems Clothing and Individual Equipment P S1 S1 General Support Equipment (Tents, Power Generation Equip, Petroleum and Water Systems, Compressors, Pumps, Welding Equipment, Army P S1 Medical Systems, and Force Sustainment Systems e.g., Kitchen, Latrine, Laundry) Systems of Systems Integration ABCTM Level Live S1 S1 S1 S1 P S1 Distributed Testing - Distributed Test Control Center R R R R R Distributed Testing - Inter-Range Control Center TEMPEST P S2 Transportability Helicopter External Airlift P S1 R Lift/Tie Down/Rail Impact P R S2 Unmanned Aircraft Systems (UAS) Performance S2 S2 S2 P S2 R Weapons Integration S1 S1 P Unmanned Ground Vehicles (UGV) P S1 S2 R Vulnerability/Lethality Live Fire Vulnerability (Title 10) Armor Performance Fire Suppression Systems Live Fire Lethality (Title 10) P S1 S1 Ballistics and Ballistic Effects Blast and Blast Effects Underwater Shock Soldier Survivability P P P P P P P P P P ATEC Pamphlet June 2010 L-9

378 Advisor/Consultative Support to RDA Community Common Functions Counter-Terrorism/Counter Insurgency Systems and Technologies Develop, Acquire, and Maintain Instrumentation, Facilities, and Personnel for Assigned Test Capabilities Exploitation of Foreign Weapons Systems Fabrication Facilities/Capabilities Force Protection Systems and Integration Homeland Defense Technologies Human Factors Engineering/MANPRINT Instrumentation Development Marketing Meteorological Measurement MIL-STD-810 Environmental Testing Mobile Technical Test Services (Safari) Modeling and Simulation Prepare Test Documents, Designs, Plans and Reports Range Sustainment Reliability, Availability, Maintainability (RAM) Review/Critique Materiel Requirements Documents Safety Confirmation Recommendations Safety Release Recommendations Stewards of Natural Environment Support Operational Tests (OT) Support to Service/Joint Training System Safety Test Expertise and Representation for Committees, Boards, Symposia, Scientific/Engineering Conventions Test Planning, Execution, Reporting, and Analyses Test Policy Coordination Capability Definitions: Primary (P) - Lead test center based on depth of expertise and technical facilities Reinforcing (R) - Essential capability to meet capacity/surge needs which exceed those at the primary test center or facilitate on-site testing for commodity-specific items Supplemental (S1)- Specific capability or capability set not available at other test centers L-4. OTC a. OTC is headquartered at Fort Hood, Texas. OTC test directorates and supporting offices are located throughout the continental United States to best support test and evaluation requirements and provide for maximum utilization of command resources. Five of the command s test directorates are co-located with the headquarters. Forward test directorates are located in support of branch or functional requirements at Fort Bliss, Texas; Fort Bragg, North Carolina; Fort Huachuca, Arizona; and Fort Sill, Oklahoma. b. OTC performs a variety of test activities in support of ATEC and the Army mission. Operational testing and experimentation is conducted to support development of Army L-10 ATEC Pamphlet June 2010

379 acquisition programs and fielding of all types of equipment, weapons, command and control systems, aviation, and support systems within the Army. Force development test and experimentation is conducted to support development or refinement of non-materiel requirements such as training, doctrine, and tactical aspects of Army requirements. Additionally, OTC conducts or supports completion of two other major types of testing. Customer tests are performed to meet customer specific requirements and provide needed information for program progress or modifications. Army warfighting experiments (AWEs), advanced technology demonstrations, and advanced concept technology demonstrations are conducted to provide early identification of potential of concepts or processes. c. The majority of OTC executed tests and experiments are conducted at the locations of the assigned player units for the specific event. OTC will normally deploy the test team and supporting personnel, instrumentation, and other required support to the player unit installation or other Army Test Schedule and Review Committee (TSARC) designated location for test conduct. As a result of this methodology, OTC does not operate major fixed test facilities and ranges other than small ranges located on installations other than Fort Hood where test directorates are located. OTC instrumentation and model and simulation systems are, in most cases, also designed to be transportable to the test site for test support. Government and contractor personnel are also deployed with required systems for simulation or stimulation of the system under test (SUT) and to collect required test data for evaluation requirements. Training of player unit and data collection personnel on the operations of the SUT also normally occurs at the player unit test site. d. Historical data of OTC (and predecessor organizations) conducted events depict a broad range of testing across most of the world and with almost every conceivable type of Army related equipment or materiel. Command test personnel have deployed to Korea, the majority of countries in Europe, Egypt, Israel, Canada, and Australia, and to most of the 50 states, including Hawaii and Alaska, to Army, Air Force, Navy, and Marine Corps installations. Over the past 30 years, OTC has conducted some type of test at almost every Army installation in the continental United States and at many installations operated by the other services, as well as numerous joint or multi-service operational tests in coordination with the other services. e. While the overall methodology of operations is similar within the OTC test directorates, the areas of expertise and/or types of equipment and materiel tested vary by test directorate. The following paragraphs provide additional information on the capabilities and functions of the OTC test directorates. L-5. OTC s Test Directorates a. Airborne and Special Operations Test Directorate (ABNSOTD). ABSOTD is located at Fort Bragg, North Carolina. ABNSOTD conducts operational tests and field experiments involving doctrine, training, organization, and materiel relating to airborne equipment, procedures, and systems. This includes aerial delivery and transportation items in support of air movement operations; equipment/materiel intended for air movement in aircraft from all services of the U.S. military; and materiel, procedures and training related to Special Operations forces. ATEC Pamphlet June 2010 L-11

380 b. Aviation Test Directorate (AVTD). AVTD is located at Fort Hood, Texas, and conducts operational tests and field experiments of manned and unmanned aviation systems, subsystems and ancillary equipment. It consists of two divisions; the Cargo/Lift Test Division, which plans and conducts testing on cargo and utility helicopters, and fixed wing aircraft and the Attack/Recon Test Division, which plans and conducts testing of reconnaissance and attack helicopters, tactical trainers, aviation countermeasure programs, and unmanned aerial systems. c. Battle Command and Communications Test Directorate (BCCTD). BCCTD is located at Fort Hood, Texas, and was formerly known as the Command, Control, Communications, and Computers Test Directorate (C4TD). BCCTD conducts operational tests of systems that will process and transmit voice, data, messaging, and video information through networks at the tactical, operational, strategic, and sustaining-base levels. This directorate also assures that information storage and transmission are secured, available, and protected from hostile or accidental destruction or release. BCCTD serves as the lead directorate for the Army in testing net-centric Battle Command and Communications systems. d. Fires Test Directorate (FTD). FTD is located at Fort Sill, Oklahoma, and was formed by the merging of the former Air Defense Artillery Test Directorate (ADATD) and Fire Support Test Directorate (FSTD). FTD conducts operational tests of Army materiel, equipment, and doctrine for field artillery systems. It consists of the Artillery Test Division, Missile Test Division, and Support Division. As a result of recently absorbing the Future Integrated Test Directorate (FITD), FTD is now also responsible for operational testing of Army modernization programs related to the Spin out Early Infantry Brigade Combat Team and for operational testing of major defense acquisition programs for follow-on Brigade Combat Team modernization program. FTD serves as the lead directorate for support to Army modernization and transformation experiments and dissemination of lessons learned. It develops testing parameters that meet the requirements of Army modernization complexity and reliance on a network-centric environment and serves as subject matter experts (SMEs) for robotic employment and operations. e. Intelligence Electronic Warfare Test Directorate (IEWTD). IEWTD is located at Fort Huachuca, Arizona, and plans, conducts, and reports on assigned mission-based operational tests, assessments and experiments of intelligence, surveillance, reconnaissance (ISR), electronic warfare (EW), counter improvised explosive device (CIED), and biometric systems. IEWTD also provides robust live, virtual, and constructive synthetic operational test environments employing validated threat models and simulations under realistic battlefield conditions to test present and future systems. The directorate also conducts operational assessments at worldwide locations to support rapid acquisition initiatives, the Joint Improvised Explosive Device Defeat Organization (JIEDDO), and the Warfighter s urgent needs. f. Maneuver Support and Sustainment Test Directorate (MS2TD). MS2TD is located at Fort Hood, Texas, and was formerly known as the Engineer and Combat Support Test Directorate (ECSTD). MS2TD is a versatile directorate that conducts operational tests and field experiments involving doctrine, training, organization, and materiel related to engineering, chemical, quartermaster, logistics, ordnance, military police, medical, transportation, and Soldier support operations. L-12 ATEC Pamphlet June 2010

381 g. Maneuver Test Directorate (MTD). MTD is located at Fort Hood, Texas, and was formerly known as the Close Combat Test Directorate (CCTD). MTD conducts operational tests and field experiments involving doctrine, training, organization, and materiel relating to infantry, armor, and combined arms systems. It is the lead test directorate for operational testing of the weapons and systems of the Army s Transformation Force. The MTD also conducts independent operational testing for weapons, scopes, lasers, armored vehicles, and future combat rifles. h. OTC s Core Competencies. Table J-2 provides a synopsis of OTC s core competencies and their related test facilities, and is provided as a reference for test planning. Table L-2. OTC s Core Competencies - Test Directorate Core Competencies Advanced Concept Technology Demonstrations Advanced Warfighting Experiments Air Crew Protection Systems Air Defense Unit Operations Air Delivery Equipment Air Target Acquisition Systems Air Traffic Control Air Worthiness Air-to-Air Missiles Air-to-Ground Missiles Armor Systems Army Aircraft Aviation Unit Operations Avionics C4I Surveillance/Reconnaissance Systems C4I Systems Chem/Bio Defense Chem/Bio Unit Operations Combined Unit Operations and Integration Communication Unit Operations Construction/Bridging Equipment Data Collection Instrumentation Data Reduction Software Direct Fire Guns/Ammo/Sites Direct Fire Missiles E3 Engineer Unit Operations Field Artillery Unit Operations Test Directorate(s) MTD MTD AVTD FTD ABNSOTD FTD, AVTD AVTD AVTD AVTD AVTD MTD AVTD AVTD AVTD IEWTD BCCTD AVTD, MS2TD MS2TD MTD BCCTD MS2TD ABNSOTD, FTD, IEWTD ABNSOTD, FTD, IEWTD MTD, ABNSOTD MTD IEWTD MS2TD FTD ATEC Pamphlet June 2010 L-13

382 Core Competencies Field Ops/Logistics for Deployed Test Team Force Combat Systems Force Protection GPS Testing Indirect Fire Guns/Ammo/Fire Control Computers Indirect Fire Locating Radar Indirect Fire Missiles/Fire Control Computers Individual Clothing and Equipment Infantry/Armor Unit Operations Information Assurance Information Mission Area Systems Intel Broadcast System Intel M&S Intel Threat Intelligence Analysis Equipment Intelligence Unit Operations Internal/External Air Transportability LAN Support for Deployed Test Team Low Velocity Air Drop of Equipment M&S Maintenance/Equipment Repair Systems Mech Infantry systems Military Police Unit Operations Military Police/Surveillance/Patrol Mine/Countermine/Demolition Mobility Enhancement Vehicles Mortars Non-Lethal Weapons Power Generation Equipment Small Arms Smoke/Obscurants/Illumination Special Operations Equipment (Air) Special Operations Equipment (Ground) Surface-to-Air Guns/Ammo/Fire Control Computers Surface-to-Air Missiles Target Acquisition/Weapons Seekers TMDE Transportation Equipment Transportation Unit Operations Test Directorate(s) ABNSOTD, FTD, IEWTD FITD MS2TD, MTD IEWTD AVTD, FTD FTD AVTD, FTD AVTD, MTD, ABNSOTD MTD, ABNSOTD ABNSOTD, FTD, IEWTD, FITD, AVTD, BCCTD, MTD, MS2TD BCCTD IEWTD IEWTD IEWTD IEWTD IEWTD ABNSOTD, AVTD, ABNSOTD, FTD, IEWTD ABNSOTD, AVTD FITD,ABNSOTD, FTD, IEWTD AVTD, MS2TD MTD MS2TD MS2TD MS2TD MS2TD MTD MTD, MS2TD AVTD, MS2TD MTD, ABNSOTD AVTD, MS2TD ABNSOTD, AVTD, MTD, IEWTD MTD, IEWTD FTD FTD AVTD, FTD, IEWTD AVTD, MS2TD ABNSOTD, AVTD, MS2TD MS2TD L-14 ATEC Pamphlet June 2010

383 Unmanned Aerial Systems (Sensor Payloads) Unmanned Ground Vehicles Core Competencies Water Purification Equipment AVTD IEWTD MS2TD, IEWTD MS2TD Test Directorate(s) ATEC Pamphlet June 2010 L-15

384 This page intentionally left blank. L-16 ATEC Pamphlet June 2010

385 Appendix M Data Authentication Group (DAG) M-1. Purpose The purpose of this appendix is to provide general guidance to assist in planning for Data Authentication Group (DAG) resources. Since the DAG is an independent body established by the ATEC AST that is separate from the event execution and system assessment or evaluation functions, DAG resourcing needs to be explicitly planned and tailored for each event. Although the specifics of the resourcing process can be expected to vary from event to event, time-tested DAG resourcing principles can be relied upon to avoid unpleasant surprises. M-2. General a. A formal independent DAG or an internal ATEC group, depending on the visibility and size of an event, may accomplish data authentication for ATEC-conducted events. The DAG is a group of representatives from the testing, evaluation, user, and acquisition communities, formed for the purpose of authenticating data that is generated, collected, and reduced. As standard practice, ASTs implement formal DAGs for all OT events, regardless of Rapid or POR status, conducted for DOT&E oversight programs or multi-service programs, and for large-scale events conducted for non-oversight programs. For multi-service programs, the DAG may be Servicespecific and may require a Memorandum of Agreement (MOA) or Memorandum of Understanding (MOU) to ensure coordination between Services. ATEC DAGs are generally formed for small-scale events when the AST has determined that membership for a formal DAG is not available, funding is not available, or a DAG is not required for other reasons. An ATEC DAG may also be formed to authenticate DT data. b. Formally established DAGs are multidisciplinary teams with membership from various agencies that have responsibility and a vested interest in the system under test. The AST coordinates with the acquisition community and identifies appropriate membership for the DAGs. Mandatory members are ATEC (AEC, DTC, and OTC), TRADOC Capability Manager, and the PM. Additional representatives from the acquisition team may also be members. DAGs formed within ATEC generally includes the evaluator, the test analyst, the data manager, the test officer, and subject-matter experts. ATEC support contractors may attend DAGs if they do not have a current contractual relationship with the system under test. Because of the inherently governmental nature of the DAG function, DAG participation normally consists of government employees, unless there are exceptional circumstances requiring the participation of ATEC support contractors. In that event, a written request for exception to policy should be submitted to the OTC Commander for review and approval. PMs may bring a system contractor into the DAG to explain aspects of the system under test. This exchange of information must be documented. c. The mission of a DAG (whether formal or within ATEC) is to authenticate that the data collected and reduced at an event is suitable for analysis and evaluation. Authentication of data is dependent upon a foregoing process where it is verified that the data is complete, accurate, consistent, and representative of the system s performance. To facilitate the verification of data, data verification teams (DVTs) may be established at the discretion of the operational tester. The ATEC Pamphlet June 2010 M-1

386 required quorum for a DVT includes at least one representative each from the operational tester and the evaluator. Other T&E WIPT members may be represented as appropriate. d. WebDAG is an Internet application developed to allow authorized DAG members Internet access to test data from any location capable of Internet connectivity at any time. It allows unique configuration for each DAG event and may be configured for each test in accordance with identified DAG requirements. M-3. Formation of a DAG a. Need for DAG. Any event that is anticipated to provide data for use in a system assessment or evaluation is a candidate for a DAG. The AST establishes the need for a DAG to support a specific event. It bases that need on such factors as program milestone decision points, ACAT level, extent of DOT&E oversight, type and level of system evaluation, magnitude of the event, and risk of incorrect interpretations of data. Consideration is also given to the likelihood that the event data will be required to support future modeling, simulation, assessment, or evaluation efforts. For ATEC- conducted events not associated with an acquisition program (e.g., an organizational force development test and experimentation), the executing agency will determine the requirement for a DAG. b. Tailoring DAG Duties to Specific Event. The duties for each specific event may be tailored by the DAG Chair to permit responsiveness to the requirements of the system assessment or evaluation strategy and the unique features of the event. Thus, different events may have different DAG Chairs, memberships, duties, and procedures as documented in eventspecific DAG Charters and DAG standing operating procedures (SOPs). c. Use of Event Data. The DAG Chair, with input from DAG members, makes recommendations to the AST regarding the suitability of the event data for analysis. Dissenting opinions are documented in the official DAG minutes. Data from non-dag events are authenticated by the event executor prior to being considered by the AST. M-4. DAG procedures a. Charter. See figure M-1 for a generic DAG charter and SOP. (1) The DAG Chair will establish event-specific duties by developing an event-specific charter and SOP. The Chair will submit the proposed DAG charter and SOP to the AST for approval. (2) The SEP or the test plan for RAIs, may include appendixes containing the DAG charter and SOP that pertain to the system being assessed or evaluated, thereby providing general DAG guidance and promoting consistency of activities from event to event. The charter and SOP may be combined into one document. (3) The OTA TP for each event will document either the AST decision to establish a DAG for an event or the rationale for not establishing one. When the AST determines that a DAG is required for an event, the OTA TP for the event will include the DAG charter and SOP. M-2 ATEC Pamphlet June 2010

387 b. DAG resourcing. The AST will ensure that resources for a DAG are identified and documented in the event TRP. Figure M-2 provides examples of resourcing considerations. c. DAG membership. The DAG is a multi-disciplined team composed of representatives from the test, acquisition and user communities. Membership is determined by the AST and may vary from event to event. d. DAG Chair. The DAG Chair will be from within ATEC and is appointed by the test organization. The chair may vary from event to event. e. DAG Chair responsibilities. (1) Develop and submit a DAG charter or SOP to the AST for coordination and inclusion in the OTA TP. (2) Assemble resources for functioning of the DAG. (3) Provide training for DAG members. (4) Perform confirmation of data collection, reduction, quality control and authentication processes, and report results at the OTRR. (5) Perform anomaly investigations. (6) Document DAG activities in official minutes. (7) Make recommendations to the AST on the suitability of event data for analysis. (8) Provide guidance and work assignments to DAG members. (9) Classify and secure data. (10) When used, ensure the WebDAG application is properly configured with the appropriate access control to protect the data. When using WebDAG, ensure that all DAG members and any authorized viewer have access to the Web site and the database. The WebDAG provides DAG members an automated means to add comments to records and request additional information on any record. f. Membership responsibilities. (1) Participate in all DAG meetings and support the DAG Chair. (2) Keep informed of changes in test operations and in data management. (3) Participate in the confirmation of processes, event data verification and validation activities, anomaly investigation, and data authentication. (4) Where practicable, be present at a sampling of events during record test. ATEC Pamphlet June 2010 M-3

388 (5) Remain available until all data has been authenticated. g. DAG review process. (1) Review the event data requirements, data collection, data reduction, and QC procedures for deficiencies that might adversely affect the database. (2) Review the data provided by the data management team. The review may include raw data (level 1), reduced data (level 2), and ordered data (level 3). (3) Review of Test Team and Player comments. (4) Perform a DAG-specific quick-look verification of event data that includes checking data element characteristics for consistency, reviewing comments and logs related to event execution, and generating and reviewing descriptive statistics for key data elements. (5) Produce and review DAG-specific data verification reports; crosswalk and review linkage of the event s RAM TIRs, MANPRINT problem reports, and software trouble reports to event performance data; and identify and investigate event conduct and data anomalies. The DAG will not pre-score or score TIRs, or problem or trouble reports. (6) Finally, the DAG will strive for a consensus regarding the suitability of event data for analysis. h. DAG timelines. The authenticated database will normally be delivered to the Evaluator by the default date of end of test (E)+10 in the ADSS. This date may be changed to meet program requirements, after coordination with the AST, and with the AST members chains of command. Any change in the E+10 dates should be entered into ADSS. i. The test team will provide the DAG access to the end-to-end run, pilot test, and record test events. j. The DAG is independent of the data manager and of the quality control process. However, the data manager will develop and implement a thorough quality control (QC) system to ensure that the DAG does not assume the QC function for the database. k. Once the data is authenticated by the DAG, it is prepared and packaged for delivery to the Evaluator. An agreement is made with the AST prior to the pilot test start data as to how the data will be packaged and delivered. M-4 ATEC Pamphlet June 2010

389 Generic DAG Charter and SOP 1. Purpose. This charter/standing operating procedure (SOP) provides guidance to Data Authentication Group (DAG) members and technical support staff on the responsibilities, methodology, and procedures applicable to the authentication of event data collected during Event ABC in support of an Army Test and Evaluation Command (ATEC) evaluation of System XYZ. 2. CHARTER. 2.1 Mission The mission of the DAG is to authenticate that the data being generated, collected, and reduced at Event ABC is suitable for analysis to evaluate System XYZ and the data is complete, accurate, consistent, and representative of event activities. The DAG also supports anomaly research and quality control (QC) functions during the authentication process. The DAG does not perform analysis for evaluating the system under test The DAG is an independent body that is separate from the event execution and system assessment/evaluation functions. The DAG provides a forum in which agencies or commands, as members of the DAG, may express and document opinions concerning event execution, data collection, data reduction, data authentication and their impact to the ATEC system-level database. 2.2 Membership. The DAG is a multi-disciplined team composed of a DAG Chair appointed by the test directorate and DAG members who are representatives from the test, acquisition, and user communities. Specifically, the following organizations are expected to provide representatives to support planned DAG activities: AEC, OTC, TRADOC Capabilities Manager, and the Program Manager (PM). DTC may be a member for specific tests or subtests, particularly when the test is a combined or integrated DT/OT. In addition, subject matter experts (SMEs) may be called upon as needed. 2.3 Responsibilities Chair Make recommendations to the AST with respect to the suitability of event data for analysis Review existing system documentation and previous DAG documentation, products, and databases to ensure that the DAG Charter and DAG SOP conform to system standards, procedures, and definitions Support the DAG resourcing process by making recommendations to the AST regarding resources needed by the DAG Assemble DAG resources per the relevant resourcing documentation to include membership, training, support personnel, facilities, computing equipment, tools, and documents Ensure timely access to event data and data collection, reduction, and QC procedures. Figure M-1. Generic DAG Charter and SOP ATEC Pamphlet June 2010 M-5

390 Perform a formal end-to-end process confirmation of the event data collection, reduction, QC, and authentication processes using pilot data of each type from every type of data source Initiate, coordinate, and support data anomaly investigations by event executor s data management team on an as-needed basis Provide guidance and assignments to DAG members Prepare official DAG minutes and report on DAG activities Members Attend all DAG meetings Support all data authentication activities Participate in end-to-end process confirmation activities Participate in event data validation activities Participate in event data verification activities Monitor event conduct in areas of expertise Monitor data collection processes Monitor data reduction processes Support anomaly investigations as needed Make recommendations to the DAG Chair with respect to the suitability of event data for analysis. Ensure that dissenting opinions are documented in the official DAG minutes. 3. Methodology. Data authentication ensures that event data is verified as a complete, accurate, and consistent representation of the system s performance during the event and that the event activities were a reasonable portrayal of the system s performance within the constraints of event limitations. Data authentication is performed in three phases: end-to-end process confirmation, event data validation, and event data verification. 3.1 End-To-End Process Confirmation. The end-to-end process confirmation is a pre-event activity that will be completed prior to record trials. The end-to-end process confirmation ensures that the data collection, reduction, QC, and authentication processes will result in an event-level database suitable for evaluating the system. The planning and execution of the data collection, reduction, and QC processes are the responsibility of the event executor. The planning and execution of the authentication process is the responsibility of the DAG Chair. 3.2 Event Data Validation. Event data validation ensures that the event-level database provides a record of what actually occurred during the event. Event data validation is primarily a value judgment process that often requires the expertise of all DAG members. Figure M-1. Generic DAG Charter and SOP (Continued) M-6 ATEC Pamphlet June 2010

391 3.3 Event Data Verification. Event data verification determines whether or not the event plan was executed as intended and whether or not the data in the event-level database are complete, accurate, and consistent. The event data verification process necessarily assumes that the planned execution will adequately generate the data needed for analysis purposes. Event data verification has three components: Representative. Representative is a measure of how much of the required data is available for analysis. Completeness requirements are established for data elements, expressed as a percentage requirement for the proportion of the times that the data element must be present and verified as an acceptable value. The DAG technical support personnel will develop DAGspecific software to report completion by data element Accuracy. Accuracy is a measure of the correctness of individual data element values. An indicator of accuracy is a data element value that is within expected range. Out of range values are reviewed to determine whether they are accurate as a data outlier Consistency. Consistency is a measure of the acceptability of a data element s value when compared to the values of other data elements. Data consistency can be determined by applying software algorithms that incorporate relationship rules to check for data values that might be inconsistent with the values of other data elements. 3.4 Authentication Codes. Based upon the results of event data validation and verification, anomaly research, and/or the DAG s knowledge of the system, the DAG will assign authentication codes to the data in the event-level database to reflect the DAG s assessment of the data s suitability for analysis. There are 3 categories of authentication codes: (1) Authenticated, (2) Limited-Use, and (3) No-Test. Authenticated data meet all guidelines established for validated and verified data. Limited-Use data meet sufficient guidelines established for validated and verified data to support some analyses (e.g., the data may indicate that a message was received but the time of that receipt is suspect/missing). No-Test data are considered either non-representative or erroneous and thus unsuitable for analysis. The DAG Chair is responsible for developing a systematic approach to using authentication codes. All No Test data will be reported in the daily minutes of the DAG meeting by the DAG Chair. 4. PROCEDURE. The DAG product will be an authenticated level 3 event-level database that can be integrated into the ATEC system-level database. The goal is an event-level database with no unexplained anomalies that is suitable for evaluative analyses. DAG activities correspond to the three phases of authentication: end-to-end process confirmation, event data validation, and event data verification. 4.1 End-To-End Process Confirmation. End-to-end process confirmation ensures that the data collection, reduction, and QC processes implemented by the event executor, and the authentication process implemented by the DAG are appropriate to address evaluation issues and measures. The DAG will also review how event activities are controlled to confirm that the collection mechanisms do not adversely affect the conduct of the operations. The DAG will also confirm that the proper operational conditions are incorporated in the event plan; the data collection, reduction, and QC plans incorporate procedures for the capture of complete, accurate and consistent data; the event data resulting from subjective observations were the result of applying appropriate rules. And finally, that the data authentication process will yield an eventlevel database suitable for analysis. The end-to-end process confirmation provides the DAG with the necessary confidence that the data collection, reduction, and QC processes will support the data authentication process. The end-to-end process confirmation activities are performed prior to record trials and include: Figure M-1. Generic DAG Charter and SOP (Continued) ATEC Pamphlet June 2010 M-7

392 4.1.1 Review event planning, event execution, and evaluation documents to ensure an understanding of issues, criteria, data collection (manual and automated), data reduction, and event conduct limitations Review data collection, reduction, and QC procedures to confirm correct generation of data elements Develop DAG-specific procedures to properly apply authentication codes to event data Develop DAG-specific software to quality control and process event data in support of DAG activities Dry-run the data collection, reduction, QC and authentication processes. This offers the evaluation and event execution teams the opportunity to identify problems, clarify possible misunderstandings, and take necessary corrective actions. The results will be reported at the test/event readiness review before the start of record trials. 4.2 Event Data Validation. Event data validation is a determination as to whether or not the data provides a reasonable representation of what actually occurred during event activities. 4.3 Event Data Verification. Event data verification is a determination as to whether or not the event plan was executed as intended and whether or not the data in the event-level database are complete, accurate, and consistent Verification Levels. Verification is accomplished through manual and automated data checks. The focus is on determining the reasonableness of each data element by itself and in relation to logically linked data elements. Ideally, the event executor data reduction team conducts level one and two verification checks and the DAG conducts level three and four checks. The four verification levels are: Level one checks are single data element reviews within a record to ensure that individual data element values are within acceptable value ranges Level two checks are across data element reviews within a record to ensure that logically related data elements within the data record do not conflict Level three checks are across record reviews within the same data file to ensure that logically related data elements in the same file do not conflict Level four checks are across file reviews to ensure that logically related data elements in different data fields and records do not conflict Quick Look. The DAG will perform DAG-specific quick-look verifications of event data. The DAG technical support staff will provide the DAG with reports indicating the completeness, accuracy, and consistency of event data Crosswalk. The DAG will crosswalk and review linkage of the event s RAM Test Incident Reports (TIRs), MANPRINT problem reports, and software trouble reports to event performance data; and identify and investigate event conduct and data anomalies. The DAG will not prescore or score any TIRs, or problem/trouble reports. Figure M-1. Generic DAG Charter and SOP (Continued) M-8 ATEC Pamphlet June 2010

393 4.3.4 Anomaly Research. The DAG will initiate research of anomalies to confirm or explain data that appears to be erroneous. A data anomaly form will be filled out by the DAG Chair and provided to the test executor for anomaly research. Data anomaly forms and research results will be cataloged and maintained by the DAG Chair. The DAG Chair may prepare a written assessment of the anomaly for Test Incident Report (TIR) consideration. 4.4 Meetings. The DAG members will arrive a week prior to the pilot test to become familiar with DAG responsibilities and processes, observe data collection and reduction, and authenticate pilot test data to support the formal end-to-end process confirmation of the data collection, reduction, QC, and authentication processes. The DAG remains onsite until the last data has been authenticated and is forwarded to the analysis team. The DAG will meet at locations and times designated by the DAG Chair. 4.5 Use of Event Data The AST determines the use of event data for analysis and inclusion in the ATEC systemlevel database. The DAG Chair will make recommendations to the AST with respect to the suitability of the event data for analysis. DAG members will make recommendations to the DAG Chair. Dissenting opinions will be documented in the official DAG minutes DAG members may not copy or remove data from the DAG site without written permission from the DAG Chair. Furthermore, DAG members should not report performance or other data to their respective organizations unless required to support anomaly research and with the concurrence of the DAG Chair. Figure M-1. Generic DAG Charter and SOP (Continued) DAG Resourcing Considerations 1. Purpose. The purpose of this enclosure is to provide general guidance to assist in planning for DAG resources. Since the DAG is an independent body established by the ATEC AST that is separate from the event execution and system assessment or evaluation functions, DAG resourcing needs to be explicitly planned and tailored for each event. Although the specifics of the resourcing process can be expected to vary from event to event, time-tested DAG resourcing principles can be relied upon to avoid unpleasant surprises. 2. Planning. 2.1 The key to adequately resourcing the DAG is advanced planning. The significant components of that planning process consist of identifying: (1) the resources required to support the DAG activities, and (2) sources of funds to pay for those resources. 2.2 Emphasis is placed on the identification of the resources required to support the DAG activities because historically: (1) most planning errors are a result of incorrectly identifying/scoping those resources, and (2) traditionally, sources of funding are usually program and event specific. 3. RESOURCES. DAG resources include membership, training, support personnel, facilities, computing equipment, tools, and documents. Figure M-2. DAG Resourcing Considerations ATEC Pamphlet June 2010 M-9

394 3.1 DAG Membership Identify potential DAG members early - consider representatives from Army Evaluation Center (AEC), Operational Test Command (OTC), Test and Evaluation Coordination Office (TECO), Developmental Test Command (DTC), TRADOC Capability Manager (TCM), Director, Operational Test and Evaluation (DOT&E), Test and Evaluation Office (TEO), Program/Product Manager (PM), subject matter experts (SMEs), and technical support staff Proper DAG planning can/should FoSter an environment of synergy among DAG members by ensuring all members understand that: (1) the frequency and length of meetings are a function of data volume and types of tools available, (2) while the DAG is meeting, the DAG members must give priority to DAG activities, and (3) continuity of membership and support staff is paramount. 3.2 Training. Plan on training DAG members on the basics of test and DAG activities to include: schedule; roles and responsibilities; test organization; test concept; data flow; sources of data; data collection, reduction, and quality control; and tools. In addition, ensure that DAG members have access to sufficient system specific training prior to the initial DAG meeting so that they can contribute in a meaningful manner from the start. 3.3 Support Personnel. Support staff can be categorized as technical (i.e. programmers) and clerical/administrative. Under ideal circumstances, technical support staff is not only familiar with the tools, computers, and equipment that will be used to support the DAG activities, but also with the system under test. 3.4 Facilities. Facilities planning must include all levels of details. DAG members need room to spread out in order to be able to work effectively and share information. Examples of items to consider are: office space (possibly a trailer), desks, work tables, telephones, copiers, safes, office supplies, sources of electrical power, surge suppressers, uninterruptable power supply (UPS) equipment, portable latrines, dumpsters, and trash pick-up. 3.5 Computing Equipment. This resource category includes items such as computers, workstations, and printers. 3.6 Tools. Start early development and testing of software tools that will be used to explicitly support DAG activities. These usually include tools used to perform an end-to-end verification of the event data collection, reduction, and quality control process using data of each type from every type of source. These tools also usually include some that allow the DAG to perform quick-look verifications of event data (checking data element characteristics for consistency and generating/reviewing descriptive statistics for key data elements). Some tools may be designed to perform cross-referencing of multiple data sources to ensure that they are consistent and/or to allow the DAG to select the best source of data when provided with redundant data sources. Finally, the DAG may also need access to specialized equipment to review collected data. Typically, this is a function of the type of equipment used to collect the data - for example video and/or audio equipment. 3.7 Documents. This resource category includes working papers; system documentation; data collection, reduction, and QC procedures; and previous DAG documentation, products, and databases. Figure M-2. DAG Resourcing Considerations (Continued) M-10 ATEC Pamphlet June 2010

395 Appendix N Performance Measurement Enterprise System Note: This appendix will be revised upon completion of the pilot tests currently being conducted at two Test Centers. This is a place holder until final approval of the PMES process. N-1. Introduction a. The ATEC Performance Measurement Enterprise System (PMES) will provide a standardized Enterprise Project Management system to estimate and manage test and evaluation efforts. PMES is a Microsoft Office Project Professional based solution and is designed to couple with the ATEC Decision Support System (ADSS). PMES is the component of ADSS that accomplishes this integration. This system will provide the command an enterprise level repository of relevant test and evaluation management data. b. PMES is an analytical tool that facilitates assessment of a project schedule and cost variances based on actual cost and schedule performance. PMES provides the ability to analyze the physical performance against a detailed plan to allow for the prediction of final costs and schedule. PMES will provide managers with an early warning, allowing them to take necessary corrective action. PMES facilitates the analysis and determination of three specific areas: cost variance, schedule variance, and estimate at completion. (1) Cost Variance (CV) is the difference between what is planned to be spent by a certain date and the actual cost incurred to that date. Cost variance answers the question is the project under budget, on budget, or over budget? (2) Schedule Variance (SV) is the difference between the work that was planned to be accomplished by a certain date and the work that was actually accomplished. Schedule variance answers the question is the project ahead of schedule, on schedule, or behind schedule? (3) Estimate at Completion (EAC) is the actual cost of the work performed to date plus the estimated costs to complete the remaining work. The Estimate at Complete value answers the question will the project finish under budget, on budget, or over budget? N-2. Policy a. ATEC will conduct earned value management (EVM) on all Acquisition Category (ACAT) I and II and Office of the Secretary of Defense (OSD) oversight systems as requested by the customer. b. The Microsoft Project-based (PMES) is ATEC s only approved tool for test and evaluation program and project management. PMES is intended to be used on all T&E projects/efforts. c. The Microsoft Project plans will identify cost, schedule and record essential tasks, subtasks, documents, resources, task linkages, critical path, and other actions that must be accomplished, developed, or performed. ATEC Pamphlet June 2010 N-1

396 d. Costs will be reported in the following areas: (1) Military hours. (2) Civilian hours. (3) Government costs (materiel, supplies, temporary duty (TDY), other). (4) Contractor costs (labor, materials, supplies, TDY, other). e. PMES will identify, at a minimum, all required customer reimbursable costs. N-3. Responsibilities a. Establishing an initial or revised baseline of costs and/or schedules requires General Officer/Senior Executive Service (or designated representative) approval when major scope changes occur within the program. b. Responsible ATEC System Team (AST) members will maintain the approved baseline for systems, keeping them current with at least bi-weekly updates as described in the appropriate methodology section below. For developmental customer tests where there is no AST test manager and/or the test officer will accomplish this responsibility. N-4. Enterprise system solution methodology a. PMES is the follow on effort to the rock drill, tracking execution of the system test and evaluation plan. If there is no rock drill scheduled, SCAs will use PMES to develop and resource schedules for the system. The PMES cost estimates and schedule process will begin upon approval of the initial test scope with the program manager/customer. b. PMES emphasizes the planning and integration of costs, schedule, and performance of a program to support decisions. PMES helps manage risk areas and provides insight to report to higher management. Basically, PMES is measuring work performed in terms of the budget assigned to that work. The benefits of using PMES are (1) Projects can be managed to a controlled, project-oriented baseline. (2) Problems can be traced to the source. (3) More objective status can be provided. (4) Magnitudes and impacts of cost and schedule problems can be identified. (5) Estimates of final costs are well based. (6) There is an increased confidence in ATEC s internal management system. c. The Organizational Breakdown Structure (OBS) for ATEC is presented in figure N-1. All elements in the OBS are required to create and track costs and schedules in Microsoft Project. N-2 ATEC Pamphlet June 2010

397 d. The first three levels of the standard ATEC Work Breakdown Structure (WBS) for all test phases are presented in figure N-2. Microsoft Project will provide a template with WBS elements to create a tailored Microsoft Project file required for tracking the progress of a system through its acquisition test phases. e. The responsibility assignment matrix determines how OBS elements will support each element in the WBS. It defines discrete intersections between the WBS and OBS to identify control points of the program. Evaluators and Test Directors/Officers within each OBS Directorate/Test Center are the AST members responsible for developing the Microsoft Project file(s) addressing cost and schedule at the control account level. It defines what work will be managed and who is organizationally responsible for the work. This will allow ATEC to see efforts being executed by organizations to conduct events through varied acquisition phases. f. The initial step is to use the milestone information and the approved Test and Evaluation Strategy as the basic framework for identification of event tasks and establishment of an initial timeline. The AST will use this to identify/place the tasks in proper relationship with other tasks. Once the basic framework is developed, the AST must war-game requirements to identify those additional tasks for each event that must be included to finalize the complete set of tasks. All tasks must be identified and included in proper sequencing. g. The second step is to identify relationships between tasks. This is essentially the process that identifies tasks that must be completed prior to another task because the first task serves as a predecessor for performance of the subsequent or succeeding task. Any individual task may have another or several predecessor tasks that must be finished before completion of the specific task. For example, before the accreditation of a model or simulation used in an OT, an accreditation plan must be developed and approved and be accredited in accordance with the approved plan. Then the accreditation report providing the result of analysis of the acceptability of the model or simulation must be developed and approved. In this case, each of the steps has a mandatory predecessor task with associated timelines and resource requirements that must be completed prior to completion of the accreditation of the model or simulation in use. h. The third step is to identify subtasks with associated attributes and timelines that must be completed in support of each event task. Some event tasks may have no associated subtasks while other event tasks may have a large number of subtasks. In rare cases, subtasks may have subordinate elements that must be identified to provide for completion of the subtask. Each must be identified, documented, and added to the overall task listing. i. The fourth step is to identify required dollars and other resources ATEC labor (military, civilian, and contract), TDY, and provisioning requirements (threat, instrumentation, modeling and simulation, etc.) to complete each task and subtask to ensure it is completed on schedule. j. The fifth step in the process is a review of completeness and accuracy of the information. This process may be conducted within the AST. Modifications to the overall data may be made at the review depending on issues that emerge during the review. k. The final step in the process is the actual conduct of the baseline review for the designated review authority and the program manager/customer. The purpose is to conduct a review of all ATEC Pamphlet June 2010 N-3

398 the information and data developed to determine whether the overall plan for ATEC support of T&E cost and scheduled requirements for the acquisition program is adequate or requires modification. Additionally, overall timelines are reviewed as achievable or adjusted, resource requirements are documented, and other issues and enablers are identified, discussed, and resolved as required. l. If initial cost and schedule plans do not include obvious assumptions upon which costs and schedules were derived, then a separate explanation of detailed assumptions must be provided in the notes portion of the Microsoft Project file(s). This is used to justify cost and schedule changes due to scope change. m. OBS Directorate/Test Center costs will be entered into Standard Operation and Maintenance Army Research and Development System (SOMARDS) at least every 2 weeks coinciding with the end of the pay period or more frequently if desired. n. The OBS Directorate/Test Center AST member is responsible for the Microsoft Project file bi-weekly, coinciding with the end of the pay period. When a Project file is updated, it shows the work completed during that 2-week period. This is accomplished by determining the percent completed of each task worked or planned to be worked. This also includes adjusting the plan for required re-tests or increases in test scope. o. Actual costs may be updated within the project plans at any time by the project owner. The actual costs are drawn directly from SOMARDS, using the latest load date available. p. The PMES solution will calculate the following performance indices: (1) Cost Performance Index (CPI) The cost efficiency factor representing the relationship between the actual costs expended and the value of the physical work performed (earned value). (2) Schedule Performance Index (SPI) The planned scheduled efficiency factor representing the relationship between the value of the initial planned schedule and the value of the physical work performed (earned value). (3) Estimate to Complete (ETC) The value of the work required to complete a task from this time forward. (4) Estimate at Complete (EAC) A value to represent the projected final costs of work when completed. The EAC equals the actual costs incurred, plus the estimated costs for completing the remaining work. q. Performance metrics will be established by ATEC ODCSOPS for the PMES to flag cost and/or schedule variances within managed projects. Performance metrics will be presented to ATEC leadership within the established schedule. r. The ODCSOPS T&E management team will be given rights and privileges to the PMES information repository. N-4 ATEC Pamphlet June 2010

399 Appendix O System Safety O-1. General Army policy (AR ) requires that system safety be applied and tailored to all Army systems throughout their life cycle and that safety and health verifications and evaluation be an integral part of the system safety effort. As a whole, program executive officers (PEO), program/project/product managers (PM), Capabilities Developers (CBTDEV), materiel developers (MATDEV), testers, independent evaluators, and system safety engineers are responsible for a. Conducting system safety programs to minimize risks throughout the system life cycle. b. Conducting hazard identification, system safety composite risk management (CRM), and hazard closeout procedures during system development. c. Identifying and managing significant hazards discovered during operation and use of fielded systems. O-2. Key Safety Documents The aforementioned participants as well as others form the System Safety Working Group (SSWG), which is chartered by the PM. This group tailors the safety documents to the requirements of the system being developed. This is done through a variety of documents that are sources of information during the safety verification process. a. System Safety Management Plan (SSMP). Prepared by the PM, the SSMP establishes management policies, objectives, and responsibilities for execution of a system safety program for the life cycle of a system. The plan also establishes the methodology by which the PM may oversee and evaluate the execution of the contractor s System Safety Program Plan (SSPP). It should be reviewed to ensure that the milestone schedule identifies the timely issuance of the Safety Assessment Report (SAR) to DTC, and that DTC is provided the results of contractor testing. It identifies system safety management issues and is incorporated as part of the acquisition strategy for all systems. b. System Safety Program Plan (SSPP). The PM will ensure that the contractor prepares and updates an SSPP. The Safety Verification section should be reviewed to determine the adequacy of procedures for feedback of test information for review and analysis, and the adequacy of procedures established by the contractor s safety organization to ensure safe conduct of all tests. This plan is a description of the contractor s methods to implement the tailored requirements of MIL STD 882, including organizational responsibilities, resources, milestones, depth of effort, and integration with other program engineering and management activities as well as those of related systems. c. Health Hazard Assessment Report (HHAR). The HHAR is prepared by the Surgeon General at the request of the PM for those systems that require medical advice or assistance for ATEC Pamphlet June 2010 O-1

400 the developmental evaluation of health hazards. The SAR makes reference to the HHAR and includes information on health hazards. The PM provides the HHAR, and Hazards Analysis/List (if available) to DTC and OTC 60 days prior to start of all testing, with a copy furnished to the Defense Technical Information Center. d. Safety Assessment Report. The PM provides the Safety Assessment Report to DTC. DTC will not accept a Safety Assessment Report unless it has been approved by the PM s supporting safety office. It is important in preparation of the safety release. It is a formal summary of the safety data collected during the design and development of the system. The PM summarizes the hazard potential of the item, provides a risk assessment, and recommends procedures or other corrective actions to reduce these hazards to an acceptable level. This is a key source of data for the Safety Release. (1) The Safety Assessment Report is updated when changes are made that impact safety. The PM provides this report to DTC and OTC 60 days prior to start of all testing, with a copy furnished to the Defense Technical Information Center. (2) The Safety Assessment Report and the safety SOPs govern range safety. Before starting any test, the Safety Assessment Report must be thoroughly reviewed. For any hazardous operations, such as tests involving explosives, SOPs must be developed and approved. e. System Safety Risk Assessment (SSRA). The SSRA provides a comprehensive evaluation of the safety risk being assumed for the system under consideration at the MDR. This document is prepared by the PM and supports the decision for accepting residual hazards. f. Safety Release. The Safety Release is a formal document issued by HQ DTC before any hands-on testing, training, use, or maintenance by Soldiers. Copies of the Safety Release are also issued to the system evaluators, Capabilities Developers, and PMs. Operational testing, including pretest system training and DT involving borrowed troops will not begin until the test agency, the trainer, and the commander who is providing the test Soldiers have received a Safety Release. DTC does not provide the Safety Release for systems developed by the U.S. Army Enterprise Technology Command (NETCOM) or the U.S. Army Medical Command (MEDCOM). (1) The Safety Release is issued for a specific event at a specified time and location under specific conditions. It is a stand-alone document that indicates the system is safe for use and maintenance by typical user troops and describes the specific hazards of the system based on test results, inspections, and system safety analyses. Operational limits and precautions are also included. (2) The requirement for a Safety Release also applies to testing of new or innovative procedures (doctrine, tactics, etc.) for the use of materiel that has been type classified. Safety Releases are not required for use of standard equipment in the normal prescribed manner. (3) The test organization uses the information contained in the Safety Release to integrate safety into test controls and procedures and to determine if the test objectives can be met within these limits. O-2 ATEC Pamphlet June 2010

401 (4) When unusual health hazards exist, The Surgeon General reviews or participates in preparation of Safety Releases to ensure safety of user troops during operational testing. (5) For Acquisition systems, the Safety Release is developed at least 30 days prior to pretest training and at least 60 days prior to all types of OT and DT that expose Soldiers to training and testing activities involving the research, development, operation, maintenance, repair, or support of operational and training materiel. This requires that pertinent data (e.g., results of safety testing, hazard classification, etc.) be provided to the Safety Release authority in sufficient time to perform this testing or determine if additional testing is required. When a Safety Release is required for a Rapid Acquisition Initiative, the dates for the safety release are negotiated with the PM. (6) The Safety Release format is reflected in DA Pamphlet g. Safety Confirmation. AR 73-1, AR , and AR 70-1 require that a Safety Confirmation be prepared at the end of each phase of the acquisition process and at major decision points (i.e., MSs, FRP, materiel release, and fielding). HQ DTC is responsible for providing the safety confirmation for all systems. (1) The Safety Confirmation is a separate document issued by DTC that provides the materiel developer and the decision maker with DTC s safety findings and conclusions, and states whether the specified safety requirements have been met. It includes a risk assessment for hazards not adequately controlled, lists any technical or operational limitations, and highlights any safety problems requiring further testing. The Safety Confirmation is provided to AEC to be attached to the OMAR/OER. The Safety Confirmation will also be provided to the PM, the Life cycle Management Command (LCMC) Safety Office, U.S. Army Safety Center, the TRADOC Safety Office, and the materiel developer or PM supporting Safety Office. (2) The Safety Confirmation is issued in memorandum format and addresses the following areas: limiting factors, evaluation results, conclusions, and recommendations. O-3. Hazard Analysis a. Hazard analyses are the heart of the system safety evaluation and provide the preparers of the Safety Assessment Report, Safety Release, and Safety Confirmation with a wealth of information. The types of analyses that are performed must be stated in the Safety Engineering section of the Safety Assessment Report. b. From the beginning, a system must be designed to eliminate or control all potential and actual safety and health hazards. These hazards shall be identified in accordance with hazard evaluation techniques and these techniques result in the various hazard analysis documents. The following documents reflect hazard evaluation techniques: (1) The preliminary hazard analysis is an inductive process that should be conducted early in the design phase of the system life cycle to identify in broad terms the potential hazards associated with the proposed operational concept. The preliminary hazard analysis is prepared by the PM or contractor. It reflects the initial risk assessment of a system and identifies safety critical areas, evaluates hazards, and identifies the safety design criteria to be used. ATEC Pamphlet June 2010 O-3

402 (2) A System Hazard Analysis (SHA) is submitted by the contractor in accordance with the requirements of the contract data requirements list. It is a systematic assessment of real and potential hazards associated with possible subsystem failure. It identifies hazards and then directs design efforts toward the elimination or control of the hazard. The SHA indicates the hazard severity and the hazard probability levels as established by MIL STD-882. (3) The Subsystem Hazard Analysis (SSHA) Report is prepared by the PM or contractor. This report identifies hazards associated with component failure modes and functional relationships of components and equipment comprising each subsystem. The SSHA is an inductive process that, in effect, is an expansion of, with increased complexity over, the SHA. It normally occurs during the design phase; however, it can be used during operation as an investigation to establish cause and effect relationships and probabilities. (4) The Operating and Support Hazard Analysis Report is prepared by the PM or contractor. This report identifies hazards and determines safety requirements for personnel, procedures, and equipment during production, testing, installation, training, escape, and operations. It, too, provides information that can be used in preparing the safety release and safety confirmation. The Operating and Support Hazard Analysis is normally conducted on all identified hazards involving man/machine interfaces. It helps ensure that corrective or preventive measures will be taken to minimize the possibility that any human error procedure will result in injury or system damage. c. The Preliminary Hazard Analysis List is prepared by the PM. It involves making a study during concept or early development of a system to determine the hazards that could be present during operational use. d. TOP provides guidance for identifying and evaluating hazards associated with systems being tested by DTC. MIL-STD 882 provides uniform requirements for developing and implementing a system safety program of sufficient comprehensiveness to identify the hazards of a system and to ensure that adequate measures are taken to eliminate or control the hazards. e. Specific safety and health tests are performed on hazardous devices, components, or byproducts to determine the nature and extent of hazards presented by the materiel. Particular attention is given to identifying and assessing special safety and health hazards presented by radioactive materials, radio frequency emitters, toxic gases, laser devices, toxic and carcinogenic materials, gaseous emissions, blast overpressure, and harmful noise sources. f. Critical items are identified to provide visibility for immediate corrective action to prevent personal injury or system damage when a category I or I1 hazard (MJL-STD 822) is identified. This helps ensure that actions are taken to optimize system safety, reliability and maintainability, and minimize the seriousness of hazards that cannot be completely eliminated. O-4. Testing and Evaluating a. Adaptation. A successful system safety T&E effort requires adaptation in order to fit the particular system test. Not every test event need be performed for every system. The test agency may consult the SSWG on which tests are necessary for a particular system. The selected tests O-4 ATEC Pamphlet June 2010

403 or assessments are then included in the TEMP and the safety subsection of the test design plan or detailed test plan. b. Test integration. System Safety tests for critical devices and components will be incorporated into tests required for other disciplines. This is accomplished through the TEWIPT. The PM will ensure adequate safety representation in this group. c. Conduct of Test. (1) There are some trade-offs between safe testing and safety tests. The tradeoffs are between the benefits to be gained from safety testing versus the risk and cost associated with a particular test. The safety release process should be used to resolve any conflicts in this area. (2) Safety testing can be used to (a) Identify hazards, determine appropriate corrective actions, and establish corrective action priorities. (b) Determine and evaluate appropriate safety design and procedural requirements. (c) Determine and evaluate operational, test, and maintenance safety requirements. (d) Determine the degree of compliance with established qualitative objectives or quantitative requirements, such as technical specifications, operational requirements, and design objectives. d. Developmental tests. (1) DTC assists in determining the system safety testing and data requirements, and ensures pertinent specification analysis and criteria are identified and reflected in ATEC s T&E documents. DTC is also responsible for the initial placing of published documents into Versatile Information Systems Integrated On-line (VISION), an ATEC initiative to integrate data across test centers to provide a common Web-based user interface is available at Published documents are also distributed into the DTIC. (2) Developmental testing to provide safety data to support the safety release is frontloaded, that is, the test is designed so that safety data can be collected as early in the test as possible. Specific safety tests are also performed on critical devices or components to determine the nature and extent of hazards presented by the materiel. (3) In order to obtain safety related data, testing must be completed that is safety specific (e.g., noxious fumes or toxic gases, operation at the boundary of the operating environment, software overload tests, etc). Most safety related data is obtained during conduct of performance and endurance tests. Therefore, while safety-specific tests can be conducted early in the program to provide information for a Safety Release, the information reflected in the test report and safety confirmation (the system safety evaluation) addresses all testing. ATEC Pamphlet June 2010 O-5

404 (4) It is imperative that the tester obtain the HTL before starting DT. The list is used along with the SAR to identify the remedies that have been applied to correct previously identified hazards. Safety tests within DT are then performed to verify the adequacy of the remedy. (5) During technical testing, specific safety and human health tests are also performed on critical devices or components to determine the nature and extent of materiel hazards. Requirements for such tests will be found in the TEMP and independent evaluation plans and are usually performed during DT when contractor testing and data are not sufficient to make a hazard assessment. Special attention is directed to pamphlet. (a) Evaluating special safety and health hazards listed in paragraph 3 7of this (b) Verifying the adequacy of safety and warning devices and other measures employed to control hazards. (c) Analyzing the adequacy of hazard warning labels on equipment and warnings, precautions, and control procedures in equipment publications. (d) Verifying the adequacy of safety/health guidance and controls in the SAR, TMs, and Initial Health Hazard Assessment Report (IHHAR). (e) Considering hazard mitigating recommendations in reviewing and/or developing test center standing operating procedures (SOP). (f) Including/coordinating any unique data requirements in the safety or human health test designs that are implied in test documentation (for example, SAR IHHAR, and so forth). (g) Identifying new hazards in reports and test incident reports when the risk assessment is inaccurate or requires revision. e. Operational tests. (1) A system must be certified to be safe for troop use under the conditions or limitations specified in the safety release before any user testing begins. Therefore, OT of safety issues is less systematic and less technical than that conducted during DT. It is common, however, for unanticipated hazards to occur when a system is placed in the hands of Soldiers and put in operation. Therefore, test planning must include disciplined observation and other data collection procedures to ensure that such hazards are identified and added to the HTL. (2) Hazards identified in previous DTs that have subsequently been corrected must be evaluated during the user test to see if the correction is adequate in an operational environment. A safety consideration unique to OT is whether any safety release restrictions imposed are so confining that the user s training needs cannot be met or that an adequate user test cannot be conducted. O-6 ATEC Pamphlet June 2010

405 f. Evaluations. (1) One element of analysis that is common to all independent evaluations is system safety. A system safety evaluation focuses on the existing status and impact of any hazards or program deficiencies in terms of the system s overall effectiveness. (2) System safety issues enter the Continuous Evaluation (CE) process through continual dialogue among the PEO/PM/MATDEV, CBTDEV, developmental and operational testers and evaluators, and other members of the acquisition team. Key activities for input of system safety issues to CE are the developed TEMP issues and criteria. Again, the forum for coordination of acquisition team activities is the T&E WIPT. The SSWG ensures that the updated SSMP is used throughout the development process by the TE WIPT for updating the TEMP and by DTC for updating the System Evaluation Plan (SEP). g. Non developmental item tests. (1) Contrary to Government system development efforts, most NDI acquisition efforts effectively preclude the Army from obtaining detailed safety engineering evaluations or assessments from the prime contractor. Safety testing will be oriented to tests that are specifically required to fill gaps that have not been satisfied by contractor data. Specific test issues will be determined during the market survey and incorporated into the TEMP. (For more information on market surveys, see AR 70-1.) (2) Non developmental items tests frequently require as much testing as a pure development item because of utilization in an unplanned environment and assembly of parts in a new configuration. The SARs (when available) and safety releases are required for NDI testing. ATEC Pamphlet June 2010 O-7

406 This page intentionally left blank. O-8 ATEC Pamphlet June 2010

407 Appendix P Rock Drills P-1. Purpose A rock drill is conducted by and for ATEC personnel. It is ATEC s process to synchronize efforts across ATEC and to ensure that all involved ATEC personnel understand the sequence of events and that no actions or obstacles to success are overlooked. It will identify and document each task required for executing the test and evaluation (T&E) strategy. It will include the determination of the task attributes (schedule, timelines, dollars, work involved, responsibility, resources, etc.), relationships with or links to or from other tasks, responsible agencies or personnel, and other data, as appropriate. It will also identify issues and disconnects. The rock drill process culminates in an integrated schedule that ATEC, across the command, will commit to, execute, and manage as a baseline. ATEC will conduct rock drills for all ACAT I and ED, ATEC designated systems. The ATEC CG/ED, or his designee, will chair the rock drills for all ACAT I systems and other systems designated by the ATEC CG/ED. Additional rock drill policies/responsibilities are provided in ATEC Regulation P-2. Methodology Each rock drill is developed based on the approved evaluation strategy, the nature of the system under development, the risk associated with the program, the type of threat portrayed, the number and type of tests or other events conducted, and the overall acquisition program schedule. In addition, a rock drill requires the interaction of the entire AST and Test and Evaluation Working-level Integrated Product Team (T&E WIPT) to produce a quality product that lays out the entire ATEC T&E requirement for a system. Many of the tasks that are the responsibility of a specific SCA impact greatly on tasks that are the responsibility of another SCA. Constant contact and coordination between all participants are required to resolve issues and ensure accurate sequencing of tasks and identification of required resources, attributes, and timelines. A rock drill is not a substitute for an Early Strategy Review (ESR), Concept In- Process Review (CIPR), or the SEP. The rock drill focus is on the execution details not revising the T&E strategy. a. The rock drill will identify and resolve issues and disconnects. The rock drill will synchronize all efforts, identify and record all needed tasks, sub-tasks, documents, dollars, other resources, and other actions that must be accomplished, developed, or performed. b. For planning purposes, when required, the rock drill process will begin once the SEP is approved. The rock drill will normally occur no more than 90 days after SEP approval, but must occur before start of testing in support of the evaluation for selected systems. Specific system circumstances will dictate when the rock drill is needed. It may occur anytime after program initiation and should be completed before the start of testing in support of system evaluation. c. The rock drill will use a reverse planning process to work details for execution of the T&E strategy; the last task performed for the major activity is the first task identified and documented. The task that immediately precedes the final task is identified and, based upon the task attributes, ATEC Pamphlet June 2010 P-1

408 documented in relationship to the final task. This process continues until all identified tasks for the major activity are documented and, based upon the individual attributes of each task, placed within the overall rock drill structure in proper sequential relationship with each other. This process should identify any disconnects. d. A generalized reverse planning process is shown graphically in figure P-1. Each rectangle in the figure represents a T&E-related task that ATEC must support. The information in the figure is at a high level and does not include a complete detail of all tasks and sub-tasks required for the overall system-level rock drill. Figure P-1 is intended as an example of the flow of information required for an acquisition program. An actual rock drill should result in a much more detailed list of tasks/sub-tasks. Example of a System-level Rock Drill - Reverse Planning Process (Major tasks, sub-tasks, and task attributes identified in reverse order to ensure identification of all requirements. POST PRODUCTION DR Start here and work backward through each major task FRP DR OER DATA MS C OMAR DATA MS B OMAR DATA END ICD SOW REVIEW T&E STRATEGY ESR, CIPR TEMP SEP Figure P-1. Example of Rock Drill Reverse Planning Process e. The number of tasks identified for the rock drill will vary with each program. The overall goal is to identify and document all tasks. Each ATEC SCA must identify those tasks that fall within their areas of responsibility and coordinate the entry of tasks into the overall rock drill data contained in Microsoft Project. In general, DTC will document developmental test (DT) P-2 ATEC Pamphlet June 2010

409 and safety documentation requirements, OTC will document operational test (OT) requirements, and AEC will document evaluation requirements. The AST must work together to ensure that all tasks are identified, relationships with other tasks determined, and critical areas identified for program success. Examples of key tasks that might be identified by ATEC and each SCA are: (1) AEC. Program requirements such as acquisition milestone (MS) dates, T&E WIPT meetings, and documents (such as TEMP, ICD, CDD, Capability Production Document (CPD), and COICs) staffing and concurrence dates. ATEC and AEC requirements such as staffing and approval dates for integrated system evaluation strategies, ESRs, CIPRs, SEPs and OMAR/OERs, major briefings to DA and DOD agencies, and requirements for modeling and simulation events supporting evaluation tasks. (2) DTC. All DT events planning, executing, and reporting tasks for each DT event to be conducted during development, production, and post-production of the item, including a Safety Release to support testing, a Safety Confirmation in support of milestones and materiel release decisions, and any other key tasks. (3) OTC. All OT event planning, executing, reporting and related tasks for each OT event conducted during the overall acquisition program development period. Events could include, but are not limited to, customer tests (CTs), early user tests (EUTs), limited user tests (LUTs), and initial operational tests (IOTs). f. Table P-1 presents sample information that each SCA might identify on high level tasks and associated attributes that could be placed in the system-level rock drill database. Again, the information is not all-inclusive but serves as an example of tasks with attributes organized using the reverse planning process methods. g. The initial step is to use the milestone information and the approved Test and Evaluation Strategy as the basic framework for identification of event tasks and establishment of an initial timeline. The AST will use this to identify/place the tasks in proper relationship with other tasks. Once the basic framework is developed, the AST must war-game requirements to identify those additional tasks for each event that must be included to finalize the complete set of tasks. All tasks must be identified and included in proper sequencing. h. The second step is to identify relationships between tasks. This is essentially the process that identifies tasks that must be completed prior to another task because the first task serves as a feeder for performance of the subsequent or succeeding task. Any individual task may have another or several feeder tasks that must be finished before completion of the specific task. For example, before the accreditation of a model or simulation used in an OT, an accreditation plan must be developed and approved and be accredited in accordance with the approved plan. Then the accreditation report providing the result of analysis of the acceptability of the model or simulation must be developed and approved. In this case, each of the steps has a mandatory feeder task with associated timelines and resource requirements that must be completed prior to completion of the accreditation of the model or simulation in use. ATEC Pamphlet June 2010 P-3

410 Step 1. Identify last task to be performed or action from which input is required from or to. 2. Identify tasks that must be performed to support or allow performance of task identified in step Identify next prior event to be completed. 4. Continue next step Table P-1. Example of Rock Drill Process Task Full-rate production decision review (FRP- DR) by milestone decision authority (MDA) Complete, approve, and submit ATEC OER to applicable agencies to support FRP-DR requirements Receive, process, and analyze data and information from data sources listed in SEP. n. Step N Complete initial operational test (IOT) report (TR) n-1. Step N-1 Provide authenticated event database from IOT SCA entering data AEC AEC AEC OTC OTC Task attributes 1. Scheduled FRP-DR date. 2. Requires ATEC OER X-days prior. 1. OER submission date. 2. OER approval date. 3. OER completion date. 4. Requires data from specified supporting events. 1. Established dates for submission of data from identified data sources. 2. Requires DTC / OTC / other agencies to conduct events to provide data. 1. IOT E-date plus 60 days. 2. Requires authenticated event database. 1. IOT E-date plus 10 days. 2. Requires processing of data and completion of test. n-2. Step N-2 Conduct IOT OTC 1. IOT record test dates. 2. Approved OTA TP, safety release, and HUC approval. n-3. Step N-3 n-4. Step N-3 p-1. Step P-1 r-1. Step R-1 s-1. Step S-1 Complete IOT OTA test plan (OTA TP) Conduct IOT event planning Milestone (MS) C and Limited -rate production decision (LRPD) by MDA Complete limited user test (LUT) report (TR) Complete required DT and provide DT reports and data for MS C decision OTC OTC AEC OTC DTC 1. Approved at IOT T-120 days. 2. Requires USAEC approval and, for major systems, DOT&E approval. 1. Requires approved or complete draft SEP. 2. Requires applicable test support packages, OMS/MP, and other requirement documents. 1. MS C and LRPD date. 2. Requires ATEC OMAR X-days prior. 1. LUT E-date plus 60 days. 2. Requires authenticated event database. 1. Reports and data provided by established dates. t-1. Step T-1 MS B decision by MDA AEC 1. Scheduled MS B date. 2. Requires ATEC OMAR X-days prior. u-1 Step U-1 Contract SOW review w-1 Step W-1 Integrated T&E strategy z-1 Step Z-1 ESR/CIPR P-4 ATEC Pamphlet June 2010

411 i. The third step is to identify the subtasks, with associated attributes and timelines, which must be completed in support of each event task. Some event tasks may have no associated subtasks while other event tasks may have a large number of subtasks. In rare cases, subtasks may have subordinate elements that must be identified to provide for completion of the subtask. Each must be identified, documented, and added to the overall task listing. j. The fourth step is to identify the required dollars and other resources to complete each task and subtask to ensure it is completed on schedule. The other resources include ATEC labor (military, civilian, and contract), TDY, and provisioning requirements (instrumentation, M&S, etc.). k. The fifth step in the process is a review of completeness and accuracy of the rock drill information. This process may be conducted within the AST. Modifications to the overall rock drill data may be made at the review depending on issues that emerge during the review. l. Once each SCA s tasks and data identification are complete, the AST must merge all the data. After merging the data, the AST must review the completed task lists for accuracy, avoidance of duplication, and completeness. m. A key step in this process is to identify and resolve issues. n. An example of cross-sca interactions is the submission of data and other information to support development of evaluation briefings and documents. AEC normally will enter expected dates for development and approval of OMARs, OERs, or OARs as the ATEC input to acquisition program decision requirements. DTC and OTC expected dates for delivery of data and other information to support the evaluation requirements may require adjustment, where possible, to meet the OMAR/OER production dates or possibly the OMAR/OER production dates may have to be adjusted due to overall program timeline requirements. o. The final step in the rock drill process is the actual conduct of the rock drill for the designated review authority. The purpose is to conduct a review of all the information and data developed to determine whether the overall plan for ATEC support of T&E requirements for the acquisition program is adequate or requires modification. Additionally, overall timelines are reviewed as achievable or adjusted, resource requirements are documented, and other issues and enablers identified, discussed, and resolved as required. While this final step may provide a short review of the T&E strategy, the intent is not to completely redo the ESR/CIPR. However, the rock drill may identify flaws that could result in changes to the T&E strategy. p. Throughout the process, data for the ACAT I or designated systems rock drill tasks will be entered into the Microsoft Project format. This software provides the capabilities to display data in various forms, to include textual and graphical formats, which will facilitate understanding of the information. For clarity, a Gantt chart displaying all of the tasks from Microsoft Project should support the review. q. The rock drill will culminate in an approved, integrated schedule. This schedule becomes the document that the AST will execute and manage to. ATEC Pamphlet June 2010 P-5

412 r. The rock drill may be revisited if major program changes occur. If conducted, the methodology for these drills is normally the same as for any initial rock drill but may be modified by guidance from the appropriate authority or as determined by the AST. s. SCA Commanders/Director may request an ATEC-level rock drill for a system not on the ATEC CG/ED s designated list. The chair may be delegated from the ATEC CG/ED to an SCA Commander/Director. t. Program Managers (PMs) are invited to attend rock drills as observers. System development contractors generally will not attend; exceptions are approved by the ATEC CG/ED. Other Test and Evaluation Working-level Integrated Product Team (T&E WIPT) members may attend if desired. P-6 ATEC Pamphlet June 2010

413 Appendix Q Other T&E Considerations Q-1. Purpose This appendix provides a brief description of areas of study and methodologies used to address other areas that the evaluator may need to consider in the overall system evaluation. Q-2. Airworthiness (AR 70-62) a. Airworthiness is a demonstrated capability of an aircraft or aircraft subsystem or component to function satisfactorily when used within prescribed limits. Army policy requires that the airworthiness of an aircraft be determined and documented in an airworthiness release before Army aviators can operate aircraft in the performance of official duties. b. The Commander, U.S. Army Materiel Command (AMC) supervises airworthiness qualification within the Army. The Commander, U.S. Army Aviation and Missile Life Cycle Command (AMCOM), a major subordinate command of AMC, is the approving authority for airworthiness of Army aircraft for which it has materiel management. The AMCOM issues the official notice of airworthiness release for Army tests or operations, statements of interim airworthiness qualification, and statements of airworthiness qualification. The PM is responsible for funding of an airworthiness release or qualification and ensuring that the airworthiness of the aircraft system has been determined. c. Developmental T&E supports the airworthiness release by demonstrating or verifying compliance with applicable aeronautical design standards, demonstration addendum, and other technical parameters cited in contracts. DTC tests the airworthiness of fixed and rotary wing aircraft, aircraft components, and subsystems installed on any Army platform. d. The airworthiness release is required before developmental testing can begin. Operational tests using aircraft also require an airworthiness release in addition to the safety release. Q-3. Ammunition for T&E of Army Systems Acquisition (AR ) This paragraph provides a brief description of the process for obtaining and managing standard ammunition from the Army stockpile for T&E of Army systems acquisition. a. The PM is responsible for identifying ammunition requirements for developmental testing and new equipment training of Army acquisition systems in accordance with AR (1) PMs develop ammunition requirements to ensure the most effective and efficient use of Army munitions (e.g., combine tests, integrate with military training, incorporate simulations, etc.). (2) To the maximum extent possible, PMs will place requirements on the prime munitions identified in the Total Army Munitions Requirements (TAMR). ATEC Pamphlet June 2010 Q-1

414 (3) Ammunition requirements validated by HQDA G-3/5/7 are resourced in accordance with Army priorities. (4) Identification of ammunition requirements as far in advance as possible is needed to ensure that correct and adequate ammunition is available for use in T&E. b. Operational Test Command (OTC) is responsible for identification and management of ammunition required for operational tests related to acquisition of systems. c. PMs requiring munitions for testing for another Service or non-dod agency must coordinate ammunition support with that Service or agency. d. PMs are responsible for funding and acquiring developmental and non-standard ammunition. PMs developing new munitions must also obtain special packaging instructions. PMs coordinate deliveries of non-standard munitions directly with the test centers and ASPs. e. Acquiring standard ammunition. (1) ATEC is the lead agency for consolidating requirements for Army tests involving standard Army munitions for systems acquisition. PMs submit standard ammunition requirements to the ATEC Ammunition Management Office (AMO) for the current FY through the POM years during the formal data calls conducted semiannually by AMO. Requirements may be submitted outside of the data call window only on an exception basis when warranted by unanticipated circumstances. ATEC records information about the PM requirements in a database. (2) ATEC submits PM requirements to HQDA G-3/5/7 for validation. During the validation process, DA considers test requirements against all Army priorities to determine whether to provide authorizations supporting the requirements. ATEC updates their database with validated DODICs and quantities for each program. (3) HQDA G-3/5/7 and G-4 co-chair the semiannual Total Army Ammunition Authorization and Allocation Conference (TA4C), during which conventional munitions in support of valid Army requirements are resourced. ATEC, AMO represents the RDTE community at the TA4C. (4) At the conclusion of the TA4C, DA posts approved authorizations to the ATEC hierarchy in the Total Ammunition Management Information System (TAMIS). ATEC subauthorizes the approved DODICs and quantities to each PM in the ATEC TAMIS hierarchy. PMs may establish subordinate hierarchy levels in TAMIS to efficiently manage their authorizations. ATEC also records the TA4C results in Pinpoint. (5) The TA4C makes resourcing decisions based on a variety of supply considerations and sound stockpile management decisions. The ammunition actually approved for validated test programs may be different from PM requirements originally submitted to ATEC. Thus, it is imperative that PMs review the results of each TA4C in TAMIS and Pinpoint for impact on their programs. Q-2 ATEC Pamphlet June 2010

415 f. Managing RDTE Authorizations. (1) PMs are responsible for requisitioning the ammunition needed for their test. (a) Forecasting in TAMIS equates to a requisition and is the standard Army supply system process for moving authorized ammunition, with no exception requirements, to a location for use. (b) Requisitions requiring exception data (e.g. specific lot number, date of manufacturer, condition code, etc) or shipment to an installation not listed in TAMIS (e.g. other services, contractor/commercial facilities, Alaska, Hawaii, etc.) must be processed as an Army Electronic Product Support (AEPS) requisition. Ammunition requisitioned through AEPS will not be forecasted in TAMIS. (2) PMs will expend only those authorizations necessary to achieve approved objectives. (a) Munitions issued using the TAMIS e581 automatically decrement authorizations in TAMIS and are reported as expenditures. Turn-in of serviceable ammunition that can be returned to the Army B14 account will credit authorizations back to the PM. Ammunition subjected to any RDTE procedure or conditioning that makes it unsuitable for return to the Army B14 account reduces authorizations. (b) At installations or activities where there is no electronic interface between TAMIS and the ammunition accountability system, issues and turn ins must be manually posted to TAMIS by either the ammunition supply point (ASP) or the PM. (c) PMs who ship ammunition to contract facilities for RDTE must post the quantity shipped to TAMIS as an expenditure using the requisition document number. (3) PMs must provide disposition instructions for test-unique munitions and ammunition acquired with RDTE funds, or turn over the items to the Army stockpile as excess. Disposal of ammunition at the end of a test requires compliance with the Environmental Protection Agency (EPA) Military Munitions Rule and the DOD Implementation Plan for the rule. Q-4. Climatic Testing (AR 70-38) a. The Army requires that RDT&E of its systems realistically consider the climatic conditions that might impact the mechanical operation or function of the system in its operational environment. Climatic testing is designed and conducted to assess the suitability of an item or system when it is to be operated or used in a wide climatic spectrum, from extreme climates to normal operational climates. AR and MIL-HDBK-310 are the basic documents describing military operating environments with emphasis on extreme conditions. MIL-STD 810(G) established standardized laboratory test methods to determine the effects of simulated and induced environmental factors on Army materiel. b. The climatic requirements as well as performance standards for operations, storage, and transit are stated in the capabilities documents. From the requirements document, the environmental testing parameters are derived and tailored to each specific system (in accordance ATEC Pamphlet June 2010 Q-3

416 with MIL-STD 810). Because climatic requirements can substantially impact acquisition and support costs, Army materiel is designed, developed, and tested to operate under conditions less severe than the absolute extremes that may occur within the areas of intended use, thereby implying some risk of failure to operate at times. Test results may be derived from environmental chamber tests or from tests conducted in the natural environment. c. Simulated climatic testing will be fully and creatively conducted prior to testing in natural environments unless such testing is impractical, e.g. physical limitations, mobility requirements, or Soldier/system requirements preclude its use. Laboratory testing will be designed to determine if any basic problems exist prior to natural environment tests. If done early, laboratory tests provide an economical method to identify potential design defects, screen materials and systems, and may provide extreme testing of environmental factors. (1) Test results from climactic chambers will not be interpreted as a total substitute for tests conducted in natural environments because they do not reproduce all of the interacting environmental factors associated with the natural environment. The TEMP will address the need for natural environmental testing. (2) Systems that fail simulated environmental testing will not be subjected to natural environmental testing until causes for the failure are resolved and correction has been made. d. The military operating environments are characterized by a combination of climate, landform, vegetation, and atmospheric obscurants. Natural environment testing can provide analogous environments to those in which the military equipment will be actually used. This offers distinct advantages for conducting tests in a real world environment where the equipment and the operator will be affected by combined environmental factors that will challenge military systems in ways not encountered in chamber tests. The natural environment test areas operated by DTC s Yuma Test Center (cold, tropic, and desert) offer the opportunity to test complete systems in a realistic manner with enough range to fire missiles, operate tanks, fly aircraft, operate vehicles, or detonate explosives. e. There are four climatic design types recognized by the Army and outlined in AR (hot, cold, severe cold, and basic), all containing some element of risk. The system evaluators will generally reflect, as one of the CTPs, the ability of the system to operate in specific climatic design types. AR 73-1 requires that, at a minimum, climatic tests under the basic climatic design-type conditions be completed prior to the system being type classified standard. In accordance with AR 70-38, potentially dangerous systems (e.g., ammunition) will be tested to all climatic design values despite their change of being used in, or the requirement to operate in, those climates. For worldwide operational requirements, the system should be capable to operate in all four climatic design types hot, basic (to include tropic, i.e., cycle B1 and B2), cold, and severe cold. Given the current military operational environment, serious consideration should be given to include the hot design type in the system requirements. The climatic design types, extracted from AR 70-38, are shown in table Q-1. Q-4 ATEC Pamphlet June 2010

417 Table Q-1. Climatic Design Types Climatic Design Type Daily Cycle 1 Hot Basic Ambient Air Temperature 2, o C ( o F) Daily Low Daily High Hot-dry (A1) 32 (90) 49 (120) Hot-humid (B3) 31 (88) 41 (105) Constant high humidity (B1) Variable high humidity (B2) Nearly constant 24 (75) 26 (78) 35 (95) Basic hot (A2) 30 (86) 43 (110) Operational Conditions Solar Radiation, W/m 2 (Bph) 3 0 to 1120 (0 to 355) 0 to 1080 (0 to 343) Ambient Relative Humidity (%RH) 4 3 to 8 59 to 88 Negligible 95 to to 970 (0 to 307) 0 to 1120 (0 to 355) Basic cold (C1) -32 (-25) -21 (-5) Negligible Cold Cold (C2) -46 (-50) -37 (-35) Negligible Severe cold Severe cold (C3) -51 (-60) Negligible Storage and Transit Conditions Induced Air Temperature, o C ( o F) 33 to 71 (91 to 160) 33 to 71 (91 to 160) Nearly constant 27 (80) 74 to to 63 (86 to 145) 14 to to 63 (86 to 145) Tending toward saturation Tending toward saturation Tending toward saturation -25 to -33 (-13 to -28) -37 to -46 (-35 to -50) -51 (-60) Induced Relative Humidity (%RH) 1 to 7 14 to to to 75 5 to 44 Tending toward saturation Tending toward saturation Tending toward saturation NOTE: The numbers shown for the values of the climatic elements represent only the upper and lower limits of the cycles that typify days during which the extremes occur, e.g., for the hot-dry cycle, 49 o C (120 o F) is the maximum daytime temperature, and 32 o C (90 o F) is the minimum nighttime (or early morning) temperature. 1 Designations in parentheses refer to corresponding climatic categories in MIL-HDBK-310 and AR (except the A-3 category) and NATO STANAG 4370, AECTP 200, Category 230, Section 2311; (see Part One, 2.2.1, 2.2.2, and 2.3). 2 º C values (rounded to the nearest whole degree) derived from data obtained/established on º F scale. 3 Bph represents British Thermal Units per square foot per hour. 4 Sequence of RH presentation corresponds to sequence of air temperatures shown (e.g., for HOT-DRY daily cycle, 8 percent RH occurs at 32 o C; 3 percent RH occurs at 49 o C). f. Chamber testing and natural environmental testing complement each other. The natural environmental test facilities offer an opportunity to test complete systems in a realistic manner. The effects of many environmental variables can be seen at once; mission profiles can be followed. AR 73-1 requires that climatic tests under the basic climatic design type conditions be completed prior to type classification standard. Systems specifically designated for use in extreme climates will complete tests in the designated environments before type-classification standard. If the system evaluator determines that chamber or laboratory testing is not adequate to address the issues, climatic tests that are critical to the basic acceptability of the system will be conducted during developmental and operational testing at natural field environmental test sites. Q-5. Contractor Support in Testing a. Title 10 U.S.C., Section 2399, restricts the involvement of system contractors in IOT&E for MDAPs, except for those persons employed by the contractor involved in the operation, ATEC Pamphlet June 2010 Q-5

418 maintenance, and support of the system being tested when the system is deployed in combat. The Defense Acquisition Guidebook (DAG) reiterates this policy for MDAPs. Army guidance (AR 73-1) cautions against system contractor manipulation or influence during IOT&E or any activities that provide input to the system evaluation leading to a full-production decision for major defense acquisition programs. b. During IOT&E, system contractors will not participate except to the extent that they are involved in the operation, maintenance, and other support of the system when it is deployed. They will not participate in collecting, reducing, processing, authenticating, scoring, assessing, analyzing, or evaluating OT data. System contractors will not be directly involved in data authentication group sessions or R&M scoring and assessment conferences. c. Policy permits the results of DT using contractor personnel or the results of contractor inhouse testing to be considered in system evaluation and assessments along with OT data. However, with the integration of Army testing, ATEC is conducting more combined or integrated DT/OT. In some cases, the results of a combined or integrated DT/OT will produce data that the system evaluator will use in an evaluation to support the production decision. It is, therefore, extremely important that all test officers understand the restrictions imposed on system contractor involvement in IOT. d. For all combined DT/OT, the AST will identify any restrictions or limitations on system contractor participating in testing required to preserve the integrity of the test data, including access to the test site or information from testing. These restrictions will be documented in the SEP. All restrictions and limitations for this purpose will be reviewed and approved by the PM and the responsible test center/test directorate. The PM will assure that the restrictions and limitations imposed on the system contractor are consistent with the terms and conditions of the contract. The responsible test center/test directorate will incorporate these restrictions and limitations in test execution and management procedures. Any deviation from these policies will be coordinated with the PM within the T&E WIPT and documented by the system evaluator. Q-6. Environmental Impact 32CFR Part 651(AR 200-2) a. For critical environmental concerns, testing is performed to identify and quantify the emissions, effluents, and wastes produced by the system. Specific subtests designed to measure emissions, effluents, and wastes may be prescribed in the TEMP and the DTP. These environmental quality characteristics and issues will be evaluated in the OMAR/OER. b. In addition, the PM may request that the developmental tester conduct tests and measurements to support the life-cycle environmental documentation. The PM will prepare environmental documentation consistent with the requirements of AR to address the environmental effects pertaining to the use and operation of the system throughout its life cycle. The environmental documentation will be provided to the developmental and operational testers before testing commences at an Army or contractor facility or location. c. Formal environmental documentation is required by Congressional mandates to support all Federal agency actions. Therefore, prior to the initiation of any testing, environmental documentation must be provided by the PM to the developmental tester (IAW AR 200-2). Q-6 ATEC Pamphlet June 2010

419 d. Environmental documentation. There are three levels of environmental documentation that can be submitted by the PM: (1) Environmental Assessment (EA). The EA addresses new and continuing activities where the potential exists for measurable degradation of environmental quality. This document concludes with either a Finding of No Significant Impact (FNSI) or a Notice of Intent (NOI) to prepare an Environmental Impact Statement (EIS). The EA, FNSI, and NOI provide for public disclosure. (2) Environmental Impact Statement (EIS). If the EA shows that the system will impact the environment adversely, or is controversial, an EIS is prepared. It provides full disclosure to the public on all issues associated with a Federal action that has the potential to significantly impact the natural environment. If required, testing is performed to identify and quantify the environmental quality issues. (3) Record of Environmental Consideration (REC). The REC briefly describes a proposed action and contains a checklist explaining why further analysis is not necessary. It is used when a categorical exclusion (see below) applies or there does exist environmental documentation on the item/system/action. e. Categorical exclusions. Actions that do not require an EA or an EIS and have been determined not to have an individual or cumulative impact on the environment may qualify for a categorical exclusion. AR contains a list of such actions. A REC documents this decision. Q-7. Hazardous Waste (AR and AR 200-2) a. It is DOD and Army policy to prevent, mitigate, or remediate environmental damage caused by acquisition programs. Prudent investments in pollution prevention can reduce lifecycle environmental costs and liability while improving environmental quality and program performance. In designing, manufacturing, testing, operating and disposing of systems, all forms of pollution shall be prevented or reduced at the source whenever feasible. The PM must ensure that the system can be tested, operated, maintained, repaired, and disposed of in compliance with environmental regulations. According to Army policy, the use of hazardous materials will be minimized and all alternative options will be considered before using any hazardous materials. b. According to the Defense Acquisition Guidebook, all acquisition programs are required to be conducted in accordance with applicable federal, state, interstate, and local environmental laws and regulations, executive orders, treaties, and agreements. The PM is responsible for developing a hazardous material management program that considers the elimination or reduction of hazardous materials over the management of pollution created for the system s entire life cycle. The management plan should include procedures for identifying, minimizing use of, tracking, storing, handling, packaging, transporting, and disposing of such materials and equipment. NEPA is the driving force for all engineering and scientific analyses required to mitigate environmental impacts and preventing pollution from hazardous waste. c. Under the provisions of AR and AR 200-2, consideration should be given in the early stages of the acquisition cycle to reducing the use of toxic or hazardous materials when new systems are tested. ATEC Pamphlet June 2010 Q-7

420 Q-8. Integrated Logistic Support (ILS) (AR ) ATEC is responsible for the evaluation of ILS characteristics of Army systems throughout the life-cycle. An ILS analyst will serve as the ILS representative to the AST and the system evaluator representative to the ILS Management Team. The ILS analyst will provide ILS input to the SEP, OTA TP, OAR, OMAR, OER, and SAR. The evaluation will consist of review and assessment of the adequacy of materiel developer ILS planning, management and execution efforts, analyses and test results. The evaluation ensures logistics planning, repair parts, and infrastructure is in place prior to each MDR and materiel release. The ILS evaluation is prepared IAW DA Pamphlet A list of responsibilities is provided in table Q-2 below. a. Other responsibilities of the ILS analyst include inputting and maintaining the ILS database, providing appropriate ILS input to the TRRs, monitoring tests to ensure integrity of ILS data on an as needed basis, participating in IPTs and ICTs as appropriate. The ILS analyst also (1) Attends IPRs and MDRs with the Army Evaluator as the SME for ILS issues. (2) Coordinates ILS findings and positions with ODCSLOG prior to attendance at IPTs, IPRs, materiel release, and other form where an ILS position or issues must be provided. b. If ODCSLOG concurs with ATEC s ILS issues and position, ATEC takes these agreed-to ILS issues and positions to the meeting or decision review and presents the DCSLOG ILS issues and position. Where there is significant ODCSLOG interest, then ODCSLOG will attend the meeting or decision review and voice the DCSLOG ILS position. As the Army Independent Logistician, the ODCSLOG signs the TEMPs and other documents and votes at meetings where a logistician position is required. Q-9. Interoperability (AR ) Interoperability T&E is required for virtually all DOD command, control, communications, computers, and intelligence (C4I) systems and information technology (IT) acquisition programs. The creation of the Joint Interoperability Engineering Organization (JIEO) and the Joint Interoperability Test Center (JITC) certifies C4I systems for DT and OT. This certification is based on testing paid for by the PM. Q-8 ATEC Pamphlet June 2010

421 Table Q-2. ILS Responsibilities ODCSLOG ATEC Assess ILS Program Management, Execution X Review, Comment on ILSP, CDD/CPD,SSP, TEMP, MFP, IPS X Represent ILS Issues at IPTs, IPRs, Other Meetings X Assess ILS Programs X Monitor Supportability Testing X ILSMIS Database X Monthly Acquisition Program Review X X Review LD and Maintenance Demos X Periodic ILS Review for OASA(IL&E) X Identify ILS Problems, Help Resolve X Influence System Design X Identify Impacts of ILS Problems X Perform Special Projects for DCSLOG, ASA(IL&E) X Voting Logistician at IPRs X a. Interoperability is defined as the ability of systems, units, or forces to provide services to and accept services from other systems, units, or forces and to use the services so exchanged to enable them to operate effectively together. Interoperability is further defined as the condition achieved among communications-electronics systems or items of communications-electronics equipment when information or services can be exchanged directly and satisfactorily between them and/or their users. The degree of interoperability should be defined when referring to specific cases. (1) Interoperability test issues include interfaces; system development (design changes, delays, and problems); performance limitations of other systems; modeling and simulation; and is usually evaluated qualitatively. (2) Interoperability test parameters include the following: (a) Physical - pins, connectors, alignment, dimensions, volume, weight, etc. (b) Electrical - voltage, cycles, power stability, surge limits, etc. (c) Electronic - frequencies, modes, rates, control logic, telemetry, etc. (d) Environmental conditioning - heating, cooling, shock, vibration, etc. (e) Software - formats, protocols, messages, etc. (f) Data - rates, inputs, characters, codes, etc. b. Compatibility is defined as the capability of two or more items or components of equipment or materiel to exist or function in the same system or environment without mutual ATEC Pamphlet June 2010 Q-9

422 interference. Compatibility concerns also apply to the instrumentation and augmentation support plans as well as the tested system. Physical compatibility (e.g., do trailer hitches match) is normally covered under mission performance. Compatibility test issues include the following: (1) Is DT data useful for OT? (2) Early System Assessment may identify potential problems areas. (3) Rigorous test environments. (4) Changes due to modifications/upgrades. (5) M&S - use to evaluate interfaces. (6) Complete prior to full-scale performance testing. Q-10. Modeling and Simulation (M&S) (AR 5-11) Considerations for M&S use in T&E: a. Planning an integrated simulation, test and evaluation program begins with the traditional T&E planning process in association with a system s Integrating Integrated Process Team (IIPT), the T&E IPT (T&E WIPT), and the ATEC System Team (AST). An M&S WIPT may also be formed, either as another working IPT to the IIPT or to the T&E WIPT. Models and simulations are always considered to support the developmental test, operational test and evaluation/ assessment of systems as they proceed through the life cycle. Testing is conducted to support comprehensive and effective evaluations in support of Army system acquisitions. Credible evaluations are required to support decision-making within the acquisition process. Tests and evaluation are interrelated and complementary process. Both are necessary; however, neither process alone is sufficient. Evaluations judge overall system effectiveness, suitability, and survivability based, in part, on technical and operational requirements while reassessing as the system evolves. Test results, among other credible resources of data such as M&S, are an integral part of the system evaluation. b. M&S are used extensively to support the system development T&E process that includes the software development T&E process. Army systems in development are increasingly complex. Testing of such systems can be large in scope and require conditions that are difficult, if not impossible, to duplicate short of actual combat. The practicalities of costs, test range space, availability of advanced threat systems/surrogates, and safety will necessarily limit test planning and test data availability. M&S can address such limitations. System evaluation may require M&S to integrate available test data so as to extrapolate or interpolate to those conditions that could not be tested due to constraints and limitation in the test environment. While M&S are not a replacement for testing, they are complementary tools to assist in the system evaluation process. c. The Simulation and Modeling for Acquisition, Requirements and Training (SMART) concept more closely integrates the efforts of the requirements, acquisition, and training communities by integrating M&S starting early in the acquisition process, creating a large trade Q-10 ATEC Pamphlet June 2010

423 space among performance, costs, design, manufacturing, supportability, and training, with the ultimate result of providing systems with greater utility, lower cost, and less burden on the operations and sustainment budget. Planning for SMART involves developing a M&S strategy that is an interconnected part of the overall acquisition strategy for a system. Documenting the M&S strategy in a Simulation Support Plan (SSP) enables the necessary continuity and acquisition of necessary simulation support capabilities as the system matures through its life cycle. d. The extent of M&S use, whether an existing M&S or newly developed M&S, in conjunction with T&E is documented in the system s TEMP. All the users of M&S must accredit the model for a particular use. Accreditation will be based upon the extent of validation and verification (V&V) associated with the M&S. ATEC Regulation provides additional information. M&S accreditation is especially essential prior to extrapolating, interpolating or predicting system performance (including software, hardware, man-in-loop). M&S verification, validation and accreditation (VV&A) status, along with the degree to which M&S will augment test data to assist in system evaluation and assessments, is also documented in the TEMP. e. Verification addresses the question Does the M&S work as intended? Specifically, verification is the process of determining that the M&S accurately represent the developer s conceptual description and specifications. Also, verification assesses the extent that the M&S was developed using sound standard software engineering techniques. Verification is applied at each stage of the M&S life cycle management process to ensure that inputs and outputs are implemented accurately and properly as well as support the purpose of M&S. In short, verification builds confidence in the structural integrity of the M&S. f. Validation addresses the question Is the model realistic? Specifically, validation is the process of determining the degree to which an M&S accurately represent the real world from the perspective of its intended use. Thus, validation examines the concept and output thoroughly. Data obtained from the real world or a credible source, which has been proven by a recognized expert is used to compare the M&S behavior and results. In short, validation is the cornerstone by which the credibility in M&S is built. g. All M&S used in testing for evaluation must be accredited. Accreditation is a decision by the user to use an M&S, and its results, for a particular application. However, accreditation is based on properly performed and documented V&V, which is used to compare against an exit criteria, formally called acceptability criteria. The V&V should be conducted and documented as the M&S is being developed. Accreditation of the acceptability criteria is unique to each M&S intended use and provides essential insights to possible solutions. Therefore, the acceptability criteria are standards that the M&S must meet in order to be accredited (i.e. used for a specific purpose). The ATEC Commander (or designee) is the accreditation authority (AA). Q-11. Range Safety Data (AR , AR and AR ) a. Army weapons, munitions, and lasers require the development of range devices, as well as safety data, to ensure safe and effective testing, peacetime training, target practice, and tactical ATEC Pamphlet June 2010 Q-11

424 employment, to minimize the possibility of accidents during firing and other uses of ammunition and explosives. b. The surface danger area diagram (safety fan) is provided by the PM; however, the test facility may assist in developing the hazardous areas. These safety fans are verified during DT. The verification must be available before OT with troops. It is critical that sufficient ammunition or explosive devices be scheduled for use in the development of these data (see AR ). Q-12. Security (including Communications Security) (AR and AR 530-1) a. Information that is properly classified or that is determined to be unclassified but sensitive (for example, proprietary information and competition-sensitive information) must be safeguarded throughout the life cycle. A comprehensive protection and technology control program shall be established for each program to identify and protect this information. b. The National Security Agency (NSA) is responsible for testing communications security equipment during the early stages of the life cycle. Q-13. Serious Incident Reporting Serious incident reporting provides timely and accurate information concerning incidents of a serious nature of command interest. An immediate telephonic report is made to the ATEC HQ, DCSOPS. A follow-up written SIR must be completed within 24 hours. See AR , Accident Reporting and Records for additional information. The following accidents or incidents are to be reported to the DCSPER, HQ ATEC. a. Class A accident. An Army accident resulting in a total cost of property damage that meets or exceeds $1,000,000; an Army aircraft or missile is destroyed, missing, or abandoned; or an injury and/or occupational illness results in a fatality or permanent total disability. A fatality will be reported regardless of the time between the initial injury and death. Any fatality of either military personnel or Army civilian employee will be reported. This includes non-appropriated fund (NAF) employees, foreign nationals employed by the Army, or non-army personnel when the accident is incurred while performing duties in a work compensable status or a result of Army operations. Personnel missing, and presumed dead, as the result of an accident will be reported as fatalities. b. Class B accident. An Army accident resulting in a total cost of property damage that meets or exceeds $200,000, but is less than $1,000,000; an injury and/or occupational illness resulting in permanent partial disability; or three or more personnel are hospitalized as inpatients as the result of a single occurrence. c. Class C accident. An Army accident resulting in a total cost of property damage that meets or exceeds $20,000, but is less than $200,000; a nonfatal injury that causes any loss of time from work beyond the day or shift on which it occurred; or a nonfatal occupational illness that causes loss of time from work (for example, 1 work day) or disability at any time (lost time case). Q-12 ATEC Pamphlet June 2010

425 d. Class D accident. An Army accident resulting in a total cost of property damage that meets or exceeds $2,000, but is less than $20,000. e. Class E aviation incident. An Army incident in which the resulting damage cost and injury severity do not meet the criteria for a Class A through D accident ($2,000 or more damage; lost time/restricted activity case). A Class E aviation incident is recordable when the mission (either operational or maintenance) is interrupted or not completed. Intent for flight may or may not exist. f. Class F incident (Foreign Object Damage (FOD) aviation incident). Recordable incidents confined to aircraft turbine engine damage (does not include installed aircraft Auxiliary Power Units (APU)) as a result of internal or external FOD, where that is the only damage. g. Damage to Army property (including Government-furnished material (GFM), or Government-furnished property (GFP), or Government-furnished equipment (GFE) provided to a contractor). h. Injuries (fatal or nonfatal) that result from criminal activity where there was intent to inflict injury. i. The arrest of military personnel or Army civilian employee by either military or civilian authorities for crimes such as rape, murder, robbery, fraud, etc. The DCSPER, HQ ATEC will be informed of the final disposition of each incident. j. The unauthorized absence of Soldiers (Soldiers in an absence without leave (AWOL) status). Soldiers will be reported to DCSPER, HQ ATEC, on the effective date of AWOL, and the effective date of return. In the event a Soldier has not returned by the 30 th day, he/she will be dropped from the rolls (DFR) in accordance with DA Pamphlet 600-8, paragraph 9-12, procedures 9-8, and DCSPER, HQ ATEC will be notified. Q-14. Threat Considerations (AR and DA Pamphlet 73-1) Threats must be identified, approved, and updated continuously throughout the life cycle. Threats fall into the category of either threats to a Blue system or targets of the Blue system. Both must be addressed, since the major threats to a Blue system may not be the targets of the Blue system. Such is the case for many Intel systems. DA-approved threat or system-specific threat definitions developed in accordance with appropriate regulations will be employed when tests are planned, designed, and conducted. Representations of threat are identified in the TEMP. a. Modeling and Simulation of the Threat. Application of M&S techniques should be considered as a means to offset the impacts of test limitations and assess the impacts of uncertainties that exist in the threat data used in the test. b. Threat Aspects. Threat aspects include current and projected capabilities of a potential enemy to limit, neutralize, or destroy the effectiveness of a mission, organization, or item of equipment under battlefield conditions at a post-ioc date. ATEC Pamphlet June 2010 Q-13

426 c. System Threat Assessment Report (STAR). The STAR is the basic threat document supporting system development for both major and non-major systems. It is used to define the threat environment in which a developmental system must function throughout its life cycle. The STAR should address both force-on-force and unstructured force (insurgents, etc.). d. Threat Test Support Package (TTSP). The TTSP is derived from the STAR, but is more detailed and provides the threat scenarios to support a specific test and assesses the impacts of threat-related test limitations, including force-on force and unstructured force. e. Target and Threat Simulators. The PM ITTS is responsible for the engineering, development, acquisition, fielding and capability accounting of Army targets, threat simulators, and major range instrumentation for DT and OT. ATEC and subordinate commands develop the requirements for targets and threat simulators/simulations to support both DT and OT. In additions to target and threat simulators developed by PM ITTS, Joint Improvised Explosive Device Defeat Organization (JIEDDO) and the Joint Test Board (JTB) develop threats for Counter IED (CIED) tests. f. Threat Simulator and Target Validation/Accreditation. Validation is the process used to determine whether a simulator or target provides a sufficiently realistic representation of a corresponding threat system to justify continuation of its development, use, or modification to restore or improve its capabilities to conform to current intelligence estimates. Threat Simulator and Target Validation Working Groups make this determination. They are comprised of representatives of user, intelligence, and simulator/target development organizations. The PM ITTS determines when these working groups are required, informs the U.S. Army Test and Evaluation Office (TEO), and participates in the meetings. TEO charters the working groups and designates the chairman. Accreditation of ACAT I through III threat simulators and targets is accomplished through a Threat Accreditation Working Group (TAWG). ATEC is a member of the TAWG representing the T&E community. g. Vulnerability Assessment of Information Technology. The Army s strategy is to integrate information systems protection in the development of information systems, infrastructure, TTPs, and training. System-of-systems assessment supports a balance between systems, not overprotecting one part of the infrastructure at the expense of another. (1) No system will proceed to a Full Rate Production decision without first having undergone a thorough IA vulnerability assessment and incorporating appropriate solutions. Procurement of NDI or COTs items does not negate this requirement. (2) ATEC will assist the PM in conducting the assessment or providing recommended solutions. Q-15. Training Aids, Devices, Simulators, and Simulations (TADSS) (AR ) a. DODD requires that acquisition programs consider the total system and not just the prime mission equipment. According to AR 73-1, the planning and execution of T&E of Army systems includes the ancillary equipment and components (such as training devices, ground support equipment, and field maintenance test sets) associated with those systems. Q-14 ATEC Pamphlet June 2010

427 b. Army-wide life cycle management of system and non-system TADSS starts with the identification of a need and continues through fielding and final disposition (to include maintenance and logistical support). TADSS are developed and acquired to support designated tasks in developmental or approved individual or collective training programs, Soldier manuals, military qualification standards, or Army training and evaluation programs, thereby improving readiness. c. There are two categories of TADSS: (1) System TADSS are designed for use with a system, family of systems, or equipment (including subassemblies and components). They may be stand-alone, embedded, or appended. Using system-embedded TADSS is the preferred approach when practical and cost effective. (2) Non-system TADSS are designed to support general military training and non-system specific training requirements. d. The planning and programming for the acquisition of system TADSS are the responsibility of the system PM. The planning, programming, budgeting, and management of RDTE for the acquisition of non-system TADSS reside with HQDA. e. ATEC provides T&E support for both system and non-system TADSS. DTC and OTC provide DT and OT, respectively. AEC provides the system evaluation to address CTPs, performance levels, proficiency, and effectiveness as well as the training effectiveness and utility of system training devices. Q-16. Transportability (AR 70-44, AR 70-47, and AR ) a. Transportability refers to the ability of a system to be moved by towing, by selfpropulsion, or by carrier via the railway, highway, air, waterway, or helicopter, and airdrop modes of transportation utilizing the existing or planned transportation support equipment. b. Transportability is a major consideration in the T&E of Army systems, including system components and spare parts. T&E of transportability will address the end-item in its tactical and packaged or shipping configurations, as well as associated support equipment and TMDE. This focus allows the evaluator to determine if the system is deployable. c. Transportability testing is accomplished to support the transportability assessments of the Surface Deployment Distribution Command (formerly MTMC, Military Traffic Management Command) and to obtain a transportability approval from SDDC. This testing also supports the U.S. Army Soldier and Biological Chemical Command s (SBCCOM) efforts in providing certification for external air transport by rotary-wing aircraft, internal air transport by fixed-wing and rotary-wing air transport, and airdrop. These certifications are required prior to transportability approval. d. Transportability T&E is required as part of DT. The ability of a system to withstand the expected transport environment over the useful life of the system must be demonstrated in T&E before the production decision. Transportability may be a separate subtest (e.g., lifting and tie down, rail impact, aircraft test loading, and low velocity airdrop) or included in the logistic ATEC Pamphlet June 2010 Q-15

428 supportability subtest. It is also appropriate to evaluate transportability during OT. Soldiers that normally prepare the system for movement should be used during these tests under realistic conditions. e. The transportability evaluator works closely with the acquisition community through the ILSMT, T&E WIPT, and other program review forums to provide a continuous assessment of transportability for all assigned acquisition programs. Q-17. Human Research Protections Programs (HRPP) AR 70-25) a. Activities which in the past were referred to as Human Use Committee (HUC) are now referred to as Human Research Protection Programs (HRPP). Whenever ATEC operations involve the use of humans as subjects of human research, such operations will be conducted under the authority granted by the Army Human Research Protections Office (AHRPO) and issued as an Assurance for the Protection of Human Research Subjects (hereafter referred to as Assurance ). Human subjects research is defined as a systematic investigation, including research, development, testing and evaluation, designed to develop or contribute to general knowledge. A human subject is a living individual about whom an investigator conducting research obtains data through intervention or interaction with the individual or identifiable private information. b. DTC and OTC will (1) Obtain an Assurance from the AHRPO prior to conducting any human subjects research activities. (2) Comply with all provisions of their approved Assurances. (3) Develop and apply procedures, in accordance with their Assurances, to screen all proposed testing to ensure proper identification of human subjects research. (4) Ensure that any of their activities which fit the definition of human subjects research are conducted within the provisions of their supporting Assurances. Q-16 ATEC Pamphlet June 2010

429 Appendix R Assessing Risk R-1. Purpose The purpose of this appendix is to provide additional information, references, and techniques for accessing and addressing risks associated with test and evaluation. Risk assessment provides a formal means of managing issues associated with the T&E of systems undergoing acquisition. R-2. Assessing Risk and Decision Theory a. In everyday life we might use risk assessment to examine the possible consequences of our actions. In T&E, we can use risk assessment to track the effectiveness, suitability, and survivability of our systems. Associated with each issue is the risk of not being able to meet expectations. Suppose, for example, there is a requirement to interoperate with another system. The risk is the inability to meet whatever standard we may set, say 99% message completion rate. This risk can be rated as High, Medium, or Low (and color coded with the customary red, yellow, or green). Multiple systems can be tracked on the basis of their risk ratings. b. Risk assessment can also be used to balance the need to test with operational risks. The imperative to field equipment as soon as possible forces us to tailor operational testing to operational risks. In his The Challenge of Transformation briefing, Paul Muessig states that we should only test when the operational risks associated with fielding a new capability can be reasonably projected to be high. Namely, when there is a high probability of critical mission failure, unacceptable risk to life or limb, or unjustifiable probability of damage to, or loss of, materiel. The Director, Operational Test and Evaluation (DOT&E) has produced a memorandum, Guidelines for Conducting OT&E for Software-Intensive System Increments, 16 June 2003 (see appendix S), that describes a method for determining the level or amount of testing that should be done based on known risks. It is a modification of traditional risk assessment that can be used for tailoring test events to operational risks associated with fielding any system not required by law to undergo IOT&E. c. The problem with risk assessment is that so much of it is based on subjective judgment. Much can be discovered through research and testing, but testing may not always be possible or results may not provide conclusive evidence. As a consequence, the Evaluator may be forced to make a best guess regarding the risk associated with fielding a system. The Evaluator may feel forced to make decisions and to act without knowing all the facts. Furthermore, the tools at our disposal do not effectively allow us to assess the combined impact of multiple risks. All we can really do is document each risk and use the documentation as a basis for projecting an outcome. d. So, at best, it is often difficult to assess risks. Of course, the task is even more difficult if in addition to assessing risks that exist today we are asked to project risks at some future milestone. Here, there is always the danger of projecting an outcome, usually in the desired direction, for which there is very little basis. ATEC Pamphlet June 2010 R-1

430 R-3. The Likelihood of Risk and Risk Impact a. Risk is the potential that something will go wrong. A risk event is the something that could go wrong. Suppose a PM contracts for the development of software for the XYZ system in order to send messages to the JKL system. There are several possible risk events. The potential of being unable to send a message is a risk. Elements of the development process where the ability to communicate with the JKL system is designed into the XYZ system could be considered developmental risk events. Instances where system users could not exchange messages could be considered operational risk events. b. Risk or the potential of failing to achieve a certain outcome has two components: (1) the likelihood, or probability, of the undesirable outcome and (2) the likely impact, or loss, given the undesirable outcome. The likelihood of a risk is determined by assessing the risk event. Its impact is assessed based on known causal relationships. Typically these two aspects of a risk are used to assess the level of risk. R-4. Two Approaches Two approaches are looked at in this section. One is based on the Defense Acquisition University s Risk Management Guide for DOD Acquisition. The second is based on rudimentary decision theory (Hays, 1963). Both approaches focus on the two components of risk: the likelihood and the impact of an undesirable outcome, and on establishing risk levels. The two approaches differ with regard to the assumptions that are made, to how the likelihood of a risk is assessed, and how a risk level is determined. a. The DOD guide assumes the Evaluator wants an estimate of the level of risk involved in producing or fielding a system. No assumptions are made about whether the Evaluator will have data on which to base his decision and relies on the ratings of subject matter experts (SMEs) to determine the likelihood of a risk event and its impact. The ratings are employed to determine the level of risk (low, medium, or high) using a Likelihood/Consequence Matrix (table R-4). (The approach avoids quantifying the level of risk because ratings are made on ordinal scales with unequal intervals.) This approach specifically addresses program management risk but can be used to cope with operational risks (i.e., risks that might occur when the system is fielded). b. The decision theory approach assumes the Evaluator will make a decision about system performance based on a set of data and wants to know the amount of risk associated with developing or fielding a system, as well as the level of risk associated with not developing or not fielding a system. The approach uses SME judgments to establish the impact of a risk but uses exact binomial probabilities to determine the likelihood of a risk. The exact binomial probabilities require the Evaluator to subjectively estimate the percent of time a system function will be successfully performed. The goodness of the risk assessment results will depend on the goodness of the estimate. The Evaluator also develops one or more decision rules that take into account available evidence and that can be used to make a decision. The level of risk associated with a decision is the product of the likelihood of occurrence of the risk and the level of its impact: Level of Risk = (Probability of Occurrence) X (Level of Impact) R-2 ATEC Pamphlet June 2010

431 R-5. DOD Guide The DOD guide recognizes that risk is inherent in any program and seeks to cope with risk through risk management. Risk management is the process of identifying/projecting the range of possible future events and outcomes and managing these future events to ensure favorable outcomes. It consists of five processes: risk planning, assessment, handling, monitoring, and documentation. a. The core processes are risk assessment and risk handling. During the risk assessment process, risk events are identified and analyzed. In the identification stage, a work breakdown structure (WBS) (i.e., functional dendritic) is developed and each WBS element or process is examined in terms of possible sources or areas of risk. These areas, along with example risks, are given in table R-1. Once risks are identified, the likelihood that each risk will occur and the likely impact on performance, schedule, and cost are rated on five point scales by the SME. These scales are given in tables R-2 and R-3. They are also assigned a risk rating using a Likelihood/Consequence Matrix given in table R-4. As can be seen in table R-4, the likelihood and impact ratings are used to determine the level of risk associated with a risk event. Possible risk ratings are defined in table R-5. Risks events are then prioritized in a table. An example of a prioritized risk event is given in table R-6. Table R-1. Risk Areas and Example Risks Threat Risk Area Requirements Design T&E M&S Technology ILS Production/Facilities Concurrency Capability of Developer Cost/Funding Schedule Management Example Risks The threat is unknown or poorly represented; the system is not capable of withstanding or avoiding man-made hostile environments. Requirements are unknown, changing, or unrealistic; expected operational environment is unknown. System will not meet user requirements; needed operator skill sets are unknown or not available; design is not cost effective or needed technologies are not available. Testing does not address important issues, does not use production representative systems, lacks an operational environment, or is based on small sample sizes. Models are not accredited. Technology is immature, unproven, too complex, or too costly. System is not supportable or logistics have not been adequately addressed. Production process has not been adequately addressed or needed facilities are not available. Coordination of needed technologies, production, and funding has not been considered. Developer has limited experience, a history of failure, or lacks necessary personnel. Projected costs are unrealistic; cost-performance tradeoffs have not been adequately considered; projected life cycle-costs are excessive. Schedule is unrealistic or resources needed to meet the schedule are unavailable. Acquisition strategy does not adequately address mission needs, T&E, required personnel, cost and funding, production, or schedule; risk assessment has not been performed. ATEC Pamphlet June 2010 R-3

432 Table R-2. Likelihood Criteria Level a b c d e Likelihood Risk Event Will Happen Remote Unlikely Likely Highly Likely Near Certainty Level Table R-3. Impact Criteria Magnitude of Impact Performance Schedule Cost A Minimal or no impact Minimal or no impact Minimal or no impact B C D E Acceptable, some reduction in margin Acceptable, significant reduction in margin Acceptable, no remaining margin Unacceptable Additional resources needed, able to meet dates <5% Minor slip in key milestone 5-7% Major slip in key milestone 7-10% Can t achieve key milestone >10% Table R-4. Likelihood/Consequence Matrix: Risk Rating as a Function of Likelihood and Consequence of the Risk Event Likelihood Rating Consequence Rating a b c d e a L L L L M b L L L M M c L L M M H d L M M H H e M M H H H Table R-5. Example Risk Ratings and Criteria Risk Rating High Moderate Low Description Major disruption likely Some disruption Minimum disruption R-4 ATEC Pamphlet June 2010

433 Table R-6. Example Presentation of Rated Risk Priority Area/Source Location Risk Probability Consequence 1 Design WBS Design not completed on time Highly Likely Can t achieve key milestone Risk Rating High b. Difficulties include coming up with likelihood and impact ratings. They are often based on subjective assessments, limited data or opinion. The problem is compounded in the case of assessing impact because three factors of the risk event, i.e., performance, schedule, and cost need to be assessed. Given the absence of prior knowledge, we may be guessing the impact of a risk event on each factor. Further, the impact levels in table R-3 are roll ups of the three ratings. The overall impact level assigned to the risk event is the highest level of the three ratings. c. In most cases, there will be multiple risks associated with a system or system component. These may be rolled up into a single overall rating by assigning to the system or component the highest risk level associated with the system or component. The approach is only valuable if the various risks assessed are well documented in terms of what they are and their causes, probabilities, and impacts. d. Once risks are identified and rated, they are handled. The primary methods of handling risks are risk control, risk avoidance, and risk assumption. Risk control does not attempt to eliminate the source of risk but seeks to mitigate risk by reducing the likelihood of it occurring or the magnitude of the impact should it occur. For example, the risk might be a failure to sell sufficient quantities of a certain product to achieve a specific cash inflow. The mitigation plan could be advertising or packaging the product in a certain way. Risk avoidance seeks to eliminate the source of the risk. Here, the plan could simply be to not engage in a marketing adventure. Risk assumption entails accepting the risk without attempting to mitigate or to eliminate the risk. This approach is realistic as long as the resources (e.g., dollars, people, time) needed to overcome a risk are identified and are available for use should the risk materialize. R-6. AMSAA Risk Assessment A very similar approach is used by the Army Materiel Systems Analysis Activity (AMSAA). This approach, described in the AMSAA Risk Assessment Primer, provides a means of determining the likelihood of a risk event. Probabilities are determined based on the maturity of the system technologies and the extent of testing that has occurred. Criteria, used to assess the maturity of the technology and the extent of testing, are given in table R-7. As can be seen in table R-8, AMSAA uses a five-point, as opposed to a three-point, scale to assess risk level. An example is given in figure R-1. ATEC Pamphlet June 2010 R-5

434 Table R-7. AMSAA Likelihood Criteria Risk Analysis Simulations Component or Subsystem Integration Demonstrated Field Test Level Tested and Met Requirements Low Show meeting requirements over most conditions All developed mature technology All sub-systems System Over range of conditions Medium Low Medium Medium High High Demonstrate high probability of meeting requirements Show meeting requirements over majority of conditions Show marginal likelihood of meeting requirements Show cannot meet requirements over many conditions Mature and developed technology Majority developed; those undeveloped based on mature technology Majority undeveloped; some undeveloped based on stateof-the-art technology Pushing state-ofthe-art components at lab level Major subsystems integrated; rest integrated on another system Major subsystems; testing and analysis indicates full integration possible Some components integrated into subsystems None System Major subsystems Some subsystems and components No systems or components For many conditions For some conditions For limited conditions No Likelihood Rating Table R-8. AMSAA Likelihood/Consequence Matrix Consequence Rating L ML M MH H L L L L ML ML ML L ML ML M M M L ML M M MH MH ML M M MH H H ML M MH H H R-6 ATEC Pamphlet June 2010

435 Risk Low Medium Low Medium Analysis Simulations Show meeting Requirements Over most conditions Component or Subsystem Integration Demonstrated All developed Mature technology All Subsystems System Demonstrate High Probability Mature and Developed of meeting R qts. Technology Show meeting Requirements over Majority of conditions Show marginal Likelihood of Meeting Medium High requirements Show can not meet Requirements over Many High conditions Majority Developed; Those Undeveloped based On Mature technology Majority undeveloped Some undeveloped Based on State-of-The- Art Technology Pushing State-of-the- Art Components at Lab level Major Subsystems integrated; Rest Integrated On another system System Major subsystems; Testing & Analysis Indicates full Integration possible Some Components integrated Into subsystems None Field Test Level Major subsystems Some Subsystems And components No systems Or components Tested & Met Requirements Over Range of conditions For Many Conditions For some conditions For limited conditions No Medium Risk Assessment NOW Level Performance Or Schedule/Cost Low Low - Medium Medium Medium - High High Minimal or no impact Acceptable with some margin reduction in margin Acceptable with significant reduction in margin Acceptable, no remaining margin Unacceptable Minimal or no impact Additional resources needed to meet need dates Minor slip in key milestone; not able to meet need date Major slip in key milestone or critical path impacted Cannot achieve major program milestone Figure R-1. Derivation of Risk Level for the Joint Common Missile Seeker Dome R-7. Operational Risks The DOD Guide methodology can be modified to assess operational risks. The elements of the WBS (i.e., functional dendritic) in terms of the areas of risk given in table R-1 are still assessed, and the probability of a risk is determined by a SME or using AMSAA methodology. However, instead of assessing the impact of a risk on the program (i.e., system development), the impact of the risk on mission accomplishment is assessed. One possible scheme for assessing the impact of a risk on mission accomplishment is provided in table R-9. Impacts are rated on a four-point scale, requiring a slight change in the Likelihood/Consequence Matrix, table R-10. As indicated, a rating of Likely or higher for critical mission failure, death, and major equipment loss yield high risk ratings. A rating of Highly Likely or higher for accomplishing a mission with major difficulties, permanent disability, and small-scale major equipment damage also results in a high risk rating. ATEC Pamphlet June 2010 R-7

System Test and Evaluation Policy

System Test and Evaluation Policy ATEC Regulation 73-1 Test and Evaluation System Test and Evaluation Policy Headquarters U.S. Army Test and Evaluation Command Alexandria, VA 16 March 2006 This page intentionally left blank. SUMMARY of

More information

ATEC Overview and the AEC Logistics Mission

ATEC Overview and the AEC Logistics Mission ATEC Overview and the AEC Logistics Mission Brian M. Simmons Director, US Army Center 23 January 2008 Presentation to SOLE Aberdeen Proving Ground MD Agenda ATEC Mission & Roles in Acquisition Army Center

More information

UNCLASSIFIED. R-1 Program Element (Number/Name) PE A / Army Technical Test Instrumentation and Targets. Prior Years FY 2013 FY 2014 FY 2015

UNCLASSIFIED. R-1 Program Element (Number/Name) PE A / Army Technical Test Instrumentation and Targets. Prior Years FY 2013 FY 2014 FY 2015 Exhibit R2, RDT&E Budget Item Justification: PB 2015 Army Date: March 2014 2040: Research, Development, Test & Evaluation, Army / BA 6: RDT&E Management Support COST ($ in Millions) Prior Years FY 2013

More information

Test and Evaluation Policy

Test and Evaluation Policy Army Regulation 73 1 Test and Evaluation Test and Evaluation Policy Headquarters Department of the Army Washington, DC 1 August 2006 UNCLASSIFIED SUMMARY of CHANGE AR 73 1 Test and Evaluation Policy This

More information

SUBJECT: U.S. Army Test and Evaluation Command (ATEC) Interim Policy Guidance (IPG) 08-1, Test and Evaluation Document Name Changes

SUBJECT: U.S. Army Test and Evaluation Command (ATEC) Interim Policy Guidance (IPG) 08-1, Test and Evaluation Document Name Changes DEPARTMENT OF THE ARMY UNITED STATES ARMY TEST AND EVALUATION COMMAND 4501 FORD AVENUE ALEXANDRIA VA 22302-1458 CSTE-TTP 4 April 2008 MEMORANDUM FOR SEE DISTRIBUTION 1. References: a. ATEC Regulation 73-1,

More information

REQUIREMENTS TO CAPABILITIES

REQUIREMENTS TO CAPABILITIES Chapter 3 REQUIREMENTS TO CAPABILITIES The U.S. naval services the Navy/Marine Corps Team and their Reserve components possess three characteristics that differentiate us from America s other military

More information

SUBJECT: Army Directive (Implementation of Acquisition Reform Initiatives 1 and 2)

SUBJECT: Army Directive (Implementation of Acquisition Reform Initiatives 1 and 2) S E C R E T A R Y O F T H E A R M Y W A S H I N G T O N MEMORANDUM FOR SEE DISTRIBUTION SUBJECT: Army Directive 2017-22 (Implementation of Acquisition Reform Initiatives 1 and 2) 1. References. A complete

More information

Test and Evaluation Policy

Test and Evaluation Policy Army Regulation 73 1 Test and Evaluation Test and Evaluation Policy UNCLASSIFIED Headquarters Department of the Army Washington, DC 16 November 2016 SUMMARY of CHANGE AR 73 1 Test and Evaluation Policy

More information

Department of the Army *ATEC Regulation United States Army Test and Evaluation Command 4501 Ford Avenue Alexandria, VA August 2004

Department of the Army *ATEC Regulation United States Army Test and Evaluation Command 4501 Ford Avenue Alexandria, VA August 2004 Department of the Army *ATEC Regulation 73-21 United States Army Test and Evaluation Command 4501 Ford Avenue Alexandria, VA 22302-1458 23 August 2004 Test and Evaluation ACCREDITATION OF MODELS AND SIMULATIONS

More information

The Armed Forces Communications and Electronics Association (AFCEA)

The Armed Forces Communications and Electronics Association (AFCEA) U.S. ARMY TEST AND EVALUATION COMMAND The Armed Forces Communications and Electronics Association (AFCEA) MG John W. Charlton 8 November 2017 Mission What does ATEC do for the Army? ATEC plans, integrates,

More information

Testing in a Joint Environment. Janet Garber Director Test and Evaluation Office Office of the Deputy Under Secretary of the Army

Testing in a Joint Environment. Janet Garber Director Test and Evaluation Office Office of the Deputy Under Secretary of the Army Testing in a Joint Environment Value Added and Considerations Janet Garber Director Test and Evaluation Office Office of the Deputy Under Secretary of the Army June 2008 UNCLASSIFIED 1 Why do we test?

More information

ATEC Testing In Support of the War

ATEC Testing In Support of the War ATEC Testing In Support of the War James B. Johnson U.S. Army Developmental Test Command 6 Feb 07 1 Understand Who We Are Full Spectrum Testing All phases of testing; developmental, operational & evaluation

More information

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 213 Army DATE: February 212 COST ($ in Millions) FY 211 FY 212 FY 214 FY 215 FY 216 FY 217 To Complete Program Element 125.44 31.649 4.876-4.876 25.655

More information

US Special Operations Command

US Special Operations Command US Special Operations Command Operational Test & Evaluation Overview HQ USSOCOM LTC Kevin Vanyo 16 March 2011 The overall classification of this briefing is: Agenda OT&E Authority Mission and Tenants Responsibilities

More information

Test and Evaluation Policy

Test and Evaluation Policy Army Regulation 73 1 Test and Evaluation Test and Evaluation Policy Headquarters Department of the Army Washington, DC 7 January 2002 UNCLASSIFIED Report Documentation Page Report Date 07 Jan 2002 Report

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-8 CJCSI 3170.01C DISTRIBUTION: A, B, C, J, S JOINT CAPABILITIES INTEGRATION AND DEVELOPMENT SYSTEM References: See Enclosure C 1. Purpose. The purpose

More information

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit) ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit) Total Program Element (PE) Cost 64312 68659 71079 72540 77725 77145 78389 Continuing Continuing DV02 ATEC Activities 40286 43109 44425 46678 47910 47007

More information

Department of Defense DIRECTIVE

Department of Defense DIRECTIVE Department of Defense DIRECTIVE NUMBER 5141.02 February 2, 2009 DA&M SUBJECT: Director of Operational Test and Evaluation (DOT&E) References: See Enclosure 1 1. PURPOSE. This Directive: a. Reissues DoD

More information

Warfighting Capabilities Determination

Warfighting Capabilities Determination Army Regulation 71 9 Force Development Warfighting Capabilities Determination Headquarters Department of the Army Washington, DC 28 December 2009 UNCLASSIFIED SUMMARY of CHANGE AR 71 9 Warfighting Capabilities

More information

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

UNCLASSIFIED FY 2016 OCO. FY 2016 Base Exhibit R-2, RDT&E Budget Item Justification: PB 2016 Army Date: February 2015 2040: Research, Development, Test & Evaluation, Army / BA 6: RDT&E Management Support COST ($ in Millions) Prior Years FY

More information

Prepared for Milestone A Decision

Prepared for Milestone A Decision Test and Evaluation Master Plan For the Self-Propelled Artillery Weapon (SPAW) Prepared for Milestone A Decision Approval Authority: ATEC, TACOM, DASD(DT&E), DOT&E Milestone Decision Authority: US Army

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Tactical Mission Command (TMC) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents Common Acronyms and Abbreviations

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 3150.09 April 8, 2015 Incorporating Change 1, Effective January 16, 2018 USD(AT&L) SUBJECT: The Chemical, Biological, Radiological, and Nuclear (CBRN) Survivability

More information

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE A: Programwide Activities FY 2012 OCO

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE A: Programwide Activities FY 2012 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 212 Army DATE: February 211 24: Research, Development, Test & Evaluation, Army FY 21 FY 211 PE 6581A: Programwide Activities Total FY 213 FY 214 FY 215

More information

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 7 R-1 Line #142

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 7 R-1 Line #142 Exhibit R2, RDT&E Budget Item Justification: PB 2015 Army Date: March 2014 2040: Research, Development, Test & Evaluation, Army / BA 6: RDT&E Management Support COST ($ in Millions) Prior Years FY 2013

More information

Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability

Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability U.S. Army Research, Development and Engineering Command Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability NDIA Systems Engineering Conference

More information

An Overview of Army Test and Evaluation. Published by: The Army Test and Evaluation Management Agency

An Overview of Army Test and Evaluation. Published by: The Army Test and Evaluation Management Agency An Overview of Army Test and Evaluation Published by: The Army Test and Evaluation Management Agency September 2007 Table of Contents Topic Page Introduction 1 I Test and Evaluation Mandate 1 II The Army

More information

The Army Force Modernization Proponent System

The Army Force Modernization Proponent System Army Regulation 5 22 Management The Army Force Modernization Proponent System Rapid Action Revision (RAR) Issue Date: 25 March 2011 Headquarters Department of the Army Washington, DC 6 February 2009 UNCLASSIFIED

More information

AGENDA ARMY DEVELOPMENTAL AND OPERATIONAL TEST PLAN REVIEW PROCESS. How Does ATEC FIT IN? Developmental Testing What:

AGENDA ARMY DEVELOPMENTAL AND OPERATIONAL TEST PLAN REVIEW PROCESS. How Does ATEC FIT IN? Developmental Testing What: AGENDA ARMY DEVELOPMENTAL AND OPERATIONAL TEST PLAN REVIEW PROCESS Dal M. Nett Safety Director US Army Test and Evaluation Command (ATEC) 27 June 08 ATEC/DTC/OTC DT 101 What is Developmental Testing? Why

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE POLICY DIRECTIVE 99-1 3 JUNE 2014 Test and Evaluation TEST AND EVALUATION COMPLIANCE WITH THIS PUBLICATION IS MANDATORY ACCESSIBILITY: Publications

More information

JCIDS: The New Language of Defense Planning, Programming and Acquisition

JCIDS: The New Language of Defense Planning, Programming and Acquisition JCIDS: The New Language of Defense Planning, Programming and Acquisition By Gregory P. Cook Colonel, USAF (Ret) INTRODUCTION The past decade has seen significant change in the way the Department of Defense

More information

Subj: CHEMICAL, BIOLOGICAL, RADIOLOGICAL, AND NUCLEAR DEFENSE REQUIREMENTS SUPPORTING OPERATIONAL FLEET READINESS

Subj: CHEMICAL, BIOLOGICAL, RADIOLOGICAL, AND NUCLEAR DEFENSE REQUIREMENTS SUPPORTING OPERATIONAL FLEET READINESS DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON, DC 20350-2000 OPNAVINST 3400.10G N9 OPNAV INSTRUCTION 3400.10G From: Chief of Naval Operations Subj: CHEMICAL,

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 8330.01 May 21, 2014 Incorporating Change 1, December 18, 2017 DoD CIO SUBJECT: Interoperability of Information Technology (IT), Including National Security Systems

More information

Test and Evaluation of Highly Complex Systems

Test and Evaluation of Highly Complex Systems Guest Editorial ITEA Journal 2009; 30: 3 6 Copyright 2009 by the International Test and Evaluation Association Test and Evaluation of Highly Complex Systems James J. Streilein, Ph.D. U.S. Army Test and

More information

Overview of the Chemical and Biological Defense Program Requirements Process

Overview of the Chemical and Biological Defense Program Requirements Process Overview of the Chemical and Biological Defense Program Requirements Process 14 March 2012 Director, Joint Requirements Office for Chemical, Biological, Radiological, and Nuclear Defense J-8, The Joint

More information

Joint Interoperability Certification

Joint Interoperability Certification J O I N T I N T E R O P E R B I L I T Y T E S T C O M M N D Joint Interoperability Certification What the Program Manager Should Know By Phuong Tran, Gordon Douglas, & Chris Watson Would you agree that

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Logistics Modernization Program Increment 2 (LMP Inc 2) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents

More information

DoD Analysis Update: Support to T&E in a Net-Centric World

DoD Analysis Update: Support to T&E in a Net-Centric World Session C: Past and Present T&E Lessons Learned 40 Years of Excellence in Analysis DoD Analysis Update: Support to T&E in a Net-Centric World 2 March 2010 Dr. Wm. Forrest Crain Director, U.S. Army Materiel

More information

FORCE XXI BATTLE COMMAND, BRIGADE AND BELOW (FBCB2)

FORCE XXI BATTLE COMMAND, BRIGADE AND BELOW (FBCB2) FORCE XXI BATTLE COMMAND, BRIGADE AND BELOW (FBCB2) Army ACAT ID Program Prime Contractor Total Number of Systems: 59,522 TRW Total Program Cost (TY$): $1.8B Average Unit Cost (TY$): $27K Full-rate production:

More information

Survivability of Army Personnel and Materiel

Survivability of Army Personnel and Materiel Army Regulation 70 75 Research, Development, and Acquisition Survivability of Army Personnel and Materiel Headquarters Department of the Army Washington, DC 2 May 2005 UNCLASSIFIED SUMMARY of CHANGE AR

More information

A udit R eport. Office of the Inspector General Department of Defense. Report No. D October 31, 2001

A udit R eport. Office of the Inspector General Department of Defense. Report No. D October 31, 2001 A udit R eport ACQUISITION OF THE FIREFINDER (AN/TPQ-47) RADAR Report No. D-2002-012 October 31, 2001 Office of the Inspector General Department of Defense Report Documentation Page Report Date 31Oct2001

More information

THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC

THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC 20301-3010 ACQUISITION, TECHNOLOGY AND LOGISTICS DEC 0 it 2009 MEMORANDUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS CHAIRMAN OF THE

More information

Subj: THREAT SUPPORT TO THE DEFENSE ACQUISITION SYSTEM

Subj: THREAT SUPPORT TO THE DEFENSE ACQUISITION SYSTEM DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON, DC 20350-2000 OPNAVINST 3811.1F N2N6 OPNAV INSTRUCTION 3811.1F From: Chief of Naval Operations Subj: THREAT

More information

Product Support Manager Workshop. Rapid Capabilities. Mr. Chris O Donnell Director, Joint Rapid Acquisition Cell

Product Support Manager Workshop. Rapid Capabilities. Mr. Chris O Donnell Director, Joint Rapid Acquisition Cell Product Support Manager Workshop Rapid Capabilities Mr. Chris O Donnell Director, Joint Rapid Acquisition Cell June 8, 2017 17-S-1832 Deliberate Requirements vs. Urgent / Rapid Requirements Lanes Urgent

More information

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 24 R-1 Line #152

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 24 R-1 Line #152 Exhibit R2, RDT&E Budget Item Justification: PB 2015 Army Date: March 2014 2040: Research, Development, Test & Evaluation, Army / BA 6: RDT&E Management Support COST ($ in Millions) Prior Years FY 2013

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 4630.8 June 30, 2004 SUBJECT: Procedures for Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS) ASD(NII)/DoD

More information

DOD DIRECTIVE E ROLES AND RESPONSIBILITIES ASSOCIATED WITH THE CHEMICAL AND BIOLOGICAL DEFENSE PROGRAM (CBDP)

DOD DIRECTIVE E ROLES AND RESPONSIBILITIES ASSOCIATED WITH THE CHEMICAL AND BIOLOGICAL DEFENSE PROGRAM (CBDP) DOD DIRECTIVE 5160.05E ROLES AND RESPONSIBILITIES ASSOCIATED WITH THE CHEMICAL AND BIOLOGICAL DEFENSE PROGRAM (CBDP) Originating Component: Office of the Under Secretary of Defense for Acquisition, Technology,

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Global Combat Support System-Marine Corps Logistics Chain Management Increment 1 (GCSS-MC LCM Inc 1) Defense Acquisition Management Information Retrieval

More information

Subj: MISSION, FUNCTIONS, AND TASKS OF NAVAL SPECIAL WARFARE COMMAND

Subj: MISSION, FUNCTIONS, AND TASKS OF NAVAL SPECIAL WARFARE COMMAND DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON DC 20350-2000 OPNAVINST 5450.221E N3/N5 OPNAV INSTRUCTION 5450.221E From: Chief of Naval Operations Subj: MISSION,

More information

UNCLASSIFIED UNCLASSIFIED

UNCLASSIFIED UNCLASSIFIED : February 216 Exhibit R2, RDT&E Budget Item Justification: PB 217 2: Research, Development, Test & Evaluation, / BA 5: System Development & Demonstration (SDD) COST ($ in Millions) FY 215 FY 216 R1 Program

More information

The members of the concept team at the United States

The members of the concept team at the United States Concept Capability Plan: Combating Weapons of Mass Destruction By Mr. Larry Lazo, Lieutenant Colonel Thamar Main, and Lieutenant Colonel Bret Van Camp The members of the concept team at the United States

More information

Department of Defense DIRECTIVE

Department of Defense DIRECTIVE Department of Defense DIRECTIVE NUMBER 3200.11 December 27, 2007 USD(AT&L) SUBJECT: Major Range and Test Facility Base (MRTFB) References: (a) DoD Directive 3200.11, Major Range and Test Facility Base,

More information

Mission Based T&E Progress

Mission Based T&E Progress U.S. Army Evaluation Center Mission Based T&E Progress Christopher Wilcox Deputy/Technical Director Fires Evaluation Directorate, US AEC 15 Mar 11 2 Purpose and Agenda Purpose: To review the status of

More information

Conducting. Joint, Inter-Organizational and Multi-National (JIM) Training, Testing, Experimentation. in a. Distributive Environment

Conducting. Joint, Inter-Organizational and Multi-National (JIM) Training, Testing, Experimentation. in a. Distributive Environment Conducting Joint, Inter-Organizational and Multi-National (JIM) Training, Testing, Experimentation in a Distributive Environment Colonel (USA, Ret) Michael R. Gonzales President and Chief Executive Officer

More information

Department of the Army *TRADOC Regulation Headquarters, United States Army Training and Doctrine Command Fort Eustis, Virginia

Department of the Army *TRADOC Regulation Headquarters, United States Army Training and Doctrine Command Fort Eustis, Virginia Department of the Army *TRADOC Regulation 71-12 Headquarters, United States Army Training and Doctrine Command Fort Eustis, Virginia 23604-5700 03 October 2012 Force Development U.S. ARMY TRAINING AND

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A) Defense Acquisition Management Information Retrieval (DAMIR)

More information

Chemical, Biological, Radiological, and Nuclear Survivability Committee

Chemical, Biological, Radiological, and Nuclear Survivability Committee Army Regulation 15 41 Boards, Commissions, and Committees Chemical, Biological, Radiological, and Nuclear Survivability Committee UNCLASSIFIED Headquarters Department of the Army Washington, DC 8 May 2018

More information

DoDI ,Operation of the Defense Acquisition System Change 1 & 2

DoDI ,Operation of the Defense Acquisition System Change 1 & 2 DoDI 5000.02,Operation of the Defense Acquisition System Change 1 & 2 26 January & 2 February 2017 (Key Changes from DoDI 5000.02, 7 Jan 2015) Presented By: T.R. Randy Pilling Center Director Acquisition

More information

Fiscal Year (FY) 2011 Budget Estimates

Fiscal Year (FY) 2011 Budget Estimates Fiscal Year (FY) 2011 Budget Estimates Attack the Network Defeat the Device Tr ai n the Force February 2010 JUSTIFICATION OF FISCAL YEAR (FY) 2011 BUDGET ESTIMATES Table of Contents - Joint Improvised

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Defense Enterprise Accounting and Management System-Increment 1 (DEAMS Inc 1) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED

More information

Department of Defense DIRECTIVE

Department of Defense DIRECTIVE Department of Defense DIRECTIVE NUMBER 5134.09 September 17, 2009 DA&M SUBJECT: Missile Defense Agency (MDA) References: See Enclosure 1 1. PURPOSE. This Directive, in accordance with the authority vested

More information

S E C R E T A R Y O F T H E A R M Y W A S H I N G T O N

S E C R E T A R Y O F T H E A R M Y W A S H I N G T O N S E C R E T A R Y O F T H E A R M Y W A S H I N G T O N MEMORANDUM FOR SEE DISTRIBUTION SUBJECT: Army Directive 2015-42 (Army Contingency Basing Policy) 1. References. A complete list of references is

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Mission Planning System Increment 5 (MPS Inc 5) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents Common

More information

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE A: Biometrics Enabled Intelligence FY 2012 OCO

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE A: Biometrics Enabled Intelligence FY 2012 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 2012 Army DATE: February 2011 COST ($ in Millions) FY 2010 FY 2011 FY 2013 FY 2014 FY 2015 FY 2016 To Program Element - 14.114 15.018-15.018 15.357 15.125

More information

Health Hazard Assessment Program in Support of the Army Acquisition Process

Health Hazard Assessment Program in Support of the Army Acquisition Process Army Regulation 40 10 Medical Services Health Hazard Assessment Program in Support of the Army Acquisition Process Headquarters Department of the Army Washington, DC 27 July 2007 UNCLASSIFIED SUMMARY of

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B) Defense Acquisition Management Information Retrieval (DAMIR)

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Global Combat Support System - Army Increment 2 (GCSS-A Inc 2) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents

More information

DoD Instruction dated 8 December Operation of the Defense Acquisition System Statutory and Regulatory Changes

DoD Instruction dated 8 December Operation of the Defense Acquisition System Statutory and Regulatory Changes DoD Instruction 5000.02 dated 8 December 2008 Operation of the Defense Acquisition System Statutory and Regulatory Changes Karen Byrd Learning Capabilities Integration Center April 2009 Changes to the

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE POLICY DIRECTIVE 15-1 12 NOVEMBER 2015 Weather WEATHER OPERATIONS COMPLIANCE WITH THIS PUBLICATION IS MANDATORY ACCESSIBILITY: Publications and forms

More information

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

UNCLASSIFIED. FY 2016 Base FY 2016 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 2016 Army : February 2015 2040: Research, Development, Test & Evaluation, Army / BA 5: System Development & Demonstration (SDD) COST ($ in Millions) Years

More information

AMERICA S ARMY THE STRENGTH OF THE NATION

AMERICA S ARMY THE STRENGTH OF THE NATION AMERICA S ARMY THE STRENGTH OF THE NATION TM Office, Assistant Secretary of the Army for Installations, Energy and Environment Methodology & Analysis for Energy Security in Military Operations (MAESMO)

More information

The Army Force Modernization Proponent System

The Army Force Modernization Proponent System Army Regulation 5 22 Management The Army Force Modernization Proponent System Headquarters Department of the Army Washington, DC 28 October 2015 UNCLASSIFIED SUMMARY of CHANGE AR 5 22 The Army Force Modernization

More information

AMC s Fleet Management Initiative (FMI) SFC Michael Holcomb

AMC s Fleet Management Initiative (FMI) SFC Michael Holcomb AMC s Fleet Management Initiative (FMI) SFC Michael Holcomb In February 2002, the FMI began as a pilot program between the Training and Doctrine Command (TRADOC) and the Materiel Command (AMC) to realign

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Integrated Strategic Planning and Analysis Network Increment 4 (ISPAN Inc 4) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED

More information

Middle Tier Acquisition and Other Rapid Acquisition Pathways

Middle Tier Acquisition and Other Rapid Acquisition Pathways Middle Tier Acquisition and Other Rapid Acquisition Pathways Pete Modigliani Su Chang Dan Ward Contact us at accelerate@mitre.org Approved for public release. Distribution unlimited 17-3828-2. 2 Purpose

More information

Department of Defense DIRECTIVE. SUBJECT: DoD Electromagnetic Environmental Effects (E3) Program

Department of Defense DIRECTIVE. SUBJECT: DoD Electromagnetic Environmental Effects (E3) Program Department of Defense DIRECTIVE NUMBER 3222.3 September 8, 2004 SUBJECT: DoD Electromagnetic Environmental Effects (E3) Program ASD(NII) References: (a) DoD Directive 3222.3, "Department of Defense Electromagnetic

More information

ACC Contracting Command Update

ACC Contracting Command Update ACC Contracting Command Update MG Ted Harrison Commanding General Agile Proficient Trusted UNCLASSIFIED 3 Jun 15 U.S. Army Commands (ACOMs) 1 Army Materiel Command 2 # of Personnel Auth / On Board Mil

More information

JCIDS Overview. Joint Capabilities Integration & Development System. Joint Staff, J-8 Capabilities and Acquisition Division UNCLASSIFIED UNCLASSIFIED

JCIDS Overview. Joint Capabilities Integration & Development System. Joint Staff, J-8 Capabilities and Acquisition Division UNCLASSIFIED UNCLASSIFIED Joint Capabilities Integration & Development System 1 JCIDS Overview Joint Staff, J-8 Capabilities and Acquisition Division 2 Previous Requirements and Acquisition Process Frequently produced stovepiped

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-6 CJCSI 5721.01B DISTRIBUTION: A, B, C, J, S THE DEFENSE MESSAGE SYSTEM AND ASSOCIATED LEGACY MESSAGE PROCESSING SYSTEMS REFERENCES: See Enclosure B.

More information

This is definitely another document that needs to have lots of HSI language in it!

This is definitely another document that needs to have lots of HSI language in it! 1 The Capability Production Document (or CPD) is one of the most important things to come out of the Engineering and Manufacturing Development phase. It defines an increment of militarily useful, logistically

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-8 CJCSI 3170.01F DISTRIBUTION: A, B, C, J, S JOINT CAPABILITIES INTEGRATION AND DEVELOPMENT SYSTEM References: See Enclosure D 1. Purpose. The purpose

More information

Information Technology

Information Technology September 24, 2004 Information Technology Defense Hotline Allegations Concerning the Collaborative Force- Building, Analysis, Sustainment, and Transportation System (D-2004-117) Department of Defense Office

More information

Department of Defense DIRECTIVE

Department of Defense DIRECTIVE Department of Defense DIRECTIVE NUMBER 3200.11 May 1, 2002 Certified Current as of December 1, 2003 SUBJECT: Major Range and Test Facility Base (MRTFB) DOT&E References: (a) DoD Directive 3200.11, "Major

More information

Duty Title Unit Location

Duty Title Unit Location Deployment DEPLOYMENTS (12 month) 6/15/2014 ***ALL DEPLOYED ASSIGNMENTS ARE SUBJECT TO CHANGE*** Legal Advisor US Embassy Kabul, Afghanistan Combined Security Transition Command- Staff Judge Advocate Afghanistan

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE POLICY DIRECTIVE 90-16 31 AUGUST 2011 Special Management STUDIES AND ANALYSES, ASSESSMENTS AND LESSONS LEARNED COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

More information

Department of Defense INSTRUCTION. 1. PURPOSE. This Instruction, issued under the authority of DoD Directive (DoDD) 5144.

Department of Defense INSTRUCTION. 1. PURPOSE. This Instruction, issued under the authority of DoD Directive (DoDD) 5144. Department of Defense INSTRUCTION NUMBER 8410.02 December 19, 2008 ASD(NII)/DoD CIO SUBJECT: NetOps for the Global Information Grid (GIG) References: See Enclosure 1 1. PURPOSE. This Instruction, issued

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Army Contract Writing System (ACWS) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents Common Acronyms and

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-6 CJCSI 5116.05 DISTRIBUTION: A, B, C MILITARY COMMAND, CONTROL, COMMUNICATIONS, AND COMPUTERS EXECUTIVE BOARD 1. Purpose. This instruction establishes

More information

The Role of T&E in the Systems Engineering Process Keynote Address

The Role of T&E in the Systems Engineering Process Keynote Address The Role of T&E in the Systems Engineering Process Keynote Address August 17, 2004 Glenn F. Lamartin Director, Defense Systems Top Priorities 1. 1. Successfully Successfully Pursue Pursue the the Global

More information

Manpower, Personnel, and Training Assessment (MPTA) Handbook

Manpower, Personnel, and Training Assessment (MPTA) Handbook ARL-TN-0715 NOV 2015 US Army Research Laboratory Manpower, Personnel, and Training Assessment (MPTA) Handbook Richard A Tauson and Wayne Cream Approved for public release; distribution is unlimited. NOTICES

More information

Developmental Test and Evaluation Is Back

Developmental Test and Evaluation Is Back Guest Editorial ITEA Journal 2010; 31: 309 312 Developmental Test and Evaluation Is Back Edward R. Greer Director, Developmental Test and Evaluation, Washington, D.C. W ith the Weapon Systems Acquisition

More information

Test and Evaluation:

Test and Evaluation: Test and Evaluation: How Well Your Equipment Performs Mr. James C. Cooke, Director Deputy Under Secretary of the Army Test and Evaluation Office 25 June 2009 1 Unclassified Support the Warfighter to accomplish

More information

UNCLASSIFIED. R-1 Program Element (Number/Name) PE J / Joint Integrated Air & Missile Defense Organization (JIAMDO) Prior Years FY 2013 FY 2014

UNCLASSIFIED. R-1 Program Element (Number/Name) PE J / Joint Integrated Air & Missile Defense Organization (JIAMDO) Prior Years FY 2013 FY 2014 Exhibit R-2, RDT&E Budget Item Justification: PB 2015 The Joint Staff Date: March 2014 0400: Research, Development, Test & Evaluation, Defense-Wide / BA 6: RDT&E Management Support COST ($ in Millions)

More information

Developmental Test & Evaluation OUSD(AT&L)/DDR&E

Developmental Test & Evaluation OUSD(AT&L)/DDR&E Developmental Test & Evaluation OUSD(AT&L)/DDR&E Chris DiPetto 12 th Annual NDIA Systems Engineering Conference Agenda DT&E Title 10 USC overview Organization DDR&E imperatives What Title 10 means for

More information

UNCLASSIFIED. R-1 Program Element (Number/Name) PE A / Joint Automated Deep Operation Coordination System (JADOCS)

UNCLASSIFIED. R-1 Program Element (Number/Name) PE A / Joint Automated Deep Operation Coordination System (JADOCS) Exhibit R-2, RDT&E Budget Item Justification: PB 2015 Army : March 2014 2040: Research, Development, Test & Evaluation, Army / BA 7: Operational Systems Development COST ($ in Millions) Years FY 2013 FY

More information

COMPLIANCE WITH THIS INSTRUCTION IS MANDATORY (AETC)

COMPLIANCE WITH THIS INSTRUCTION IS MANDATORY (AETC) BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE INSTRUCTION 99-103 16 OCTOBER 2013 AIR EDUCATION AND TRAINING COMMAND Supplement 6 APRIL 2015 Test and Evaluation CAPABILITIES-BASED TEST AND EVALUATION

More information

ACQUISITION OF THE ADVANCED TANK ARMAMENT SYSTEM. Report No. D February 28, Office of the Inspector General Department of Defense

ACQUISITION OF THE ADVANCED TANK ARMAMENT SYSTEM. Report No. D February 28, Office of the Inspector General Department of Defense ACQUISITION OF THE ADVANCED TANK ARMAMENT SYSTEM Report No. D-2001-066 February 28, 2001 Office of the Inspector General Department of Defense Form SF298 Citation Data Report Date ("DD MON YYYY") 28Feb2001

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE POLICY DIRECTIVE 10-25 26 SEPTEMBER 2007 Operations EMERGENCY MANAGEMENT ACCESSIBILITY: COMPLIANCE WITH THIS PUBLICATION IS MANDATORY Publications and

More information

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) Exhibit R-2 0605602A Army Technical Test Instrumentation and Targets ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) COST (In Thousands) FY 2006 Actual FY 2007 FY 2008 FY 2009 FY 2010 FY 2011 FY 2012

More information

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) Budget Item Justification Exhibit R-2 ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) 114 812 FY 2006 FY 2007 FY 2008 FY 2009 FY 2010 FY 2011 FY 2012 FY 2013 to Total COST (In Thousands) Actual Estimate

More information