Student Guide Week 1 March 2017 Version

Size: px
Start display at page:

Download "Student Guide Week 1 March 2017 Version"

Transcription

1 TST 204 Intermediate Test & Evaluation Course Student Guide Week 1 March 2017 Version

2 THIS PAGE INTENTIONALLY LEFT BLANK

3 Intermediate Test & Evaluation Course Week One Student Book Table of Contents Lesson 1 Administration / Introductions Lesson 2 T&E Policy Exercise Lesson 3 Current Events Lesson 4 T&E Early Planning Monday Homework Software, Interoperability, & Cybersecurity Exercise Lesson 5.1 Software T&E Lesson 5.2 Cybersecurity T&E Lesson 5.3 Software, Interoperability, & Cybersecurity Exercise Lesson 6.1 TEMP Development Tuesday Homework MS B TEMP Lesson 6.2 T&E Requirements Lesson 6.3 KPP, CTP, COI, and MOE/MOS Exercise Lesson 6.4 Test Resources Lesson 6.5 TEMP Review Exercise Wednesday Graded Assignment MS B TEMP Lesson 7 Human Systems Integration Lesson 8.1 Risk Management Lesson 8.2 Risk Management / ESOH Exercise Lesson 9 Data Mgmt. & Test Scenarios Exercise Thursday Homework (see Lesson 10) DT Test Execution Lesson 10 DT&E Test Execution Exercise 3

4 THIS PAGE INTENTIONALLY LEFT BLANK 4

5 Admin / Intro Lesson 1 Administration \ Introductions 5

6 THIS PAGE INTENTIONALLY LEFT BLANK 6

7 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 1 Lesson Title Administration and Introductions Lesson Time 1.5 hours. (1 hour of this is on Monday of week 1, and 30 minutes is on Friday of week 1) Terminal Learning Objective Given instructor led administration related discussions, the student will comply with routine administration related requirements; and the student will be introduced to the instructor(s) and other students in the class. Enabling Learning Objectives Complete enrollment forms and student data sheets. Review the course matrix. Resolve billeting and/or non availability issues with site coordinator. Provide an overview of the course assessments. Introduce self to class and briefly discuss T&E related experience. Provide information concerning DAU products and services Learning Method Class discussion and participation. Assignments None Estimated Student Preparation Time Method of Assessment References None N/A Student Roster and Evaluation Feedback Form 7

8 THIS PAGE INTENTIONALLY LEFT BLANK 8

9 Welcome to TEST 204 Agenda Briefing for DAU Students (200 Level) TST 204 Description, Overview, and Expectations Student Introductions 2 9

10 For 100- & 200-Level Courses Need a photo (or collage?) or something that transitions to mission statement Think the V22 photos from the n:\ Working with you to achieve mission success Enabled to Support the Acquisition Workforce 10 USC Ch Sec. 1746: The Secretary of Defense shall establish and maintain a defense acquisition university structure to provide for the professional educational development and training of the acquisition workforce. DAU Mission: Provide a global learning environment to develop qualified acquisition, requirements and contingency professionals who deliver and sustain effective and affordable warfighting capabilities. 10

11 LOCATED WITH OUR CUSTOMERS Region Location FY16Q1 C/NE Fort Belvoir, VA 38,301 Mid-Atlantic California, MD 29,922 Midwest Kettering, OH 21,883 South Huntsville, AL 34,718 West San Diego, CA 31,633 Total 156,457 We are part of the community, not just a place to take classes. College Of Contract Management (CCM) Partnership between DAU and the Defense Contract Management Agency (DCMA) CCM Mission: Design and deliver DCMA-defined training assets to further develop Agency acquisition professionals who deliver actionable acquisition insight from the factory floor to the front line around the world. 23 contract management courses fielded Contract administration, quality assurance, industrial manufacturing/supply chain, software engineering, earned value management, aircraft operations Over 20 more courses in development In addition to the above functional areas, pricing, systems engineering, multi-functional courses Several CCM courses open to non-dcma Defense Acquisition Workforce members 11

12 Defense Systems Management College (DSMC) DAU college specializing in executive-level acquisition and requirements training Objective: expand the knowledge base and training of senior acquisition leaders in support of Department of Defense initiatives Executive-level Acquisition Course International Acquisition Management Courses Requirements Management Training Acquisition Leadership Training Executive Coaching Mission Assistance DAU Acquisition Learning Model Courses Continuously improved delivery Critical thinking Understanding industry Pull-Learning When You Need It New organization Improved website (Summer 2016) More video, tools, communities Connected to courses DAU Comes to You More structured workshops e.g. source selection and should-cost More faculty involvement Connected to Workflow and Foundational Learning 12

13 DAU: TRAINING COURSES AND MORE Training Classroom & online DAWIA, Core Plus, & Executive Targeted Training - Tailored organizational training Rapid Deployment Training - On-site & online training on the latest AT&L policies Formal & informal learning at the point of need Continuous Learning CL Modules - Online, self-paced learning modules Conferences - DAU Acquisition Community Training Symposium, Hot Topic Forums Knowledge Sharing DAP - Online portal to Big A & HCI knowledge ACC - DoD's online collaborative communities Knowledge Repository and Acker Archives - Online connection to publications and DAU research resources Mission Assistance Consulting - Helping organizations solve complex acquisition problems Team Training Services Acquisition Workshops, MDAP event-driven training Executive Coaching For acquisition senior leaders 9 DAU s icatalog Most current resource for information regarding DAU courses and the Certification & Core Plus Development Guides Accessible from the DAU home page ( or directly at 13

14 Test & Evaluation Level I Certification ACQ 101 Fundamentals of Systems Acquisition Management 25 hrs, online TST 102 Fundamentals of Test and Evaluation Level II Certification SYS 202 Intermediate Systems Planning, Research, Development and TST 204 Intermediate Test & Evaluation Engineering, Part I 18 hrs, online 9.5 days classroom 9 hrs, online ENG 101 Fundamentals of Systems Engineering 35 hrs, online CLE 023 Modeling and Simulation for Test and Evaluation CLE 074 Cybersecurity Throughout DoD Acquisition Associates Degree 1 Year Experience ACQ 202 Intermediate Systems Acquisition, Part A 25 hrs, online CLE 003 Technical Reviews CLE 030 Integrated Testing CLE 035 Introduction to Probability and Statistics CLM 013 Work-Breakdown Structure ACQ 203 Intermediate Systems Acquisition, Part B 4.5 days classroom CLR 101 Introduction to JCIDS CLE 029 Testing in a Joint Environment CLE 301 Reliability and Maintainability CLM 016 Cost Estimating Bachelor Degree + 24 Hrs STEM 2 Years Experience Level III Certification TST 303 Advanced Test and Evaluation 4.5 days classroom CLB 008 Program Execution CLB 009 PPBE and Budget Exhibits CLV 016 Introduction to EVM CLL 015 Product Support Business Case Analysis (BCA) CLM 014 IPT Management and Leadership CLM 031 Improved Statement of Work Bachelor STEM Degree 4 Years Experience 11 DAU Strategic Partnerships More than 150 colleges & universities offer credit for DAU courses toward degrees and certificates Excel-erate Your Master s Degree Through this program, partner universities are offering the Defense Acquisition Workforce credit toward masters degrees for DAWIA Level II and III certification. Impact: Saves time, tuition assistance dollars and out of pocket expenses 14

15 Helping Meet Continuous Learning Requirements Providing Online Tools To Enhance Job Performance Defense Acquisition Portal A one-stop source for acquisition information and tools Ask A Professor Got an acquisition question? Go to the experts! PM Toolkit All the information a program manager could ever ask for in one convenient location Service Acquisition Mall All the tools and templates one needs to create performance-based service acquisition requirements Acquisition Community Connection Where the DoD and AT&L workforce meets to share knowledge DAU Media Video clips from senior leaders on acquisition topics Acker Library and Knowledge Repository Better Buying Power Defense Acquisition Guidebook The acquisition policy and discretionary best practice guide 15

16 Stay Current on Acquisition Policy & Research Our periodicals recognize students as subject matter experts, showcasing their original content to high-level policymakers Professional Development Opportunities The DAU Alumni Association provides a means for continuing professional growth within the defense acquisition community and helps workforce members meet their continuous learning requirements. The Association hosts the annual Acquisition Community Symposium and a number of Hot Topic Forums. If you are a government employee attending Defense Acquisition University, you are eligible for a three year free membership in the National Defense Industrial Association. 16

17 Connect With /Defense-acquisition-university /defenseacquisitionuniversity Access DAU resources on your mobile device at: Student Academic Policies & Information Students should visit the Student Policies and Information page at for information on: Student Standards of Conduct Violations of the Standards of Conduct Course Enrollment, Extensions, and Walk-ins Disenrollment, Dropping a Course, and Wait Lists Course Prerequisite/Pre-course Work Requirements Student Travel Student Assessment and Evaluation Student Attrition Codes Accommodating Students with Disabilities Transferring Students Between Career Fields (Programs) and from Other Institutions Test Reset Policy and Procedures Student Transcripts, Records Retention, and Disclosure of Student Academic Records (Privacy) Student Complaint/Grievance Procedures DAU encourages students who have a concern or issue with the learning environment to discuss it with their instructor. Students who feel their issue is not resolved satisfactorily may go to the department chair/site manager or Regional Associate Dean for Academics. 17

18 Connect with Our Learning Assets at DAU Training & CL Courses Defense Acquisition Portal Mission Assistance Acquisition Community Connection The training you get from DAU Better Buying Power DAU Knowledge Repository helps you support our warfighters. 18

19 TST 204 Purpose TST 204 builds on your knowledge, skills, and on-the-job experience relating to DoD T&E policies, processes, and practices. Course structure includes lectures and practical exercises which apply T&E concepts and principles. Course topics include: Role of T&E in systems acquisition T&E program planning (including TEMP development) Managing a T&E program Planning, conducting, and processing the results of T&E events The intent is to provide you perspectives on T&E that you may not otherwise get within your current job assignment. The more you know about other T&E stakeholders concerns, the better we can work together in planning & executing test programs. 21 Course Overview Course Matrix & Materials Course Overview Student Book Student Reference CD (See page, end of this lesson) TST 204 Acronym List & DAU Glossary (on student CD) Homework, most evenings (see your class schedule) Course Evaluation Pre-Work Submissions (10 Points) Individual Presentations/Briefings (20 Points total) Each student must present at least two briefings Two Exams (50 Points total) Two Graded Assignments (20 Points total) Passing grade is at least 80 Points (of 100 possible points) 22 19

20 Expectations Be present and participate in all Class sessions. Stay engaged and learn from each other. Expected result: Increase your knowledge and skills and be better prepared to plan and execute future test programs 23 Course Admin Local Training Command Point of Contact & Local Required Forms Facilities and Billeting Administrative Forms Student Data Sheet / Master Roster End-of-Course Evaluation Note: Pre-Work files are on student CD (top folder) 24 20

21 Course Administration Class hours: except second Fri ; one hour for lunch each day Any Unexcused Absence from class may result in dismissal, including tardiness to class, in accordance with DAU Directive 704. Emergency/Safety/Weather Procedures Dress Code- Business Casual Facilities (site specific): Rest Rooms Telephones: Silence Personal and Work Cell Phones Computer Terminals/Network Access Student Areas Coffee Mess Course Materials: Student Books (electronic on disk and BB) Lunch Facilities/Restaurants Schedule and Attendance Class Schedule is a guide. Instructors will adjust as required depending on class to achieve required Learning Objectives. Instruction will generally follow a 50 minute class followed by 10 minute break except during exercises which students will manage. DAU Policy: You are expected to be in class during the full two weeks. DAU Directive 704 is available on DAU Website. DAU Policy only allows instructors to excuse short duration absences under VERY limited emergency circumstances. These will be handled on a case by case basis only and must be approved by the instructors. Students will be required to perform make-up work. You will NOT be excused attend meetings, attend other training, or otherwise to be absent for work-related issues during authorized training. If you know in advance you will need to miss class time, please see instructors and consider self-withdrawal. Returning late from Breaks and Lunch will count against this policy

22 Academic Integrity A student shall not: Misrepresent their work. Fraudulently or unfairly advance his or her academic position, e.g. by obtaining unauthorized assistance on exams or other work. Remove or copy any exam or exercise solutions for outside use. Be a party to another student's failure to maintain academic integrity. Violate the principle of academic integrity in any other manner. See Student Academic Policy Memorandum Dir Non-Attribution We encourage and expect full and candid discussions during class. Our objective is to enable students and instructors to express their views freely and without concern for possible attribution or embarrassment. Use discretion with sensitive points/privileged information Don t repeat content or connect speaker with views expressed Don t embarrass another by repeating personal views such as support or criticism 26 22

23 Course Evaluation Exams & Graded Assignments Two graded assignments worth a total of 20 points towards your final grade Instructors will generally assign these assignments to be done during class, as a team effort However instructors may assign these assignments as homework, if they desire Two exams worth 25 points each (total of 50 points) towards your final grade Exams are open book, open notes, but NOT a team effort! Exams include multiple choice, fill-in-the-blank, and short answer questions Exams are NOT cumulative 29 TST 204 Week 1 Schedule Time Monday Tuesday Wednesday Thursday Friday 0800 Admin / Intro with Software T&E T&E Requirements HSI Exam Review DAU Info 0830 DT&E Test Execution 0900 Risk Management Exercise 0930 T&E Policy Cybersecurity T&E CTP / MOE / MOS 1000 Exercise Exercise Risk Mgmt / ESOH 1030 Exercise 1100 Lunch 1130 Lunch Lunch Software, Interop & Lunch Lunch 1230 Current Events Cyber Exercise Test Resources Data Management DT&E Exercise 1330 Early T&E Planning TEMP Review & Test Scenarios (Continued) 1400 & Mini Exercises TEMP Development Exercise Exercise HW - TEMP Exer. 1st Exam - 25 points Point Graded 1700 HW - SW/Cyber Exercise Assignment HW - DT test execution 28 23

24 TST 204 Week 2 Schedule Time Monday Tuesday Wednesday Thursday Friday 0800 STAT Part I Reliability T&E Modeling &Simulation T&E Through FRP OT&E Exercise 0830 and Mini-Exercise & Mini-Exercise and Mini-Exercise and Exercise (continued) STAT Part II (DOE) 1000 and Exercise nd Exam - 25 points Lunch Lunch Lunch Lunch Graduation! OT&E Reporting Maint & Availability T&E Successes & Exam Review Grading: 1330 Exercise, and & Mini-Exercise Best Practices OT&E Test Execution Prework - 10 pts point graded Exercise Exercise 2 Assnmts - 20 pts 1430 assessment T&E Dilemmas DT&E Assessment 2 Exams - 50 pts 1500 Exercise Exercise 2 Briefings - 20 pts Total pts 1630 HW - T&E Dilemmas HW - DT Assessment HW - CPD Exercise 31 Course Evaluation Pre-Work & Student Presentations Pre-Work Submissions (10 Points) Individual Presentations/Briefings (20 Points total) Each student must present at least two briefings (NOTE: In smaller classes you may need to brief more than 2 briefings) Students should plan to brief one of the Star briefings each week

25 TST 204 BRIEFING CHECKLIST (Each Student Must Brief At Least Twice) A briefing (10 points total, for each briefing) consists of the following: 1. Briefer introduces self, and introduces the case or exercise (1 pt) 2. Briefer accurately briefs their team s results (1 pt) - Briefer used the correct terminology - Briefer was able to answer questions (at the 200 level based on prerequisite knowledge) 3. Briefer demonstrates an understanding of the content; and explains their team s assumptions in putting the brief together (1 pt) 4. Briefer uses clear and understandable visual aids; and briefing is audible to the entire class (1 pt) 5. Briefer adequately covers all exercise material assigned to their team; and has logical information/conclusions based on their team s work (5 pts) 6. Briefer stays within allotted time limit (10 minutes maximum, unless otherwise directed by an instructor). Note that questions (Q&A) doesn t count against the 10 minute limit. (1 pt) 33 T&E Successes Briefing Wednesday Week Two Volunteer for this briefing if you (or your organization) have T&E related successes / best practices / lessons learned, that you d like to share This counts as an official briefing. Students are encouraged to prepare their slides ahead of time, for this briefing. We can do 4-5 briefings. Plan to take about minutes each with discussion (Q&A) 34 25

26 TST 204 Student Book TST 204 Books (If provided) are yours to keep. Many DAU locations are electronic only. Suggest you follow along in the slides, take notes, highlight key points Electronic version of both week books are on student CD-ROM and posted on the Virtual Campus. Full sized PDF slide files for each lesson are also provided for you to follow along on your laptop if desired. Pre-work files, acronyms, & DAU Info slides are on student CD which is posted on the Virtual Campus You will be asked for feedback at end-of-course so notes can also help recall for feedback on survey Lessons & exercises cover some of the T&E activities, that are done in that acquisition phase CAUTIONARY NOTE: Realize that this Book and Reference Material are only current at this point in time. Be sure that you get the most current references for use back at work. 35 Introductions Student and Instructor Introductions < 1 minute Name Organization and Location Years of T&E related experience Current T&E related project 36 26

27 T&E Policy Exercise Lesson 2 T&E Policy Exercise 27

28 THIS PAGE INTENTIONALLY LEFT BLANK 28

29 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 2 Lesson Title T&E Policy Exercise Lesson Time 2.0 hours Terminal Learning Objective Given current local, national, and/or international DoD T&E policy and events, the student will discuss potential impacts to the DoD T&E acquisition community and associated stakeholders based upon current policy changes and trends. (TLO #2) Enabling Learning Objectives Given local, national, and international T&E policy and events, determine potential impacts on the external environment and end user. (ELO #2.1) Given T&E policies, principles, procedures, requirements, and regulations, determine potential impacts to program decision makers. (ELO #2.2) Learning Method Class discussion and participation; group exercise. Assignments None Estimated Student Preparation Time Method of Assessment References None Group presentations. Student CD ROM (dozens of files containing T&E policy) 29

30 THIS PAGE INTENTIONALLY LEFT BLANK 30

31 T&E Policy Exercise Overview Review / discuss T&E directives and T&E Policy, at the DOD and Service-specific levels Student Exercise Note: Your student CD-ROM has numerous directives and policy documents - in addition to the ones listed on the following slides 2 31

32 Public Laws, Policies and Regulations Not only are Test and Evaluation activities important to the success of a system under development; but they are also mandated in several ways: Department of Defense (DoD) Policy Service T&E Policy Public Law Requirements Each branch of service provides specific documentation pertaining to its own T&E policy and guidance: Guidance documents often contain information on how to implement policy documents Service organizations also provide T&E policy and guidance (that apply specifically to that organization) Student References Contents 0 Course Files 1 References 1 DAU Reference 2 DoD and Joint Guidance 3 JITC 4 USAF Guidance 5 USA Guidance 6 USMC Guidance 7 USN Guidance 8 DOE & Statistics Info 9 Reports and Studies 10 Reliability 11 Sample Docs and Teplates 12 Software, Cybersecurity, and IT Info 13 Statutes 14 Technical review, M&S, and DoDAF fact sheets 2 Student Exercise Files 32

33 DOD & Joint Guidance DoDD Defense Acquisition System DoDI Operation of Defense Acquisition System DoD MRTFB DoDI Joint T&E DoDD Director OT&E DoDI DASD(DT&E) CJCSI I JCIDS MIL-STD-882E System Safety 5 Other DOD & Joint Publications DOT&E TEMP Guidebook 3.0 Defense Acquisition Guidebook Joint T&E Handbook Risk, Issues and Opportunities Management Guide Reliability, Availability, & Maintainability (RAM) Guide T&E Management Guide Study of Commercial Industry Best Practices in DT&E 6 33

34 Army Policy & Procedures AR 70-1 Acquisition Policy AR 73-1 T&E Policy DA Pamphlet 70-3 Acquisition Procedures DA Pamphlet 73-1 T&E Support for System Acquisition 7 Navy Policy & Procedures SECNAV Instruction E System Acquisition & JCIDS SECNAV M Acquisition and Capabilities Guidebook COTF Operational Test Director s Manual 8 34

35 Marine Corps Policy & Procedures USMC Integrated Test And Evaluation Handbook MCOTEA Operational Test & Evaluation Manual 9 Air Force Policy & Procedures AFI Acquisition and Sustainment Life Cycle Management AFI Capabilities Based T&E AFI Joint T&E Program AFI MRTFB & Test Resource Planning AFOTEC Test Director s Toolkit AFOTEC OT&E Guide 10 35

36 Multi-Service Actions Role of Lead Service Multi-Service OT&E Memorandum of Agreement Joint T&E is different than Multi-Service T&E Test Integrated Product Team (IPT) actions 11 Student Exercise If you were in charge what T&E regulations and/or policies would you change, and why? Pick one or two T&E policies Refer to policy documents (CD-ROM or www), if you want Local, Service, and/or DoD policies are all fair game You DON T need to say which policy documents apply Discuss solutions (if applicable) Present a 10 minute brief to the class You have 30 minutes to work 12 36

37 TST 204 Student Reference CD-ROM Note that the TST 204 student CD-ROM has 100s of reference files; on T&E, Systems Engineering, and Acquisition-related topics. Although most can be obtained on google they are conveniently compiled for your use here. 0 Course Files: Has directories containing copies of the Pre-work files, PDF copies of student book, a directory of full sized pdf slides for each lesson and a directory containing useful references for each of the lesson exercises where needed. 1 References: contains the following directories: 1. DAU Reference contains many DAU publications including the 2017 PDF catalog, and many other useful references. 2. DoD, Multi-service, and Joint Guidance Folder: Has loads of files Latest versions of Regulations and guidebooks (including DOT&E TEMP Guidebook and TE Management Guide), examples, memos, and briefs, Mil-Stds referenced in the course, and information on Better Buying Power. 3 7 JITC, Air Force, Army, Marine Corps, and Navy Guidance Folders: Has service and JITC guidance documents (directives, instructions, etc.). Also has several T&E guides. 8. DOE and Statistics Folder Has reference documents on Design of Experiments and statistics topics. 9. Defense Science Board (DSB), Inspector General (IG), and General Accounting Office (GAO) reports Folder: Has reports and testimony to Congress on T&E and acquisition topics. 10. Reliability Information Folder: Has reference documents, and the Reliability Solver tool. 11. Sample Documents Folder: Has examples of TEMP, Performance Specification, SEP, Production Acceptance Testing, CDRL, DID, FD/SC, and many other docs & templates. 12. Software, Cybersecurity & IT Info Folder: Has info on software, software T&E, cybersecurity, and other IT-related topics. 13. Statutory Folder: Has copies of laws, and other info (laws of interest to testers and systems engineers). 14. Technical Reviews, M&S, DoDAF Fact Sheets: Has factsheets and info on technical & program reviews, modeling & simulation, and DoD Architecture Framework (DoDAF). 37

38 THIS PAGE INTENTIONALLY LEFT BLANK 38

39 Current Events Lesson 3 Current Events 39

40 THIS PAGE INTENTIONALLY LEFT BLANK 40

41 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 3 Lesson Title Current Events Lesson Time 1.0 hours Terminal Learning Objective Given current local, national, and/or international DoD T&E policy and events, the student will discuss potential impacts to the DoD T&E acquisition community and associated stakeholders based upon current policy changes and trends. (TLO #2) Enabling Learning Objectives Given local, national, and international T&E policy and events, determine potential impacts on the external environment and end user. (ELO #2.1) Given T&E policies, principles, procedures, requirements and regulations, determine potential impacts to program decisionmakers. (ELO #2.2) Learning Method Class discussion and participation. Assignments None Estimated Student Preparation Time Method of Assessment References None Written examination. Student Roster and Evaluation Feedback Form Test and Evaluation Management Guide, 6 th ed., 2012 version, chapter 1 41

42 THIS PAGE INTENTIONALLY LEFT BLANK 42

43 Current Events NOTE: This lesson is a placeholder. Instructors will update these with updates as they occur. Lesson Topics These items will significantly influence both Acquisition and T&E. 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 2 43

44 EV1 EV2 DoDI Chng 2 dated 2 Feb 2017 Lesson Topics: 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 3 Slide 3 EV1 To replace or add to bri Ed Verchot, 11/7/2016 EV2 Ed Verchot, 11/7/

45 Change 1 & 2, DoDI Issued on January 26 & February 2, 2017 Modifies and clarifies roles & responsibilities for MDA approval Multiple changes to Milestone & Phase Information Requirements (Table 2) DAB Planning Meetings SECDEF authority to waive acquisition law/regulation Enclosure 12 modified, removed from , and incorporated into DoDI , Business Systems Requirements And Acquisitions (Change 2) Urgent Capability vs. Rapid Enclosure 14 incorporates and cancels Directive Type Memo related to Cybersecurity in the Defense Acquisition System 4 Change 1 Comparison to 7 Jan 2015 Version (How much has changed?) Instruction (Basic Process Description) Enclosures 1. Acquisition Program Categories and Compliance Requirements 2. Program Management 3. Systems Engineering 4. Developmental Test and Evaluation (DT&E) 5. Operational and Live Fire Test and Evaluation (OT&E and LFT&E) 6. Life-Cycle Sustainment 7. Human Systems Integration (HSI) 8. Affordability Analysis and Investment Constraints 9. Analysis of Alternatives (AoA) 10. Cost Estimating and Reporting 11. Requirements Applicable to All Programs Containing Information Technology (IT) 12. Acquisition of Defense Business Systems (DBS) (change 2) 13. Urgent Capability Acquisition (title change) 14. Cybersecurity in the Defense Acquisition System (added) LEGEND Extensive Changes Moderate Changes Little Changes 45

46 TEMP Changes TEMP Guidance was not significantly changed in changes 1 and 2. Some of the appropriate guidance from DoDI follows. For a complete list of guidance see DOT&E TEMP Guidebook and Defense Acquisition Guidebook Milestone A TEMP (vice TES) Identify each major DT phase & event as contractor or Govt. DT More Cybersecurity T&E required A TEMP annex, with the component s rationale for requirements More software maturity metrics and software testing TEMP approval authorities clarified 6 TEMP Guidance(cont.) More TEMP guidance: Developmental evaluation framework that shows the correlation/mapping between: test events: key resources; and decision supported An OT evaluation overview, including synopsis of the intended analysis for each major OT test phase or event Table of variables (factors & conditions) that may have a significant effect on operational performance Starting at MS B, the levels, and methods of controlling variables during test events must be stated 7 46

47 T&E Reports to Congress & Congressional Notification DT&E Exception Reports (compiled and submitted annually by USD(AT&L) to Congress). Case 1: When an MDAP proceeds with implementing a TEMP that includes a developmental test plan disapproved by DASD(DT&E). Case 2: When an MDAP proceeds to IOT&E following an assessment by DASD(DT&E) that the program is not ready for operational testing. Must explain what, why, and any mitigating actions MDAP decision to conduct DT&E without an approved TEMP Congressional notification (done by USD(AT&L)) Must explain why, & timeline for getting an approved TEMP DoDI Encl. 1, and FY13 NDAA Lesson Topics: Defense Acquisition Guidebook Released 21 Feb DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 9 47

48 Defense Acquisition Guidebook (DAG) Update 27 Feb 2017 New Interface Updated chapters for all areas. T&E now Chapter 8 Engineering now Chapter 3 10 Recent Statutory Changes Lesson Topics: 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 11 48

49 FY17 NDAA Changes Affecting Acquisition and T&E After significant debate during which the Senate proposed to merge DASD DTE and DOT&E and the House did not agree, Congress declined to direct organizational changes regarding the duties DASD (DTE-TRMC). HOWEVER, they directed a study to reduce overlap between them. (see next slide for details) Congress directed that USD(AT&L) be split into USD(Research and Engineering) and USD(Acquisition and Sustainment) effective Feb THE (CONFERENCE) COMMITTEE RECOMMENDS THAT THE PANEL ADDRESS THE FOLLOWING QUESTIONS: (a) How can the Director of Operational Test and Evaluation and the Deputy Assistant Secretary of Defense for Developmental Test and Evaluation (DASD DT&E) at the Office of the Secretary of Defense approach oversight within the system development cycle to avoid overlap but be mutually supporting without sacrificing the independence of either organization? (b) Does participation with and assessment of program progress during phases prior to operational test and evaluation bias the independent objectivity of the operational test and evaluation organization? (c) Are there specific test and evaluation activities that should be realigned for management within OSD or the services to promote effectiveness and efficiency of those programs? (d) Overall are the developmental and operational test and evaluation organizations effectively carrying out the missions as described in title 10, United States Code, and are there impediments to meeting those responsibilities? In addition, are they engaged in activities outside their mission areas? (e) Are the activities of the test and evaluation organizations constructive, not duplicative or disruptive, to support the acquisition goals of the military departments and defense agencies? (f) What staffing authorities and other resources are needed to support effective and efficient oversight of both the developmental and operational phases of testing commensurate with the effort to each relative to the portion of the programs that their oversight entails? 49

50 FY17 NDAA Changes Affecting Acquisition and T&E After significant debate during which the Senate proposed to merge DASD DTE and DOT&E and the House did not agree, Congress declined to direct organizational changes regarding the duties DASD (DTE-TRMC). HOWEVER, they directed a study to reduce overlap between them. (see next slide for details) Congress directed that USD(AT&L) be split into USD(Research and Engineering) and USD(Acquisition and Sustainment) effective Feb Effective Feb 2018 FY17 NDAA CHANGES TO AT&L The position of Undersecretary of Defense for Acquisition, Technology, and Logistics is eliminated The position of Undersecretary of Defense for Research and Engineering is established with precedence immediately below DepSecDef The position of Undersecretary of Defense for Acquisition and Sustainment is established to set Acquisition Policy and act as oversight (Milestone Decision Authority) for DoD Acquisition Programs. Precedence is immediately below USD (R&E) Specific organization beneath each of the new USDs was left to the DoD to recommend and implement by Feb 2018 including appropriate assignments of Deputy Assistant Secretaries. 50

51 Upcoming Changes Jan 2017 Office of Chief Innovation Officer has been created. With new administration new direction and interpretations can be expected. Recommendations for FY 17 Appointee positions not yet forthcoming or approved. NDAA specifically allows for USD (AT&L) movement to new USD (R&E). 16 Risk, Issue, Opportunity Management Guide Lesson Topics: 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 17 51

52 Risk, Issue, and Opportunity Management Processes Technical Events Programmatic Events Business Events What can go wrong? What has or is certain to go wrong? What can be improved? Risk Management Issue Management Opportunity Management Consequences: Both positive and negative impacts to cost, schedule, and performance Discussed in Risk Lesson 18 Revised Risk and Issue Management Process Risk Monitoring How has the risk changed? Risk Planning What is the program s risk management process? Process Planning What are the program s risk and issue management processes? Identification What has, can, or will go wrong? Risk Handling Should the risk be accepted, avoided, transferred, or mitigated? Communication and Feedback Risk Identification What can go wrong? Monitoring How has the risk or issue changed? Communication and Feedback Analysis What is the likelihood of the risk and the consequence of the risk or issue? Risk Analysis What are the likelihood and consequence of the risk? Mitigation / Correction What, if anything, will be done about the risk or issue? June 2015 June 2016 (Released Feb 2017) 19 52

53 Risk Management Closer Look Process Planning What is the program s risk mitigation process? Risk Identification What can go wrong? Communication and Feedback Risk Monitoring How has the risk changed? Risk Analysis What are the likelihood and consequence of the risk? Risk Mitigation Should the risk be mitigated? If so, how? 20 Issue Management Process Fig 3.12 Issue Process Planning What is the program s issue management process? Issue Identification What has or will go wrong? Issue Monitoring How has the issue changed? Communication and Feedback Issue Analysis What is the consequence of the issue? Corrective Action What, if anything, should be done about the issue? 21 53

54 Opportunity Management Process Fig 4.2 Opportunity Process Planning What is the program s opportunity management process? Opportunity Identification What can be improved? Opportunity Monitoring How has the opportunity changed? Communication and Feedback Opportunity Analysis What is the business case analysis of the opportunity? Opportunity Management Should the opportunity be pursued, reevaluated, or rejected? If so, how? 22 Opportunity Management tied to Should Cost Risk Mitigation enables meeting Will Cost and Should Cost Risk Mitigation Activities Cost, Schedule, and Performance thresholds Baseline Opportunity Mitigation Activities Opportunity Mitigation enables achieving Should Cost 23 54

55 Emphasis on Joint Risk Management A Joint Risk Management Board addresses common program risks Contractor Risks Contractor RMB Mutual Risks Joint Risk Management Board (JRMB) Government Risks Government RMB Roles in Mitigating Risks Vary with Contract Type Fig A 1 Government and Contractor Joint Risk Management Boards 24 Risk, Issues, and Opportunity Management Focus of New Guide is to Clear up Confusion, add clarity and to build on the foundation of the 2015 Guide

56 Lesson Topics: Other Acquisition Initiatives 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 26 Affordability & Should-Cost Pre-Life Cycle CBA MDD Mission/Capability Portfolio MSA TMRR EMD P&D O&S AoA MS A -Affordability Target Unit Acquisition Cost Annual Unit O&S Cost MS B SE trade analyses to define cost-effective design point. Focus on affordability of Design (unit acquisition cost & sustainment cost) Apply should-cost to control program overhead & unproductive expenses w/o sacrificing sound investment in product affordability -Present SE Trade-offs -Establish Affordability Requirement -Will-Cost captured in APB/ADM -MDA approved Should-Cost Baseline -Program schedule justification/approval -BCA to support tech data rights strategy MS C -Establish Economic Production Rate Range FRPD Will-Cost based on ICE/SCP/POE of affordable design drives resource planning and budgeting. Apply Should-Cost to drive down all elements of program cost, including acquisition and sustainment costs of the product design. Should-Cost baseline drives program execution. 56

57 Affordability Affordability will continue to be a design constraint Anticipated future budget will be the basis for the constraint Affordability analysis submitted at milestone reviews If caps are breached, costs must be reduced or else program cancellation can be expected What is an Open System Architecture (OSA)? OSA is a strategic Business and Technical acquisition approach that leverages the commercial market place in a way to control and optimize design features to ensure that a level field of competition provides the best valued product for our war fighter in a timely basis. Key design features include: BUSINESS Create a Competition focused Environment (A CULTURE of Competition) Open Design Disclosure for All Stakeholders (Data Rights) Enterprise Strategy Ensure Government Access to Data for Reduced Life Cycle Sustainment Costs TECHNICAL Use a Modular Design (Loose Coupling with High Cohesion) Use of Open Standards (Public, Published and Popular (The Three P s)) A successful OSA implementation allows for competition and ease of change that provides the best value to our war fighters. A Successful Open System Architecture can be; Added to Replaced Supported Modified Removed... by different vendors throughout the life cycle ENG-301 Leadership in Engineering Defense Systems 29 57

58 Quick Review of Data Rights 100% Private Development Funding 100% Govt < LR or RR Limited Rights (LR) or Restricted Rights (RR) Government Purpose Rights (GPR) Unlimited Rights (UR) > UR (Title or Ownership) * * Operations, Maintenance, Installation, ENG-301 Training; Leadership Form, in Engineering Fit, Function; Defense Computer Software Documentation Systems Page 30 Acq. Workforce Professionalism Continued investment in developing the acquisition workforce Establishing qualification requirements for Key Leadership Positions (KLPs) KLP qualification boards Going beyond DAWIA certification to qualification Increase recognition of top performers Increase the cost consciousness of the workforce 58

59 Lesson Topics These items will significantly influence both Acquisition and T&E. 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Other Acquisition Initiatives 32 59

60 Lesson Topics These items will significantly influence both Acquisition and T&E. 1. DoDI Change 2 Signed 2 Feb Defense Acquisition Guidebook Released 21 Feb Recent Statutory Changes 4. DoD Risk, Issue and Opportunity Management Guide for Defense Acquisition Programs Updated 5. Better Buying Power 3.0 Emphasis Continues 34 60

61 T&E Early Planning Lesson 4 T&E Early Planning 61

62 THIS PAGE INTENTIONALLY LEFT BLANK 62

63 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 4 Lesson Title T&E Early Planning Lesson Time 3.0 hours (includes several mini exercises) Terminal Learning Objective Given a system description, the student will correctly assess a program T&E Strategy. (TLO #3) Enabling Learning Objectives Determine impacts of multi service T&E on a program T&E strategy. (ELO #3.1) Determine impacts to the T&E strategy, commonly experienced by the DoD T&E community. (ELO #3.2) Determine methods to develop a T&E strategy for cost effective T&E. (ELO #3.3) Given a system description, assess a Milestone A TEMP for required content to support a system development process. (ELO #3.4) Given a system description, determine required resources to ensure that the T&E strategy is executable, and supports the overall program plan and T&E Strategy; and that necessary resources are leveraged where possible and are available when needed. (ELO #3.5) Recognize the advantages and disadvantages of specific verification and T&E methods. (ELO #3.6) Contrast the risks and benefits of using integrated T&E; and how CT, DT, OT, and LFT&E fit together during systems development. (ELO # 3.7) Identify the T&E WIPT/ITT/Combined Test Team needed to address T&E issues and documentation, to support the T&E strategy, approach, and overall program plan. (ELO #3.8) 63

64 TST 204 Intermediate Test & Evaluation Learning Method Class discussion, class participation, and mini exercises. Assignments None Estimated Student Preparation Time None Method of Assessment Written examination, class participation, and class discussions. References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition Management System Defense Acquisition Guidebook TST 204 Student Reference Disc Test & Evaluation Management Guide, 6 th ed., 2012 version, chapters 2, 3, 5, 6, 8, 21 64

65 T&E Early Planning The following continuous learning modules apply to this lesson: - CLE023 M&S for T&E - CLE029 Testing in a Joint Environment 1 Lesson Objectives Determine impacts to the T&E strategy, commonly experienced by the DoD T&E community. Identify the T&E WIPT/ITT/Combined Test Team needed to address T&E issues and documentation, to support the T&E strategy, approach, and overall program plan. Determine impacts of multi-service T&E on a program T&E strategy. Determine methods to develop a T&E strategy for costeffective T&E. Recognize the advantages and disadvantages of specific verification and T&E methods. Contrast the risks and benefits of using integrated T&E; and how CT, DT, OT, and LFT&E fit together during systems development. 2 65

66 Typical T&E Products MS A TEMP Draft MS B TEMP MS B TEMP DT&E Assessment MS C TEMP IOT&E Report LFT&E Report FRP FD Uses & Impacts of T&E Results T&E is a principal tool to measure progress in system development Results are used to improve system design (particularly contractor DT results) Provides risk mitigation information to MDA Conducted to Facilitate learning Assess technical maturity & interoperability Facilitate integration into fielded forces Confirm performance Reduce risk 4 66

67 What is a T&E WIPT? The T&E Working-level Integrated Product Team (WIPT) is responsible to support the T&E WIPT Chair, and other program WIPTs, on all aspects of a program s T&E effort. This includes T&E program strategy, design, development, oversight; and the analysis, assessment & reporting of test results. The T&E WIPT must be established as soon as practicable after the MDD decision (DoDI , Encl. 4, par 3.e) Early T&E WIPT involvement in program strategy discussions and plans is highly desired. The PM may form lower level functional working groups, who report to the T&E WIPT, to focus on specific areas such as integrated test planning, cybersecurity, SW T&E, reliability scoring, M&S development and Verification, Validation, and Accreditation (VV&A), and threat support. 5 T&E WIPT Membership T&E WIPT Chair: PM, DPM, or Chief Developmental Tester. Must have T&E experience relative to the product/system DASD(DT&E) & DOT&E representatives (if oversight program) Lead DT&E Organization Invited SMEs relative to the WIPT issue / topic. (SE, Logistics, Financial, Contractor / Vendor, etc.) Other services, for Joint or Multi-Service programs T&E WIPT should involve all Key stakeholders that have impact / influence on the program s T&E planning, execution, and assessment. HQs/Sponsor/PEO representatives. Can speak for sponsoring service/agency Combat Developer and Evaluator Test Ranges and Facilities Contracting Officer & other support staff, at proper times when needed OTA(s) and other organizations involved in the program, especially if a SoS/FoS program (i.e., JITC)

68 T&E WIPT in Support of Integrated Testing Provides a forum for all key organizations to be involved in the T&E effort Develops and maintains the program s T&E strategy Participates in the development of the Acquisition Strategy, and uses the principles of integrated testing Develops and maintains the Integrated Test Schedule Develops and documents a quality TEMP as quickly and efficiently as possible (depending on the Milestone) Necessitates that all stakeholders are given an opportunity to contribute to the TEMP development T&E WIPT Products The primary T&E planning document The T&E Master Plan (TEMP) The TEMP document is first due at Milestone A; and is updated at Milestone B and Milestone C. The T&E WIPT will fully document the T&E Strategy within the TEMP, to include the integrated testing concept DOT&E TEMP Guidebook 3.0 (Nov 2015) is the principle guide for TEMP development. (It is also endorsed by DASD (DTE-TRMC). DAG Chapter 8 provides additional TEMP guidance and templates

69 Differences Between DT&E and OT&E - Individual Exercise Take 3 minutes, and write down as many differences between DT&E and OT&E as you can think of DT&E OT&E 9 T&E Approaches for Hardware Products Prototypes: They can be for a system, subsystem, or component Testing is often done to determine whether the concept works and/or is feasible (risk reduction prototypes) Competitive prototyping (two or more competing contractor teams) is a statutory requirement for MDAPs, and a regulatory requirement for all other programs Engineering Development Models (EDMs): They can be for a system, subsystem, or component DT&E is often conducted on EDMs, to test very specific parameters LRIP Articles: DT&E, LFT&E (including full-up system level LFT&E), and IOT&E are conducted on LRIP articles 10 69

70 Integration Testing Integration Testing Testing in which software and/or hardware components are combined and tested progressively until the entire system has been integrated Risks in integrating components built by different vendors: Poor system performance & reliability (middleware and/or adaptors are sometimes needed, to obtain adequate performance) If integration problems occur, it can be expensive and/or cause schedule delays to fix the problems It can be difficult to determine (based on test results) whether the problem is in the interface, or in the components It may be more difficult to define the test methodology Multi-Service Testing A multi-service test program occurs when A system is to be acquired for use by more than one service A system must interface with equipment of another service (Joint program) May include DT, OT, or both The lead service has overall responsibility for the program T&E is conducted according to lead service regulations A memorandum of agreement for multi-service OT&E can be accessed on your student CD-ROM T&E Management Guide Chapter 21 has additional info. (See Student Reference Disc for Management Guide) 12 70

71 Challenges of Multi-Service Testing - Individual Exercise Take 3 minutes, and write down... The three most challenging (or most difficult) things (in your opinion), about Multi-Service Testing (as compared with Single-Service Testing) If you ve never done Multi-Service Testing, SWAGs are acceptable 13 Joint & Multi-Service T&E Lessons Learned The T&E effort is harder to coordinate Establish a joint T&E WIPT with best experts from each service & with continuous up the chain communications The T&E WIPT should act as the forum for test planning, conduct, evaluation & reporting issues & procedures Leadership, joint processes & methods of resolving disagreements need to be established early Goals, schedules, performance levels, logistics issues & CONOPS are different for each service All these issues need to be worked out early Test strategies (TEMP) should contain each service s funding profile & resource requirements for testing 14 71

72 Contractor DT A contract typically includes a specification Section 3 of the specification contains design and performance requirements Section 4 of the specification contains verification methods, for the requirements listed in section 3 (I.E. contractor DT) Verification methods include analysis, inspection, demonstration, and test A Verification Cross Reference Index (or Matrix) shows the verification methods for each requirement listed in section 3. Note: Govt. DT is performed by govt. organizations; whereas contractor DT is performed by contractor organizations (under contract to the govt.) 15 Govt. s role in Contractor DT The govt. s role is to oversee contractor DT, and efficiently integrate contractor DT with govt. DT Specific T&E requirements for contractor DT should start with the RFP If T&E requirements are not in RFP, they won t be in the contract... If they are not in the contract, they won t be delivered The following publication provides details about getting T&E requirements into DoD contracts: Incorporating Test and Evaluation into Department of Defense Acquisition Contracts October 2011 This publication is on the Student CD (in the DoD folder) 16 72

73 Govt. s Role in Contractor DT (cont.) It is important for the govt. to understand the pedigree of all test data and performance data To accomplish this, the govt. should approve the contractor s proposed verification methods (spec, section 4) - including test planning, test processes, test resources, test schedules and test reports The govt. may also choose to witness some parts of the contractor s DT program The govt. may then build upon contractor DT results, to do whatever govt. DT is necessary 17 Verification and T&E Methods Inspections Examples? Analysis CAD/CAM Comparison to similar systems What others can you think of? Lab Testing Environmental Component EMI/EMH Software Testing What others can you think of? Simulation (Live, Virtual, Constructive LVC) War Gaming Modelling and Simulations Hardware in the Loop Testing Other examples? Testing Developmental Testing (DT) Operational Testing (OT&E) Combined DT and OT&E Live Fire Testing Systems Level Test and Evaluation Ground Testing Sea Trials Flight Testing 18 73

74 Tailoring for Cost Effective T&E Individual Exercise Take 3 minutes, and write down things that can be tailored (things that can be done), for cost effective T&E. Write down as many things as you can think of. 19 Tailoring Methodologies For Cost Effective T&E Tailoring can occur concerning the type of testing to conduct Benefits and disadvantages of each type of test Information gained from each type of test Cost, schedule, resources, and risk implications Other things that can be tailored: Evaluation methodologies Amount, timing, and sequencing of testing Amount of data collected Test design & test conditions 20 74

75 Event Driven T&E Strategy An event driven acquisition strategy links program decisions to demonstrated accomplishments in development, testing, and production. (Missile Defense Agency definition) Develop an event-driven T&E Strategy, rather than a schedule-driven one, to enable program success Don t test until ready If technical problems arise, consider restructuring the schedule to add additional time & resources Don t drop test events to save time Issues associated with an event-driven (vs. schedule driven) T&E Strategy: Availability of funds Availability of facilities, test articles, manpower, and other assets Political issues and schedule dates Technical, and other risks 21 Event Driven Vs Schedule Driven Individual Exercise Take 3 minutes and write down 3 benefits of an Event-Driven T&E Strategy (Event-Driven Test Events, instead of Schedule-Driven Test Events) 3 risks / limitations / disadvantages of an Event-Driven T&E Strategy 22 75

76 Impacts to T&E Strategy T&E Related or Non-T&E functional areas: Engage specialists early Information assurance, interoperability, HSI, software, reliability, etc. Mission environments: Introduce operational realism & early user involvement in DT&E, where practical Test in extreme environments, to understand system capabilities & limitations 23 Impacts to T&E Strategy (Cont.) Mission objectives: In planning for IOT&E, focus on the missions that will be accomplished, and identify the critical operational capabilities How will the system accomplish its missions? Consider: organizational structure; tactics, techniques & procedures (TTP); training; and any required supporting systems Policy decisions or limitations: Potential test impacts on the environment Use of M&S to augment testing (NOT to replace all testing) Use of contractors in support of IOT&E 24 76

77 Impact Of Technical Maturity on T&E Strategy New technology and/or technical maturity may impact T&E in the following ways Until system design is finalized, T&E strategy and test planning may be subject to changes T&E representation on the systems engineering IPT may help mitigate this Test results for prototypes, subsystem & component level testing may be quite different than test results for production representative systems Test planning assumptions (for later test phases) may be incorrect Problems with system maturity may cause program delays, resulting in insufficient test time and/or canceling of test events 25 TRL Technology Readiness Levels (Used to measure/assess technical maturity) 9 Actual system proven through successful mission operations Actual system completed and qualified through test and 8 demonstration System prototype demonstration in an operational environment System/subsystem model or prototype demonstration in a relevant environment Component and/or breadboard validation in a relevant environment Component and/or breadboard validation in a laboratory environment Analytical and experimental critical function and/or characteristic proof-of-concept 2 Formulation of technology concept or application 1 Basic principles observed and reported 26 77

78 Various T&E Strategies Combined Testing Any combination of two or more types of testing Integrated testing requires the collaborative planning and collaborative execution of test phases and events to provide shared data in support of independent analysis, evaluation, and reporting by all stakeholders, particularly the systems engineering, developmental (both contractor and government) and operational T&E communities. Concurrent Testing DT and OT take place at the same time as two parallel, but separate & distinct, activities Incremental Testing Testing in support of an incremental acquisition approach 27 Integrated Testing Goals Conduct a seamless test program that produces credible qualitative and quantitative data useful to all evaluators Address developmental and operational issues early for decision makers Allow for the sharing of test events, where a single test point or mission can provide data to satisfy multiple objectives, without compromising the test objectives Attain synergy of effort among all T&E stakeholders including CT, DT&E, and OT&E, interoperability, Cybersecurity, and certification testing in order to maximize use of available test resources and infrastructure. Test by one, use by all 28 78

79 Major Steps in Integrated Testing Collaborative planning: PM must establish an integrated test planning group (all applicable stakeholders) DoDI , Encl 4, Par 2c Contractor and Government test entities meet early in program lifecycle with data requirements and determine areas of overlap Develop test plans that generate data to meet shared requirements Collaborative execution As appropriate, multiple groups participate in each test event to ensure data generated meets needs Shared data Common data base providing data with agreed upon fidelity reduces cost and improves efficiency Independent Evaluation Use common data base to derive organization-specific conclusions Integrated Testing is NOT Integrated Test and Evaluation 29 T&E Strategies Individual Exercise Take 3 minutes and write down 3 benefits of a Combined/Integrated Testing Strategy (Combined/Integrated Test Events) 3 risks/limitations of a Combined/Integrated Testing Strategy 30 79

80 Entrance & Exit Criteria (Acquisition Phases) Entrance criteria are applicable to all programs Describe in general terms what accomplishments are needed to enter (or proceed further) in a particular phase Within DoD 5000-series policy documents, exit criteria are defined as program-unique gates within each acquisition phase Must be specific and demonstrable during that phase Must be substantially achieved to proceed to the next phase Listed in the Acquisition Program Baseline (prepared by the PM and approved by the MDA) Entrance or exit criteria may include objectives or results that must be demonstrated or verified by the T&E community 31 Entrance & Exit Criteria (Test Phases) Each major developmental test phase or event (including Test Readiness Reviews) will have test entrance and exit criteria. The developmental test completion criteria (customer needs) will dictate what data are required from the test event. (DoDI , Encl 4, Par 5a(4)) Each major test phase or event should have test entrance and test completion criteria. (DoDI , Encl 5, Par 5e(1)) 80

81 Importance of Entrance & Exit Criteria Importance of entrance criteria: Helps ensure the system under test is mature enough to proceed to the next test phase; helps reduce risk Scarce resources can be wasted if criteria are not met to successfully conduct the next phase of testing Failure to meet entrance criteria is grounds to postpone testing Importance of exit criteria: Helps ensure all critical tests were conducted, critical data was collected, and critical test objectives were assessed That phase of testing is supposed to continue until all exit criteria have been met DT&E Entrance & Exit Criteria Examples Typical entrance criteria (for a major DT phase): Approved documents, test procedures and plans, and safety certifications (as needed) No major deficiencies (that would preclude or limit testing). Previous test phase(s) satisfactorily completed Test Readiness Review satisfactorily completed (if one is required prior to that test phase) Typical exit criteria: Tests (or critical tests) have been completed at an acceptable level, with necessary data collected and authenticated RAM scoring conference completed CTPs that MDA has designated as exit criteria and/or directed to be demonstrated (during that phase of testing) have been demonstrated and/or assessed 34 81

82 OT&E/OA Entrance & Exit Criteria Examples Typical entrance criteria: Approved documents (including test readiness statements), test procedures and plans, OT support packages, and safety certifications (as needed) No major deficiencies (that would preclude or limit testing) Previous test phase(s) satisfactorily completed Test Readiness Review satisfactorily completed (if one is required prior to that test phase) Typical exit criteria: Tests (or critical tests) have been completed at an acceptable level, with necessary data collected and authenticated RAM scoring conference completed COIs that MDA has designated as exit criteria and/or directed to be demonstrated (during that phase of testing) have been demonstrated Note: all COIs must be demonstrated and/or assessed to exit from IOT&E 35 Summary T&E Informs The System Development Process T&E WIPTs Are Essential to Support All Aspects of a Programs T&E DT&E, OT&E, and LFT&E Can Bring Different Challenges to Program T&E Multi Service and Joint T&E Programs Can Add Complexity to a Program. Tailoring Verification and T&E is Essential for Cost Effective T&E Event Driven T&E Strategies Are Essential to Effective T&E Management

83 Monday Night Homework Read the JECSS Milestone A TEMP document This document will be used during tomorrow s Software, Interoperability, and Cybersecurity Exercise Note: This TEMP is based on a real program. Names and other details have been changed. 83

84 THIS PAGE INTENTIONALLY LEFT BLANK 84

85 Monday Homework Software, Interoperability, & Cybersecurity Exercise Monday Homework 85

86 THIS PAGE INTENTIONALLY LEFT BLANK 86

87 For Training Use Only Test and Evaluation Master Plan Joint Enterprise Combat Support System (JECSS) Developed by: USN/JECSS Joint Enterprise Combat Support System (JECSS) Program Office Date: 9 Dec 2011 ACAT IAM This document was designed for classroom purposes only. Format reflects current acquisition policy. Details related to existing systems and threats may not be factual. 87

88 For Training Use Only SUBMITTED BY: _//Signed// Wallace T. Bell III Navy Program Manager, JECSS Date CONCURRENCE: Rear Admiral Edward A. Fairfield Program Executive Officer, Integrated Warfare Systems Rear Admiral Karen S. Goodbar Commander COMOPTEVFOR Date Date COMPONENT APPROVAL: Troy S. Krackel, Dr. Assistant Secretary of the Navy for Acquisition Secretary of the Navy Date OSD APPROVAL: John D. Smart Deputy to the ASD (NII) C3ISR & IT Acquisition Acting Ben M. Dell Director, Operational Test and Evaluation Date Date 88

89 For Training Use Only TABLE OF CONTENTS PART 1. INTRODUCTION PURPOSE MISSION DESCRIPTION SYSTEM DESCRIPTION SYSTEM THREAT ASSESSMENT PROGRAM BACKGROUND KEY CAPABILITIES KEY INTERFACES SPECIAL TEST REQUIREMENTS SYSTEMS ENGINEERING REQUIREMENTS PART 2. TEST & EVALUATION PROGRAM MANAGEMENT AND SCHEDULE T&E MANAGEMENT T&E DATA STRATEGY INTEGRATED TEST PROGRAM SCHEDULE PART 3. TEST AND EVALUATION STRATEGY T&E STRATEGY INTRODUCTION EVALUATION FRAMEWORK DEVELOPMENTAL EVALUATION APPROACH DEVELOPMENTAL TEST OBJECTIVES MODELING AND SIMULATION TEST LIMITATIONS OPERATIONAL EVALUATION APPROACH MISSION-ORIENTED APPROACH OPERATIONAL TEST OBJECTIVES M&S TEST LIMITATIONS FUTURE T&E PART 4. TEST AND EVALUATION RESOURCE SUMMARY INTRODUCTION TEST ARTICLES TEST SITES AND INSTRUMENTATION TEST SUPPORT EQUIPMENT TEST AND EVALUATION FUNDING SUMMARY APPENDIX A - BIBLIOGRAPHY APPENDIX B - ACRONYMS

90 For Training Use Only 1. INTRODUCTION 1.1 PURPOSE This Test and Evaluation Master Plan (TEMP) for the Navy led Joint Enterprise Combat Support System (JECSS) was produced in conjunction with the JECSS Acquisition Strategy Report (ASR) and the Initial Capabilities Document (ICD) in support of a Milestone A decision in JECSS is currently in the pre-acquisition process. Milestone A will authorize the USN Project Manager to select and purchase appropriate commercial-off-the-shelf (COTS) software to support JECSS requirements, and to establish a contract with an Enterprise Resource Planning (ERP) system integrator. The system integrator will devise a strategy for implementing the COTS ERP solution and proceed with JECSS blueprinting and planning. Specific schedules, including the order in which the solution will be implemented, are notional and will not be fully known until developed by the system integrator in preparation for Milestone B in Note: A waiver has been obtained, to allow the use of a Lead Systems Integrator (LSI), with future JECSS contracts. 1.2 MISSION DESCRIPTION The Navy initiated the Logistics Enterprise Architecture (LogEA) to bring together major disciplines such as Accounting and Finance, Acquisition, Human Resources, Installations and Environment, Strategic Planning and Budgeting, and Technical Architecture into a single integrated logistics enterprise for DoD. The LogEA is closely tied to the Department of Defense (DoD) Battlefield Enterprise Architecture Logistics (BEA-LOG) which provides higher-level guidance for developing enterprise architectures. The expectation of LogEA is to develop and employ an enterprise-wide logistics architecture that encompasses both operations and systems. LogEA will meet DoD guidelines for its Future Logistics Enterprise as spelled out in Joint Vision 2020 and will fully support the Navy and other services by adopting transformational management and execution practices that are not bound by existing processes, legacy systems, geography, or organization. The mission of the Joint Enterprise Combat Support System (JECSS) is to support LogEA by providing an integrated tool-suite to be used by service logistics enterprises to deliver timely logistics support to joint war fighters. The integration provided by the JECSS will allow the services to manage logistics from an enterprise-wide perspective focused on meeting the war fighter requirements effectively and in a cost effective manner. The JECSS will allow joint logistics to become rapid in its response, dynamically re-configurable in its structure, integrated across all its processes, and common in its application. The JECSS will deliver reliable, time-certain, and effective support and information based on operational requirements. It will have network-centric operations leveraging centralized planning and decentralized execution with real-time joint Command, Control, and Communications (C3). 1.3 SYSTEM DESCRIPTION The JECSS will be an IT enabler for Logistics transformation by using commercially available products. These products will consist of a Commercial Off-the-Shelf (COTS) Enterprise Resource Planning (ERP) basic package based on commercial best 90

91 For Training Use Only practices that provides most of the capability found in the current logistics legacy systems and COTS bolt-on(s), as necessary, to provide specific capabilities. The ERP is defined as a set of application software that brings maintenance, material management, financials, procurement, and other business practices into balance. It includes a set of business process solutions using a single integrated relational database system to manage enterprise operations. Common data and practices will be shared across the enterprise providing real-time information for decision-making and performance measurement. The JECSS ERP product and system integrator will be selected separately. The ERP product acquisition will be conducted first to allow the Navy to select the product that best meets joint requirements. The system integrator will then be selected based on their approach for implementing the chosen ERP, boltons, and other transition/integration tasks. The system integrator will be responsible for development, configuration, developmental testing, migration planning from legacy systems, and fielding of the JECSS. The JECSS will connect with the classified secret Marine Combat Mission System (MCMS) and the SAP/SAR Global Combat Support System-Air Force (GCSS-AF) Integration Framework (IF). These two connections will require Type 1 encryption. The JECSS is a functional Major Automated Information Systems (MAIS) comprising the MCMS and GCSS-AF into a System of Systems (SoS). Connections to the MCMS and GCSS-AF will require both Principle Accrediting Authority (PAA) and National Security Agency (NSA) involvement/approval and NSA s approval of JECSS Type 1 encryption. The JECSS will be made up of the following integrated modules: Advanced Planning & Scheduling Materials Management & Contracting Facilities Management Configuration & Bill-of-Materials (BOMs) Decision Support Document Management Repair & Maintenance Management Customer Relationship Management (CRM) / Order Management Quality Control Distribution & Transportation Budgeting It is envisioned that each module may require multiple iterations after milestone B of configuration, development, integration, testing, and user training to deliver functionality that will comprise each module. The system integrator will propose the best solution for delivering a total joint system based on ERP product requirements and the best approach to legacy transitioning. The system integrator will be responsible for determining data requirements for each module and/or iteration on an ongoing basis. This will include an accurate mapping of existing interface requirements (Legacy to Legacy) and all new interface requirements (Legacy to ERP and resultant ERP to Legacy modifications). Further details of all functions and modules are provided in the Key Capabilities section (1.3.3). 91

92 For Training Use Only SYSTEM THREAT ASSESSMENT JECSS is a software application on the classified Marine Combat Mission System (MCMS) and the classified Global Combat Support System-Air Force (GCSS-AF). Connections to the MCMS and GCSS-AF will require Type 1 encryption and require NSA involvement/testing. The Joint Mission Needs Statement (MNS) for GCSS-AF, dated 10 September 2010 describes this requirement. It states, MCMS and GCSS are needed to provide a classified and real-time multidimensional view of the battle space and the joint war fighter with a near-real-time Focused Logistics. MCMS and GCSS will serve as enablers to achieve not only Focused Logistics in the joint environment but also to implement the other operational concepts of Joint Vision 2010/2020. The XXXX Threat Assessment dated XXX is required by DoDI Test activities, to include encryption and penetration testing, could be handled by the National Security Agency (NSA). Only properly tested and approved products will be used. Lastly, integration of COTS components will be evaluated for compliance with industry best security practices, using research from preceding government projects. (Further Details Omitted for Training Purposes) PROGRAM BACKGROUND The Global Combat Support System Air Force (GCSS-AF) Operational Requirements Document (ORD) dated 23 December 2010 serves as the Initial Capabilities Document (ICD) for JECSS. A JECSS Capabilities Development Document (CDD) that will contain Critical Performance Parameters (KPPs) is being prepared for Milestone B on Although the KPPs are currently tentative, they will include, as a minimum, security and cybersecurity parameters to provide for the protection of unclassified, critical sensitive, classified secret, and SAP/SAR information and net-ready parameters to ensure interoperability. The KPPs that are developed in the CDD will be incorporated into the Milestone B Test and Evaluation Master Plan (TEMP) as Measures of Effectiveness and Suitability as appropriate. The user community may further refine the KPPs as the acquisition proceeds. All activity interfaces, services, policy-enforcement controls, and data-sharing of the Net Centric Operations and Warfare Reference Model (NCOW RM) and GIG Key Interface Profiles (KIPs) will be satisfied to the requirements of the specific Joint integrated architecture products (including data correctness, data availability and data processing), specified in the threshold (T) and objective (O) values. Threshold is 100 percent of interfaces; services; policy-enforcement controls; and data correctness, availability and processing requirements designated as enterprise-level or critical in the Joint integrated architecture. Objective is 100 percent of interfaces; services; policy-enforcement controls; and data correctness, availability and processing requirements in the Joint integrated architecture. IAW CJCSI E, this program will comply with the taxonomy and lexicon of Net Centric Operations and Warfare (NCOW) concepts and terms, and architectural descriptions of NCOW concepts. It will also comply with the NCOW RM activities, services and standards required to establish, use, operate, and manage the net-centric enterprise information environment to include: the generic user-interface, the intelligent- 92

93 For Training Use Only assistant capabilities, the net-centric service capabilities (core services, Community of Interest (CoI) services, and environment control services), and the enterprise management components. The declaration is the list of the Key Interface Profiles (KIPs) that apply to the system. The KIPs declaration alone does not ensure interoperability; a system must also be designed against the appropriate architectures, most current version of the DOD Information Technology Standards Registry (DISR) and IA standards. A checklist for KIP applicability assessment is at Appendix D. JECSS relies on the GCSS-AF Integration Framework (IF) for the key interface profile perspective. NOTE: Architecture products included in the draft JECSS CDD were developed in the absence of Joint Integrated Architectures and completed Key Interface Profiles. The architecture products included in the draft JECSS CDD have been developed in compliance with currently available NR-KPP products (CRDs, KIPs, NCOW RM, etc) and the products included herein provide the baseline for solution design and test. As new KIPs are released they will be incorporated into the TEMP which would then become the baseline for test. The JECSS PMO has reviewed all appropriate cybersecurity policy and guidance, and has addressed the implementation of these cybersecurity considerations in the JECSS Program Cybersecurity Strategy. In accordance with DODD , enclosure 4, paragraph E.4.2, IT System Procedures states MAIS programs must comply with the Clinger-Cohen Act requirement to have a cybersecurity strategy. The JECSS PMO understands cybersecurity requirements shall be addressed throughout the system life cycle in accordance with DoD Instruction , Cybersecurity, and DoD Instruction , Risk Management Framework for DoD Information Technology. Currently, the scope of JECSS is for unclassified, critical sensitive, classified secret, and Special Access Program/Special Access Required (SAP/SAR) information. Marine Corps use will be classified secret and Air Force use will be up to SAP/SAR. The Cybersecurity Strategy is an integral part of the program s overall acquisition strategy and identifies the technical, schedule, cost, and funding issues associated with executing requirements for cybersecurity. The JECSS Cybersecurity Strategy will leverage C&A features and processes provided by the MCMS Marine and GCSS-AF program offices to ensure adequate cybersecurity in a streamlined manner. Although this document is at the strategy level, the PMO is focused on how JECSS will completely integrate cybersecurity into all aspects of the program throughout the lifecycle. The JECSS cybersecurity foundation is built on up-front integration of applicable laws, guidance and directives. This approach will help ensure adequate cybersecurity from cradle to grave. This means cybersecurity requirements are addressed from initiation and source selection -- by including cybersecurity requirements as part of source selection criteria -- through decommissioning of the system -- by including Discretionary Access rules in the data migration to the system's evolutionary replacement. The JECSS Program Management Office (PMO) will ensure program cybersecurity, through the PMO assigned Authorizing Official (AO), is a synergistic checks-and-balances partnership with the MCMS and GCSS-AF PMOs; and that authorization decisions (ATO, IATT) are made in accordance with the cybersecurity risk management framework. 93

94 For Training Use Only KEY CAPABILITIES - (OMITTED FOR TRAINING PURPOSES) KEY INTERFACES The JECSS will encompass Navy business management processes related to service provider, engineering, supply chain management, expeditionary command and control, maintenance, repair, and overall work force management. The Navy led JECSS must also interface with a number of external joint systems to support these processes across the DoD. These external systems cross the spectrum of ERP functionality and will include Defense Financial and Accounting Service (DFAS) accounting systems as well as the Defense Contract Management Agency (DCMA) systems, logistics systems, and DoD HR systems. The JECSS will utilize the Enterprise Application Integration (EAI) services of the GCSS-AF IF to reuse existing external interfaces established to sustain joint operations while enabling the retirement of existing logistics and procurement legacy systems that are presently satisfying those Information Exchange Requirements (IER). The JECSS will provide a common operating environment and single integrated business operations picture that supports current and future war fighter logistics needs. The JECSS must efficiently interface with external service providers, oversight organizations, and joint customers. As a result, interoperability is established through internal and external Navy interfaces. These interfaces will be handled by the MCMS and GCSS-AF IF Enterprise Application Integration (EAI) services for information sharing and to expose unique logistics data for Joint Task Force consumption. The JECSS program manager is responsible for interfaces to the MCMS and CGSS-AF. The JECSS will comply with the appropriate and applicable standards within the Global Information Grid (GIG), Network Centric Enterprise Services (NCES), and the Joint Technical Architecture (JTA). The JECSS will also comply with NSA cross domain requirements for a classified SoS. The JECSS Capability Development Document (CDD) will list the Operational Information Exchange Matrix, Operational View -3 (OV- 3), and the System Information Exchange Matrix, System View -6 (SV-6). The JECSS program office is responsible for interfaces to the MCMS and CGSS-AF; as well as for strategies to meet GIG, NCES, and JTA requirements SPECIAL TEST REQUIREMENTS OMITTED FOR TRAINING PURPOSES SYS ENGINEERING REQUIREMENTS OMITTED FOR TRAINING PURPOSES 94

95 For Training Use Only 2. TEST & EVALUATION PROGRAM MANAGEMENT AND SCHEDULE 2.1 T&E MANAGEMENT Organizational roles and responsibilities as they relate to test and evaluation of the JECSS are as follows: JECSS Program Manager (PM). Responsible for overall management of the DT&E program. Coordinates and publishes the Test and Evaluation Master Plan (TEMP). Supervises the System Integrator (SI) and is responsible for all Developmental Test and Evaluation (DT&E). Provides a test officer who chairs the T&E WIPT. Provides technical engineers and subject matter experts to support these activities. JECSS Chief Engineer (CE). Coordinates and publishes the Test and Evaluation Master Plan (TEMP). Oversees the overall JECSS cybersecurity strategy; and has been assigned, by the JECSS PM, as the Risk Management Framework (RMF) Authorizing Official (AO). Responsible for the JECSS Interim Authority to Test (IATT) and Authority to Operate (ATO). Test and Evaluation Working Level Integrated Product Team (T&E WIPT). A Navy and Air Force cross-functional test team, chaired by the Navy PM and the embedded Air Force test director, whose main purpose is to create and manage the JECSS T&E strategy throughout the life of the program. The T&E WIPT conducts high level T&E planning, advises the JECSS Navy PM on T&E matters, and authors the TEMP. Members may include PM and OTA representatives, developers, the joint user communities, and the T&E community. COMOPTEVFOR. The independent Operational Test Agency (OTA) for the Navy s Marine Combat Mission System (MCMS). Responsible for OT portions of the TEMP. COMOPTEVFOR concurs on and signs the TEMP, provides a representative on the T&E WIPT, and participates in combined DT/OT. Prepares detailed OT&E plans and performs independent OT&E, including operational assessments (OAs), Multiservice OT&E (MOT&E), Qualification OT&E (QOT&E), and Follow-on OT&E (FOT&E). Prepares OT&E reports that evaluate the system s operational effectiveness, suitability, and survivability. Air Force XXX Test Squadron. A government developmental test support organization responsible for overseeing and/or conducting classified DT&E in support of the JECSS Program Office. Will maintain insight into System Integrator (SI) activities as it pertains to classified testing and oversee DT&E activities in support of the JECSS program office test group. Responsible for inputs to the JECSS Information Support Plan (ISP) and Part III of the TEMP. The ISP will be developed in EMD. Joint Interoperability Test Command (JITC). JITC is the DoD chartered agency for certifying interoperability. JITC recommends specific interface and system Joint Interoperability Certification after viewing test events, collecting data where appropriate, and analyzing test results. JITC will review JECSS program documentation to include the CDD, C 4 ISP, ISP, SSS and TEMP to ensure that they meet guidelines identified in CJCSI H/10 Jan 12 and CJCSI E/15 Dec 08 (IAW DoDI May 95

96 For Training Use Only 03). JITC will collect data with the XXX Test Squadron during DT and COMOPTEVFOR during MOT&E to eliminate duplication of resources. JITC will test the JECSS information transfer effectiveness to other specifically designated external systems in an operational environment. JITC will ensure that data elements and resulting test dealing with Joint Interoperability are valid and sufficient for joint and external interoperability certification purposes. JITC, via consultation and coordination with the XXX Test Squadron and COMOPTEVFOR, will compile and analyze applicable DT and OT interoperability test results. JITC will provide the Joint Staff J8 and Program Manager an Interoperability Certification memorandum for J8 approval when sufficient joint interoperability-related data are available to analyze and support such a certification. System Integrator (SI). Contractor responsible for integration of the COTS ERP solution. Plans and conducts DT&E (or its ERP equivalent) in coordination with the Lead Developmental Test Organization (LDTO). The LDTO will be determined prior to Milestone B. Responsible for meeting contractual requirements for the readiness of JECSS for OT&E. Collaborates with the LDTO on Part III of the TEMP. Assistant Secretary of the Navy for Acquisition. User representative. Marshals joint requirements and produces coordinated JECSS requirements documentation. Assists in providing representative users for DT/OT and MOT&E. Concurs on and signs TEMP. Assistant Secretary of Defense (Networks and Information Integration) [ASD(NII)]. Exercises overall OSD oversight for JECSS and is the Milestone Decision Authority (MDA). Chairs the Information Technology Acquisition Board (ITAB) and issues Acquisition Decision Memorandums (ADMs) as appropriate. Provides the staff and chairmanship of the IT Overarching Integrated Product Team (OIPT) and the Integrating Integrated Product Team (IIPT). The IT OIPT Leader signs the TEMP along with DOT&E to provide OSD approval. National Security Agency (NSA). The NSA will be involved for type 1 encryption only, starting at Milestone B. Deputy Assistant Secretary of Defense for Developmental Test and Evaluation [DASD(DT&E)]. Exercises DT&E oversight for selected acquisition systems. Monitors DT&E and assesses compliance with DoD system engineering and DT&E policies and procedures. Staffs the TEMP at the OSD level and develops a recommendation regarding TEMP approval for the Deputy to the ASD(NII) for C3ISR & IT Acquisition (the IT OIPT Leader). Director, Operational Test and Evaluation (DOT&E). Exercises OT&E oversight for selected acquisition systems. Reviews and approves all plans for operational testing including the TEMP, combined DT/OT plans, OA plans, and MOT&E plans. Provides an independent assessment of operational effectiveness, suitability and survivability, and reports the results to the MDA and to the Congress. 2.2 DATA STRATEGY OMITTED FOR TRAINING PURPOSES 96

97 For Training Use Only 2.3 INTEGRATED TEST PROGRAM SCHEDULE The following JECSS Integrated Program Schedule (Figure 1) depicts the currently planned functional areas and their notional schedules: Figure 1 JECSS Integrated Program Schedule Each functional area will be developed as one-to-many iterations defined at milestone B, each following a seamless testing approach. Integration testing will be performed, but much of the integration is yet to be determined because of the dependencies upon the software product(s) chosen. We anticipate the integration points will start coming to light during the pathfinder activities. (The pathfinder activities are existing acquisition programs that are different than the JECSS Program.) The integration test strategy will be further defined within the Test and Evaluation Master Plan (TEMP) prior to Milestone B. 97

98 For Training Use Only 3. TEST & EVALUATION STRATEGY 3.1 T&E STRATEGY INTRODUCTION The JECSS will apply Navy Seamless verification principles throughout its testing program in order to integrate developmental and operational test objectives to the maximum extent possible and to ensure a quality application implementation. Developmental Test (DT) for a COTS ERP solution will require a non-traditional approach, given the solution purchased is already a proven product. DT as it applies to the ERP product implementation will include testing of all newly developed Reports, Interfaces, Conversions, Extensions (RICE) Objects and requirements validation of the configuration and integration of the ERP software modules. This testing will not apply to validating the COTS product itself, which has already been done by the COTS vendor, nor will the JECSS PM be validating MCMS or GCSS-AF IF, which are the responsibilities of these two service specific program offices. This validation will be based on the business processes captured during the blueprinting process and documented in the Capability Development Document (CDD) for each iteration. Throughout this phase, the system's performance, functional, and interoperability characteristics will be validated through business process, system integration, and independent government testing to insure that all system capabilities and requirements are satisfactory. Each test within DT will verify the status of business process integration within the existing software, identify risks and provide information to mitigate design risks, and substantiate the achievement of system functionality and performance requirements. Configuration management and Problem Reporting (PR) (as required) will be the responsibility of the system integrator during DT with JECSS PMO/LDTO oversight. Developmental testing activities will provide excellent opportunities to verify interoperability. JITC, as DoD s Joint Interoperability Certifier, will engage as early as possible in the acquisition process to identify and exploit opportunities to collect technical interoperability data. To the greatest extent possible, JITC will craft a Interoperability Test Plan (ITP) in concert with the JECSS PMO test group to avoid redundancy and provide adequate test coverage. 3.2 EVALUATION FRAMEWORK OMITTED FOR TRAINING PURPOSES 3.3 DEVELOPMENTAL EVALUATION APPROACH The DT environment will attempt to stay ahead of both the MCMS and GCSS-AF IF software migration plans such that when an iteration is ready to move into Pre- Production, the MCMS and GCSS-AF IF Pre-Production will have been recently upgraded to the DT environment versions. The objective of the JECSS DT phase is to ensure that the delivered product satisfies the functional and technical requirements of the system for a given iteration. DT must determine whether the iteration has met the DT exit criteria and is ready to be submitted for OT. Operational Assessments (OA) will be conducted by COMOPTEVFOR in parallel with DT. As a joint program, OA results will only be provided to the Program Office. A DT report will be prepared certifying readiness to proceed to OT. The DT report will document the results of DT to date, the satisfaction of the DT exit criteria, the JECSS PM s risk assessment, and the JECSS PM s recommendation to proceed to OT 98

99 For Training Use Only based on acceptance of the identified risk. As the AO, the JECSS Chief Engineer will provide approvals as to connectivity of JECSS with all other unclassified and/or classified systems. Table 1 shows the DT exit criteria for release to OT. Table 1. DT Exit Criteria for JECSS Iteration release to OT Type of requirement DT requirements Additional program requirements Criteria System configuration baselined and managed. JECSS technical requirements met; any requirements not met or requirements with open deficiencies are identified with impact analyses provided. Demonstrate the functionality required. Interoperability interfaces to other systems demonstrated. No open Priority 1 or 2 problem/discrepancy/change reports against requirements. Priority 3 problems must be documented with appropriate impact analyses completed and work-arounds must be identified. These impact analyses must focus on the problems potential impact to the system s mission capability and the ability to resolve the affected Critical Operational Issues (COIs). Approved DoD acquisition documentation in place, the PM s DT report certifies readiness to proceed to OT. COMOPTEVFOR s OA reports provide input to the JECSS Program Office. Table 2 lists the discrepancy report categories (DR) to be used in classifying problems, as identified in the JECSS Configuration Management Plan (CMP) and Annex J, Section J.3 of IEEE/EIA Standard , April Table 2. Discrepancy Report Categories DISCREPANCY CATEGORY 1a 1b 2a 2b 3a DISCREPANCY DEFINITION Prevents the accomplishment of an operational or mission critical capability Jeopardizes safety, security, or other requirement that is designated critical Adversely affects the accomplishment of an operational or mission essential capability and no workaround solution is known Adversely affects technical, cost, or schedule risk to the project or to life cycle support of the system, and no workaround solution is known Adversely affects the accomplishment of an operational or mission essential capability but a workaround solution is known 99

100 For Training Use Only 3b Adversely affects technical, cost, or schedule risks to the project or to the life cycle support of the system, but a workaround solution is known 4a Result in user/operator inconvenience or annoyance but does not affect a required operational or mission essential capability 4b Results in inconvenience or annoyance for development or support personnel, but does not prevent the accomplishment of those responsibilities 5 Any other effect Developmental Test Objectives The JECSS test team will coordinate DT with those designated to perform operations. Business processes will be tested through scenarios that will be based on the outcome of blueprinting. DT for the JECSS is based upon commercial ERP best practices and will include separate business processes and system integration testing of COTS configuration, data migration testing, performance/stress testing, and security / cybersecurity testing. Details of predominant DT are as follows: Unit (module) testing. Unit testing is the separate business process testing (thread, assembly, or string testing) and is the initial level of testing for individual business processes by themselves and in isolation from other business processes. Unit testing will be performed on JECSS units/modules by the System Integrator before DT. Functional testing. Functional testing is primarily concerned with testing the transactions within a module that implements area specific business processes. Test data needed for transactions will be generated from the data migration team in conjunction with the testing functional team. The data generated by the transaction execution will be available for use by the succeeding function within related functional tests. System integration testing. System integration testing is a test of the business processes and their associated process scenarios, interfaces, middleware, and development programs. System integration testing will validate that the integrated solution satisfies the requirements documented in the Capability Development Document (CDD). It will use the data resulting from real data conversion, and will show that the COTS solution meets JECSS business requirements. Integration testing is the level of testing that involves the combining of separate business processes, bolt-ons, if required, and RICE objects into a complete system. Testing of the JECSS COTS solution, at the business transaction level, ensures that all business events can support the JECSS joint mission and that end-to-end business processes function properly. It tests a collection of functionally and operationally linked separate business processes. System integration testing includes all currently identified JECSS interfaces, reports and forms, formal end-to-end tests of business processes using a series of controlled test cycles and test scenarios. Upon conclusion of system integration testing, a software test report of system test results will be prepared and submitted to the JECSS PMO. System testing will include an opportunity for JITC to observe integration testing processes. In addition, the system integration test will demonstrate individual transaction performance 100

101 For Training Use Only against operational data, as well as support continued performance tuning of the solution. Additional components considered to be part of the system integration test include the data migration test, system performance test, information assurance test(s), and initial joint interoperability tests. During test planning, the combination of integrated scenarios, test data, test scripts, the execution schedule, and expected results will be defined and documented. Details for specific tests that may make up the system integration test are as follows: Data migration test. The data migration test, building on the data conversion unit test, will test the entire data conversion process from current system extraction to the target software load. It will use operational data from pre-defined site or sites. This test will verify that the components built can be used to convert the production data. The results of this test will be used in the system integration test and will provide initial conversion program run times for cutover planning. The data project team will develop the quality control procedures for data conversion. The validation criteria (i.e. record count, value count) will be determined for each data object. Performance testing. Performance testing will be conducted, to the extent possible, in the developmental environment to evaluate the throughput and response times for the application and system. This test determines whether the application is fast enough to process the volume of work expected. Performance testing entails identifying all application areas that can be overloaded, determining the threshold for each area, designing a test to exceed the threshold, and verifying that overload handling works as expected. The simultaneous entry of a high-volume of online transactions, combined with the expected batch volume (scheduled jobs, reports, etc.) will be used to evaluate the effect of a simulated peak load on application performance. Automated testing tools will be used to generate automated transaction scripts and to emulate a multitude of users running those scripts simultaneously. Performance issues of the MCMS and GCSS-AF Integration Frameworks (the SoS) are beyond the scope of the JECSS effort. Cybersecurity testing. Cybersecurity testing will verify that both the MCMS and GCSS-AF security tiers ensure adequate protection of all system information. Testing will verify compliance with network security policies, access controls, auditing, and other security requirements as specified in DoDI , DoDI , and DoDI Joint interoperability testing. The primary basis and measure for joint interoperability that defines the interoperability threshold and objective requirements is the Information Exchange Requirements (IER). The JITC will facilitate (by reviewing test artifacts) or conduct data gathering to support Joint Interoperability Certification determination at every opportunity. The JECSS Chief Engineer will contact JITC after 101

102 For Training Use Only the Preliminary Design Review (PDR). Opportunities to generate and collect data will occur during DT activities. Once the DT exit criteria is satisfied, the application will be validated by both MCMS and GCSS-AF Pre-Production systems. The application will be delivered to MCMS and GCSS-AF with a complete deployment package to include automated test scripts. The MCMS and GCSS-AF program offices will be responsible to install the application, run the test scripts to verify integration functionality, and then run regression test scripts to ensure there is no negative impact to other applications already deployed on the MCMS and GCSS-AF IF. When all the OT entrance criteria are satisfied, MOT&E can begin. Table 3 shows the OT entrance criteria. 102

103 For Training Use Only Table 3. OT Entrance Criteria OTA PM Test sites MCMS GCSS-AF JITC Responsible party Criteria DOT&E approved Operational Test and Evaluation Plan and concept. User test participants trained on their testing roles. Data collectors in place ready to support testing. DoD acquisition documents approved. System configuration frozen and configuration managed. Cybersecurity Authority to Operate issued. User and site operations manuals and training provided. System Standard Operating Procedures and business processes defined. System ready for test. User test participants trained and ready to support testing. System configurations and site databases verified functional and necessary data conversion complete. Site/command approval to proceed with test granted. Successful MCMS and GCSS-AF Pre-Production Integration and Regression Test Sufficient, appropriate, technical Joint Interoperability Certification-supporting data elements identified and sufficient technical data gathered, under configuration management, to demonstrate IER satisfaction to technical threshold values. Detailed OT strategy, data element identification, and test planning in place to gather sufficient Joint Interoperability Certification-supporting data generated by users using JECSS as a tool, in their environment, to do their jobs to measure effectiveness and suitability. Interim Interoperability Certification granted, with OT data requirements to support full certification identified M&S OMITTED FOR TRAINING PURPOSES Test limitations OMITTED FOR TRAINING PURPOSES 3.4 Operational Evaluation Approach COMOPTEVFOR, as the lead Operational Test Agency (OTA), will be responsible for planning, conducting, and reporting OT&E activities for the JECSS program. 103

104 For Training Use Only Operational testing activities will occur during development and at the final production operational representative system both during DT&E and OT&E to provide seamless verification. The purpose of OT is to determine whether or not JECSS is operationally effective and operationally suitable when operated by typical trained users in representative, operational, and joint environments. OT will employ a spectrum of Navy, Marine, Army, and Air Force logistics users and other designated users in scenarios that exercise the functions, processes, procedures, doctrine, logistics and maintenance support concepts planned for use when the system is fielded. During early test phases, contractor test, DT and OT, combined test will be considered and used where possible to maximize resources utilization and condense overall testing time by achieving multiple test objectives with each testing event. JECSS OT activities will fall into one of two categories, 1) Operational Assessments (OA)/Early Operational Assessments (EOA) and 2) Multiservice Operational Test and Evaluation (MOT&E) depending on the development/acquisition stage of the system. JITC will conduct assessments and recommend specific interfaces and system interoperability certifications by viewing test events, collecting data, and analyzing test results for interoperability. NSA could conduct cross-domain and encryption assessments and recommend security certifications by viewing test events, collecting data, and analyzing test results for systems security and encryption if necessary/required. Previously coordinated DT data will provide JITC and COMOPTEVFOR with technical data and other information on the administrative and technical joint JECSS interface aspects. Accordingly, JITC will participate with COMOPTEVFOR during the various OT&E activities to ensure that at least the minimum amount of data is collected and that it is sufficient for interoperability assessments, Joint Interoperability Certification and OT&E purposes. JITC will assist COMOPTEVFOR in testing information transfer effectiveness in an operational environment. JITC will work in consultation and coordination with COMOPTEVFOR to compile and analyze applicable test data to support the JECSS DT, OT, and Joint Interoperability Certification. NSA could be requested to assist the program office with security and encryption certifications and/or conduct an independent penetration test. If required, penetration testing will be conducted by an NSA support contractor. Data from this testing will be reviewed by NSA and provided to the program office. JITC will provide the program office and COMOPTEVFOR with a specific interface or a system Interoperability Certification Memorandum and submit that memorandum to the Joint Staff J8 to concur and issue the Joint Interoperability Certification. The JECSS Chief Engineer will be responsible for providing systems security approvals and encryption certifications as required Mission-Oriented Approach The JECSS Critical Technology Elements (CTEs) are assessed at Technology Readiness Level (TRL) 4 at this time, and leading into Milestone A. The program office has conducted a proof of concept. Appropriate market research has been conducted prior to technology maturation and risk reduction to reduce duplication of existing technology and products, and JECSS will demonstrate a high likelihood of accomplishing its intended mission. The program office will ensure that the JECSS 104

105 For Training Use Only CTEs are assessed in accordance with current laws and regulations, prior to TRL certification. It is anticipated that the program will progress to TRL 6 just before Milestone B Operational Test Objectives With the objective of assessing the extent to which JECSS is operationally effective and suitable, COMOPTEVFOR will conduct EOA/OA on the JECSS at various stages of development and integration. The JECSS T&E WIPT, as a consensus, will recommend to COMOPTEVFOR when the JECSS is ready for an EOA/OA. COMOPTEVFOR will recommend to the JECSS PM at what stages any type of EOA/OA would be appropriate. The purpose of the EOA/OA is to provide the JECSS PM and OTA feedback as to the system s level of maturity and readiness for MOT&E. The EOA/OA will, to the extent possible, involve joint functional users early within developmental, integration, prototyping, and integrator site test environments to assess the system s progress towards satisfying stated operational requirements, as defined in the JECSS CDD. The EOA/OA findings will assist the JECSS PM in the identification and management of programmatic risk. With the objective of determining the extent to which JECSS is operationally effective and operationally suitable, COMOPTEVFOR will conduct MOT&E to support the MDA in the Full Rate Production (FRP) decision for the JECSS. The MOT&E will involve typical trained users exercising production representative JECSS System configurations in a realistic manner to determine the degree to which JECSS satisfies the stated operational requirements in the CPD. COMOPTEVFOR will rely on test events, performance measurements, observations, and user questionnaire findings collected throughout OT activities to evaluate the JECSS s COIs. Following Navy Program Executive Officer (PEO) certification of readiness for MOT&E, COMOPTEVFOR will conduct an MOT&E of the first JECSS release intended for operational use, with additional OT events conducted as determined by risk assessments conducted in accordance with DOT&E s XXXX dated XXX. COMOPTEVFOR will provide an MOT&E final report to DOT&E. DOT&E will consider these findings and recommendations in its independent assessment to the MDA in support of the MDA s deployment decisions. COMOPTEVFOR will conduct the MOT&E for the JECSS as soon as the system is certified ready for dedicated OT. Testers will examine JECSS capability to perform its required functions and generate requisite reports and files when operated by users in the intended operational environment. System interfaces will be exercised during the MOT&E to verify interoperability. The MOT&E will consist of typical Logistics and possibly other external users performing normal day-to-day processes in their normal operational environment and will include both automated and manual tasks. Users will be required to perform functional tasks during the MOT&E that may include both Logistical processes and the non-core Logistical processes. At the conclusion of the MOT&E, COMOPTEVFOR will prepare a final report describing the MOT&E and summarizing the results in support of the full-rate production and fielding decision. 105

106 For Training Use Only M&S OMITTED FOR TRAINING PURPOSES Test Limitations OMITTED FOR TRAINING PURPOSES 3.5 Future Test and Evaluation OMITTED FOR TRAINING PURPOSES 4. RESOURCE SUMMARY 4.1 Introduction OMITTED FOR TRAINING PURPOSES Test Articles OMITTED FOR TRAINING PURPOSES Test Sites and Instrumentation OMITTED FOR TRAINING PURPOSES Test Support Equipment OMITTED FOR TRAINING PURPOSES Threat Representation OMITTED FOR TRAINING PURPOSES Test Targets and Expendables OMITTED FOR TRAINING PURPOSES Operational Force Test Support OMITTED FOR TRAINING PURPOSES Simulations, Models and Testbeds OMITTED FOR TRAINING PURPOSES Joint Mission Environment OMITTED FOR TRAINING PURPOSES Special Requirements OMITTED FOR TRAINING PURPOSES 4.2 Test and Evaluation Funding Summary - See note 1, below. Note 1: Draft T&E funding requirements estimates have been developed, and are currently undergoing review. It is anticipated that funding will be obtained via Congressional plus-ups for the first three years, as well as through the PPB&E process. Recent Program Budget Decision PBDXYZ has resulted in an across the board 30% budget cut for the JECSS program. Replacement funding has not been identified. APPENDIX A BIBLIOGRAPHY - OMITTED FOR TRAINING PURPOSES APPENDIX B ACRONYMS - OMITTED FOR TRAINING PURPOSES 106

107 Software T&E Lesson 5.1 Software T&E 107

108 THIS PAGE INTENTIONALLY LEFT BLANK 108

109 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 5 Lesson Title Software, Interoperability, and Cybersecurity T&E Lesson Time 4.5 hours (includes one exercise) Terminal Learning Objective Given a T&E strategy, the student will correctly assess issues related to T&E of Information Technology Systems. (TLO #4) Enabling Learning Objectives Recognize T&E's role in translating requirements documents (cybersecurity strategy, program protection plan, information support plan) to identify evaluation criteria to support T&E planning efforts. (ELO #4.1) Given a T&E strategy, develop issues pertaining to cybersecurity, systems engineering, and IT system development. (ELO #4.2) Given a T&E strategy, critique the adequacy of the cybersecurity strategy. (ELO #4.3) Given a T&E strategy, critique the adequacy of T&E approaches for software intensive systems. (ELO #4.4) Given a T&E strategy, critique T&E methodologies for assessing system interoperability. (ELO #4.5) Recognize how software and cybersecurity T&E fit into system development. (ELO #4.6) Given a T&E strategy, analyze issues related to hardware and software component interoperability and issues related to integration (to function and perform properly) in a large system of systems. (ELO #4.7) Learning Method Class discussion, participation, and group exercise. Assignments Homework assignment (read the JECSS Milestone A TEMP). 109

110 TST 204 Intermediate Test & Evaluation Estimated Student Preparation Time 90 minutes Method of Assessment Written examination, group presentations. References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook, Chapter 9 TST 204 Student Reference Disc Test & Evaluation Management Guide, 6 th ed., 2012 version, chapters 14, 15, 20,

111 Software T&E The following continuous learning modules apply to this lesson: - CLE041 Software Reuse - CLB023 Software Cost Estimating - CLE060 Practical Software & Systems Measurement DAU also offers Software Acquisition Management (SAM) and Information Resource Management (IRM) courses Outline Software T&E Mission Software Test Planning Software Test Trouble Reporting Software Test Exit and Maturity Criteria Software IOT&E COTS and NDI Software Lessons Learned and Best Practices 2 111

112 Software T&E Mission Objectives Demonstrate performance of the whole system Assist in fault detection & correction How Incremental test strategy ID & correct SW errors early Provide scientific based metrics to measure progress & quality Provide data to support acquisition decisions 3 Example Typical Costs of Software Fixes* Lifecycle SW Development Activity Requirements Analysis Initial $ Spent Errors Introduced Errors Found Relative Cost 5% 55% 18% 1.0 Design Activities 25% 30% 10% Testing Activities 60% 10% 50% Documentation 10% Post-Deployment Software Support (PDSS) ---* 5% 22% *Once a system is fielded, PDSS costs are typically 50-70% of total system lifecycle costs B. Boehm, Software Engineering Economics 4 112

113 Operational Requirements (user needs) & Project Plans System Requirements Software Development V-Model System Design Software Requirements Software Architectural Design Software Detailed Design Concurrent Test Planning Operational & Acceptance Tests Government Developmental & System Tests Software / Hardware Integration & Test Software Item Test Software Subsystem Integration & Test Unit Integration & Test Code Unit Test 5 Software Test Plan Relationships Test & Evaluation Master Plan (TEMP) Documents the overall structure and objectives of the T&E program...it includes critical operational effectiveness and suitability parameters for [among other items]...software and COMPUTER RESOURCES...it addresses SOFTWARE MATURITY and the degree to which SOFTWARE DESIGN has stabilized.. Defense Acquisition Guidebook S D P TEMP Software Development Plan (SDP).... Coding & Unit Testing Unit Integration & Test SI Qualification Testing SI/HI Integration & Test System Qualification Testing.... Software Quality Assurance Corrective Action Process.... DID DI-IPSC Software Requirements DT: substantiate system technical performance OT: meet CDD requirements ST P Software Test Plan (STP) IDs Test Items (SIs, units) ID s Personnel Resources ID s Test Environment Provides Test Schedules Traces to Requirements DID DI-IPSC Rev

114 Spectrum of Typical Software Tests Desk Checking Walkthroughs Ineffective Better than nothing Mental tests May have defined procedures Results may be recorded May lead to SW metrics Around 40% defect removal Spectrum of Quality Events Formal Inspections Joint Reviews Uses trained teams Peer reviews Formal process Team attitude critical Entry/exit criteria Basis for SW metrics Genesis for process improvement Around 70% defect removal Preparation critical Frequently abridged High-level review May/may not be high leverage Human-Based Quality Activities Computer-Based Testing Activities Process-Driven Test & Integration Planning Qualification Testing Software Item Oriented 7 J-STD-016 Classification Schemes Software Testing Problem Reporting Priority Classification Scheme Priority 1: Prevents Mission Accomplishment or jeopardizes safety or other critical requirement Priority 2: Adversely affect Mission Accomplishment or cost, schedule, performance or software support + no workaround exists Priority 3: Adversely affect Mission Accomplishment or cost, schedule, performance or software support + a workaround exists Priority 4: User/Operator or support inconvenience Priority 5: All other problems Category Classifications Project Plans Operational Concept Alternate category and priority System/Software Requirements schemes may be used by Design developers if approved by the Code acquirer Databases/data files J-STD-016, Annex M Test Plans, Descriptions, Reports User, Operator or Support Manuals Other Software Products 8 Rev

115 System DT&E Decision Exit Criteria Test activities (based on the test plan) Demonstrate that the system meets all critical technical parameters & identify technological & design risks. Ready to perform system OT&E No open priority 1 or 2 problems Acceptable degrees of: Requirements traceability / stability Computer resource utilization Design stability Breadth & depth of testing Fault profiles Reliability & Interoperability Software Maturity 9 Software OT Maturity Criteria Software Maturity must be demonstrated prior to OT&E Software Problems Functionality Management CM Process No Priority I or II problems Impact Analysis for Priority III Functionality available before OT&E DT completed External interfaces functionally certified PMO ID s unmet critical parameters Acquisition Executive must certify (+ Operational Tester must agree) that: SW requirement & design stable Depth & breadth of testing is OK DT has demo ed required functions Deficiency reporting process in place Software CM system in place System is baselined Test Agency has access to CM system Fixes All changes completed *OSD (OT&E) Memo Subject: Software Maturity Criteria

116 Software Evaluations During IOT&E Does the software support system performance? Is the software: Mature & reliable? Usable by typical operators? Sustainable/maintainable? Testers Bottom Line all cases of software testing, reproducible failures facilitate software fixes. 11 Software Test Readiness Review (TRR) Inputs/Entry Criteria Identified tested requirements (SRS) Traceability of test requirements to SRS & IRS has been established All software item test procedures have been completed Test objectives identified Applicable documentation, procedures and plans are complete and under control Method of documenting and dispositioning test anomalies is acceptable Outputs/Exit Criteria Software Test Descriptions are defined, verified & baselined Testing is consistent with any defined incremental approaches Test facilities available and OK Tested software under CM Lower-level software tests done Software metrics indicate readiness to start testing Software problem report system is defined and implemented Software test baseline established Development estimates updated Nontested requirements at the SI level are identified for later testing

117 Information Technology (IT) Trends IT trends that may affect T&E (all of these tend to add more complexity to software testing): Larger & more complex software packages More emphasis on software reuse More challenges / more emphasis on cybersecurity More challenges / more emphasis on interoperability (part of the Net-Ready KPP) Rapid pace of technology advances Open Architecture 13 IT Issues IT issues that may affect the T&E strategy: Size & complexity of the software. Degree of software reuse. Amount of COTS software. Role the software will play (safety, security risks, activities to be performed by the software, critical or high risk functions, technology maturity) Number of interfaces / complexity of integration / number of outside systems requiring interoperability Cost & schedule available for software testing, previous test results Software development strategy, stability of software requirements

118 COTS/NDI SOFTWARE TESTING ISSUES & RISKS Unit level testing/component testing is generally impossible Documentation of many COTS products is not complete or robust Complex, non-standard interfaces abound in COTS Market leverage may not exist to force vendor bug fixes Component understanding depends mostly on vendor claims Formal requirements documents as such generally not available Current COTS use may not match its original design environment 15 COTS/NDI SOFTWARE TESTING ISSUES & RISKS (cont.) Real-time performance may be marginal Robustness and reliability of many COTS products are generally lower when compared with custom code Higher COTS use in a system generally implies more difficult system level integration testing Evaluation of product suitability occurs long before formal testing Frequent market-driven releases complicate regression testing Slip-streaming is common making version numbers meaningless

119 Software T&E Methodologies Software T&E methodologies vary depending on software type Embedded software that operates only on specific equipment Aircraft, ground vehicles, cell phones, submarines, etc. Applications software that may be procured separately from the operating equipment Timekeeping, personnel, government travel systems, I.D. card systems, etc. 17 T&E of Embedded vs. Applications Software Basic test methods are similar for embedded and applications software Identify required schedule, materiel and expertise Use metrics to evaluate Resource Management, Technical Requirements and Product Quality, including Reliability Support evaluation from requirements definition through unit, integration and system test phases across the life cycle (see the software V ) Maintain rigorous data and configuration management Use appropriate models and simulations Test emphasis is very different

120 Embedded vs. Applications Software Test Emphasis Embedded software emphasis is on Does the hardware/software system function correctly in its intended operating environment? System safety & mitigation of key risks Interoperability with other known systems Reliability usually critical Applications software emphasis is on Does the software function correctly (often in an office environment on varying hardware) Cybersecurity is often a key risk Interoperability through standards compliance Reliability may not be as critical 19 Software T&E Considerations for MAIS Programs Large defense unique software intensive programs (such as command & control systems, and combat support systems): Multiple software builds are typically needed to achieve a deployable capability. Each build has allocated requirements, resources & software testing. Defense Business Systems (DBS) expected to have a life-cycle cost above $1M: MDA shall not approve MS A decision if IOC cannot be achieved (fielded) within 5 years (P.L , section 811) This may necessitate multiple increments, with several builds & fieldings for each increment Each limited fielding results from a specific build, and provides the user with mature and tested sub-elements of the overall capability

121 Software Testing Lessons Learned Formulate test strategy prior to contract award that accommodates cost/schedule constraints Test Strategy should be able to: Verify all critical software requirements of system Test in a way to isolate faults Phase testing to focus on: Qualification Testing Operational Thread Testing Performance/Stress Testing Resolve the requirements vs. design information argument early-on Plan ahead for: Adequate schedule Test Regression Strategy Timing and format of deliverables Accommodating incremental builds Understand the test--be prepared to prioritize test cases Be flexible and attuned to end-ofschedule pressures Cop an attitude--know when to fall on your sword Understand the politics of the acquisition From A Real Life Experience in Testing a 1,000,000+ SLOC Ada Program STC Project Integrity (PM/SPO) 16 Best Practices Construction Integrity (Developer) Product Integrity & Stability (Tester) Adopt Continuous Risk Management Estimate Cost and Schedule Empirically Use Metrics to Manage Track Earned Value Track Defects against Quality Targets Treat People as the Most Important Resource Adopt Life Cycle Configuration Management Manage & Trace Requirements Use System-Based Software Design Ensure Data & Database Interoperability Define & Control Interfaces Design Twice/Code Once Assess Reuse Risks and Costs Inspect Requirements and Design Manage Testing as a Continuous Process Compile & Smoke Test Frequently One Example of Software Mgmt. Best Practices * * As developed by the Software Program Managers Network 121

122 Lesson Summary Software T&E Is Critically Important to Modern Program Success Disciplined Process of Test Planning, Trouble Reporting, and Application of good Test Exit and Maturity Criteria is Key to Success Software IOT&E is Growing in Importance COTS and NDI Software Present Additional Challenges Pay Attention to Lessons Learned and Best Practices

123 Cybersecurity T&E Lesson 5.2 Cybersecurity T&E 123

124 THIS PAGE INTENTIONALLY LEFT BLANK 124

125 Cybersecurity T&E References: - DoDI , Cybersecurity March, DoDI , Risk Management Framework (RMF) for DoD IT March, Cybersecurity TE Guidebook 2015 July 1 - DOT&E Memo, Procedures for Operational Test and Evaluation of Cybersecurity in Acquisition Programs, 1 August Defense Acquisition Guidebook, Chapter 8, T&E - DoD Program Manager s Guidebook for Integrating the Cybersecurity RMF into the System Acquisition Lifecycle, Sep 2015, Version 1.0 This lesson will cover the following topics: 1. The Cybersecurity T&E Process 2. The RMF Process 3. Cybersecurity T&E 4. Integrated Architectures 5. Backup Slides Lesson Topics 2 125

126 Cybersecurity T&E This lesson provides guidance to the T&E community for developing an approach to cybersecurity T&E. Compliance with traditional information assurance policy has proven insufficient to ensure that systemic vulnerabilities are addressed in fielded systems used on the battlefield. A broader cybersecurity T&E approach that focuses on military mission objectives and their critical supporting systems is needed, to fully address the cyber threat. Cybersecurity is an integral part of developmental and operational T&E. Cybersecurity T&E planning, analysis, and implementation is an iterative process that starts at the beginning of the acquisition lifecycle and continues through maintenance of the system. Cybersecurity T&E is performed in conjunction with the Risk Management Framework (RMF) as defined in DoDI , Risk Management Framework (RMF) for DoD Information Technology (IT). Additional guidance and best practices are provided in the Cybersecurity TE 3 Guidebook V1.0 1 July Cybersecurity T&E Overarching Guidelines Test activities should integrate RMF security controls assessments with tests of commonly exploited and emerging vulnerabilities early in the acquisition lifecycle. The Test and Evaluation Master Plan (TEMP) should detail how testing will provide information to assess cybersecurity and inform acquisition decisions. The goal of cybersecurity DT&E is to identify issues related to resilience of military capabilities before MS C. The goal of cybersecurity OT&E is to ensure that the system under test can withstand realistic threat representative cyber-attacks and to return to normal operations in the event of a cyber-attack. The Cybersecurity T&E Process represents a shift left because it requires early developmental T&E involvement. The Cybersecurity T&E process requires the development of mission-driven cybersecurity requirements, which requires systems engineering collaboration. To reduce discovery late in the acquisition lifecycle, test in mission context, against realistic threat, and.. Shift Left! 4 126

127 The Cybersecurity T&E Process Lesson Topics: 1) The Cybersecurity T&E Process 2) The RMF Process 3) Cybersecurity T&E 4) Integrated Architectures 5) Backup Slides References: - DoDI , Cybersecurity March, DoDI , Risk Management Framework (RMF) for DoD IT March, Cybersecurity TE Guidebook 2015 July 1 - DOT&E TEMP Guidebook Nov Defense Acquisition Guidebook, Chapter 8, T&E - DoD Program Manager s Guidebook for Integrating the Cybersecurity RMF into the System Acquisition Lifecycle, Sep 2015, Version Policy in DoDI , Change 2, Feb 2, 2017 The DT&E program will... Support cybersecurity assessments & authorization (Enclosure 4) The Program Manager will Develop a strategy and budget resources for cybersecurity testing. The test program will include, as much as possible, activities to test and evaluate a system in a mission environment with a representative cyber-threat capability (Enclosure 4) Beginning at Milestone A, the TEMP will document a strategy and resources for cybersecurity T&E. (Enclosure 5) Beginning at Milestone B, appropriate measures will be included in the TEMP and used to evaluation operational capability to protect, detect, react, and restore to sustain continuity of operation. (Enclosure 5) Note: DAG Chapter 8 reflects DoDI changes and provides a lot of additional information

128 Cybersecurity T&E Overview A key feature of cybersecurity T&E is early involvement in test planning and execution Beginning at Milestone A, the Test and Evaluation Master Plan (TEMP) will document a strategy and resources for cybersecurity T&E. The cybersecurity T&E phases are iterative, i.e., phases may be repeated several times throughout the lifecycle due to changes in the system architecture, new or emerging threats, and changes to the system environment. First four phases are DT&E; last two phases are OT and are defined in the DOT&E August 1, 2014 Memo. MDD CDD Validation Dev RFP Release Decision IATT ATO A B C Full Rate Production Decision Review Materiel Solution Analysis DRAFT Technology Engineering & AOA Maturation & CDD Manufacturing CPD CDD Risk Reduction Development Production and Deployment O&S ASR MS A TEMP DT&E SRR SFR PDR CDR TRR Event SVR Draft TEMP MS B TEMP DT&E Assessment MS C TEMP DT&E Assessment OTRR IOT&E T&E Phases Understand Cybersecurity Requirements Characterize Cyber Attack Surface Cooperative Vulnerability Identification Adversarial Cybersecurity DT&E Cooperative Vulnerability and Penetration Assessment Adversarial Assessment Key Cybersecurity T&E Documents The Cybersecurity Strategy (CS) is STATUTORY for mission critical or mission essential IT systems. Regulatory for all programs containing IT Also an appendix to the PPP The DoD Chief Information Officer (DoD CIO) is approval authority for ACAT ID/IA programs; the Component CIO is approval authority for all other ACATs. The Risk Management Framework (RMF) Security Plan (SP), signed/approved by the assigned Authorizing Official (AO), is also required at MS A The PPP, Cybersecurity Strategy and Security Plan will help guide Cybersecurity Testing Also supports the IATT (for DT&E events) and ATO (for OT) approved by the AO MDD Program Protection Plan Cybersecurity Strategy RMF Security Plan CDD Validation Dev RFP Release Decision IATT ATO A B C Full Rate Production Decision Review Materiel Solution Analysis DRAFT Technology Engineering & AOA Maturation & CDD Manufacturing CPD CDD Risk Reduction Development Production and Deployment O&S ASR DT&E SRR SFR PDR CDR TRR Event SVR OTRR IOT&E MS A TEMP Draft TEMP MS B TEMP MS C TEMP Reference DoDI , Jan 2015, Table 2, for the Cybersecurity Strategy and PPP Reference DoDI and DoD Instruction , for the RMF Security Plan 128

129 Summary of Changes to Cybersecurity Roles & Responsibilities DIACAP role DODI , 2007 Designated Accrediting Authority (DAA) Certifying Authority No explicit role Information Assurance Manager (IAM) Information Assurance Officer RMF role DODI Authorizing Official (AO) Security Control Assessor (SCA) Information System Owner (ISO) Information System Security Manager (ISSM) Information System Security Officer (ISSO) Responsibilities (Reference DoDI for a complete definition of roles and responsibilities) The AO ensures all appropriate RMF tasks are initiated and completed, with appropriate documentation, for assigned ISs and PIT systems, monitor and track overall execution of systemlevel POA&Ms, Promote reciprocity. The SCA is the senior official with authority and responsibility to conduct security control assessments. In coordination with the information owner (IO), the ISO categorizes systems and documents the categorization in the appropriate JCIDS document (e.g., CDD). The ISSM maintains and reports IS and PIT systems assessment and authorization status and issues, provides ISSO direction, and coordinates with the security manager to ensure issues affecting the organization's overall security are addressed appropriately. The ISSO is responsible for maintaining the appropriate operational security posture for an information system or program. Authorizations Types of Authorizations (DoDI Encl. 6) Interim Authorization to Test (IATT): Limited permission to operate and/or connect to a network for a specific period of time, solely to test your system Authorization to Operate (ATO): Your system may operate and/or connect to the GIG. Basically, a three year lifecycle Authorization to Operate with conditions: For mission critical systems with Very High or High risk non-compliant security controls. Permission must be obtained from the DoD Component Chief Information Officer. Only valid for one year. Corrective actions completed & AO review within 6 months of the authorization date. Can t operate systems without a current ATO or IATT Testers need to plan ahead, coordinate with the necessary people, and POM for the necessary actions, so ATOs / IATTs are received in time to conduct necessary T&E activities 129

130 Cybersecurity T&E Process, Phase 1 Understand Cybersecurity Requirements Purpose Understand the program s cybersecurity requirements and develop an initial approach and plan to conduct cybersecurity T&E. Schedule Typically initiated prior to MS A Must be performed regardless of where the program is in the acquisition lifecycle Major Tasks Establish the T&E WIPT Compile the list Cybersecurity Requirements Identify Cyber Threats Review PPP, CS and RMF SP and document Cybersecurity activities in the TEMP Develop the initial evaluation framework and include cybersecurity activities Coordinate RMF artifacts with AO (for IATT/ATO) during TEMP development Prepare DT&E analysis (to cybersecurity T&E results to-date) in support of PDR Provide input to EMD RFP development 11 Cybersecurity T&E Process, Phase 2 Characterize Cyber Attack Surface Purpose Identify cybersecurity requirements by characterizing the cyber-attack surface. The goal is to identify opportunities an attacker might use, and to plan testing to evaluate those opportunities. Schedule Ideally starts prior to EMD, during TMRR (Activities must be performed wherever the program enters the acquisition lifecycle). Will be revisited at each milestone and may be iterated as design changes (which may introduce new vulnerabilities) are made. Major Tasks Identify Cyber-Attack surface. Examine system architecture (e.g. SV-1, SV-6 viewpoints) to identify interfacing systems, services, and data exchanges that expose the system to potential exploits, including GIG, temporary, and unused connections, critical components and technology. System architecture will also be reviewed by AO s Security Control Assessor Analyze the attack surface (use SMEs to assist in this area) Consider Host environment Review security artifacts to help identify the attack surface and T&E strategies

131 Cybersecurity T&E Process, Phase 3 Cooperative Vulnerability Identification Purpose To analyze, test, and assess how a adversary may obtain access to critical mission systems and subsequent actions the adversary may be able to perform. Goal is to identify and mitigate vulnerabilities and determine measures to improve resilience. Schedule Begins after Milestone B, with Blue Team testing results and cybersecurity kill chain analysis performed in this phase providing input to Critical Design Review (CDR) and preparation for the TRR. Major Tasks Finalize the system testing environment Review available RMF artifacts Perform a vulnerability assessment (Blue Team) Perform a cybersecurity kill chain analysis Verify preparation for 4 th Phase, adversarial cybersecurity T&E 13 Vulnerability Assessment (Blue Team) Blue and Red Teams Threat Representative Testing (Red Team) Comprehensive Identifies any/all known vulnerabilities present in systems Reveals systemic weaknesses in security program Focused on adequacy & implementation of technical security controls and attributes Multiple methods: hands-on testing, interviewing personal, or examination of relevant artifacts Feedback to developers and system administrators for system remediation and mitigation Conducted with full knowledge and cooperation of systems administrators No harm to systems Exploit one or more known or suspected weaknesses Attention on specific problem or attack vector Develops an understanding of inherent weaknesses of system Both internal and external threats Model actions of a defined internal or external hostile entity Report at the end of the testing Conducted covertly with minimal staff knowledge May harm systems and components & require clean up 131

132 Cybersecurity T&E Process, Phase 4 Adversarial Cybersecurity DT&E Purpose Evaluation of the system s cybersecurity in a mission context, using realistic threat exploitation techniques, while in a representative operating environment. The goal of this step is to evaluate how critical mission objectives: Will be impacted if data is altered due to cyber-attack Will be compromised if required data is unavailable Will be compromised if mission data is exploited in advance of mission execution Schedule - Conducted before Milestone C. Major Tasks Complete resource planning Complete Threat Representative test planning Conduct assessment using representative threat Develop DT&E assessment 15 Cybersecurity Operational T&E For acquisition program under DOT&E oversight, cybersecurity T&E Phases for Operational Test are applied to all programs on DOT&E oversight that send or receive digital information via: Direct or indirect connections to external networks Wireless or radio frequency connections Physical ports (e.g. USB), removable data cards Non Internet Protocol-based data buses (e.g. 1553) Any system with two-way data transfer capabilities to external networks DOT&E will evaluate the level of test required for other systems on a case-by-case basis OTAs are encouraged to apply the procedures to all information handling systems, regardless of oversight Note: Refer to TEMP Guidebook 3.0 for specific examples and guidance for incorporating Cybersecurity T&E into TEMPs. 132

133 Cybersecurity T&E Process, Phase 5 Cooperative Vulnerability and Penetration Assessment Purpose - This phase consists of an overt and cooperative review of the system to characterize operational cybersecurity status and determine residual risk as well as readiness for adversarial assessment (Phase 6). It includes an OT&E event. Schedule Should begin after the system under test has received an Authorization to Operate (ATO) or an Interim Authorization to Test (IATT) in operationally representative network(s). Will preferably occur before Milestone C, but may occur after Milestone C under certain circumstances. If approved, may be integrated testing, but regardless of whether integrated or not, should make use of all relevant DT data. Major Tasks Test Planning Coordination with a cybersecurity vulnerability assessment team Ensure sufficient post-test availability for correction/mitigation of testdiscovered vulnerabilities. 17 Cybersecurity T&E Process, Phase 6 Adversarial Assessment Purpose A full OT&E of the system s defensive cyberspace performance in the operational environment (including network defense services) to withstand threat representative cyber-attacks, detect and react to those attacks, and to return to normal operations in the event of a successful cyber-attack. All major vulnerabilities (discovered previously) should be corrected or remediated prior to entering this phase. Schedule Conducted before the Full Rate Production or Full-Deployment Decision. The Cyber Operational Resiliency Evaluation can be conducted during or in support of the IOT&E. Duration will depend upon the details of the system design and cyber threat, but a minimum of one to two weeks of dedicated testing is a nominal planning factor with potentially a longer preparation period for threat reconnaissance and research activity. Major Tasks Test Planning Coordination with the Operational Test Agency team

134 The Risk Management Framework (RMF) Process Lesson Topics: 1) The Cybersecurity T&E Process 2) The RMF Process 3) Cybersecurity T&E 4) Integrated Architectures 5) Backup Slides References: - DoDI , Cybersecurity March, DoDI , Risk Management Framework (RMF) for DoD IT March, Cybersecurity TE Guidebook 2015 July 1 - DOT&E TEMP Guidebook Nov Defense Acquisition Guidebook, Chapter 8, T&E - DoD Program Manager s Guidebook for Integrating the Cybersecurity RMF into the System Acquisition Lifecycle, Sep 2015, Version RMF Process Overview

135 Technology Maturation & Risk Reduction DoDI Figure 4 DoDI Cybersecurity Extends applicability to all DoD IT processing DoD information DoD Information Technology (IT) Information Systems Major Applications Enclaves PIT Systems Platform IT (PIT) PIT Services Internal External Products Software Hardware Applications Cybersecurity applies to all IT that receives, processes, stores, displays, or transmits DoD information

136 Lesson Topics: 1) The Cybersecurity T&E Process 2) The RMF Process 3) Cybersecurity T&E 4) Integrated Architectures 5) Backup Slides Cybersecurity T&E References: - DoDI , Cybersecurity March, DoDI , Risk Management Framework (RMF) for DoD IT March, Cybersecurity TE Guidebook 2015 July 1 - DOT&E TEMP Guidebook Nov Defense Acquisition Guidebook, Chapter 8, T&E - DoD Program Manager s Guidebook for Integrating the Cybersecurity RMF into the System 23 Acquisition Lifecycle, Sep 2015, Version 1.0 T&E Responsibilities in , , & DOT&E Memo For the developmental test community: Cybersecurity assessments must be integrated into DT&E Cybersecurity planning, implementation, testing, and evaluation must be incorporated in the DoD acquisition process Adequate DT&E to support cybersecurity must be planned, resourced, documented, and executed in a timely manner Coordination is required with the DoD Test Resource Management Center for establishment of DT&E specific cybersecurity architectures & requirements. For the operational test community, the policy: Requires that programs perform cybersecurity assessments as part of operational test assessments. Requires DOT&E to conduct independent cybersecurity assessments during OT&E for systems under acquisition oversight; and that DOT&E oversee cybersecurity assessments by test agencies during both acquisition and exercise events

137 Cybersecurity in the TEMP Section 1: RMF categorization (if completed) Threat information from the STAR, the PPP, cyber attack surface analysis Section 2: Overall T&E Schedule defines cybersecurity T&E events Section 3: Cybersecurity events in the Developmental Evaluation Framework High Level Test Plans for Cybersecurity DT and OT, including, at a minimum, DT cooperative vulnerability testing DT adversarial testing OT cooperative vulnerability and penetration assessment OT adversarial assessment Section 4: Initial definition of resources required for cybersecurity T&E Cyber Ranges Cybersecurity SMEs Cybersecurity T&E Must Begin Before MS A and Must Be Addressed in MS A TEMP! 25 Overview of the Developmental Evaluation Framework The Developmental Evaluation Framework articulates a logical evaluation strategy that informs decisions: How acquisition, programmatic, technical and operational decisions will be informed by evaluation How system will be evaluated How test and M&S events will provide data for evaluation What resources are required to execute test, conduct evaluation, and inform decisions This information is put into a matrix format. Decisions Define Evaluation Define Define Test Inform Data Execute Resources / Schedule Guidance for the Developmental Evaluation Framework and its inclusion in the TEMP is provided in DAG Chapter 9 or directly from DASD DTE-TRMC. 137

138 Developmental Evaluation Framework Decisions Evaluation Test Decision points within the program are listed across the top row of the table, with Decision Support Questions that support the decisions defined directly beneath High level evaluation measures are referenced in the far left columns. The evaluation measures are referenced from the Systems Engineering Plan, PPP, and requirements documentation. Test events that feed the decision are defined in the cells corresponding to decisions, DSQs, and evaluation measures. Resources / Schedule Resources and schedule are defined in the TEMP, linked to decisions and test events included in the matrix Developmental Evaluation Objectives Functional evaluation areas System capability categories Performance Performance Capability #1 System Requirements and T&E Measures Technical Reqmts Document Reference Description Decision #1 Decision #2 Decisions Supported * Decision Support Questions (DSQ): Questions capturing the essence of the information needed to make informed decisions. For example, the decision to move forward with system integration may be informed with DSQ such as: (1) Are the components to be integrated performing as required? (2) Are the basic platform capabilities performing as required? Decision #3 Decision #4 DSQ #1 DSQ #2 DSQ #3 DSQ #4 DSQ #5 DSQ #6 DSQ #7 DSQ #8 Identify major decision points for which testing and evaluation phases, activity and events will provide decision supporting information. Cells contain description of data source to be used for evaluation information, for example: 1) Test event or phase (e.g. CDT1...) 2) M&S event or scenario 3) Description of data needed to support decision 4) Other logical data source description 3.x.x.5 Technical Measure #1 DT#1 M&S#2 DT#4 M&S#2 3.x.x.6 Technical Measure #2 M&S#1 DT#3 DT#4 M&S#2 Performance 3.x.x.7 Technical Measure #3 DT#3 IT#1 Capability #2 3.x.x.8 Technical Measure #4 M&S#4 IT#1 Interoperability Interoperability 3.x.x.1 Technical Measure #1 DT#3 DT#4 Capability #3 3.x.x.2 Technical Measure #2 IT#2 M&S#4 DT#4 3.x.x.3 Technical Measure #3 Interoperability IT#2 IT#1 M&S#2 Capability #4 3.x.x.4 Technical Measure #4 IT#1 DT#3 Cybersecurity SW/System Assurance PPP 3.x.x SW Assurance Measure #1 SW Dev Assess SW Dev Asses SW Dev Assess RMF RMF Contol Measure #1 Cont Assess Cont Assess Cont Assess Cont Assess Vulnerability Assess Vul Assess Measure #1 Blue Team Blue Team Interop/Exploitable Vuln. Vul Assess Measure #2 Red Team Red Team Reliability 4.x.x.1 Technical Measure #11 M-demo#1 IT#5 Reliability Cap #1 4.x.x.2 Technical Measure #12 M-demo#1 IT#2 IT#5 4.x.x.3 Technical Measure #13 M-demo#2 IT#2 Reliability Cap #2 4.x.x.4 Technical Measure #14 M-demo#2 IT#2 Example DEF RFP Release/MS B (Block 2 ECP known) PQA (Bench Qual Complete) (Ready for Integration) Final TRR Integration Complete (Ready for 1st Flight) OTRR MS C Developmental Evaluation Objectives Performance Airframe Drive System Electrical System Rotor System Fuel System CAAS DAFCS ACRB Technical Measures Max Design Gross Weight Fatigue Life Static Structure Maneuver Load Sub System PDR Results Satisfied? Subsystem Qual Performance Gnd Test demonstate needed capability? Software clearance achieved? Airworthiness Certification received? Contractor 1st Flight Assessment results? System Cert Test Readiness Statements EMD Exit Criteria Satisfied? Ctr Assessment of Grd Test, Govt DT PDR Ground Test AQS DT Flt test and LUT Results RFP Release/MS-B (Block 2 ECP content AQS, and known): SIL results Sub-system Flt Test PDR PDR Torque Component Qual Test Capabilities Emergency Power (Time) (Sub-systems) PDR CQT Ground Test AQS Efficiency Airframe Drive Usage Spectrum Electrical System Capacity Rotor SystemWeight Reduction Fuel SystemComplete Accurate Timely CAAS (Software) A/C State Nav Comm (Digital/Voice DAFCS (Control System) ACRB (Rotor blades) Handling Qualities LCTA (Actuators) Lift Flying Quality PDR CQT Ground Test AQS PDR CQT Ground Test AQS PDR CQT Ground Test SIL AQS PDR CQT Ground Test SIL AQS CDR CQT Ground Test AQS Decisions / DSQs results satisfactory for inclusion in Blk2 Ctr Assessment of Grd Test, Govt DT Ground Test AQS AQS, and SIL results Flt Test PQA (Bench Qual Complete, Ready for Integration?): Sub-system Ctr Assessment of Grd Test, Govt DT bench qualification indicate ready to integrate? AQS, and SIL results Flt Test Ctr Assessment of Grd Test, Govt DT Final TRR (Integration Complete, Ready AQS, and for SIL results 1 st Flight?) Flt TestGround test demonstrate capability? Ctr Assessment of Grd Test, Govt DT Software clearance achieved? AQS, and SIL results Flt Test Airworthiness certification received? Contractor 1 st flight assessment results/recommendation? Ctr Assessment of Grd Test, AQS, and SIL results OTRR: PM certify system ready to enter OT&E? MS-C: EMD exit criteria satisfied? Ctr Assessment of Grd Test, AQS, and SIL results Ctr Assessment of Grd Test, AQS, and SIL results Govt DT Flt Test Govt DT Flt Test Govt DT Flt Test DT Flt test and LUT Results DT Flt test and LUT Results DT Flt test and LUT Results DT Flt test and LUT Results DT Flt test and LUT Results DT Flt test and LUT Results DT Flt test and LUT Results LCTA PDR CQT Ground Test AQS Ctr Assessment of Grd Test, AQS, and SIL results Govt DT Flt Test DT Flt test and LUT Results Reliability Interoperability NR KPP CyberSecurity Sys/SW Assurance RMF (Compliance) Vulnerability Assessment Reliability RAM MTBF MTBMA PDR Subsystem RAM RAM Data Collection SW Reliability RAM Data Collection RAM Data Collection RAM Data Collection 138

139 Cybersecurity in the DEF Developmental Evaluation Objectives Functional evaluation areas System capability categories Performance Performance Capability #1 System Requirements and T&E Measures Technical Reqmts Document Reference Description Decision #1 Decision #2 Decisions Supported Decision #3 Decision #4 DSQ #1 DSQ #2 DSQ #3 DSQ #4 DSQ #5 DSQ #6 DSQ #7 DSQ #8 Identify major decision points for which testing and evaluation phases, activity and events will provide decision supporting information. Cells contain description of data source to be used for evaluation information, for example: 1) Test event or phase (e.g. CDT1...) 2) M&S event or scenario 3) Description of data needed to support decision 4) Other logical data source description 3.x.x.5 Technical Measure #1 DT#1 M&S#2 DT#4 M&S#2 3.x.x.6 Technical Measure #2 M&S#1 DT#3 DT#4 M&S#2 3.x.x.7 Technical Measure #3 Performance DT#3 IT#1 Capability #2 3.x.x.8 Technical Measure #4 M&S#4 IT#1 Interoperability 3.x.x.1 Technical Measure #1 Interoperability DT#3 DT#4 Capability #3 3.x.x.2 Technical Measure #2 SW/System Assurance IT#2 PPP 3.xx M&S#4SW Assurance DT#4 Measure #1 3.x.x.3 Technical Measure #3 Interoperability IT#2 IT#1 M&S#2 Capability #4 3.x.x.4 Technical Measure #4 IT#1 DT#3 Cybersecurity SW/System Assurance PPP 3.x.x Vulnerability SW Assurance Measure Assess #1 Vul Assess Measure #1 SW Dev Assess SW Dev Asses SW Dev Assess RMF RMF Contol Measure #1 Adversarial AssessCont Assess Cont Assess Cont Assess Adversarial Cont Assess Assess Measure #1 Vulnerability Assess Vul Assess Measure #1 Blue Team Blue Team Interop/Exploitable Vuln. Vul Assess Measure #2 Red Team Red Team Reliability 4.x.x.1 Technical Measure #11 M-demo#1 IT#5 Reliability Cap #1 4.x.x.2 Technical Measure #12 M-demo#1 IT#2 IT#5 4.x.x.3 Technical Measure #13 M-demo#2 IT#2 Reliability Cap #2 4.x.x.4 Technical Measure #14 M-demo#2 IT#2 RMF RMF Compl Measure #1 Control Assess Cybersecurity T&E events are included in the Developmental Evaluation Framework (DEF) to support acquisition decisions. The DEF is included in the MS B and MS C TEMPs and ideally is drafted in the MS A TEMP. SW Dev Assess Control Assess Blue Team Red Team 29 Operational Evaluation Framework (Example from TEMP Guidebook) Test Goal Specific Capability and COI being evaluated Response Variable Specific evaluation metrics tied to each mission. Test Design Test design for specific test and variables required. Resources / Schedule Resources and schedule are defined in the TEMP, linked to decisions and test events included in the matrix

140 Notional Cybersecurity in the OEF (Example adapted from TEMP Guidebook) Specific Cybersecurity mission oriented measures should be planned and included in the Operational approach and by MS B in the OEF. 31 Cybersecurity T&E Penetration T&E Can the system withstand attempts to circumvent system security? Tests may include penetration attempts from inside & outside the organization The National Security Agency (NSA) has a support contractor specifically responsible for accomplishing penetration testing Though Penetration testing may result in destruction to the system under test, it is NOT the intent during DT&E to default to this practice. Penetration testing may be used to better understand the risk to the system under test. Security T&E Validates the correct implementation of audit, identification & authentication, access controls, object reuse, trusted recovery, and network connection rule compliance Test plans & procedures should address all security requirements / features; and should assess amount of residual risk May Require Classified Annex to your TEMP

141 Types of Cybersecurity Testing Type of Test Conducted by Systems Integration Lab (SIL) Program Office Supports IATT or ATO Predecessor to other security testing TEMPEST Testing Program Office Goal is to detect & prevent compromised or NSA information from emissions such as radio or electrical signals, sounds, vibrations Can minimize compromising emissions by distance, shielding or other features Anti-Tamper/Tamper Detect Tests Services Will features employ if sys in enemy hands? DAU online CLE022 provides more detail Types of Cybersecurity Testing (cont) Type of Test Conducted by Type 1 Encryption NSA Type 1 = Highest Protection Level Test that plaintext correctly converted to cipher text with an algorithm/key Cross Domain DISA Testing Required when multiple classification levels exist within a system Penetration NSA Subcontractor Test if security features can be circumvented 141

142 Introduction to Cyber Ranges Cyber Ranges provide operationally relevant, reasonably realistic, cybercontested / cyber-safe environments to support T&E of systems, segments and platforms with minimal risk to operational networks. Adequate DT&E, OT&E and assessments may require testing on cyber ranges due to one or more of the following reasons: Testing cannot or should not occur on open operational networks (e.g., testing related to offensive cyber operations) Advanced cyber adversarial tactics, techniques, and procedures cannot be realistically represented Scaling of the operational environment (number of users or hosts, and/or amount of network traffic) cannot be realistically achieved for test purposes The level of operational complexity and associated mission risk are such that impact to operational networks should be avoided Using Cyber Ranges The program office should work with the Lead DT&E Organization to: Identify testing that will occur on a cyber range Identify cyber events that should be integrated with DT&E, OT&E and assessment activities Support development of linkages between the cyber range and developmental and operational networks Plan for integration of Blue Team & Red Team emulations on the cyber range Coordinate with cyber range staffs to ensure they understand the test objectives and the planned test scenarios Coordinate with intel community support to ensure adversarial threats and targets emulated on the range are realistic and representative Verify that Blue targets and Blue offensive capabilities emulated on the 36 range are realistic and representative 142

143 Range Capabilities DoD Cybersecurity Range National Cyber Range (NCR) Joint IO Range (JIOR) C4 Assessment Division DOD Cyber Range Resources HQMC C4 in the role of Executive Agent and Service Sponsor in partnership with DISA PEO-MA and OUSD has enabled the DoD Cybersecurity Range operationally realistic Tier 1 environment to support the training, exercise and test and evaluation communities. The Cybersecurity Range provides a generic DoD Tier I, Tier II, and Tier III capability. The CC/S/A s with their individual cyber environments can connect into the Cybersecurity Range through the Information Operations (IO Range) or via Virtual Private Network (VPN) over Internet and Defense Research Engineering Network (DREN). Cyber testing and cyber training asset (facility, tools, trained staff). Capabilities include: Security architecture that enables a common infrastructure to be partitioned into MILS and leverage real malware Toolkit that automates the lengthy process of creating high fidelity test environments Accessible remotely via the Joint IO Range. Unique combination of expertise in cyber domain, cyber testing, cyber range management and cyber test tools Closed-loop, secure, distributed network that forms a realistic and relevant live-fire cyberspace environment supporting CCMD, Service, Agency and Test Community training, testing, and experimentation across the Information Operations and Cyberspace mission areas. JIOR meets CCJO intent and provides a critical Joint Force cyberspace training and testing environment. It is the only live-fire Range supporting Cyberspace and IO related objectives in the Joint Training Enterprise. Conduct assessments of existing and emerging C4 capabilities in a persistent C4 environment to achieve interoperable and integrated solutions that satisfy joint operational requirements. Replicates Joint Warfighter C4 systems and addresses the interoperability of those systems. Lesson Topics: 1) The Cybersecurity T&E Process 2) The RMF Process 3) Cybersecurity T&E 4) Integrated Architectures 5) Backup Slides Integrated Architectures and Cybersecurity 37 -Interoperability and effective management of security content will be achieved through adherence to DoD cybersecurity architectures as issued. All DoD Components must commit to these architectures to facilitate sharing of information necessary to achieve mission success, while managing the risk inherent in interconnecting systems. DoDI , Encl. 3, par 4e

144 Integrated Architectures Architecture: structure of components, their relationships, and principles & guidelines governing design and evolution over time. IT Architecture for computer systems includes hardware and software components, interfaces, and their execution concept. Why Architecture? Provides a level of risk reduction in identifying shortfalls, capacities, interoperability gaps/enhancements Provides the top level vision that guides the engineering effort that makes it all work Provides a degree of stability in a changing environment Enables cybersecurity and effective management of security content 39 Integrated architectures are mandatory (CJCSI F) Three Attributes of the NR-KPP 144

145 Supporting NR-KPP Architecture DATA Yikes! Document/ Architecture AV-1 AV-2 CV-1 CV-2 CV-3 CV-4 CV-5 CV-6 DIV-1 DIV-2 (OV-7) DIV-3 (SV-11) OV-1 OV-2 OV-3 OV-4 OV-5a OV-5b OV-6a OV-6c PV-2 SV-1 or SvcV-1 SV-2 or SvcV-2 SV-4 or SvcV-4 SV-5a or SvcV-5 SV-6 or SvcV-6 SV-7 or SVcV-7 6 DCR 1 R R R R R R SvcV-10a SvcV-10b SvcV-10c StdV-1 (TV-1) StdV-2 (TV-2) CONOPS 1 R R R R R R R R R R ICD 1 X R R R R R X X X X O R CDD 1 X X X X X X X X X X X X X X X X X X X X X X X 2 X 2 CPD 1 X X X X X X X X X X X X X X X X X X X X X X X X 2 X 2 IC 3, 4 X X X X X X X X X X X X X X X X X X X X X X X X Legend Note 1 Note 2 Note 3 Note 4 Note 5 Note 6 X Required O Optional R- Recommended, PM needs to check with their Component for any additional architectural/regulatory requirements for CDDs, CPDs. (e.g., HQDA requires the SV-10c, USMC requires the SV-3, IC requires the SvcV-10a and SvcV-8) The AV-1 must be registered, must be public and released at the lowest classification level possible in DARS for compliance. The technical portion of the StdV-1 and StdV-2 are built using GTG-F DISR standards profiling resources and, within six months of submitting JCIDS documentation, must be current and published for compliance. Use of non-mandated DISR standards in the StdV-1 must be approved by the PM or other duly designated Component cognizant official and documented by a waiver notification provided to the DoD CIO." Intelligence Community (IC) requirements IAW the IC Enterprise Architecture Program Architecture Guide and development phase which clarifies the IC Policy Guidance Acquisition. Service Views (SvcV) only 1. The Sponsor* and the Program are jointly responsible for the AV-1, AV-2, CV-1, CV-2, CV-3, CV-4, CV-5, CV6, SV-6 or SvcV The Sponsor* is responsible for the development of the architecture data for the OV-1, OV-2, OV-4, OV-5a, OV6c, DIV-2, and the SV-6 or SvcV The Program is responsible for the development of the architecture data for the DIV-1, DIV-3, OV-3, OV-5b, OV-6a, PV-2, SV-1 or SvcV-1, SV-2 or SvcV-2, SV-4 or SvcV-4, SV-5a or SvcV-5, SvcV-10a, SvcV-10b, SvcV-10c, StdV-1, and StdV-2. * Operational user (or representative). The NR-KPP Measures data is captured in the SV-7 or the SvcV-7. DoDAF v2.0 - Eight Viewpoints (8 Collections of Views, 52 Views)

146 DoDAF Integrated Solution Architecture Development Flow (Notional) Overview Summary AV-1 OV-1 High-Level Operational Concept Operational Event Trace Diagram START OV-6 System Event/Trace Diagram OV-4 SV-10c Organizational Relationships Operational Activities/Tasks SV-4 OV-5 System Functions System Measures Matrix Operational Node Connectivity Needlines (Resource Exchanges) SV-7 Activities to Functions Map SV-5 Links Needlines to System Interfaces SV-2 OV-2 SV-1 System Logical Interface Diagram Operational information Exchange Matrix System Physical Interface Diagram DIV-2 OV-3 Logical Data Model System Data Exchange Matrix StdV-1 SV-6 Integrated Dictionary Standards Profile OV-2 &OV-5 used to develop OV-3 DIV-3 ASR SRR SFR PDR Initial Drafts Complete StdV-2 AV-2 Physical Data Model SV-1 & OV-2 used to develop SV-6 Standards Forecast

147 Impacts of IT Architectures on T&E An architecture-based approach can allow for easier testing & computer based testing, because protocols & interfaces are defined Testing can occur earlier, and at a higher level, by examining the architecture itself. Potential performance problems can often be detected in the architecture, before coding begins. The architecture can facilitate or impede cybersecurity, software coding, data representation, functionality, performance, and reusability The DoD Architecture Framework (DODAF) promotes interoperability throughout the DoD, & also provides a framework for interoperability testing. 45 Summary Cybersecurity T&E Process activities begin early, e.g., pre- Milestone A, and continue throughout the Acq. Lifecycle Cybersecurity T&E process should exercise critical operation missions in a relevant cyber threat environment The Test and Evaluation Master Plan (TEMP) details: How testing will provide the information needed to assess cybersecurity and Inform Systems Engineering, Risk Management and Acquisition Decisions. Test activities integrate RMF security controls assessments and tests of commonly exploited and emerging vulnerabilities Cybersecurity T&E Process represents a shift left

148 Backup Slides Lesson Topics: 1) The Cybersecurity T&E Process 2) The RMF Process 3) Cybersecurity T&E 4) Integrated Architectures 5) Backup Slides Note: Most of the backup slides deal with the Cybersecurity T&E Process 47 Cybersecurity T&E is a Collaborative, Integrated Process Cybersecurity-related information and actions should be integrated throughout the acquisition lifecycle. Integration and collaboration: Includes relationships between all communities to align artifacts and activities within acquisition milestones and events. Involves collaboration with Users, PM, Systems Engineers, Security Controls Assessors. Begins prior to MS A in conjunction with existing Engineering and Systems Security Engineering. Is executed in an incremental and iterative manner to verify security requirements, expose vulnerabilities and evaluate resilience to cyber attacks

149 T&E Community Must Understand Cybersecurity Requirements Specified Requirements Requirements clearly identified in program documentation, e.g., ICDs/CDDs, CONOPs, Product Specifications PPP Requirements mandated by DoD regulations, such as DIACAP IA or RMF Controls Essential Requirements Define capabilities that must be achieved to provide sufficient resilience to support mission accomplishment in the presence of cyber attack Objective of Step 3 is analysis of potential kill chain activities to identify essential cybersecurity requirements necessary to improve resilience of the operational system to cyber attack. Implied Requirements AKA Derived Requirements Requirements driven by operational capabilities Requirements driven by acquisition approach and/or technology choices, e.g. COTS/GOTS Technical requirements that enable the capabilities defined in CONOPS, etc. Includes the Cyber Threat environment Objective of Step 2 is to characterize the cyber attack surface to identify the additional implied cyber requirements. T&E Community collaborates with stakeholders to confirm testability, identify needed test resources and plan T&E Events! 49 Simple Example: Analyses of Automotive Attack Surfaces Modern automobiles are pervasively computerized Engine, Transmission, Body, Airbag, Antilock Brakes, HVAC, Keyless Entry Control, etc. Attack surface is extensive Telematics: Blue Tooth, Cellular, Wi-Fi, Keyless Entry Attack Surface is easily exploited OBD Diagnostics, CD players, Bluetooth Cellular radio/ Wi-Fi allow Long distance vehicle control, location tracking, in-cabin audio exfiltration Aug 2011: Comprehensive Experimental Analyses of Automotive Attack Surfaces Source: University of California, San Diego, University of Washington We protect our similar military Platform IT systems using appropriate Cybersecurity measures 149

150 Urban Assault Vehicle Early System Concept Example Phase 1: Understanding Cybersecurity Requirements/Develop T&E Approach Plan Cybersecurity T&E to Engage with SE Team Early Engage with SE/SSE Activities/Processes Requirements Reviews, Contracting, SETRs etc. Plan Verification DT&E to close Attack Surface Conduct Kill Chain Vulnerability Assessments (Blue Team and Red Team) to evaluate mission performance Verify Production Readiness at MS C OT&E post MS C Example Requirements Resources CONOPS Capabilities Documents Information Support Plan Systems Requirements Documents Program Protection Plan Cybersecurity Strategy RMF Packages Contract Specs/Technical Requirements Documents MITRE: Cyber Attack Lifecycle Example Phase 2: Characterize the Attack Surface Stakeholders Identify Vehicle Attack Surface 1. Vehicle to Vehicle Comms 2. Telematics 3. Keyless Entry 4. OBD II 5. Radio 6. Anti Theft Urban Assault Vehicle Attack Surface Refine T&E Strategy to Understand All systems interfaces Likelihood of attack? What happens if/when exploited? Approach to close/mitigate vulnerabilities Adequacy of Cybersecurity T&E Approach Aug 2011: Comprehensive Experimental Analyses of Automotive Attack Surfaces Source: University of California, San Diego, University of Washington 150

151 Example Phase 3: Vulnerability Identification Vehicle Attack Surface 1. Deny Vehicle/Vehicle Comms 2. Intercept Telematics 3. Clone Keyless Entry 4. Corrupt OBD-II 5. Monitor Radio 6. Disable Anti-Theft T&E Activities Verify/Exercise Critical Missions Cooperative Kill Chain Vulnerability Assessments (Blue Team) ID potential exploits, exposed vulnerabilities/mission impact Urban Assault Vehicle Attack Surface Aug 2011: Comprehensive Experimental Analyses of Automotive Attack Surfaces Source: University of California, San Diego, University of Washington PPP Criticality Analysis Vehicle SV-6 Systems Data Exchange Requirements Cyber Attack Lifecycle Example Phase 4: Adversarial Cybersecurity DT&E Exercise Critical Missions 1. Tx/RX Vehicle/Vehicle Comms 2. Cellular Phone Calls 3. Use Keyless Entry 4. Upload/Download OBD II Data 5. Tune Radio 6. Anti Theft Urban Assault Vehicle Autobahn Mission Simulated/Lab Environment/Cyber Range T&E Actions Verify/Exercise Critical Missions Adversarial Kill Chain Vulnerability Assessments (Red Team) ID exposed vulnerabilities/mission impact Develop DT&E Assessment EMD Article 151

152 Example Phase 5: Vulnerability and Penetration Assessment Exercise Critical Missions 1. Tx/RX Vehicle/Vehicle Comms 2. Cellular Phone Calls 3. Use Keyless Entry 4. Upload/Download OBD II Data 5. Tune Radio 6. Anti Theft 7. Bullet proof windows, Run flat tires Urban Assault Vehicle Autobahn Mission Operational Environment & Cyber Range & Blue Team T&E Activities Establish Representative Cyber Environment with Threats and Users Conduct Vulnerability Assessment (Blue Team) Evaluate Test Data Determine readiness for OT&E LRIP/Production Article Example Phase 6: Adversarial Assessment Exercise Critical Missions 1. Tx/RX Vehicle/Vehicle Comms 2. Cellular Phone Calls 3. Use Keyless Entry 4. Upload/Download OBD II Data 5. Tune Radio 6. Anti Theft 7. Bullet proof windows, run flat tires T&E Activities Establish Representative Cyber Environment with Threats and Users Conduct assessment using representative threat (Red Team) Understand Mission Impacts Evaluate Test Data Produce OT&E Assessment Urban Assault Vehicle Autobahn Mission Operational Environment & Red Team LRIP/Production Article

153 Software, Interoperability, & Cybersecurity Exercise Lesson 5.3 Software, Interoperability, & Cybersecurity Exercise 153

154 THIS PAGE INTENTIONALLY LEFT BLANK 154

155 Software, Interoperability, and Cybersecurity Exercise MDD CDD Validation Dev. RFP Release A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment Operations & Support Draft MS A TEMP YOU ARE HERE 1 Milestone A Test and Evaluation Master Plan (TEMP) TEMP is first due prior to the Milestone A decision (updated at MS B and MS C) TEMP Guide 3.0 explains required content. TEMP development requires early involvement of testers, evaluators, and others Establishes early consensus among T&E WIPT member organizations For programs on the OSD T&E oversight list, TEMP is approved by DOT&E and DASD(DT&E) For programs not on the OSD T&E Oversight List, the CAE, or designated representative, approves the TEMP 155

156 Milestone A TEMP MS A TEMP should address: Identification & management of technology risk Evaluation of system design concepts against preliminary mission and sustainment requirements resulting from the AoA Risk reduction and competitive prototyping Early demonstration of technologies in relevant environments Development of an integrated test approach including an initial evaluation framework (include Cybersecurity activities). T&E concepts throughout the program life cycle Software, Interoperability, and Cybersecurity Exercise Read the provided JECSS Milestone A TEMP. Then based on the JECSS TEMP, answer the questions assigned to your team. Note: This TEMP is based on a real program. Names and other details have been changed. 156

157 Backup Slides on Interoperability Test Methods Interoperability Test Methods - Networks Networks Tie together platforms, simulators, etc. Allows test articles (at geographically dispersed locations) to interact over the network Often spoof the test articles into believing they are all playing on a common mapsheet Can incorporate man-in-loop, HWIL, operator training Can test interoperability between real & simulated platforms Distributed testing is more cost efficient than bringing all test assets to a single location Example - Defense Research and Engineering Network (DREN) Robust, high-capacity, low-latency network DoD, academia, industry virtually any CONUS location, Alaska, Hawaii, selected OCONUS sites Dedicated internet-like connectivity. Provides digital, imaging, video, audio data transfer. Allows encryption ELO

158 Interoperability Test Methods - Exercises Annual Defense Interoperability Communication Exercise (JITC, Services, FEMA, Coast Guard, Allies) Interoperability testing, training, assess transformation initiatives / emerging technologies Replicated geographically dispersed joint task force environment (network), representative of real-world operations DICE05 had 33 test events, including 13 certification tests, 10 assessments, 6 demonstrations, 4 warfighter-support tasks has lessons learned Trident Warrior / FORCEnet Annual C2 exercise run by Naval Network Warfare Command Net-centric strike group across U.S. and coalition forces - Navy, Marine Corps, Allied participants Many other testing & training exercises Often cost effective (many different platforms assembled) ELO 6.2 Interoperability Test Methods Testbeds & Simulations Testbed A system representation consisting of part actual hardware and/or software - and part computer models or prototype hardware/software. A simulated platform upon which experimental tools and products may be deployed and allowed to interact in real-time. Example: Testbeds are often used to examine information exchanges between systems - Correct data format, message send & receive, etc. Simulation A method for implementing a model (a model is a representation of the real world). The process of conducting experiments to understand system behavior (modeled under selected conditions), or of evaluating various strategies for system operation. Simulation may include the use of analog or digital devices, laboratory models, or testbeds Example: PLGR Emulator provides simulated PLGR (GPS location) signal to Army platforms or systems ELO

159 Student Questions for the Software, Interoperability, and Cybersecurity Exercise: 1. T&E Strategy a. Discuss the strengths and weaknesses of the JECSS T&E strategy. What recommendations do you have, concerning the T&E strategy? b. Is the T&E schedule realistic? Will test objectives likely be met within the schedule? Why or why not? Consider all the various types of testing as part of your answer. c. Does the JECSS have a good balance among T&E cost, schedule, and performance requirements? Why or why not? 2. Integration a. Describe the integration approaches (system, subsystem, and component level) as specified in the JECSS TEMP. What are the integration challenges for this program? b. In your opinion, is the schedule for achieving integration realistic? (Are the integration objectives likely to be met, within the allotted amount of time? Why or why not?) c. What improvements would you recommend concerning the integration approach? Provide your recommendations. 3. Interoperability a. Discuss what the JECSS will connect to/with. What is included in the JECSS System of Systems? b. What are the interoperability challenges for the JECSS? Is the interoperability effort likely to be easy or hard for the JECSS program? Explain and/or justify your answer. c. In your opinion, is the schedule for achieving interoperability realistic? (Are the interoperability objectives likely to be met, within the allotted amount of time? Why or why not?) 4. Cybersecurity a. Is the Cybersecurity effort likely to be easy or hard for the JECSS program? Explain and/or justify your answer. b. Identify the cybersecurity developmental and operational test events that must be included to bring the program into compliance with DT and OT policy and guidance. c. If the JECSS program were to connect to Coalition/Allied systems What Cybersecurity challenges or issues would exist? 5. Software a. List and discuss the engineering challenges and issues for a joint program, compared to a single service program. Which of these challenges/issues might impact, or apply to the software effort? b. List and discuss the challenges and issues in developing a test and evaluation strategy for a joint program, as compared to a single service program. Which of these challenges/issues might impact, or apply to the software effort? c. What impacts are likely to arise (or may arise) because of the use of COTS software? Consider impacts to the JECSS program, the T&E strategy, and/or any other impacts. 159

160 This Page is Intentionally Blank 160

161 Lesson 6.1 Test & Evaluation Master Plan (TEMP) Development TEMP Development 161

162 THIS PAGE INTENTIONALLY LEFT BLANK 162

163 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 6 Lesson Titles TEMP Development, TEMP Requirements, T&E Resources Lesson Time 8.5 hours (Includes two exercises) Terminal Learning Objective Given a scenario and DoD policy, the student will develop Test and Evaluation Master Plan (TEMP) content. (TLO #5) Enabling Learning Objectives Given a scenario, assess whether the capability requirements are well defined, can be measured and/or assessed during testing, and are relevant to the operational mission. (ELO #5.1) Given a scenario, assess whether planned tests support the test objectives/system requirements; and whether data collected will support established effectiveness, suitability, and survivability metrics. (ELO #5.2) Recognize T&E s role during development of requirements documents (Initial Capabilities Document, Capability Development Document, Capability Production Document, System Threat Assessment Report, and Operational Mode Summary/Mission Profile). (ELO #5.3) Given DoD policy, critique a TEMP and develop required content to support a system's technical requirements and acquisition strategy; and common DoD policies, practices, and procedures. (ELO #5.4) Given a TEMP, determine necessary resources and T&E infrastructure requirements and shortfalls (people/knowledge, funding, facilities/ranges, instrumentation and associated support, software systems integration labs, and modeling and simulation). (ELO #5.5) Identify organizations with roles and responsibilities in providing for, or overseeing the T&E strategy and TEMP. (ELO #5.6) Recognize where environmental, interoperability, cybersecurity, and mission level testing should fit into system development. (ELO #5.7) 163

164 TST 204 Intermediate Test & Evaluation Enabling Learning Objectives (Continued) Given a TEMP, construct a Developmental Evaluation Framework Matrix, and discuss how collected data supports the evaluation framework. (ELO #5.8) Recognize processes for evaluating a system's survivability and lethality. (ELO #5.9) Discriminate between LFT&E of lethality and vulnerability, and OTA evaluation of survivability. (ELO #5.10) Given a scenario, develop Critical Technical Parameters (CTPs), Measures of Effectiveness (MOEs), Measures of Suitability (MOSs), and data requirements to support assessment and evaluation of system performance requirements, Key Performance Parameters (KPPs), Key System Attributes (KSAs), and Critical Operational Issues (COIs). (ELO #5.11) Learning Method Expository discussion and two group exercises. Assignments 1. Students read the SPAW TEMP as a homework assignment the evening before the exercises. 2. Students complete a written graded assignment (after the exercises) on parts III and IV of the MS B TEMP. Estimated Student Preparation Time 90 minutes Method of Assessment Written examination, written graded assignment, and group presentations. References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook, Chapter 9 TST 204 Student Reference Disc Test and Evaluation Management Guide, 6 th ed., 2012 version, chapters 12, 13,

165 Test and Evaluation Master Plan (TEMP) Development Test & Evaluation Master Plan Documents the overall structure & objectives of the entire T&E program: Developmental Test and Evaluation (DT&E), Operational Test and Evaluation (OT&E) and Life Fire Test and Evaluation (LFT&E) Provides a framework to develop detailed T&E plans Documents T&E schedule & resource requirements Considered a contract among the PM, OSD and T&E activities Note: The PM and Test WIPT should use the DOT&E TEMP Guidebook 3.0 for format and content as guidance in formulating T&E plans

166 Importance of the TEMP The most important part of TEMP planning is the logical thinking that leads up to what testing is needed. When done properly, a TEMP should ensure that: All planned tests are actually required All test data collected are used for something (no waste) Data is collected in the most cost effective method (e.g., M&S vs. open-air test) Enough data is collected so that if there is a failure, causes and fixes can be determined Accurate T&E Funding estimates are input into the POM 3 Operational Mode Summary/ Mission Profile (OMS/MP) Prior to the completion of Materiel Solution Analysis phase, the DoD Component combat developer will prepare an Operational Mode Summary/Mission Profile (OMS/MP) document. OMS/MP will include the operational tasks, events, durations, frequency, operating conditions & environment in which the recommended materiel solution (the system) is to perform each mission and each phase of a mission. OMS/MP will be provided to the Program Manager, and will inform development of the plans for the next phase including: acquisition strategy, test planning, and capability requirements trades. Paraphrased from DoDI , Par. 5d(2)(b)

167 Test and Evaluation Master Plan (TEMP) Evolution TEMP is first due prior to the Milestone A decision (updated at MS B and MS C) TEMP Guidebook lists required information & suggested format TEMP development requires early involvement of testers, evaluators, and others Establishes early consensus among T&E WIPT member organizations For programs on the OSD AT&L Engagement List for DT&E or DOT&E s oversight list for either OT&E or LFT&E, TEMP is reviewed by DOT&E and DASD(DT&E) For programs not on the oversight, the CAE, or designated representative (usually the MDA), approves the TEMP Operational Mode Summary/ Mission Profile (OMS/MP) Prior to the completion of Materiel Solution Analysis phase, the DoD Component combat developer will prepare an Operational Mode Summary/Mission Profile (OMS/MP) document. OMS/MP will include the operational tasks, events, durations, frequency, operating conditions & environment in which the recommended materiel solution (the system) is to perform each mission and each phase of a mission. OMS/MP will be provided to the Program Manager, and will inform development of the plans for the next phase including: acquisition strategy, test planning, and capability requirements trades. Paraphrased from DoDI , Par. 5d(2)(b)

168 Evaluation Methodology and Evaluation Overview Starting at MS A, the PM will... describe a (developmental) evaluation methodology in the TEMP that will: Provide essential information on programmatic & technical risks Provide information for major programmatic decisions Starting at Milestone B, the evaluation methodology will include a (developmental) evaluation framework More information on the Developmental Evaluation Framework (DEF) will be presented in the upcoming TEMP Inputs lesson Starting at Milestone B, every TEMP will include an (operational) evaluation overview. More information on the Operational Evaluation Framework Matrix (OEFM) will be presented in the upcoming TEMP Inputs lesson Paraphrased from DoDI Major Test Phases & Events Starting at Milestone A, the TEMP should document T&E for acquisition phase completion (major test events required for milestone exit and entrance criteria). Each major test phase or event should have test entrance and test completion criteria. Each major test phase or event should have a synopsis of the intended analysis. Synopsis should indicate how the required data for test completion will contribute to one or more standard measures of program progress (COIs, KPPs, CTPs, KSAs) Paraphrased from DoDI , Encl 4, Par

169 Table of Independent Variables Every TEMP will include a table of independent variables (or conditions, parameters, factors, etc.) that may have a significant effect on operational performance. Starting at MS B, the updated table of variables will include: anticipated effects on operational performance the range of applicable values (or levels, settings, etc.) the overall priority of understanding the effects of the variable the intended method of controlling the variable during test (uncontrolled variation, hold constant, or controlled systematic test design) Paraphrased from DoDI , Encl 4, Par. 5 9 Milestone A TEMP Content The MS A TEMP will include sufficient information to describe in detail the T&E approach for execution during the TMRR Phase. The TEMP should include, at a minimum, the following information: For additional information, see the TEMP guides (posted on DASD(DT&E) and DOT&E websites). Description of the evaluation methodology that provides essential information on programmatic, technical risks, and major programmatic decisions. Documentation of the T&E for phase completion, that includes major test events required for milestone exit and entrance criteria. Description, within each test phase or event, of the overview of the intended analysis that includes: COIs, KPPs, KSAs, and CTPs. Inclusion of a table of independent variables that may have significant effect on operational performance. Components rationale for the requirements in the draft CDD. Documentation of the strategy and resources for cybersecurity T&E

170 Milestone A TEMP Content (cont.) For software acquisitions, the lead OTA will conduct an analysis of operational risk to mission accomplishment covering all planned capabilities of features in the system. This analysis will include commercial and non-developmental items. Initial analysis will be documented in the TEMP. Identification of the resources required to execute planned T&E activities. Documentation of the test Infrastructure, tools, and VV&A strategy. Documentation of the T&E program & master schedule for major T&E events. Description of the interoperability assessment and resources. For MDAPs and MAIS, identification of the Chief Developmental Tester and Lead DT&E Organization. Description of the initial understanding of all T&E. Identification of the plan for evaluating prototypes, technology, etc. Description of the general approach supporting engineering activities, certifications, and system evaluations. Discussion of the T&E implications of the CONOPS, including test resource implications. 11 TEMP must include T&E activities, to demonstrate maturity of the critical technologies TRL Technology Readiness Levels 9 Actual system proven through successful mission operations 8 Actual system completed and qualified through test and demonstration 7 System prototype demonstration in an operational environment 6 System/subsystem model or prototype demonstration in a relevant environment 5 Component and/or breadboard validation in a relevant environment 4 Component and/or breadboard validation in a laboratory environment 3 Analytical and experimental critical function and/or characteristic proof-of-concept 2 Formulation of technology concept or application 1 Basic principles observed and reported

171 TEMP Evolution CDD T&E WIPT Critical Operational Issues MS-B TEMP B Prior to MS-C the TEMP must be updated based on CPD to focus on remaining LFT&E and OT&E 13 TEMP Format Part I Introduction Part II Test Program Management and Schedule Part III Test and Evaluation Strategy Part IV Resource Summary

172 TEMP Part I - Introduction 1.1 Purpose 1.2 Mission Description 1.3 System Description Program Background Key Interfaces Key Capabilities System Threat Assessment Systems Engineering (SE) Requirements Special Test or Certification Requirements Previous Testing 15 The Evolving Threat Threat assessment (TA) docs are typically only valid for a few years Threats can evolve rapidly (for example, the changing IED threat) TEMP, test scenarios, procedures, etc. may NOT be based on the most recent TA As threats change, Follow-On Operational Test and Evaluation (FOT&E) should be considered to assess current mission performance & inform operational users New threat or target resources may be needed (current resources may not be threat representative) System may not be designed against the most recent threats The user probably wants T&E for any significant threats

173 Threat Equipment & Simulators As much as practical, actual threat systems should be used as targets or simulators during testing Where actual threat systems are not available, VV&A threat simulators should be used Sources of threat equipment: Automated Joint Threat Systems Handbook Published by the Threat Systems Office (DOT&E) accessible via the SIPRNET An information retrieval database Lists threat simulators, facilities, targets, M&S, ranges, and foreign assets 17 Threat Simulator Validation & Accreditation Threat representations for use in OT (targets, threat simulators, M&S) require validation & accreditation, unless coordinated with DOT&E (DAG para ) Threat representations for use in DT generally require validation & accreditation (see Service regs. for guidance) Validation of the threat representation: Establishes & documents a baseline comparison with the associated threat (based on current, DIA-approved threat data) Determines the extent of operational & technical performance differences between the two Typically conducted by the DoD component responsible for the threat representation Results are documented by a validation report

174 Threat Simulator Validation & Accreditation (cont.) Accreditation of the threat representation, for use in a specific test: Conducted by the organization in charge of the test (OTA, PM/PEO, etc.) Looks at the validation results to determine differences between the threat & threat representation (if any), and to determine the impact on the test Is the threat representation adequate & suitable for a specific test? Note: if representative threat representation cannot be obtained, obtain any necessary waivers and report this as a test limitation 19 TEMP Part II - Test Program Management and Schedule 2.1 T&E Management T&E Organizational Construct 2.2 Common T&E Database Requirements 2.3 Deficiency Reporting 2.4 TEMP Updates 2.5 Integrated Test Program Schedule Figure Integrated Test Program Schedule

175 User Involvement In T&E Benefits: Increased operational realism Greater chance of finding system shortcomings Improved test design & execution Helps ensure user needs are represented in the development of the system Relatively low cost (TDY/travel only, for active duty military) Risks: Difficult to obtain More/expensive training may be required Here today gone tomorrow Note: User involvement is desired for DT, but required for OT 21 Contractor Involvement in T&E DT&E minimal restrictions concerning contractor involvement The program should identify each developmental test phase or major developmental test event in the TEMP, as contractor or government DT&E Contractor involvement within DT&E should be identified in both the program RFP and TEMP OT&E Title 10, U.S.C. places restrictions on the use of contractors in support of IOT&E Contractors may only participate in IOT&E of major defense acquisition programs to the extent they will participate when the system is deployed in combat These limitations don t apply to DoD support contractors

176 OTA Interactions With Other Agents OTA interactions with the program office and other T&E organizations: T&E WIPT often includes representatives from each organization involved with the test program, including PM & OTA OTA rep handles all items related to OT&E OTA can give input to other organizations via the WIPT DOT&E TEMP approval, OT results, etc., for programs on the OT or LFT&E Oversight List See DoDI , Encl. 5 for more information With other OTAs MOA on Multi-Service Testing (On Student Disc) 23 TEMP Part III T&E Strategy 3.1 T&E Strategy Decision Support Key 3.2 Developmental Evaluation Approach Developmental Evaluation Framework Test Methodology Modeling and Simulation (M&S) Test Limitations and Risks 3.3 Developmental Test Approach Mission-Oriented Approach Developmental Test Events and Objectives 3.4 Certification for OT&E 3.5 Operational Evaluation Approach Operational Test Events and Objectives Operational Evaluation Framework Modeling and Simulation Test Limitations 3.6 Live Fire Evaluation Approach Live Fire Test Objectives Modeling and Simulation Test Limitations 3.7 Other Certifications 3.8 Future Test and Evaluation

177 Developmental Evaluation Framework Matrix (DEF) Starting at MS B, TEMP Part III will include a Developmental Evaluation Framework that: Identifies key data (that contributes to assessing progress on) KPPs, KSAs, CTPs, DT objectives, interoperability and cybersecurity requirements, reliability growth, maintainability attributes, DT objectives, and others (as needed) Shows the correlation/mapping between test events, key resources, and decisions supported The DEF will support a MS B assessment of planning, schedule, and resources; and a MS C assessment of performance, reliability, interoperability, and cybersecurity Paraphrased from DoDI Encl 4 par 5a(11) 25 DEF Content & Format DEF entries are requirements grouped into 4 critical evaluation areas (Performance, Reliability, Interoperability, Cybersecurity) Each functional evaluation area should list the significant decision points supported (major milestones, and other, program-unique decision points) A Developmental Evaluation Framework will include elements (columns, rows or cells) bearing essential information A sample DEF (the Time-Phased NEW Radar Example) is included in this section of the TST 204 student book This sample DEF shows required DEF content (see DAG on the Student CD-ROM, for a detailed list & descriptions of the required content) DEF format should be tailored to the needs of the individual programs (This sample DEF is meant to merely suggest format)

178 Operational Evaluation Framework Starting at Milestone B, every TEMP will include an evaluation overview. The overview will show how the major test events and test phases link together to form a systematic, rigorous, and structured approach to evaluating system performance across the applicable values of the independent variables. Test resources will be derived from the evaluation overview. Paraphrased from DoDI Encl 5 par 5e 27 Evaluation Overview Content & Format TEMP Part III will include an Evaluation Overview, with the following info.: Test Goals Mission-oriented T&E measures Test design info. (factors, scientific and statistical methods & measures, etc.) Test period (OA, IOT&E, FOT&E, etc.) High level resources summary (time, people, places, and things) needed to execute an adequate test The Evaluation Overview also aids Integrated Testing by identifying opportunities for using DT data for OT evaluation

179 Decision Support Key A separate summary of decision points and the information needed to support them should be included in a table, to serve as a quick reference for evaluations in the TEMP See the Decision Support Key example, in this section of the TST 204 Student Book 29 What is Live Fire Test and Evaluation (LFT&E)? LFT&E evaluates the ability of systems to meet and defeat expected battlefield threats. Assessment of lethality and/or survivability of covered systems LFT&E seeks to affect design as early in the acquisition cycle as possible LFT&E results are integrated with OT results to evaluate overall system effectiveness, suitability and survivability

180 Lethality Testing Testing of a production representative munition or missile, for which the target is representative of the class of systems that includes the threat and the target and test conditions are sufficiently realistic to demonstrate the lethal effects the weapon is designed to produce. Note: Lethality is primarily addressed by LFT&E 31 Survivability Includes the elements of Susceptibility (assessed in OT&E) Vulnerability (assessed in LFT&E) Recoverability (primarily assessed in LFT&E) Is an important contributor to Operational Effectiveness & Suitability Survivability assessment should be conducted for all systems under OSD OT&E oversight, that may be exposed to threat weapons Whether the system is designated for LFT&E oversight or not

181 Survivability Includes assessments of Susceptibility How likely is the system to be hit? Function of: - Speed/altitude - Agility - Tactics - Use of CMs - Stealth - Etc... Vulnerability Will the system still work if hit? Function of: - Armor plating - Redundancy - Damage control procedures - Etc... Recoverability Once damaged, what needs to be done to Prevent loss Reduce casualties Regain mission capable status ELO # OTA Evaluations of Survivability Survivability is assessed as part of the OT&E process Test events are conducted, to assess the system s survivability against typical threats Other methods such as M&S, analysis, & military judgment can be used to augment test data Examples of survivability MOPs: Distributional statistics (such as median) of time the launcher spent on the firing point Demonstrated launch angle Missile firing visual signature effects

182 Waiver from Full-Up, System-Level (FUSL) LFT&E A system on the OSD oversight list for LFT&E may not proceed past LRIP, until report on results of FUSL LFT&E is submitted to Congress If FUSL LFT&E is unreasonably expensive & impractical, a waiver package must be sent to the Congressional defense committees (via service officials, DOT&E, etc.) prior to Milestone B Or if the program is initiated at MS B (or MS C), a waiver package must be submitted as soon as practical after MS B (or MS C) Note: waiver package includes certification by USD(AT&L) or Component Acquisition Executive; and a DOT&E-approved alternative plan for conducting LFT&E in the absence of FUSL testing (Paraphrased from 10 USC 2366) TEMP Part IV - Resource Summary 4.1 Introduction 4.2 Test Resource Summary Test Articles Test Sites Test Instrumentation Test Support Equipment Threat Representation Test Targets and Expendables Operational Force Test Support Models, Simulations and Test-beds Joint Operational Test Environment Special Requirements 4.3 Federal. State, Local Requirements 4.4 Manpower/Personnel Training 4.5 Test Funding Summary This information will be covered in the upcoming Test Resources lesson

183 TEMP Appendices Appendix A Bibliography Appendix B Acronyms Appendix C - Points of Contact Appendix D - Scientific Test & Analysis Techniques Appendix E - Cybersecurity Appendix F - Reliability Growth Plan Appendix G - Requirements Rationale (required by DoDI ) Additional Appendices as Needed 37 TEMP Development Summary Documents the overall structure & objectives of the entire T&E program: Developmental Test and Evaluation (DT&E), Operational Test and Evaluation (OT&E) and Life Fire Test and Evaluation (LFT&E) Provides a framework to develop detailed T&E plans Documents T&E schedule & resource requirements Considered a contract among the PM, OSD and T&E activities Format for TEMP is in the DOT&E TEMP Guidebook

184 THIS PAGE INTENTIONALLY LEFT BLANK 184

185 Developmental Evaluation Framework Matrix: N.E.W. Radar Decisions Supported Functional Evaluation Area Decision Support Key System Requirements and T&E Measures Question Resources Cross Reference Enter Software Build (SW) Phase Begin Computer inthe Loop Develop ment Begin Hardware inthe loop Integration Begin Near Field Chamber Integration /Test Begin Far Field Range Integration /Test Begin Landbased Test Site Integration /Test Platform DT&E Ready (Major functional areas, from Funct. /Alloc. Baseline products when available) Performance Ballistic Missile Defense (BMD): Detection (Developmental issue) Does N.E.W Radar detect threat missiles? Technical Requirements Document Reference Description 3.X.X.X Signal to Noise Ratio at 1000 NM range Technical Measures (e.i., CTP, TPM, req'd benchmark) (ie, MRTFB, cyber range, security team) > 8 db BMD Surrogate Targets Digital Threat M&S (In this order:) KPP #, KSA #, COI # and/or Critical Requirement # KPP 1, COI 1, COI 2 1) DT #1 2) Digital M&S 3) >2dB Identify major decision points for which T&E phases, activity and events will provide decision supporting information. Decision points and supporting information may be acquisition, programmatic, technical or operationally related. Display descriptive information in the below cells in an abbreviated manner similar to the following format: 1) Test event or phase (e.g. CDT1...) 2) Test method, technique, parameters... 3) Description of data needed to support decision 4) Other 1) DT #1 2) Digital M&S 3) >2dB 1) DT #3 2) Injected simulated tracks 3) >6dB 1) DT #4 2) Modelcontroled RF signal 3) > 8dB 1) DT #7 2) Sea range and surrogate BMD targets 3) > 8dB Air Warfare (AW): Air Tracking Can NEW Radar detect aerial vehicles? 3.X.X.X Transmitter Power Output Minimum Detectable Signal (MDS) Transmitted Power Meter KPP 2 1) DT #3 2) Injected simulated tracks 1) DT #4 2) Modelcontroled RF signal 1) DT #5 2) Far Field measure ments 1) DT #6 2) Live Radar measure ments 1) DT #7 2) Live Radar measure ments Air Warfare (AW): Self Defense Can N.E.W. Radar support Air Warfare versus aircraft and cruise missiles? 3.X.X.X Angular Resolution <0.005 Radians Threat Representative Aircraft and ASCM surrogates Range Resolution < 150 ft Threat Representative Aircraft and ASCM surrogates KPP 4, COI 4 1) DT #1 2) Digital aircraft and ASCM M&S KPP 4, COI 4 1) DT #1 2) Digital aircraft and ASCM M&S 1) DT #2 2) Digital aircraft and ASCM M&S 1) DT #2 2) Digital aircraft and ASCM M&S 1) DT #3 2) Injected simulated tracks 1) DT #3 2) Injected simulated tracks 1) DT #4 2) Modelcontroled RF signal 1) DT #4 2) Modelcontroled RF signal 1) DT #5 2) Far Field aircraft and ASCM measure ments 3) < rad 1) DT #5 2) Far Field aircraft and ASCM measure ments 3) < 900 ft 1) DT #6 2) Live Radar aircraft and ASCM measure ments 3) < rad 1) DT #6 2) Live Radar aircraft and ASCM measure ments 3) < 300 ft 1) DT #7 2) Sea range and surrogate aircraft and ASCM targets 3) < rad 1) DT #7 2) Sea range and surrogate aircraft and ASCM targets 3) < 150 ft Interoperability 185

186 Developmental Evaluation Framework Matrix: N.E.W. Radar Decisions Supported Functional Evaluation Area Decision Support Key System Requirements and T&E Measures Question Resources Cross Reference Enter Software Build (SW) Phase Begin Computer inthe Loop Develop ment Begin Hardware inthe loop Integration Begin Near Field Chamber Integration /Test Begin Far Field Range Integration /Test Begin Landbased Test Site Integration /Test Platform DT&E Ready (Major functional areas, from Funct. /Alloc. Baseline products when available) Net Ready (Developmental issue) Is the N.E.W. Radar interoperable with interfacing networks? Technical Requirements Document Reference Description 3.X.X.X Information Operations (IO) Technical Measures (e.i., CTP, TPM, req'd benchmark) IEEE Spectrum of interoperability model (SoIM) Level 6 (ie, MRTFB, cyber range, security team) SW lab, CEC DDS Combat Systems LAN Air Traffic Control system interface (In this order:) KPP #, KSA #, COI # and/or Critical Requirement # NR KPP 1) DT #1 2) Digital network simulations Identify major decision points for which T&E phases, activity and events will provide decision supporting information. Decision points and supporting information may be acquisition, programmatic, technical or operationally related. Display descriptive information in the below cells in an abbreviated manner similar to the following format: 1) Test event or phase (e.g. CDT1...) 2) Test method, technique, parameters... 3) Description of data needed to support decision 4) Other 1) DT #2 2) Digital network simulations 1) DT #3 2) Injected simulated tracks, network simulations 1) DT #4 2) Modelcontroled RF signal, network simulations 1) DT #5 2) Far Field measure ments, network stimulators 1) DT #6 2) Live Radar measure ments, live network interfaces 1) DT #7 2) Live Radar measure ments, live network interfaces 3) Level 6 Cybersecurity Cyber Operations Security Does system meet Cybersecurity standards? 3.X.X.X Ability to withstand cybersecurity attacks All controls assigned, in place, configured, and adequate. NCR, Red Cell Team RMF C&A 1) Pen Test 1 2) LBTS (Nat'l Cyber Range) penetration testing 3) cyber range report, vulnerability assessment 1) Pen Test 2 2) Platform penetration testing 3) Red team report Network Vulnerability Assessment Does system meet Cybersecurity standards? 3.X.X.X Free of network vulnerabilities, open ports, trapdoors, inactive interfaces All controls assigned, in place, configured, and adequate. Suitable or Certified test software, personnel and test environment RMF C&A 1) DT #4 2) Network Security Scan 3) Scan audit 1) DT #5 2) Network Security Scan 3) Scan audit 1) Pen Test 1 2) Network Security Scan 3) Scan audit 1) Pen Test 2 2) Network Security Scan 3) Scan audit Reliability Reliability Is the N.E.W. Radar Materially Reliable? 3.X.X.X Reliability Probability of System Materiel Readiness over 24 hrs NEW Radar Components and full up integrated testing Reliability Growth Curves 1) DT #10 2) Reliability data, Failure modes 3) > 16 hrs 1) DT #11 2) Reliability data, Failure modes 3) > 24 hrs 186

187 Operational Evaluation Framework Matrix Assumes 100% Efficiencies Goal of the Test Mission oriented Response Variables Test Design Resources Test Period Mission / Capability COIs Effectiveness / Survivability / Suitability STAT Methodology and Operational Context People, Places, Things e.g LUT, OA, IOT&E, etc. 187

188 Decision Support Key Decision Description Developmental Information Needed The B kit weight and ability of the CIRCM prototype to output sufficient laser energy to defeat a threat missile Ensure EMD request for proposals includes appropriate work tasks and deliverables to complete technology maturation Pre EMD Review Maturity of critical CIRCM technology elements, weight of key CIRCM components, and CMWS or JATAS integration risks Shift focus from technology maturation to system development and platform integration Milestone B The size, weight, and power must be understood sufficiently to enable safe integration onto an aircraft platform. In addition, sufficient functionality must be present in the CIRCM to enable operation and initial determination of system capability. First Flight Transition from the lab to flight testing Stability of the design elements associated with the long lead components. Ability to integrate CIRCM onto at least one rotary wing platform; and any significant (Cat I) deficiencies identified, should be corrected Commit to long lead procurement of material and components LRIP 1 Long Lead Demonstrated capability of the integrated CIRCM system, status of the system development effort, including software maturity, and supportability of the CIRCM, including system reliability. Decide when to begin low rate production of the CIRCM Milestone C (LRIP 1) Current status of system development, including whether additional design issues have been identified, and if fixes to significant deficiencies deferred from MS C have been validated in flight test Tie additional progress in development (after MS C) to second lot of production (reduce risk that excessive systems will be produced prior to completing design and development) LRIP 2 Information on the aerodynamic and system integration effects on the aircraft platform s ability to safely and routinely attain, sustain, and terminate flight with CIRCM installed and operating within limitations Airworthiness is required prior to operational deployment AW Certification CIRCM capabilities, limitations, and operating procedures for the intended usage and missions First deployment Readiness for first deployment Ability of a CIRCM system, integrated onto an aircraft platform, to complete the dedicated OTRR Proceed to operational testing operational test event (IOT&E) with no significant issues affecting operational effectiveness, operational suitability, or survivability. FRP Proceed with full rate production Production processes are mature and no significant design deficiencies remain Sufficient quantity of systems and spares in place to support routine operations. The T&E knowledge required at IOC will focus on the supportability of the CIRCM system. IOC Declare initial operational capability 188

189 Tuesday Homework Tuesday Homework MS B TEMP 189

190 THIS PAGE INTENTIONALLY LEFT BLANK 190

191 FOR TRAINING USE ONLY Test and Evaluation Master Plan for Joint-Self-Propelled Artillery Weapon (J- SPAW) March

192 FOR TRAINING USE ONLY TEST & EVALUATION MASTER PLAN SIGNATURE PAGE FOR THE SELF-PROPELLED ARTILLERY WEAPON (SPAW) ****************************************************************************** SUBMITTED BY Product Manager, Army Airborne Command and Control System DATE CONCURRENCE Project Manager, Armor Brigade Combat Team DATE APPROVAL Program Executive Officer, Ground Combat Systems DATE 2 192

193 FOR TRAINING USE ONLY T&E IPT TEMP COORDINATION SHEET FOR THE SELF-PROPELLED ARTILLERY WEAPON (SPAW) SIGNATURE DATE I. M. Awesome T&E IPT Chairman Concur/Nonconcur Date C. Developer Combat Developer Concur/Nonconcur with comments ATEC (AST Chair) Concur/Nonconcur Independent Logistician Concur/Nonconcur Threat Integrator (FIO) Concur/Nonconcur Joint Interoperability Test Command Concur/Nonconcur Note to students: The Table of Contents and Annexes have been omitted for training purposes 3 193

194 FOR TRAINING USE ONLY 1. PART I INTRODUCTION 1.1 PURPOSE This Test and Evaluation Master Plan (TEMP) establishes the framework for conducting Test and Evaluation (T&E) of the Self-Propelled Artillery Weapon (SPAW). 1.2 MISSION DESCRIPTION MISSION OVERVIEW There is a critical need to provide supporting fires to the maneuver forces that are capable of ranges in excess of 32,000 meters. The Russian MSTA-X 152 mm selfpropelled howitzer and the Chinese PLZ05-X 152 mm self-propelled howitzers are both projected to have a maximum range in excess of 32,000 meters. This performance puts US artillery forces at a distinct disadvantage. The range of the US M109A1 howitzer is 25,000 meters. With upgrades the M109 can reach ranges of 30,000 meters. These weapons will still be vulnerable to enemy counter-battery fires. The SPAW will replace the existing systems with a single system that has the ability to perform both the light and heavy artillery missions for U.S. Army and Marine Corps use. The SPAW is expected to provide increased range, speed, maneuverability, firing rate, integration, communications, and the ability to logistically support these improvements in a battlefield environment. The SPAW will be used to provide armored combat support. It will be able to be air transported into the theater of operations and operate in combat with tanks and the Bradley Fighting Vehicle. It requires excellent ground mobility allowing the ability to move and maneuver to increase personnel survivability. It requires the ability to fire in a 360 degree circle with its primary and secondary armaments. The system requires the ability to provide both direct (line of sight) and indirect (non- line of sight) firing CONCEPT OF OPERATIONS The SPAW will provide armored combat support. The SPAW will replace existing systems with a single system that has the ability to perform both the light and heavy artillery missions for the U.S. Army and Marine Corps. The SPAW is an extended range howitzer based on a concept similar to the existing M-109A1 self-propelled howitzer. The SPAW is highly mobile and capable of providing highly lethal fire in any type of operation. The SPAW consists of a 155 mm cannon mounted on a track vehicle chassis. The cannon are manufactured using new material technology and have a design range of 34,000 meters. The vehicle will have provisions to carry 35 rounds of conventional artillery ammunition. The vehicle also contains command/control and target acquisition electronics suites to provide precision targeting, location, and threat information. The crew of the SPAW will consist of a section chief, driver, one cannoneer who loads and fires the weapon, and two ammunition handlers who handle the ammunition. The SPAW system is made up of the SPAW and a Field Artillery Ammunition Support Vehicle (FAASV). The SPAW system will be deployed in battery-sized units consisting of six weapon systems. The battery would also include support vehicles. One tracked vehicle would be devoted to command and control functions. One tracked vehicle 4 194

195 FOR TRAINING USE ONLY would be devoted to fire direction functions and subsystems. All vehicles would contain compatible command/control and target acquisition suites. Each SPAW and tracked support vehicle is transportable in the field by C OPERATIONAL USERS This section will describe the intended users of the system, how they will employ the system and important characteristics of the users. Section will be updated prior to MS B. 1.3 SYSTEM DESCRIPTION Descriptions of the SPAW and FAASV follow: SPAW Description The M109-series medium, self-propelled howitzer is a full-tracked, armored combat support, internally loaded, air transportable, vehicle powered by an eight cylinder, diesel engine. The system is capable of both direct (line of sight) and indirect (non-line of sight) firing. The hull and cab assemblies protect the crew and equipment against small arms fire. The vehicle is divided into three sections: driver's compartment, engine compartment, and fighting compartment. Features include an on-board ballistic computer, secure communications, enhanced position and navigation system, an integrated muzzle velocity system (MVS), new turret, improved cannon and mount, improved ballistic and nuclear, chemical, and biological protection, automotive improvements, built-in test equipment (BITE), and driver's night vision capability. The SPAW is equipped with a defensive armament machine gun system with 50 caliber munitions. Employed as part of a firing battery providing massed fires, the SPAW fully integrates with the maneuver forces, enabled to receive and compute fire missions from all fielded target acquisition sources, and command and control systems. The indirect fire support includes sensor directed actions to engage or operate under Battalion Headquarters direction (Threshold) and Allied/Coalition (Objective) force direction. Two versions of the SPAW referred to as Increment 0 (Inc 0) and Increment 1 (Inc 1) will be designed and tested

196 FOR TRAINING USE ONLY Figure 1: SPAW Concept Characteristics PROGRAM BACKGROUND Describes the Acquisition Strategy and other key information from Analysis of Alternatives process (omitted for student exercise) KEY INTERFACES Interfaces with existing or planned system architectures. Key DODAF views should be included here. (omitted for student exercise) KEY CAPABILITIES The SPAW Measures of Effectiveness and Suitability shown in Table 1-1 are specific SPAW Performance Capabilities selected from the SPAW CDD. The following table also denotes the KPPs and supporting KPP requirements identified in the validated CDD for the SPAW. The Army will refine KPPs, including measurable values, before Milestone C. Any changes to the KPPs will be included in the SPAW CPD, which supports Milestone C

197 FOR TRAINING USE ONLY Table 1-1: SPAW Measures of Effectiveness and Suitability Operational Capability Deployability Parameter Capability Threshold Capability Objective Capability Reference Transportable worldwide by air, sea, highway, and rail modes. (KPP 3: Transportability) Transport by C-17 Transport by C-17 CDD Agility and Versatility Speed 25 mph on unimproved roads 30 mph on unimproved roads Cant 10 degrees of cant without improving position 15 degrees of cant without improving position Fording The howitzer will be capable of fording 4 feet of water. The howitzer will be capable of fording 4.5 feet of water. Lethality Range (KPP 1: Maximum Firing Range) Firing 34,000 meters 37,000 meters CDD Sustained rate of fire 3rd/min Fire without using spades Sustained rate of fire 5rd/min Fire without using spades CDD CDD Sustainability/ Reliability Operational availability (Ao) (KPP 6: Availability) Reliability Mission defined critical systems, 85% (Ao). Howitzer/ammunition combination 0.98 Mission defined critical systems, 95% (Ao). CDD Combination 0.99 CDD Intermediate Not exceed 2 hour Not exceed 1.5 Mean Repair Time hours CDD

198 FOR TRAINING USE ONLY CRITICAL TECHNICAL PARAMETERS All CTP values are expected to be at Threshold value at FRP decision. For all other decisions, the appropriate maturation value for that point will be considered. Reliability CTP s will use growth curves as part of the maturation determination. Supported Capability Table 1-2: Critical Technical Parameters Developmental Technical Parameter Threshold Value Stage Event Deployability Transportable Prod Qual Test O = C-17 MS C Decision Supported Agility & Versatility cross-country sustained speeds IQT T = 10 mph MS C PVT O = 15 mph FRP Agility & Versatility Unimproved road sustained speeds (no more than +2% grade) IQT T = 25 mph MS C PVT O = 30 mph FRP Interoperability Survivability & Force Protection Survivability & Force Protection Receive and compute fire missions from multiple sources Minimum time when moving to stop and first round after fire order Compute own firing data IQT, LUT T = SPAW Battalion, Joint levels MS C IQT T = 30 seconds MS C PVT T = provide limited tactical fire direction For the rest of the battery when required FRP SYSTEM THREAT ASSESSMENT A recent capability gap has emerged in the area of counter-battery fire; therefore, procurement of the Self-Propelled Artillery Weapon has become essential to the US maneuvering forces. Recent developments and proliferation of threat artillery have rendered US artillery forces extremely vulnerable. There is a critical need to provide supporting fires to the maneuver forces that are capable of ranges in excess of 32,000 meters. This performance puts US artillery forces at a distinct disadvantage. The range of the US M109A1 howitzer is 25,000 meters. With upgrades the M109 can reach 8 198

199 FOR TRAINING USE ONLY ranges of 30,000 meters. These weapons will still be vulnerable to enemy counterbattery fires. The Defense Intelligence Agency (DIA)-validated System Threat Assessment Report (STAR) dated Feb 2005 lays out the threat to the SPAW in detail. This STAR outlines specific representative threats, which U. S. forces could encounter in a regional conflict anywhere in the world, out to the 2022 timeframe. It does not necessarily reflect the most advanced or technologically feasible threats in all areas, or those only fielded in potentially small numbers in a few countries. The SPAW will be employed worldwide, wherever U.S. interests are threatened. To this end, potential threat forces will be armed with various mixes of increasingly sophisticated weaponry. They will include small arms and automatic individual/crew served weapons, antitank (AT) weapons to include antitank guided missiles (ATGM), medium caliber cannon (20-75mm), hand held high explosive antitank (HEAT), and land mines. Regardless of its location on the battlefield, SPAW equipped forces will be threatened by indirect fire. As part of a digitized force, the Crusader will be subject to electronic warfare, threat information operations and directed energy. The SPAW will potentially operate in a nuclear, biological and chemical (NBC) environment, which could include weaponized agents, toxic industrial hazards and battlefield residues. (Further discussion omitted) There are two primary threats to US medium range artillery systems. The most prolific system is the recently-fielded Russian MSTA-X 152 mm self-propelled howitzer. This system has a maximum range of 32,000 meters. It is being widely exported and current intelligence estimates place it in 10 countries. (Further discussion omitted) The second threat system is the Chinese PLZ05-X 152 mm self-propelled howitzer. It has a maximum firing range of 31,500 meters. It has been fielded within the People s Liberation Army and has been exported to Vietnam and North Korea. (Further discussion omitted) SYSTEMS ENGINEERING (SE) REQUIREMENTS This section would include SE-based information including Technical Performance Measures from the SEP and Reliability Growth Guidance. (Further discussion omitted for exercise) SPECIAL TEST OR CERTIFICATION REQUIREMENTS This section would include all required certifications and special test requirements. This could include Risk Management Framework (RMF), Cybersecurity Guidance, and many other areas. (Further discussion omitted for exercise) PREVIOUS TESTING 9 199

200 FOR TRAINING USE ONLY This section would discuss results of previous tests that apply to or effect the test strategy. (Further discussion omitted for exercise)

201 FOR TRAINING USE ONLY 2. PART II TEST PROGRAM MANAGEMENT AND SCHEDULE 2.1 T&E MANAGEMENT Conduct of the development, execution, and management of the SPAW T&E program will comply with the Army's Integrated T&E process. This process implements a T&E WIPT for development of the TEMP. The following paragraphs present a brief description of the T&E IPT primary members. Combined Test Organization (CTO) Coordinates and manages all developmental test planning, support, and reporting Plans and coordinate T&E VV&A activities Prepares, coordinates, distributes, and maintains the TEMP, as directed by PM SPAW Coordinates and integrates technical test execution, data collection, data authentication, results reporting, and documentation at all levels (component, subsystem, system, and platform) PM SPAW Materiel Developer Establishes and chairs the T&E WIPT Directs and supports the activities of the Lead System Integrator Develops and provides System Support Package; spare and repair parts; technical literature; training package to include, as required, new equipment training support packages and coordination of instructor and key personnel training; special tools and test measurement and diagnostic equipment; and unique software Provides the DT report to Office of the Secretary of Defense (OSD) assessing potential of system and recommendation on readiness to proceed to IOTE Provides M&S to satisfy wrap-around requirements for support of T&E Accredits M&S used in support of major program milestones/events/activities except those used for operational test and evaluation Ensures the maintenance of all certifications including interoperability, and the DoD Information Assurance Certification and Accreditation Process (DIACAP) throughout the life cycle Guides and provides funding and oversight for DT&E, M&S and LFT&E activities Prime Contractor Designs, fabricates, tests, and manages the configuration of SPAWs, Provides Integration Laboratory distributed test capability, which supports DT and OT events Provides test planning and documentation to include TEMP input, the Integrated Test Evaluation Plan (ITEP), and Detailed Test Plans (DTPs)

202 FOR TRAINING USE ONLY Conducts developmental testing, Integrated Logistics Support Testing, and training for these events with support provided by the government Supports the conduct of Limited User Tests (LUT) Provides data management capability Maintains the contractor facilities and resources that are required to support DT and OT events Provides contractor staffing of the CTO to accomplish integrated test planning and oversight, and management of test execution TRADOC, US Army Artillery Center and School, Fort Sill, OK Combat Developer (user representative) Develops and coordinates the CDD; Critical Operational Issues (COIs); Doctrine and Organization Test Support Package (D&O TSP); TTP; System Training Plan; Capability Production Document; Threat Test Support Package (TTSP); Training Developer Test Support Package, and other test products as required Coordinates with the materiel developer and system evaluator, the need, schedule, and resources for test, experimentation, and M&S to support SPAW Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel, Facilities, and Policy (DOTMLPF-P) products Plans, resources, and conducts FDEs Serves as the user representative for all matters related to the CDD Responsible for development of test threads necessary for Intra-Army Interoperability Certification (IAIC) testing Deputy Undersecretary of the Army for Test and Evaluation (DUSA-TE) U.S. Army Test and Evaluation Office Primary focal point for coordination of the TEMP and resolution of test issues at the Department of the Army (DA) and OSD level Advocate for Independent Validation of Army M&S OSD Deputy Assistant Secretary of Defense for DT&E Responsible for DT&E oversight within OSD OSD TEMP coordination OSD DOT&E Responsible for OT and LFT&E oversight within OSD. Oversees OT&E and LFT&E activities OSD TEMP coordination and approval Responsible for the approval of all operational test plans Prepares the Beyond LRIP and LFT&E reports T&E ORGANIZATIONAL CONSTRUCT Represents all organizations that are stakeholders of, or support the SPAW T&E program

203 FOR TRAINING USE ONLY The T&E WIPT will form additional sub-ipts as necessary to work specific areas such as the Reliability IPT, Simulation IPT, Threat Working Group, Cybersecurity IPT, Software IPT, and the Live Fire IPT 2.2 COMMON T&E DATABASE REQUIREMENTS Sufficient and appropriate data shall be collected in all test activities to ensure accurate determination of exit criteria compliance. Data shall be consistent with appropriate applicable standards. Maximum coordination and sharing among participating organizations shall be assured. The Program and Developmental Test Leads will ensure that DASD (DT&E) and DOT&E have timely access to all records, test data, and reports, including classified and proprietary information, as appropriate to carry out their independent assessments. 2.3 DEFICIENCY REPORTING Details omitted for training purposes. 2.4 TEMP UPDATES This TEMP will be updated at each major future milestone as required by DoD Between major required updates Program Manager will keep TEMP up-to-date as required by performing required updates and informing Milestone Decision Authority as appropriate. 2.5 INTEGRATED TEST PROGRAM SCHEDULE The SPAW program is event-driven. Testing provides data to systems engineering that demonstrates the level of compliance with the specifications and allows evaluators to support assessment of the capabilities achieved. Significant events include a MS C decision (FY 2010), a demonstrated IOC (FY 2012), and a FRP Decision (FY 2014). These schedules are for reference only and present the scheduled activities as currently anticipated. The Integrated Master Schedule maintains the real-time program schedules, which will be revised as changes occur. 3. PART III TEST & EVALUATION STRATEGY 3.1 T&E STRATEGY Details omitted for training purposes. 3.2 DEVELOPMENTAL EVALUATION APPROACH Details omitted for training purposes DEVELOPMENTAL EVALUATION FRAMEWORK A draft CDD has been developed. The requirements in the draft CDD reflect the Analysis of Alternatives. SPAW PMO will undertake engaging with contractors to design, fabricate and test required prototypes during TMRR phase. A top level

204 FOR TRAINING USE ONLY evaluation framework matrix containing primary capabilities (as defined in the CDD) will be provided as the material solutions are identified, matured, and reflected in the TEMP. (Framework will be completed as a student exercise) Table Developmental Evaluation Framework TEST METHODOLOGY This section will describe each capability and key functional area and address the test methodology required for that areas. (Further discussion is omitted for exercise) MODELING AND SIMULATION (M&S)

205 FOR TRAINING USE ONLY The SPAW T&E plans must integrate closely with the use of M&S to reduce program cost and schedule risk. Key objectives of the SPAW M&S efforts are to: Provide SPAW system, subsystem and mission module emulators in the System Integration Laboratory (SIL) Develop and deliver SPAW test articles (simulations) to interface/integrate with the SIL Provide SPAW design performance and characteristics data for generating SPAW representations to be embedded in the onboard operating system Cooperative development of a complete list of M&S necessary to support major program decisions. This list will prioritize SPAW models and simulations to meet funding and schedule constraints Simulation-Emulation-Stimulation (SES) Using the SES process, developing and maintaining an integration and test environment will facilitate testing and performance validation of SPAW variants hardware/software components before installation into the actual vehicle. This process will permit simulation of LRU functions, emulation of the LRU physical interfaces and system data flow over the Global Information Grid, and stimulation of the system through user inputs. The SES approach allows LRU emulation to be interchangeable with the respective hardware to simulate the vehicle system for the software real-time test bed environment. Testing vehicle configuration with simulation/emulation in a lab environment permits validation and verification of system functions and capabilities through appropriate stimulation and measurement System Integration Laboratory (SILs) System Integration Laboratories will conduct SPAW integration and testing, initially with simulations. The follow-on testing at SPAW level will utilize the simulation environment and emulations connected to the SIL. To support the spiral life cycle software development, SIL development will phase builds. The SIL will generate a virtual prototype that supports initial prototype testing and will continue to update it through all phases, and until fielding the objective force. The SPAW SILs will develop an iterative use of M&S to support the LSI phased T&E incremental phase upgrade approach. With the increasing availability of the high-fidelity models during the time phase of the EMD program, SILs will acquire the models from the corresponding IPTs and integrate the models to support the system/subsystem integration and test (I&T), software I&T, as well as the incremental update of the SoS integrated test/user test (IT/UT) simulation models Test Limitations and Risks No test limitations or risks have been developed for these tests. 3.3 DEVELOPMENTAL TEST APPROACH

206 FOR TRAINING USE ONLY MISSION-ORIENTED APPROACH This section is expected to describe the approach to test system performance within a mission context including integration with operational testing and use of actual users to support human factors assessments. (Further details omitted for training purposes.) DEVELOPMENTAL TEST EVENTS The design of the SPAW test program supports a strategy of continuous evaluation through the life of the program. The evaluation strategy is based on assessing the Critical Technical Parameters (CTPs) of the SPAW. The CTPs were derived from the requirements as contained in the JROC Validated/Approved SPAW CDD, (2006). Evaluation of the CTPs involves discerning whether the materiel performance is sufficient to enable completion of mission critical tasks when acted upon by elements within the operational environment such as threat, terrain, weather, etc. The evaluation strategy enables early detection of gaps in the SPAW capabilities, identification of design attributes that have sufficient robustness to enable acceptable performance in a degraded state, and reinforces Human Systems Integration considerations INTEGRATED QUAL TEST (IQT) OVERVIEW NOTE FOR DAU STUDENTS: IQT is SPAW testing conducted during the EMD phase. A combined government/contractor effort will conduct the IQT to verify that the resulting prototype delivered systems meet the performance criteria set out in the systems specifications and CDD requirements, and ATEC/AEC independent evaluation criteria. IQT will include subsystem- and system-level performance testing, interoperability testing, E3 and induced environmental testing, transportability testing, mobility testing, delivery accuracy testing, supportability demonstrations, and will address RAM, human factors, survivability, lethality, and system safety. The SPAW and FAASV will have common components and subsystems to the maximum extent possible. Using this commonality where reasonable will reduce the amount of technical testing during IQT that all unique components and subsystems without points of commonality would require SPAW Common IQT Events- TBD SPAW Increment 0 (Inc 0) Early testing of the eight SPAW Inc 0 vehicles will reduce risk for all SPAW prototypes. SPAW Inc 0 vehicles will undergo contractor shakedown testing prior to transitioning into IQT with a focus on evaluation of mobility capability and firing operations. Events to

207 FOR TRAINING USE ONLY be accomplished during IQT are presently scheduled for APG and YPG, but the locations are subject to change based on host command capabilities at particular test sites. As the program matures, updates to the estimated required test times and test locations will occur. The IQT will include the following: Shoot: demonstrate indirect fire capabilities at combat weight Time on Target, When Ready, At My Command, Multiple Round Simultaneous Impact, Target outside traverse limit Fire all current ammunition types (full manual firing for Excalibur pending fuse setter, ammunition, and SW development) Perform Technical Fire Control Projectile Tracking System crew tasks Check Fire procedures Demonstrate mobility capabilities at Fully Combat Capable weight Cross-country and road speeds Obstacles and terrain Fuel consumption Limited digital map display of surroundings to be refined (TBR) Limited Warfighter Machine Interface (WMI) and crew displays to support primary operations Draft operator manual Survive: demonstrate control of ambient crew environment Environmental control system Sustain: demonstrate limited resupply operations Demonstrate limited transport capabilities at Fully Combat Capable weight Rail and highway transport configuration Train: initiate development of SPAW FTT concept EBCT, TTP development After completion of IQT in October 2010, the EBCT will receive refurbished Inc 0 vehicles along with a Support Package of spare parts and peculiar support equipment. The Evaluation Brigade Combat Team (EBCT) will support development of Tactics, Techniques, and Procedures (TTPs) by demonstrating selected aspects of cannon operation from battery to section tasks with priority going to Mission Essential Tasks SURVIVABILITY TEST A combination of testing, demonstrations, M&S, and analysis will collect survivability data for each SPAW variant. This data will support the SoS survivability evaluations. Electromagnetic Environmental Effects (E3) Details omitted for training purposes Chemical, Biological, and Nuclear (CBN) Details omitted for training purposes Cybersecurity Details omitted for training purposes

208 FOR TRAINING USE ONLY Signatures Details omitted for training purposes Ballistic Vulnerability Details omitted for training purposes Soldier Survivability (SSv) Details omitted for training purposes Fire Survivability Testing Details omitted for training purposes PRODUCTION & DEPLOYMENT (PD) TESTING Production Verification Test (PVT) The SPAW PVT will finalize specification compliance and qualification testing using integrated LRIP vehicles. Testing will consist of activities to validate the manned system s ability to meet specifications and performance requirements. Additionally, PVT will validate production manufacturing and process changes implemented during LRIP. Soldiers will participate during this testing. Test activities will focus on RAM, areas where problems were uncovered and corrected during EMD, and will include natural environment and mobility/durability tests. Expansion of environmental tests during the PD phase will test the systems in the field under various environments including tropic, arctic, and desert conditions. 3.4 CERTIFICATION FOR INITIAL OPERATIONAL TEST AND EVALUATION (IOT&E) Section should discuss how system will be certified ready for IOT&E including entry criteria and how DT&E addresses those criteria. (Further Details omitted for training purposes.) 3.5 OPERATIONAL EVALUATION APPROACH The overall OT&E objective is to evaluate the operational Effectiveness, Suitability, and Survivability (ESS) of the SPAW. An integrated T&E approach will merge developmental, operational T&E and M&S whenever practical to avoid redundancy. The operational T&E strategy will evaluate the system based on all COIs found in Section 3.6. ATEC will provide system evaluation reports as inputs to acquisition decisions. The OT&E of the SPAW will consist of LUT and IOT&E. Enhanced lethality and assured maneuver of the SPAW demonstrates operational effectiveness. RAM, interoperability, and support burden demonstrate operational suitability. RAM evaluation will use data collected from DT and OT combined. The capability of the SPAW system to survive ballistic threats (to include direct and indirect fires), resist electronic attack, and survive within NBC environments demonstrates survivability. DRAFT Critical Operational Issues and Criteria (COIC) have been developed, and are pending approval by HQDA. The DRAFT COIs are provided below. For the complete COIC (along with MOEs, MOSs, and MOPs), refer to the System Evaluation Plan. Can the SPAW deliver accurate indirect fire on the battlefield?

209 FOR TRAINING USE ONLY Is the SPAW mobility adequate to support wartime and peacetime operations? Can the SPAW be rapidly inserted into world-wide theatres of operation? Can the SPAW operate in the joint battle space environment? Does the SPAW logistics footprint adequately support wartime and peacetime operations? Is the SPAW safe to operate in wartime and peacetime? Do the SPAW personnel requirements support wartime and peacetime operations? Is the SPAW survivable on the battlefield? Is the SPAW availability sufficient to meet wartime and peacetime needs? OPERATIONAL TEST EVENTS AND OBJECTIVES Limited User Test (LUT) The SPAW prototypes will participate in LUT as part of existing operational exercises. It is anticipated that all LUT events will occur at Fort Bliss, Texas. NOTE FOR DAU STUDENTS: LUT will occur during EMD, and LUT results will feed an Operational Assessment. Configuration Description The LUT will include three Increment 1 SPAW systems with Build X software, requisite resupply vehicles, a Battery Tactical Operations Center, and all other subsystem components in quantities necessary to support the scope of the LUT. Objectives The LUT will provide an initial estimate of effectiveness and suitability of the SPAW platform, verify system functionality in an operational environment, and gather data from the user s perspective on platform system/ subsystem design performance. The evaluation will determine the capability of the SPAW crew to provide indirect fire support that enables accurate, cooperative, and autonomous fires. The LUT will examine the ability of the cannon section to maneuver effectively and accurately engage targets. Another objective is to test the system capability to reach operational readiness to include system responsiveness, emplacement, and displacement of the system. The LUT will also gather data for sustainment, Human Systems Integration, and non-ballistic attributes. Scope Phase I The LUT Phase 1 will focus on individual (thread) SPAW networked capabilities and limitations when employed by small units conducting tactical exercises in an operational environment. A SPAW cannon section will conduct battery command collaborative assessments and course of action analysis. Specific tasks to the SPAW section include:

210 FOR TRAINING USE ONLY The SPAW will conduct indirect fires to influence the will, destroy, neutralize, or suppress enemy forces in support of maneuver forces. During the LUT, the SPAW unit will conduct missions based on vignettes derived from the SPAW Concept of Operations. Vignettes will target evaluation of SPAW fire and maneuver capabilities. Missions will take place during day/night operations, which will stress sensor capabilities (i.e., a UAV identifying a target). Tasks will include first round response time while emplaced and when a fire order is received while moving. A typical mission will include a reported target that meets the Commander s criteria and the SPAW battle command selecting the best munition and unit to fire. The SPAW fire mission will follow along with a UAV sending images back to the network to assess the effectiveness of the targeting. Subsequently, shared target damage assessment happens throughout the network. The SPAW will: o Conduct precision fires to destroy high priority targets o Conduct protective fires to suppress and provide obscurants to enable maneuver o Conduct cooperative, accurate and autonomous indirect fire missions o Conduct networked engagements as well as detecting and maintaining a track on ground targets within the assigned battle space Scope Phase II The SPAW cannon section will participate in conducting force-on-force missions in an environment that will provide opportunities for full spectrum operations to the extent possible. The mission plan will determine the ability of the SPAW section to maneuver (mobility). This ability will include the ability to shoot and then move to another location in conjunction with the maneuver plan. Virtual representations in the simulation will represent the majority of the SPAW battery team; the data from the live test players and other system prototypes will provide the primary source of data for the assessment. TRADOC will develop scenarios for use by OTC to design the test structure PRODUCTION & DEPLOYMENT (PD) LUT The purpose of PD LUT is to provide data to assess the additional capability provided by the latest software build to SPAW, and assess the full capability of the battle command network. Additionally, PD LUT will provide the opportunity for modifications to hardware and software following the prior LUT. The combat team will be organized and equipped in the same manner as the prior LUT. System Configuration The test will include prototype SPAW hardware and LRIP software within the framework of a full SPAW battery. Test Configuration

211 FOR TRAINING USE ONLY SPAW testing during PD LUT will be performed on three prototype SPAW vehicles with the latest (LRIP configuration) software build. Events The scope of the PD LUT will build upon that of the final EMD LUT. The ability to contribute to the networked fires process will be determined. A SPAW platoon will conduct battalion collaborative assessments and course of action analysis tasks. It will also focus on the additional capability provided by the latest SPAW software build to the SPAW platform. During PD LUT, the SPAW unit will conduct missions derived from the SPAW Operational plan. PD LUT will examine the ability of the artillery platoon to effectively maneuver and accurately engage targets under the SPAW network. In addition to EMD LUT capabilities, SPAW will: Conduct area suppression in support of companies and platoons Provide tactical and technical fire direction to other systems IOT&E The purpose of the IOT is to provide the framework to test and evaluate the SPAW battery in conducting full spectrum operations in a tactically realistic, operational live environment. The focus of evaluation will determine the effectiveness and suitability of the SPAW. IOT will occur at Fort Bliss, Texas. Entrance Criteria: Validated threat representation SPAW system safety certificate All ammunition (rounds/fuses/charges) qualified for use in the cannon No degradation to mission critical/essential functions caused by radiated susceptibilities and conducted emissions have been demonstrated Successfully completed a platform Logistics Demonstration ATO obtained and JITC assessment completed Software FQT successfully conducted and all major software deficiencies found during the FQT resolved Test Configuration Description SPAW testing during the IOT will occur with an SPAW battalion s (18 cannon systems) production vehicles with the latest software build. Objectives Testing during IOT will assess the Effectiveness, Suitability, and Survivability in an operational environment, and gather data from the user s perspective on platform system/subsystem design performance. Evaluation will assess the ability of the SPAW

212 FOR TRAINING USE ONLY to support the full Brigade within the context of full BDE operations, to include sustainability and maneuverability. Scope M&S (live, virtual, and constructive) will create an operationally realistic distributed environment. A distributed test environment allows for linkage of live, virtual, and constructive environments for test and evaluation activities. The missions performed based on the scenarios selected will use/conduct full spectrum of countermeasures, counter- countermeasures and obscurants, electronic warfare and information assurance attack. The Threat Coordination Group will develop the threat used, and the Threat Accreditation Working Group will perform accreditation. A force protection evaluation will take place on non-direct combat systems based on the ATEC interim guidance concerning this issue. Missions will occur in the framework of an overarching Defense Planning Scenarios. This test will occur in two phases, as follows. Scope Phase I This event consists of a deployment, confirming the SPAW battalion TTP for deployability of a representative SPAW-equipped battalion sample by various modes of transportation. This phase will demonstrate the ability and time to go from essential to full combat configuration. Scope Phase II The SPAW will play in SPAW brigade and SPAW battalion missions in a realistic environment during several engagements that will examine the SPAW role in the networked fires process. Tests will also assess SPAW contribution to assured mobility; the SPAW contribution to the lethality of joint fire missions. This phase will determine SPAW contribution in target acquisition, location, identification, and designation. This phase will also examine the deployability of the SPAW as a part of the SPAW-equipped battalion/brigade. Specific missions of the SPAW to support tactical maneuver in mission scenarios include: Fire for Effect (FFE): The SPAW will conduct destructive fires to determine its contribution to shaping engagements with precise and or area fires, thus allowing tactical friendly maneuver. This will also occur in conjunction with direct fires to determine its contribution to multiple lethal modes Protective/Suppression fires: The SPAW will conduct protective fires supporting maneuver formations, including danger-close and final protective fires, to facilitate ground maneuver. Testing will also determine the ability to fix or isolate a target Special purpose fires: The SPAW will employ special purpose fires including obscurants and illumination to aid in maneuver and counter-mobility OPERATIONAL EVALUATION FRAMEWORK

213 FOR TRAINING USE ONLY The operational framework reflects the SPAW plan for the test design for strategically varying the factors across the operational envelope. Table Operational Evaluation Framework Matrix MODELING AND SIMULATION (M&S) Details omitted for training purposes TEST LIMITATIONS If representative soldiers and maintainers cannot be obtained, trained contractor personnel will be employed. 3.6 LIVE FIRE TEST AND EVALUATION APPROACH - Details omitted for training purposes. 3.7 OTHER CERTIFICATIONS - Details omitted for training purposes. 3.6 FUTURE TEST AND EVALUATION - Details omitted for training purposes

214 FOR TRAINING USE ONLY 4. PART IV T&E RESOURCE SUMMARY 4.1 INTRODUCTION Details omitted for training purposes. 4.2 TEST RESOURCE SUMMARY Details omitted for training purposes TEST ARTICLES Test article requirements by variant are specified in Table 4-1. The number of test articles for each test event will remain under review, and will be updated as required. In building this test article matrix, several key assumptions were made. Specifically, that SPAW tests have top priority of use of government test facilities and ranges, and M&S at a high level of maturity (to include availability and VV&A). If any of these assumptions change in validity, more hardware test articles may be required. Performance testing will be augmented through the use of additional types of test assets beyond the test articles indicated below. Automotive test rigs, surrogate hardware, and emulators will be part of the LSI test suite. Use of these alternatives is the key to minimizing the number of test articles needed for EMD IQT, and enables maximum participation of test articles in OT events. In addition to the listed test articles, ballistic hulls and turrets (BH&Ts) in support of ballistic survivability are required. Table 4-1: SPAW Family of Systems Test Articles Prototype Initial Production Items Ballistic Survivability LUT LFT PVT PD LUT Model IQT SPAW (Increment 0) SPAW (Increment 1) FAASV IOT&E Total TEST SITES Developmental Test Sites Potential test sites to support the developmental test events identified in this TEMP include, but are not limited to, the following: Aberdeen Proving Ground (APG) Yuma Proving Ground (YPG) White Sands Missile Range (WSMR) Dugway Proving Ground (DPG) Fort Greely

215 FOR TRAINING USE ONLY Redstone Arsenal Fort Rucker Fort Bliss Fort Sill Fort Irwin Final test site determinations will be made as further detailed test planning matures. Final test site selections will be made to most effectively use the resources available as well as to employ the Evaluation Brigade Combat Team. Early technical testing of components, sub-assemblies, and subsystems will be conducted at system contractor facilities with government participation. Integration testing will be accomplished in the LSI SIL locations as well as at the C4ISR SIL. Valid environmental statements exist or will be developed for all test sites Operational Test Sites It is anticipated that conduct of the operational test events identified in Part III of this TEMP will occur at Fort Bliss, TX. Fort Bliss provides suitable maneuver space, terrain, frequency allocations, intra-site and external connectivity to the LSI SIL and government sites using the ATEC Test Integration Network (ATIN) and the Defense Engineering and Research Network. Fort Bliss also provides support for a battalion or greater force along with a comparable Threat force TEST INSTRUMENTATION REQUIREMENTS Instrumentation to support SPAW testing will require a combination of embedded, appended, and enhanced range instrumentation. An embedded instrumentation (EI) trade study has been initiated to determine the data which can be provided by EI, compare it to evaluation data requirements, and identify data shortfalls to be supported with appended instrumentation. Appropriate instrumentation projects have been initiated to address SPAW instrumentation requirements based on reviews from Commanding General, ATEC and PM SPAW, and coordination with the U.S. Army Test and Evaluation Management Agency. A continuous review process exists to ensure the required capabilities are achieved and remain synchronized with the SPAW test program. Refer to the SPAW Test Instrumentation Plan for additional information TEST SUPPORT EQUIPMENT It is anticipated that all SPAW test locations will require some form of test support equipment. This may include test measurement and diagnostic equipment, calibration equipment, frequency monitoring devices, or other test support devices that are not included under the instrumentation requirements. Some of the support and equipment needed to conduct SPAW testing will be acquired as Government Furnished Equipment (GFE). A process is currently in place for requesting GFE on the SPAW program, and as the SPAW program is better defined, the listing of requested GFE will be presented in this section of the TEMP

216 FOR TRAINING USE ONLY Test Related Government Furnished Equipment Test Related Government Furnished Equipment (GFE) is defined as Government Furnished Equipment, Property, Information, Facilities, and Services that are made available to the contractor within the contract. Test Related GFE requirements are in support of system tests (IQT, PVT, Live Fire Test, and some specialty testing). Included in Test Related GFE are test engineering support from ATEC, unique instrumentation, and test range support. The PM SPAW has designated the CTO to manage the test related GFE to include budget, cost, and interaction with the respective Government provider and the contractor. If the CTO is unable to resolve any disagreements that arise relative to the spending of the Government provided funding for Test Related GFE, the matter will be referred to the PM SPAW for resolution THREAT REPRESENTATION General A detailed evaluation of possible SPAW threat environments is provided in the SPAW STAR, but this report does not and cannot provide an all-inclusive threat assessment. The scope of types and sources of available threat systems makes it impractical and prohibitive, both in time and cost, for testing against all known or emerging threats to a SPAW. Therefore, the Threat-Intelligence Community (TIC) (includes TRADOC Deputy Chief of Staff for Intelligence, Headquarters Department of the Army Deputy Chief of Staff (DCS) G-2, and Defense Intelligence Agency for threat validation) using the Threat Coordination Group and the Threat Test Support Package (Threat TSP) process will validate threat system/simulation requirements. The TIC will identify the threat requirements for types, quantities, level of fidelity, and scheduled due dates for availability in test to provide for the potential of a real threat system requirements for SPAW testing to ensure Army validation and ATEC accreditation. Model-to-Model, Model-to-Real, and Real-to Model testing will be used throughout the development cycle to provide comprehensive threat system evaluations for the survivability, lethality, and effectiveness determinations of SPAW in either a cooperative or isolated engagement. The TIC will specify in the Threat TSP for each test event the environments in which threat and threat representations will operate Threat Force Requirements New Defense Planning Guidance and TRADOC Standard Scenarios will also be reviewed and assessed for new threat systems testing requirements throughout SPAW development cycle. The CTO will update the TEMP based on information provided by the TIC to reflect new threat requirements or changes to existing requirements. Table 4-2 identifies threat representative units and personnel that will be required, both live and virtual, during T&E events. Appropriate threat command and control elements will be required and utilized in both live and virtual environments. The scope of the T&E

217 FOR TRAINING USE ONLY event will determine final threat inventory for SPAW T&E. Coordination between the TIC and ATEC- AEC/OTC will finalize threat representation quantities for a valid threat portrayal recommended for test. Table 4-2: Threat Unit Equipment, Personnel, and Targets EQUIPMENT IQT EMD LUT PVT IOT&E PD LUT M&S LIVE M&S LIVE M&S LIVE M&S LIVE M&S LIVE T-72S/72Z/54 20/20 20/20 20/20 20/2 20/2 0/3/0 0/4/0 /0 /0 /4 0/4 0/4 0/4/0 T BMP APC ATGM LCHER ATGM MP ANTI MATERIAL HOW SP GUN TWD HOW TWD MRL SP TRANSPORT ADA GUN ADA MSL LNCHER SP ADA MSL MP MINES TRK 2.5T TRK PRIME TRK TRACT TLR TRK LARGE CIV TRK/VAN CIV CAR C2 VEHICLE, NFI VAN, COMM, NFI RADIO RELAY RADAR, BS RADAR, CM/CB ARTY JAM (HF, VHF) GPS Jammer (MF) Decoys 3:1 ratio per system (high and low resolution) Tanks (high and low resolution)

218 FOR TRAINING USE ONLY EQUIPMENT IQT EMD LUT PVT IOT&E PD LUT M&S LIVE M&S LIVE M&S LIVE M&S LIVE M&S LIVE IFV/APC wheel/track Arty Gun/How MRL Mine Field (300 mines) ADA systems and sites TEST TARGETS AND EXPENDABLES Test targets and expendables include threat representations and ammunition in support of test. PM SPAW and ATEC are currently developing test target sets to support SPAW testing. Additionally, PM SPAW has developed ammunition forecast (see Table 4-3) to support DT during the EMD portion of this program OPERATIONAL FORCE TEST SUPPORT Details omitted for training purposes MODELS, SIMULATIONS AND TEST-BEDS Details omitted for training purposes JOINT OPERATIONAL TEST ENVIRONMENT Details omitted for training purposes SPECIAL REQUIREMENTS All T&E efforts will comply with federal, state, and local environmental regulations. Current permits and appropriate agency notifications will be maintained regarding all test efforts. Although the PM is ultimately responsible for overall environmental compliance of the weapon system, it is the responsibility of the specific test sites to ensure all applicable environmental requirements are met for their test sites. 4.3 FEDERAL, STATE, AND LOCAL REQUIREMENTS Details omitted for training purposes. 4.3 MANPOWER/PERSONNEL AND TRAINING 4.4 MANPOWER/PERSONNEL AND TRAINING MANPOWER/PERSONNEL Manpower requirements for the EBCT are currently being generated for each program test event. An initial estimate is included in Table 4-4. Validated requirements will be included in subsequent updates of this TEMP

219 FOR TRAINING USE ONLY Table 4-4: Operational Force Test Support OT Test Event Force Strength Date Required FDE 222 FY08 Inc 0 LUT 222 FY08 Inc 0 IOT&E 2500 FY10 FDE 300 FY10 Inc 1 LUT 545 FY12 FDE 545 FY14 PD LUT 485 FY14 FDE 3285 FY16 Inc 1 IOT&E 3285 FY TRAINING The SPAW prime contractor is required to provide individual and crew training and instructional materials to government operator/maintainer personnel. Collective task training for test participants will be conducted by TRADOC along with the contractor as required. 4.5 T&E FUNDING SUMMARY Funding requirements by fiscal year are outlined in Table 4-5 (omitted for training purposes). Currently, there are no identified funding shortfalls. ATEC has provided cost estimates for developmental testing based upon the Test Related GFE requests and the agreed upon Consolidated Test Asset Requirements Matrix. These estimates will be revised and incorporated in program budget justification at each major Planning, Programming, Budgeting, and Execution System update. The Army will ensure the T&E program is resourced as required

220 THIS PAGE INTENTIONALLY LEFT BLANK 220

221 Lesson 6.2 T&E Requirements 221 T&E Requirements

222 THIS PAGE INTENTIONALLY LEFT BLANK 222

223 T&E Requirements The following continuous learning modules apply to this lesson: - CLR252 Developing Requirements - CLL003 Supportability T&E - CLE062 Human Systems Integration Definitions Capability - The ability to execute a specified course of action under specified conditions and level of performance. Capability Gap Assessment (CGA) -- A deliberate assessment of the Future Years Defense Program that reviews CCMD IPLs and other issues and perspectives from the Services and other DoD Components, relative to fielded materiel and nonmateriel capability solutions, and development efforts that may already be underway to address capability gaps. DOTmLPF-P = Doctrine, Organization, Training, materiel, Leadership & education, Personnel, 2 Facilities, and Policy CJCSI I, Jan,

224 3 Departmental Process Interactions 4 224

225 Requirements Tradeoffs Finding the balance between: CCMD near-term requirements to support CONPLANs and current missions and Services long range vision & investment plans Versatile, joint systems and Systems optimized for service missions Growing demands and Fiscal & political constraints Geographic specificity and Worldwide applicability Ambitious requirements and Achievable acquisition strategy Quantity matters and Quality (High-end capabilities) COST and PERFORMANCE (acceptable risk) 5 10 USC 181 (JROC) 225

226 Process Lanes 7 JCIDS Guidance Documents 8 226

227 Requirements Acquisition Interaction 9 Desired End-State Take the Lead in Shaping the Force Back to JROC and CJCS Title 10 Debate the difficult issues and make difficult choices earlier Better upfront fidelity on cost/schedule/performance tradeoffs More analytic rigor and risk/portfolio analysis Stronger emphasis on prioritizing capability requirements Better end-to-end traceability to facilitate decision making: Missions Requirements Acquisition and DOTmLPF-P Budget. More dynamic/iterative process throughout a program s lifecycle. (Revisit as necessary strategy shifts, threat changes, etc.) Make the difficult choices throughout the requirements continuum

228 2015 Changes (part 1 of 2) Consolidated Guidance: CJCSI (JROC Charter), CJCSI (JCIDS), and the JCIDS Manual are still the core products CJCSI (Intelligence Certification), CJCSI (Net-Ready KPP), and JWSTAP Charter (Weapon Safety Endorsement) cancelled, with content absorbed into the three core documents Significant revision of Intelligence Certification content Roles/Responsibilities: Expanded guidance for stakeholder roles/responsibilities in CJCSI 5123 Developing Requirements: Refined CBA guidance Focus on leveraging DODAF to streamline development activities ICD Attributes: Initial Objective Values vice Minimum Values Enable more robust leverage of S&T efforts to satisfy requirements Introduces the Capability-Mission Lattice as a framework for traceability to operational missions Increased focus on ensuring attributes are measurable and testable Changes (part 2 of 2) Documents: Streamlines document formats (starts with June 2012 alternate formats) Extends IT Box construct to IS CDD Aligns affordability sections of CDDs/CPDs with new DODI Content/Endorsement guides for Mandatory KPPs, Weapon Safety endorsement, DOTmLPF-P endorsement, and Intelligence Certification Requires validation page to be combined with JCIDS documents Staffing: Merges JSDs of Joint Information and Independent Integrates Common gatekeeping with DCMO for Defense Business Systems Enhances guidance for submission and review of higher classification documents/issues, including SAP/SAR and ACCM Portfolio Management: Consolidates post validation processes and prioritization guidance into the portfolio management guidance

229 Joint Capabilities Integration and Development System (JCIDS) Changes January, 2015 JCIDS instruction (CJCSI I) & JCIDS Manual continues many changes started in 2012: Expedited staffing & validation procedures for urgent operational needs A joint prioritization process a pre-milestone A review of the AoA study, and cost, schedule, performance recommendations to the MDA Clear formats for requirements documents, and directed page limits Draft CDD due prior to Milestone A JCIDS docs are on student CD-ROM 13 Five Categories of JCIDS Documents 1. ICD - documents new capability requirement(s)/gap(s); and the intent to address capability gap(s) with a non-materiel and/or materiel solution 2. Joint DCR - documents the intent to address capability requirement(s)/gap(s) with a non-materiel solution, recommending changes in one or more DOTmLPF-P area. 3. CDD - defines KPPs, KSAs, and performance attributes necessary to design systems & establish programmatic baselines. 4. CPD - defines KPPs, KSAs, and performance attributes, for the acquisition program s Production and Deployment phase 5. UON, JUON, or JEON - documents capability requirement(s)/gap(s) which would result in unacceptable loss of life or critical mission failure. Expedited procedures are used to initiate rapid acquisition efforts

230 CDD Validation & RFP Release Decisions (DoDI ) CDD Validation Decision is the point at which major cost & performance trades have been completed, and enough risk reduction to commit to requirements that will be used for preliminary design All non-kpp requirements (when delegated by the requirements validation authority) are subject to costperformance trades & adjustments to meet affordability constraints CDD validation precedes the Development RFP Release Decision Point Development RFP Release Decision is extremely important (so much is based on the EMD RFP) 15 DoDI Hardware Intensive Model

231 Key Performance Parameters (KPPs) Those attributes or characteristics of a system that are considered critical or essential to the development of an effective military capability Expressed in Thresholds & Objectives Based on analytical efforts & studies used to develop an ICD Few in number (generally eight or fewer) Must be testable CDD KPPs are included, verbatim, in APB 17 Performance Thresholds / Objectives Thresholds & objectives may change whenever capabilities documents are issued or updated Changes may occur because of: New / revised war fighting needs New threats or technology Inability to attain existing requirements Cost or schedule constraints Safety issues, or to correct deficiencies Impact of requirements changes on T&E: Changes to test planning, scenarios, test and/or analysis methods Revisions to TEMP & other documents New instrumentation, infrastructure, threat systems, or other resources may be required Cost & schedule growth; or reduction of other planned testing to accommodate the new testing 231

232 JCIDS Publications & KPPs Up to 6 Mandatory Key Performance Parameters (KPPs): 1. Force Protection (FP-KPP) 2. System Survivability KPP (JCIDS Manual 12FEB2015) 3. Sustainment KPP 4. Net-Ready (NR-KPP) 5. Training KPP 6. Energy KPP Sponsors must address the mandatory KPPs to all CDDs and CPDs. In cases where a KPP is not appropriate, the Sponsor shall justify why the KPP is not appropriate. 19 Mandatory KPPs (JCIDS MANUAL, Enclosure D) Force Protection KPP Required for manned systems or systems enhancing personnel survivability, when systems will be used in an asymmetric threat environment. System Survivability KPP Required for all manned systems. Selectively applied for unmanned systems. Sustainment KPP and two mandatory supporting KSAs (Reliability and O&S Cost) Required for all ACAT I programs. Net-Ready KPP (NR-KPP) Required for all systems used to enter, process, store, display or transmit DoD information, regardless of classification or sensitivity. Training KPP Required for all ACAT I programs. Energy KPP Required for all systems where the logistics supply chain for energy may be involved/impacted

233 Mandatory Navy KPPs Mandatory Navy KPPs (see Aug 2013 memo on student CD) Cost KPP (procurement cost per unit) Schedule KPP (IOC date is the threshold) Space, Weight, Power and Cooling Margins KPP Required for platforms that carry payloads, such as weapons or modular sensors Ensure platform s ability to accommodate evolving payloads over its life Goal is to improve Navy s focus on attributes that will be important in current / emerging fiscal & security environment 21 Key System Attribute (KSA) An attribute or characteristic considered crucial in support of achieving a balanced solution/approach to a KPP or some other key performance attribute deemed necessary by the sponsor KSAs provide decision makers with an additional level of capability performance characteristics below the KPP level KSAs require a sponsor 4-star, Defense agency commander, or Principal Staff Assistant to change Performance attributes of a system considered important to achieving a balanced solution/approach to a system, but not critical enough to be designated a 22 KPP. 233

234 System Threat Assessments (DoD , Encl. 1) Capstone Threat Assessment: Serve as the analytical foundation for STARs Project technology & adversary capability trends over next 20 yrs. Maintained by the responsible production center, & updated every 2 years. Validated by DIA Initial Threat Environment Assessment: Required for anticipated ACAT I & IA programs Supports the MDD and AOA Provides ability to assess mission needs & capability gaps System Threat Assessment Report (STAR): For ACAT ID & IAM programs, STAR is validated by the DIA. All other programs, STAR is validated by the DoD component. MDAP, MAIS, and programs on DOT&E oversight list require a unique, system specific STAR. T&E threat representation is based on the STAR STAR is typically validated for only a few years 23 System Threats Impact of evolving threats on T&E: May lead to changes in test procedures threat equipment, CONOPS changes, etc. System requirements may change to deal with emerging/changing threats In making the Full Rate Production Decision or the Full Deployment Decision, the MDA will consider any new validated threat environments that might affect operational effectiveness, and may consult with the requirements validation authority as part of the decision making process to ensure that capability requirements are current. DoDI Par 5d(12)(a)

235 Responsibilities The Sponsor (Service/DoD Component) is responsible for preparation of the ICD & CDD/CPD; and usually conducts the AoA Sponsor (Combatant Command/DoD Component) also prepares UONs, JEONs, JUONs The intelligence community is responsible for preparing the STAR The PM is responsible for translating the ICD & CDD capabilities/attributes (requirements) into a system design & specs (technical performance specifications) 25 Role of T&E In Capabilities Analysis As Member of the Acquisition IPT: Advise on the testability of capabilities Advise on the risk of testing attributes Determine if STAR threat can be portrayed As Member of the T&E WIPT: Prepare the Test & Evaluation Master Plan (TEMP), and Integrated Master T&E Schedule The TEMP supports the SEP The TEMP also supports (and must be consistent with) contracting docs such as the RFP & SSP

236 Requirements & Test Perspective DT perspective System Performance Specification System / Item Detailed Specification Verification Methods JCIDS CDD & CPD have KPPs, KSAs, and additional performance attributes EOAs / OAs Measures of Effectiveness OT perspective Critical Operational Issues Measures of Suitability CTPs TPMs MOPs MOPs DT data element DT data element DT data element OT data element OT data element OT data element Who does DT? - Government - Contractors DT/OT data DT/OT data DT/OT data DT/OT data Who does OT? -OTAs -MAJCOMs - Fielded Forces Requirements & Test Terms Term Responsibility Of Reference KPP (Key Performance Parameter) User CJCSI 3170 KSA (Key System Attribute) User CJCSI 3170 Additional Performance Attributes User CJCSI 3170 CTP (Critical Technical Parameter) T&E WIPT DAG TPM (Technical Performance Measure) PM's Systems Engineers DAG COI (Critical Operational Issue) DAU Glossary & User (Army) or OTA (USAF/Navy) Service Docs MOE (Measure of Effectiveness) OTA DAG MOS (Measure of Suitability) OTA DAG MOP (Measure of Performance) OTA DAG

237 Development of CTPs CTPs should focus on critical design features or risk areas, that if not achieved, will preclude delivery of required operational capabilities Examples of CTPs: Technical maturity, or RAM issues CTPs will likely evolve/change, as system matures during EMD Evaluation of CTPs is important in projecting system maturity; and in determining whether the system is on schedule, and will likely achieve operational capabilities CTPs provide a basis for entry or exit criteria for DT phases CTPs unresolved prior to LRIP, must have an action plan to resolve them prior to FRPDR Chief Developmental Tester coordinates the CTP process (along with the Chief or Lead Systems Engineer, OTA, and SMEs as needed) Paraphrased from DAG Dendritic Approach to T&E MOE 1 DATA ELEMENT 1 CRITICAL ISSUE MOE 2 MOP 1 DATA ELEMENT 2 MOP 2 DATA ELEMENT N MOP N MOE N Test Objective IS RADAR "X SATISFAC- TORY FOR THE CAP MISSION? - DETECTION CAPABILITY or (Pdet) - TRACKING CAPABILITY - MEAN AND VARIANCE OF DETECTION RANGE - % TIME OPERATORS JUDGED OPERATION OF CONTROLS & DISPLAYS SATISFACTORY THE NUMBER OF FALSE - MEAN AND VARIANCE OF TARGETS PER SCAN - MEASURED/ OBSERVED DETECTION RANGE - OPERATOR'S COMMENTS - OBSERVATION OF FALSE ALARMS - OBSERVATION OF SCANS DETERMINE THE EFFECTiVE- NESS OF THE RADAR X FOR THE CAP MISSION T&E WIPT is Responsible to develop the evaluation framework

238 Operational Effectiveness The overall degree of mission accomplishment of a system when used by representative personnel in the environment planned or expected for operational employment of the system considering: Organization.....Doctrine...Tactics Survivability.....Vulnerability Threat (Including countermeasures, initial nuclear weapons effects, and NBC contamination) Mobility Lethality etc. 31 Operational Suitability The degree to which a system can be placed satisfactorily in field use, with consideration being given to: Reliability, Maintainability, Availability Compatibility, Interoperability, Integration Safety, Human Factors Transportability, Logistics Supportability Manpower Supportability, Documentation & Training Requirements Wartime Usage Rates Natural Environmental Effects and Impacts Note: COMOPTEVFOR Suitability COIs RAM, & Logistics Supportability

239 Tuesday Night Homework Read the SPAW Milestone B TEMP Note: This is the J-SPAW TEMP, submitted in support of Milestone B All students should read the ENTIRE TEMP When you analyze the TEMP, consider whether the DT, OT, LFT&E and resources are appropriate? the TEMP provides enough information to address Milestone B, Milestone C, FRP-DR, and beyond? Take notes on any TEMP mistakes / discrepancies Analyze the TEMP (for errors in content, format, etc.) In the TEMP Exercise, each team will prepare and present a briefing to the class on the results of their analysis Recommend you categorize the errors you found by where you found them (TEMP Part I, Part II, etc.) 239

240 THIS PAGE INTENTIONALLY LEFT BLANK 240

241 KPP, CTP, COI, and MOE/MOS Exercise Lesson 6.3 KPP, CTP, COI, and MOE/MOS Exercise 241

242 THIS PAGE INTENTIONALLY LEFT BLANK 242

243 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 6, Part 3 Lesson Title KPP, CTP, COI, and MOE/MOS Exercise Lesson Time Terminal Learning Objective 2.0 hours Given a scenario and DoD policy, the student will develop Test and Evaluation Master Plan (TEMP) content. (TLO #5) Enabling Learning Objectives Given a scenario, assess whether the capability requirements are well defined, can be measured and/or assessed during testing, and are relevant to the operational mission. (ELO #5.1) Given a scenario, assess whether planned tests support the test objectives / system requirements; and whether data collected will support established effectiveness, suitability, and survivability metrics. (ELO #5.2) Given a scenario, develop Critical Technical Parameters (CTPs), Measures of Effectiveness (MOEs), Measures of Suitability (MOSs), and data requirements to support assessment and evaluation of system performance requirements, Key Performance Parameters (KPPs), Key System Attributes (KSAs), and Critical Operational Issues (COIs). (ELO #5.11) Assignments As part of the pre work, students have already read the SPAW CDD (prior to this exercise). Assessment Class participation; oral presentation. Related Lessons Lesson 3: T&E Early Planning 243

244 TST 204 Intermediate Test & Evaluation Self Study References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook, Chapter 9 244

245 KPP, CTP, COI, and MOE/MOS Exercise MDD CDD Validation Dev. RFP Release A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment Operations & Support CDD YOU ARE HERE Note: Some student groups will work on part 1 of the exercise, while other student groups work on part 2. 1 This lesson will cover the following topics: 1. Exercise Part 1 2. Exercise Part 2 Lesson Topics DT&E activities will start when requirements are being developed to ensure that key technical requirements are measurable, testable, and achievable. DoDI , Encl. 4 par 4a 2 245

246 Lesson Objectives Given a scenario, assess whether the capability requirements are well defined, can be measured and/or assessed during testing, and are relevant to the operational mission. Given a scenario, assess whether planned tests support the test objectives / system requirements; and whether data collected will support established effectiveness, suitability, and survivability metrics. Given a scenario, develop Critical Technical Parameters (CTPs), Measures of Effectiveness (MOEs), Measures of Suitability (MOSs), and data requirements to support assessment and evaluation of system performance requirements, Key Performance Parameters (KPPs), Key System Attributes (KSAs), and Critical Operational Issues (COIs). 3 KPP, CTP, COI & MOE/MOS Exercise Part 1 Lesson Topics: 1) Exercise Part 1 2) Exercise Part 2 Note: Some student groups will work on part 1 of the exercise, while other student groups work on part

247 KPP, CTP, COI & MOE/MOS Exercise (Part 1) Given: Acquisition documents (TES, Draft CDD) & key operational, technical, & programmatic reqts. Objective: Develop Critical Technical Parameters (CTPs) Overview (4 tasks) Task 1. Identify and build a list of SPAW top-level functions. Task 2. Allocate capabilities (e.g. requirements from the CDD) for three of the SPAW functions ID d. Task 3. From the acquisition documents, select (4) KPPs and develop (4) CTPs. (The CTPs must be related to the KPPs, and must be different than the KPPs.) Task 4. Present a briefing to the class. 5 Task 1. Identify Functions & Build a List of Top Level Functions Figure 1. Sample List of Functions for UAV System. mobility UAV System Functions (partial) communicate sensing survivability maintenance training firepower FUNCTION: What must the system do? (Normally expressed as a verb.) 6 247

248 Task 2. Allocate Capabilities For Three Functions Example Functions and Capabilities, UAV System Function Mobility Communicate Firepower UAV Functions and Capabilities Capability 1. Travel at not less than 100 miles per hour 2. Conduct reconnaissance from an altitude of 5000 feet AGL 3. Have a combat radius of 75 miles 1. Transmit images of battlefield with resolution of 1 meter 2. Receive flight control commands from the ground 3. Travel to commanded geographical location 1. Deliver 500-pound warhead with Circular Error Probable (CEP) of 10 meters or less 2. Have weapon reliability of 0.95 or higher. 3. Carry all air deliverable ammunition. Capability: How well; in what environment; under what conditions; interfacing with. 7 Task 3. Select KPPs Key Performance Parameters * Combat Radius. The Unmanned Aerial Vehicle must be capable of operating with a Combat Radius of 75 nautical miles (T) 125 nautical miles (O). This KPP is critical because there is a requirement to provide surveillance at a range that is beyond the engagement range of the major enemy threat. * Second KPP. Third KPP. Fourth KPP

249 Task 3. Develop CTPs Critical Technical Parameters * Fuel Consumption. Fuel consumption will not exceed eight gallons per hour (T). This CTP is critical because fuel consumption and fuel capacity directly impact the range, payload available, and operating cost of the system. * Airspeed. The UAV will be capable of cruising at no less than 100 nautical miles per hour. This CTP is critical to ensure that accurate surveillance at the maximum combat radius is available to ensure acceptable reaction time for friendly troops from the XXX enemy weapon system. Third CTP. Fourth CTP. Note: each CTP must support one of the KPPs; and each CTP must be different than the KPPs. 9 DAG Defining CTPs: T&E programs will have hundreds or thousands of technical parameters needing capture to support data analysis and evaluations; however, every technical parameter is not a CTP. CTPs measure critical system characteristics that, when achieved, enable the attainment of desired operational performance capabilities in the mission context. CTP do not simply restate the KPPs and/or KSAs. Each CTP must have a direct or significant indirect correlation to a KPP and/or KSA that measures a physical characteristic essential to evaluation of the KPP or KSA. Note: some services and programs do this differently 249

250 Mandatory KPPs Up to 6 Mandatory Key Performance Parameters (JCIDS Manual, Encl. B): 1. Force Protection (FP-KPP) 2. Survivability KPP 3. Sustainment KPP / Availability KPP 4. Net-Ready (NR-KPP) 5. Training KPP 6. Energy KPP For systems where the above KPP s aren t mandatory, they are selectively applied (sponsor determines applicability) One team will have exercise questions that pertain to the Mandatory KPPs, for the SPAW system Exercise - Part 1 Timeline Task 1. Identify and build a list of top level functions. Task 2. Allocate capabilities for three functions. Task 3. From the acquisition documents, select (4) KPPs and develop (4) CTPs; one CTP for each KPP selected. Note: each CTP must support one of the KPPS, and each CTP must be different than the KPPs. Task 4. Present your results to the class in a briefing. You will have 45 minutes to prepare your briefing. Note: Some student groups will work on part 1 of the exercise, while other student groups work on part

251 KPP, CTP, COI & MOE/MOS Exercise Part 2 Lesson Topics: 1) Exercise Part 1 2) Exercise Part 2 Note: Some student groups will work on part 1 of the exercise, while other student groups work on part KPP, CTP, COI, and MOE/MOS Exercise (Part 2) Given: COIs assigned by the instructor Objective: Identify & develop MOE, MOS, MOP & Data Elements Overview (5 tasks): Task 1. Using the (2) COIs assigned by the instructor, identify (2) MOEs or (2) MOSs for each COI. Task 2. Develop (2) MOPs for each MOE or MOS. Task 3. Develop and Identify at least (2) Data Elements required to fully assess each MOP. Task 4. Highlight within the master Dendritic the KPP s and CTP s. Task 5. Present your results to the class in a briefing

252 Task 1. Link COI s with MOE / MOS Sample COI / MOE linkage for a Race Car (Effectiveness example shown) MOE Speed COI Can this car win the race? Acceleration Maneuverability Fuel Consumption 15 Task 2. Develop MOP for MOE & MOS Sample MOE and MOP for selected COI Select One COI for Effectiveness, One for Suitability that best address Risks MOP Maximum (Hot Day) MOE Speed Maximum (Cold Day) Avg (Straight Away) COI Can this car win the race? Acceleration Average (Lap) 0 to 100MPH 100 to 200MPH 200 to max MPH Steering (New Tires) Maneuverability Steering (Used Tires) Braking Fuel Consumption Max Speed Average

253 Task 3. Identify MOP Data Elements Sample Race Car MOP Data Elements MOP Data Elements Maximum (Hot Day) MOE Speed Maximum (Cold Day) Avg (Straight Away) COI Can this car win the race? Acceleration Average (Lap) 0 to 100MPH 100 to 200MPH Maneuverability 200 to max MPH Steering (New Tires) Steering (Used Tires) Braking Break out force Sensitivity Steering wheel force per G in turn Steering wheel mount position Fuel Consumption Max Speed Average 17 Task 4. Highlight KPPs & CTPs COI 1 COI 3 MOE 1 MOE 2 MOE 3 MOS 1 MOS 2 MOS 3 MOP 3: DATA 3, 9, 27, etc KPP MOP 4: DATA 4, 6 MOP 9: DATA 8, 12 CTP MOP 10: DATA 3,

254 MOE high level MOP lower level Examples MOE, MOS, MOP, and Data Elements Data elements lowest level MOS high level MOP lower level Mobility Data elements lowest level Speed on improved roads Time, Distance Traveled, Road Surface, Weather, etc. Reliability MTBF Total System Time (the on time), Total # Failures, Types of Failures, Environmental Conditions, etc. 19 Exercise - Part 2 Timeline Task 1. Using the (2) COIs assigned by the instructor, identify (2) MOEs or (2) MOSs for each COI Task 2. Develop (2) MOPs for each MOE or MOS. Task 3. Develop and Identify at least (2) Data Elements required to fully assess each MOP. Task 4. Highlight within the master Dendritic the KPP s and CTP s. Task 5. Present your results to the class in a briefing. You will have 45 minutes to prepare your briefing. Note: Some student groups will work on part 1 of the exercise, while other student groups work on part

255 Critical Operational Issues 1. Can the SPAW deliver accurate fire on the battlefield? (E) 2. Can the SPAW be rapidly inserted into worldwide theatres of operation? (S) 3. Is the SPAW survivable on the battlefield? (E) 4. Is the SPAW availability sufficient to meet wartime and peacetime needs? (S) 5. Is the SPAW mobility adequate to support wartime and peacetime operations? (E) 6. Does the SPAW logistics footprint adequately support wartime and peacetime operations? (S) Summary Given a scenario, assess whether the capability requirements are well defined, can be measured and/or assessed during testing, and are relevant to the operational mission. (ELO #5.1) Given a scenario, assess whether planned tests support the test objectives / system requirements; and whether data collected will support established effectiveness, suitability, and survivability metrics. (ELO #5.2) Given a scenario, develop Critical Technical Parameters (CTPs), Measures of Effectiveness (MOEs), Measures of Suitability (MOSs), and data requirements to support assessment and evaluation of system performance requirements, Key Performance Parameters (KPPs), Key System Attributes (KSAs), and Critical Operational Issues (COIs). (ELO #5.11)

256 Template - Students can use this for Part 2 of the Exercise, if they want COI MOE or MOS MOE or MOS MOP MOP MOP MOP Two Data Elements Two Data Elements Two Data Elements Two Data Elements

257 KPP, CTP, COI, and MOE/MOS Exercise Student Exercise Questions Part 1A (One team will complete Part 1A of the Exercise): Task 1. Identify (make a list) of top level functions for the SPAW. Task 2. Allocate capabilities for three functions. Task 3. Select four (4) Key Performance Parameters (KPPs) and develop four (4) Critical Technical Parameters (CTPs); one CTP for each KPP selected. Note: Each CTP must support one of the KPPs; and each CTP must be different than the KPPs. CTPs DON T need to come from the Capability Development Document (CDD). (You are NOT limited to what is written in the CDD Please think about things that are NOT in the CDD.) Task 4. Present your results to the class in a briefing. Part 1B (One team will complete Part 1B of the Exercise): Enclosure B of the January, 2012 JCIDS Manual lists six mandatory or selectively applied KPPs Force Protection KPP, Survivability KPP, Sustainment KPP/Availability KPP, Net Ready KPP, Training KPP, and Energy KPP. Enclosure B of the January, 2012 JCIDS Manual is on your student CD ROM. The SPAW draft CDD is in this section of your student book. Task 1. For each of the six mandatory/selectively applied KPPs Is that KPP currently required (as a KPP) in the SPAW draft CDD? If there is currently a KPP for the SPAW, does that KPP cover what is required by Enclosure B of the JCIDS Manual? Task 2. For each of the six mandatory/selectively applied KPPs According to the criteria given in Enclosure B of the JCIDS Manual, should that KPP be required (as a KPP) in the SPAW draft CDD? Why or why not? Task 3. Select four (4) Key Performance Parameters (KPPs) from the existing KPPs in the SPAW draft CDD and develop four (4) Critical Technical Parameters (CTPs); one CTP for each KPP selected. Note: Each CTP must support one of the KPPs; and each CTP must be different than the KPPs. CTPs DON T need to come from the Capability Development Document (CDD). (You are NOT limited to what is written in the CDD Please think about things that are NOT in the CDD.) Task 4. Present your results to the class in a briefing. 257

258 Part 2 (Three teams will complete Part Two of the Exercise): Task 1. Using the two (2) Critical Operational Issues (COIs) assigned by the instructor, identify two (2) Measures of Effectiveness (MOEs) or two (2) Measures of Suitability (MOSs) for each COI. Task 2. Develop two (2) Measures of Performance (MOPs) for each MOE or MOS. Task 3. Develop at least two (2) Data Elements that are needed to assess each MOP. Task 4. Highlight within your dendritics the KPPs and CTPs. Task 5. Present your results to the class in a briefing. Note: You are NOT limited to what is written in the CDD Please think about things that are NOT in the CDD. Critical Operational Issues (COIs) for Part 2 of the exercise: 1. Can the SPAW deliver accurate fire on the battlefield? 2. Can the SPAW be rapidly inserted into world wide theatres of operation? 3. Is the SPAW survivable on the battlefield? 4. Is the SPAW availability sufficient to meet wartime and peacetime needs? 5. Is the SPAW mobility adequate to support wartime and peacetime operations? 6. Does the SPAW logistics footprint adequately support wartime and peacetime operations? 258

259 FOR TRAINING USE ONLY UNCLASSIFIED DRAFT CAPABILITY DEVELOPMENT DOCUMENT FOR Self-Propelled Artillery Weapon (SPAW) Increment: 1 ACAT: IC Validation Authority: JROC Approval Authority: JROC Milestone Decision Authority: US Army Service Acquisition Executive Designation: JROC Interest Prepared for Milestone B Decision Date: 1 September 2006 [Note: This document was designed for classroom exercise purposes only. Format reflects current acquisition policy. Details related to weapons systems and threats are not factual] 259

260 Executive Summary (omitted) Revision History (omitted) Table of Contents (omitted) Points of Contact (omitted) FOR TRAINING USE ONLY Self-Propelled Artillery Weapon (SPAW) 1. Capability Discussion. A recent capability gap has emerged in the area of counterbattery fire; therefore, procurement of the Self-Propelled Artillery Weapon has become essential to the US maneuvering forces. Recent developments and proliferation of threat artillery have rendered US artillery forces extremely vulnerable. The Russian MSTA-X 152 mm self-propelled howitzer and the Chinese PLZ05-X 152 mm selfpropelled howitzers are both projected to have a maximum range in excess of 32,000 meters. This performance puts US artillery forces at a distinct disadvantage. The range of the US M109A1 howitzer is 25,000 meters. With upgrades the M109 can reach ranges of 30,000 meters. These weapons will still be vulnerable to enemy counter-battery fires. Development and fielding of the SPAW will return superiority of medium-range artillery weapons to the US Applicable Initial Capabilities Document (ICD) Next Generation Fire Support ICD, March 2004 (see Appendix C) 1.2. Range of Military Operations. The SPAW weapon system directly impacts and is an integral part of three classes of military operations. They are discussed in the following sections Conventional Warfare (discussion omitted) Strikes (discussion omitted) Unconventional Warfare (discussion omitted) 1.3. Applicable Joint Concepts. The SPAW is highly suitable for acquisition as a joint weapon system. Both the US Army and the US Marine Corps employ medium range artillery weapons. The capability gap discussed above applies to both services. The SPAW directly applies to the following joint functional, operational, and integrating concepts as identified in the previously published Next Generation Fire Support Initial Capabilities Document (ICD) (see Appendix C) Applicable Joint Functional Concepts (JFC). The SPAW is a significant component of the two JFC discussed below. (detailed discussion omitted) Force Application (omitted) Protection (omitted) 260

261 FOR TRAINING USE ONLY Applicable Joint Operational Concepts (JOC). The SPAW is a major component of the four JOC discussed below. (detailed discussion omitted) Major Combat Operations (omitted) Stability Operations (omitted) Strategic Deterrence (omitted) Homeland Security (omitted) Applicable Joint Integrating Concepts (JIC). The SPAW is a significant component to the two JIC discussed below. (detailed discussion omitted) Joint Forcible Entry (omitted) Global Strike (omitted) 1.4. Operating environment. Since the SPAW will be deployed to worldwide operational areas; it must have the capability to be fully functional in all anticipated climatic conditions (hot, temperate, and cold). Expected operational deployment areas include climates from sub-arctic to tropical and terrain from desert to mountainous JCIDS related documents (omitted) 2. Analysis Summary. An Analysis of Alternatives (AoA) was conducted to determine the optimum indirect fire weapon required to address the capability gap identified in Section 1. The AoA analysis was further extended to establish system Key Performance Parameters (KPP), system attributes, and criteria for each parameter. The AoA also incorporated the results of several ancillary studies done to determine the effectiveness of various weapon characteristics. The studies are included in Appendices A-D (omitted). The results of these analyses indicate that a new-design, self-propelled, 155 mm howitzer is required for the maneuver forces. The performance parameters are listed in Section CONOPS Summary. (1) The Army is the centerpiece of the nation's conventional capability for land warfare and conventional deterrence. The Army provides a unique contribution to the National Command Authority by providing land forces which are capable of decisively fighting and winning the nation's wars, engaging the process to promote peace and stability by being a rapidly deployable and credible land force, and by providing crisis response. (2) The Army's responsibility to satisfy 21 st Century requirements for effective full spectrum strategic responsiveness demands an improved capability for the rapid deployment of highly-integrated, combined arms forces possessing overmatching capabilities, exploiting the power of information and human potential, and combining the advantages of both light and mechanized forces, across the full range of military 261

262 FOR TRAINING USE ONLY operations. To support these objectives, there is a need for self-propelled artillery that can operate independently and from on the move, can receive a fire mission, compute firing data, select and take up its firing position, automatically unlock and point its cannon, fire and move out - all with no external technical assistance. (3) As with legacy self-propelled artillery units, it is anticipated that the SPAW will be deployed in battery-sized units of six howitzers. Each SPAW battalion will be attached to a division level force in order to provide timely, accurate indirect fires. Units will normally be located in firing platoon areas under the control of the platoon Fire Direction Center and with support centralized in a battery support area. Paired howitzers will operate in 1-kilometer diameter battle positions. However, howitzers may operate in individual battle positions. Tactical and technical fire control will be maintained through interface with the Battery Computer System (BCS), Tactical Fire Direction System (TACFIRE), and Advanced Field Artillery Tactical Data System (AFATDS). 4. Threat Summary. The SPAW will be employed worldwide, wherever U.S. interests are threatened. To this end, potential threat forces will be armed with various mixes of increasingly sophisticated weaponry. They will include small arms and automatic individual/crew served weapons, antitank (AT) weapons to include antitank guided missiles (ATGM), medium caliber cannon (20-75mm), hand held high explosive antitank (HEAT), and land mines. Regardless of its location on the battlefield, SPAW equipped forces will be threatened by indirect fire. As part of a digitized force, the SPAW will be subject to electronic warfare, threat information operations and directed energy. The SPAW will potentially operate in a nuclear, biological and chemical (NBC) environment, which could include weaponized agents, toxic industrial hazards and battlefield residues. (Further discussion omitted) (1) There are two primary threats to US medium range artillery systems. The most prolific system is the recently-fielded Russian MSTA-X 152 mm self-propelled howitzer. This system has a maximum range of 32,000 meters. It is being widely exported and current intelligence estimates place it in 10 countries. (Further discussion omitted) (2) The second threat system is the Chinese PLZ05-X 152 mm self-propelled howitzer. It has a maximum firing range of 31,500 meters. It has been fielded within the People s Liberation Army and has been exported to Viet Nam and North Korea. (Further discussion omitted) 5. Program Summary. This program has Joint potential or interest. A supportability strategy and system support packages will be developed and updated throughout the acquisition process. (1) Maintenance Planning. Logistics and maintenance support will be accomplished using a "replace forward, repair rear" concept, representing two levels of maintenance. When feasible, Weapon System Replacement Operations (WSRO) will be used to fully replace a system. 262

263 FOR TRAINING USE ONLY (2) Support Equipment. New or unique equipment to support SPAW will be kept to a minimum. (3) C4I/Standardization, Interoperability, and Commonality. SPAW will have the inherent capability to access information databases, in a "push/pull" mode. Information pulled from database(s) shall assist in planning and supporting military operations. It must support and interface with existing and emerging Army, Joint and Allied/Coalition C4I systems via the Joint Variable Message Format (JVMF). The transfer of battle command information shall be automated over tactical data and voice communications systems. (4) Electromagnetic Environmental Effects and Spectrum Supportability. All sheltered components of the SPAW shall be designed to be mutually compatible with other electric or electronic equipment within the system s expected operational electromagnetic sheltered environment. All spectrum dependent equipment must have a frequency supportability assessment conducted and conform to the frequency spectrum certified for Army use worldwide. (5) Future Development. The capability to fire on-the-move, while not a current requirement, is being considered as a priority for future development. Fire-on-themove, C4I upgrades, and improved range performance are three technology improvements that are planned for, as future incremental efforts. 6. Capabilities Required 6.1. Key Performance Parameters. The following system capabilities have been designated as Key Performance Parameters. They are summarized in Table 1. Discussion and rationale for each parameter follows. Note to students: Some of the mandatory / selectively applied Key Performance Parameters have been omitted (from this CDD), for training purposes. See the JCIDS manual, Enclosure B for more information on the mandatory / selectively applied KPPs. 263

264 Paragraph # Key Performance Parameter Maximum Firing FOR TRAINING USE ONLY Table 1. Key Performance Parameter Summary Development Threshold Development Objective 34,000 meters 37,000 meters Range Net Ready Support execution of critical Joint operational activities as identified in the system's integrated architecture products and satisfy the technical requirements for transition to Net-Centric military operations to include: 1) Implementation of GIG IT standards as identified in the SPAW TV-1, 2) Compliance with GIG KIPs as identified in the system's KIP declaration table Threshold = objective Transportability Transport by C-17 Threshold = objective Survivability Shoot & Move Relocate 200 meters within 5 minutes Relocate 200 meters within Force Protection Crew survival from 152mm HE/HEDP detonating from xx meters (classified) 3 minutes No crew incapacitation from 152mm HE/HEDP detonating from >xx meters (classified) Availability Operational availability 0.85 Operational availability Maximum Firing Range. The maximum firing range of the howitzer shall be at least 34,000 meters (T) with a desired range of 37,000 meters (O). Rationale. The current range of US artillery systems is 25,000 meters. Threat artillery ranges are approximately 32,000 meters. A firing range in excess of 32,000 meters is required to adequately defend against enemy counter-battery fires. A maximum range differential advantage over enemy artillery threats directly improves friendly artillery survivability Net Ready. The SPAW must support execution of critical Joint operational activities as identified in the system's integrated architecture products and satisfy 264

265 FOR TRAINING USE ONLY the technical requirements for transition to Net-Centric military operations to include implementation of Global Information Grid (GIG) Information Technology (IT) standards as identified in the SPAW Technical View (TV)-1, and compliance with GIG Key Interface Protocols (KIP) as identified in the system's KIP declaration table. (Further details omitted) Rationale. Artillery fires can be requested by a variety of US and NATO forces. Targeting data may be available from a variety of sensors and sources. The SPAW must be able to communicate with each potential communications node in the various DoD and NATO networks. The SPAW system must be interoperable with all information exchange networks employed by US and NATO forces. C4ISR should be based on the ABCS systems. The systems should be used/fully compatible with the appropriate ABCS systems such as ASAS, MCS, AFATDS, FBCB2 as well as the associated communications and positioning equipment (SINCGARS, MSE, EPLRS, NTDR, GPS) Transportability. The SPAW will be moved into theater primarily via C-5 and C-17. Intra-theater deployability will be by air (C-17), the vehicle itself, rail, sea and other ground transportation vehicles. The SPAW must be capable of rapid deployment/displacement to critical areas immediately upon landing/insertion and have the ability to rapidly relocate to meet emerging threats and to shape the battlefield. It must be capable of being transported by the C-17. Rationale. The current battlefield environment requires the ability to rapidly redeploy supporting artillery assets long distances in short times from remote locations. An Analysis of Alternatives (see Appendix C) has indicated that the C-17 airlift platform represents the optimum vehicle available for performing this function Survivability. The SPAW system must be able to break down after firing and move at least 200 meters with five minutes (T), with a goal of three minutes (O). Rationale. The SPAW system must have a capability to vacate a firing position quickly after firing, to avoid counter-battery fires Force Protection. The SPAW system must be able to protect the crew from the primary threat blast and explosive fragments from 152mm High Explosive (HE) and High Explosive Dual Purpose (HEDP) rounds impacting at a range of (classified) meters from the SPAW. The threshold is crew survival from 152mm HE and HEDP rounds detonating at a range of (classified) meters from the SPAW. The objective is to have no crew incapacitation from 152mm HE and HEDP rounds detonating at a range greater than (classified) meters from the SPAW. Rationale. The current threat analyses indicate that the primary threat is from enemy field artillery. The HE and HEDP rounds are the most lethal to the SPAW Availability. The SPAW shall have an operational availability of 0.85(T); 0.95 (O). Rationale. Studies conducted in support of the Functional Needs Analysis (reference omitted) focused on technical factors that drive operational suitability. Modeling and Simulation and war-gaming have indicated that the minimum acceptable operational availability for a medium howitzer is Analysis to date indicates that availability above 0.96 might not be cost effective. 265

266 FOR TRAINING USE ONLY 6.2. Additional Performance Attributes. The following system capabilities have been designated as significant performance attributes for the SPAW system. They are summarized in Table 2. Discussion and rationale for each attribute follows. Table 2. Additional Performance Attributes Paragraph # Attribute Development Threshold Development Objective Max firing rate 4 rd/min for 5 min 6 rd/min for 5 min Sustained rate of 3 rd/min 5 rd/min fire Speed 25 mph on unimproved roads 10 mph cross-country 30 mph on unimproved roads 15 mph crosscountry Crew 5 personnel 3 personnel Reliability Howitzer/ammunition Combination = 0.99 combination = Safety Not unduly hazardous Threshold = objective Mean time between failure (automotive) % confidence level % confidence level Intermediate mean repair time Not exceed 2 hours Not exceed 1.5 hours Environmental protection Nuclear, biological, chemical protection for Threshold = objective crew Supportability Supportable using current logistic procedures in combat Fuel consumption Not exceed 40 gallons per hour Fording Four feet of water without kit Suspension system Firing without using spades Cant limits 10 degrees of cant without improving position Accuracy (at a range of XXX meters) CEP 40 meters Threshold = objective Not exceed 35 gallons per hour 4.5 feet of water without kit Threshold = objective 15 degrees of cant without improving position CEP 30 meters Maximum Firing rate. The howitzer will provide a maximum firing rate of at least four rounds per minute (T); six rounds per minute (O) for a five-minute period. Rationale: Analysis has indicated that the maximum firing rate has been shown to be the second greatest contributor (just behind maximum range) to survivability when facing enemy counter-battery fire. Firing rate has also been shown to be one of the three most important contributors to a medium howitzer s effectiveness as a 266

267 FOR TRAINING USE ONLY weapon, along with range and accuracy. Four rounds per minute was a recommendation from the AoA Sustained Firing rate. The howitzer will provide a sustained rate of fire of three (T); five (O) rounds per minute. Rationale: Analysis has indicated that a rate of three rounds per minute represents the optimum compromise between weapon effectiveness and technological cost. Increasing the sustained rate of fire directly improves the weapon s mission effectiveness Speed. The howitzer must be capable of traveling on unimproved roads at speeds of 25 mph (T); 30 mph (O) and at 10 mph (T); 15 mph (O) when traveling cross country. Rationale: Current doctrine and supporting analysis has indicated that rapid mobility is required to support major combat operations on today s battlefield. War gaming analysis (see AoA) indicates that speeds of 25 mph/10 mph on unimproved roads and cross country are the minimum required for howitzers to maintain pace with the maneuver forces Crew. The weapon must be capable of being operated by a crew of no more than five (T); three (O) personnel. Rationale: Ergonomic studies were conducted in support of the Functional Needs Analysis (reference omitted). These studies have indicated that a crew of between three and five personnel is the optimum compromise between technology and manpower requirements Reliability. The reliability of the howitzer and ammunition combination shall be 0.98 (T); 0.99 (O). Rationale: Technical studies conducted in support of the Functional Needs Analysis (reference omitted) focused on technology and operational suitability. Modeling and Simulation and war gaming have indicated that the minimum acceptable reliability for a medium howitzer is Analysis to date indicates that reliability above 0.99 might not be cost effective Safety. The weapon shall not be unduly hazardous to the crew during loading and firing. (T=O) All safety hazards will be eliminated or reduced to an acceptable level of risk. The SPAW must not have any uncontrolled safety or health hazards that may adversely impact upon the health or safety of the operator, maintainer, trainer, or handler. A System Safety Assessment (SSA) will be completed as part of the design process to ensure the system is free from conditions which can cause death, injury, or illness to the target audience soldier. The SSA will be updated prior to each Milestone Decision Review. Rationale: Compliance with Systems Engineering principles contained in DoD requires a system design that provides a safe operating environment for the crew. DoD requires Human Factors Engineering to be a fundamental component of every weapon system design and development Mean Time Between Failures (MTBF). Mean time between failures (automotive) will be at least 750 hours (T); 850 hours (O) at an 80% confidence level. 267

268 FOR TRAINING USE ONLY Rationale: Analytical studies conducted in support of the Functional Needs Analysis (reference omitted) focused on technical factors that drive life cycle cost. MTBF was identified as a key driver in that influences operations and maintenance cost and also has a significant impact on operational suitability. Analysis has indicated that the minimum acceptable MTBF for a medium howitzer is 750 hours at a 80% confidence level Intermediate Mean Repair Time. Intermediate mean repair time shall not exceed 2 (T); 1.5 (O) hours. Easy access to prime power train components must be provided. Power pack and primary components should be capable of replacement by organizational maintenance with crew assistance using organic tools and equipment. Rationale: Analytical studies were conducted in support of the Functional Needs Analysis (reference omitted). Intermediate mean repair time was identified as a significant contributor to campaign level operational suitability. The analysis further indicated that the minimum acceptable intermediate level repair time was no greater than two hours Environmental Protection. The crew must be protected from nuclear, biological, and chemical (NBC) attack (T = O). Carry on-board items necessary to conduct immediate and operational decontamination as developed and defined in appropriate SPAW operational and technical manuals and conducted by crew within 15 minutes. Be operationally decontaminable to a level to prevent spread of contaminates to operators during critical functions. Reduce personnel MOPP levels (T) for personnel performing critical functions. Decontamination to negligible risk levels following thorough decontamination is not required, but desired (O). Rationale: Current doctrine calls for the maneuvering forces to be able to conduct military operations in an NBC environment Supportability. The system must be supportable using current logistic procedures in a combat situation (T = O). Diagnostic capability must be compatible with the Army Diagnostic Improvement Program. Rationale: A Logistic Support Analysis (LSA) for several weapon concepts was conducted in parallel with the Functional Needs Analysis. The LSA indicated that a new self-propelled howitzer must be capable of being supported through the existing logistic support system under combat conditions. Any significant deviations from this; e.g. new resupply/repair processes or support equipment, would make life cycle costs prohibitive Fuel Consumption. Fuel consumption will not exceed 40 (T); 35 (O) gallons per hour. Rationale: Design trade studies and modeling and simulation results recommend that forty gallons per hour is the maximum acceptable fuel consumption rate. These analyses used optimizing techniques that considered sizing of fuel tanks, vehicle gross weight, un-refueled range, and vehicle speed Fording. The howitzer will be capable of fording 4 (T); 4.5 (O) feet of water without a fording kit. 268

269 FOR TRAINING USE ONLY Rationale: Operational analysis efforts considered a number of representative scenarios which were based on actual geographical data. Simulations were conducted which involved various river and stream crossing requirements. Results indicated that the ability to ford four feet of water without a kit was a significant contributor to operational effectiveness Suspension System. The howitzer suspension system will be such that it can be fired without using spades (T = O). Rationale: Operational analysis was conducted to identify those weapon characteristics which had the greatest effect on timeliness and accuracy of delivering supporting fires to friendly troops. One factor that emerged from these studies was the significance of the capability of the suspension system. Having a suspension system that did not rely on spades (e.g. does not need to be dug-in ) was a significant contributor to timeliness of artillery fires Cant Limits. The weapon will be capable of absorbing a maximum of 10 (T); 15 (O) degrees of cant without requiring improvement of the firing positions. Rationale: Operational analysis was conducted to identify those weapon characteristics which had the greatest effect on timeliness and accuracy of delivering supporting fires to friendly troops. A factor that emerged from these studies was the significance of the capability to fire from a non-level position. Analysis indicates that the ability to accommodate up to 10 degrees of cant significantly improves both timeliness and accuracy of artillery fires Accuracy. Accuracy of conventional munitions (M107 High Explosive projectile) fired by the howitzer shall be less than or equal to 40 meters (T); 30 meters (O) CEP, at a minimum firing range of XXX meters. Rationale: Operational analysis was conducted to identify those weapon characteristics which had the greatest effect on lethality and survivability. CEP of less than or equal to 40 meters was a recommendation from the AoA. 7. FoS/SoS. The SPAW will join previously fielded equipment which is fully integrated into the Advanced Field Artillery Tactical Data System (AFATDS). Additionally, it will be fully compatible with the appropriate ABCS systems such as ASAS, MCS, AFATDS, FBCB2 as well as the associated communications and positioning equipment (SINCGARS, MSE, EPLRS, NTDR, GPS). 8. IT and NSS Supportability (omitted) 9. Intel Supportability (omitted) 10. E3 and Spectrum Supportability. All sheltered components of the SPAW shall be designed to be mutually compatible with other electric or electronic equipment within the system s expected operational electromagnetic sheltered environment. All spectrum dependent equipment must have a frequency supportability assessment conducted and conform to the frequency spectrum certified for Army use worldwide. 269

270 FOR TRAINING USE ONLY 11. Assets required to achieve IOC. IOC requires 18 SPAW, 18 Field Artillery Ammunition Resupply Vehicles (FAASV), six battery command centers, one battalion fire direction center, and at least one maintenance/support vehicle. 12. Schedule and IOC/FOC Definitions. IOC is defined to be the fielding of the SPAW in one division artillery battalion and is projected for 1QFY12. FOC is projected for 2QFY Other DOTMLPF-P Considerations Doctrine (omitted) Organization (omitted) Training. Initial individual training will be conducted during new equipment training (NET) in conjunction with introduction of the SPAW during unit fielding and in the institution. The institution will train system operation and/or familiarization to initial entry and professional development officer and enlisted personnel course attendees. Collective training on the system will be conducted in the unit. The unit commander will be responsible for system proficiency through sustainment and transition training, and ensure training time and assets are available to train required tasks to standard. The SPAW System Training Plan (STRAP) details specifics. It is essential that all requisite training products be developed by the materiel developer consistent with the delivery of SPAW. Task analysis and training product development will be performed using the Automated Systems Approach to Training (ASAT) database software, with software and ASAT training provided as Government Furnished Equipment (GFE). Training products will be prepared in accordance with the TRADOC Systems Approach to Training (SAT) and TRADOC Regulation Materiel (omitted) Leadership and education (omitted) 13.6 Personnel. The development threshold for crew size is five personnel. The development objective is three crewmembers. The reliability, availability, MTBF, and supportability Performance Attributes (Section 6.2) are intended to reduce the overall required size of the maintenance force. As a minimum, no increase in total force structure for operators and maintainers will be required by introduction of the SPAW. Appropriate analysis will be conducted to identify force structure impacts, if any. Results of this analysis will be incorporated into a Manpower, Personnel, & Training Assessment (MPTA) to be submitted at each Milestone Decision Review (MDR) as part of the MANPRINT Integration Report (MIR) and considered at program reviews as appropriate Facilities (omitted) 13.8 Policy (omitted) 14. Other System Attributes 270

271 FOR TRAINING USE ONLY Design, risk, cost drivers. Significant program risk is involved in the development of a main gun capable of increased range and increase rate of fire. Three technical risk drivers have been identified as follows The development of full length tube cooling and an integral muzzle brake present minor technical issues that may delay program development Shock testing of the breech mounted laser igniter will be required due to increased loads and stresses on the breech Fatigue life testing of the breech closure assembly will be required to determine safe fatigue lives for the high-pressure components at the breech end of the cannon and breech assembly Security needs (omitted) 15. Program Affordability Program cost for 500 systems is $1.35 billion in constant FY06 dollars. One system consists of one SPAW and one FAASV. This includes an average unit production cost per vehicle of approximately $1.2M. Appendix A, ICD/CDD Crosswalk (omitted) Appendix B, Integrated Architecture Products (omitted) Appendix C, References Next Generation Fire Support ICD (omitted) SPAW Analysis of Alternatives (under separate cover) 271

272 ABCS AFATDS AMDWS AoA ASAS ASAT AT ATGM BCS BCS3 C4ISR CDD CJCSI CONOPS CPD CPOF CSEL CTIS DoD DOTmLPF-P DTSS E3 EMD EPLRS FBCB2 FDC FOC FUE GCCS GFE GIG GPS HEAT ICD IMETS IOC ISYSCON JCIDS JDAM JFC JIC JOC JROC JVMF KIP KPP LSA MCS MIDS MOE MOP MOPP MOS MOSAIC MRFSW MSE MTBF NATO NBC NET NTDR SAT SINCGARS SOP SPAW SSA STRAP TACFIRE TAIS TRADOC UAS WSRO FOR TRAINING USE ONLY Appendix D, Acronym List Army Battle Command Systems Advanced Field Artillery Tactical Data System Air and Missile Defense Workstation Analysis of Alternatives All Source Analysis System Automated Systems Approach to Training Anti-Tank Anti-Tank Guided Missile Battery Computer System or- Battle Command System Battle Command Sustainment and Support System Command, Control, Communications, Computers, Intelligence, Surveillance, & Reconnaissance Capability Development Document Chairman of the Joint Chiefs of Staff Instruction Concept of Operations Capability Production Document Command Post of the Future Combat Survivor Evader Locator Combat Terrain Information Systems Department of Defense Doctrine, Organization, Training, Materiel, Leadership & Education, Personnel, Facilities, and Policy Digital Topographic Support System Electromagnetic Environmental Effects Engineering and Manufacturing Development Enhanced Position Location Reporting System Force XXI Battle Command, Brigade & Below Fire Direction Center Full Operational Capability First Unit Equipped Global Command and Control System Government Furnished Equipment Global Information Grid Global Positioning System High Explosive Anti-Tank Initial Capability Document Integrated Meteorological System Initial Operational Capability Integrated System Control Joint Capabilities Integration and Development System Joint Direct Attack Munition Joint Functional Concepts Joint Integrating Concepts Joint Operational Concepts Joint Requirements Oversight Council Joint Variable Message Format Key Interface Profiles Key Performance Parameter Logistics Support Analysis Maneuver Control System Multifunctional Information Distribution System Measure of Effectiveness Measure of Performance Mission Oriented Protective Posture Measure of Suitability Multifunctional On The Move Secure Adaptive Integrated Communications Medium Range Fire Support Weapon Mobile Subscriber Equipment Mean Time Between Failure North Atlantic Treaty Organization Nuclear, Biological, Chemical New Equipment Training Near-Term Digital Radio Systems Approach to Training Single Channel Ground & Airborne Radio System Standard Operating Procedure Self-Propelled Artillery Weapon System Safety Assessment System Training Plan Tactical Fire Direction Center Tactical Airspace Integration System Training and Doctrine Command Unmanned Aircraft System Weapon System Replacement Operations 272

273 Test Resources Lesson 6.4 Test Resources 273

274 THIS PAGE INTENTIONALLY LEFT BLANK 274

275 Test Resources The following continuous learning modules apply to this lesson: - CLE038 Time Space-Position Information - CLE037 Telemetry Test Resources Definition Test resources encompass all resources needed to conduct T&E of any kind. Test resources include, but are not limited to, test ranges, facilities, capabilities, air, space and ground support equipment, T&E-related manpower and pay, T&E-related travel and training, research, development, aircraft, flying hours, threat systems development and operations, modeling and simulation (M&S), distributed assets (live, virtual and constructive), targets, instrumentation, communications, range safety, information technology, data management and security. AFI99-109, May

276 TEMP Format Part I Introduction Part II Test Program Management and Schedule Part III Test and Evaluation Strategy Part IV Resource Summary TEMP Parts I through III were covered in the previous lesson. This lesson covers TEMP Part IV, and additional (related) information on test resources. 3 TEMP Part IV - Resource Summary 4.1 Introduction 4.2 Test Resource Summary Test Articles Test Sites Test Instrumentation Test Support Equipment Threat Representation Test Targets and Expendables Operational Force Test Support Models, Simulations and Test-beds Joint Operational Test Environment Special Requirements 4.3 Federal. State, Local Requirements 4.4 Manpower/Personnel Training 4.5 Test Funding Summary 4 276

277 TEMP Part IV All key test resource requirements should be stated in the TEMP Part IV Include items such as unique instrumentation, threat simulators, surrogates, targets, and test articles. Because the first TEMP must be prepared for program initiation, initial test resource planning must be accomplished very early as part of the TEMP preparation process. Refinements and reassessments of test resource requirements are included in each TEMP update. Once the test resource requirements are identified, the PM/OTA must then work within the Service headquarters and range management structure to ensure that the assets are available when needed. More detailed listings of required test resources are generated in conjunction with the detailed test plans. 5 TEMP Resource Estimates Must match resource estimates against the schedule, & justify by TEMP analysis Resource estimates (quantities of test articles, targets, expendables, threat simulations, operational forces, etc.) will be derived from defensible statistical measures of merit (power and confidence) associated with the coverage of the factors in a quantification of test risk. TEMP must discuss and display the calculations done to derive the content of testing and to develop the associated resource estimates. Test infrastructure and tools to be used in operational tests must undergo verification, validation, and accreditation by the intended user or agency. Test infrastructure, tools, and the VV&A strategy will be documented in the TEMP, including the associated required resources. DOT&E will approve the quantity of test articles for OT events, for systems under DOT&E oversight. The OTA will determine the quantity for programs not under DOT&E oversight. 6 Paraphrased from DoDI Encl 5 par 9 277

278 Coordinating T&E Resources Resources must be identified in the TEMP, and tasked to the appropriate agencies early The Army supplements the TEMP with a Test Resource Plan, which is used as the tasking document to participating test support units. The Air Force sometimes uses a Test Resource Plan (TRP) to supplement resource requirements from the TEMP. The Universal Documentation System (UDS) A series of documents used to identify & coordinate resources and support requirements needed by test range users T&E Management Guide Chapter 13 has additional info. (This document is on the student CD-ROM) 7 Early Coordination of T&E Resources Reasons why some T&E resources require early coordination: New instrumentation or test infrastructure may need to be developed or procured Threat representations may need to be obtained; threat simulators may need to be developed For huge test efforts requiring a lot of test range time, availability may need to be coordinated years in advance M&S (in support of T&E) sometimes takes a long time to develop 8 278

279 T&E Resources (slide 1 of 4) TEMP Part IV Summarizes all key T&E resources, both government & contractor, that will be used during the course of the acquisition program. 1. Test articles- The number of test articles (system, subsystem, component). Key support equipment & technical information. 2. Test sites The specific test ranges & facilities for each type of testing. 3. Test Instrumentation that must be acquired. 4. Test support equipment Required test support equipment. 9 T&E Resources (slide 2 of 4) 5. Threat representation* - Number, type, availability, and fidelity requirements 6. Test targets & expendables* - Number, type, and availability for targets, weapons, flares, chaff, sonobuoys, smoke generators, acoustic countermeasures, etc. 7. Operational force test support Type & timing of aircraft flying hours, ship steaming days, on-orbit satellite contacts/coverage, and other critical operating force support *Requires a Validation Process

280 T&E Resources (slide 3 of 4) 8. Models, simulations & test beds Identify the models & simulations to be used, and resources required for accreditation 9. Joint Operational Test Environment The live, virtual or constructive components or assets needed to evaluate system performance against joint requirements 10. Special requirements Any significant resources, such as: special databases, unique mapping products, extreme environmental conditions, or restricted / special use air / sea / landscapes 11 T&E Resources (slide 4 of 4) Federal, state and local requirements Describe how environmental compliance (and other federal, state, and local requirements) will be met, including NEPA requirements. Manpower/personnel training Manpower/ personnel and training requirements and limitations. T&E funding summary Funding required to pay cost of testing. (Summarized by FY, DT and OT dollars, and separated by major events and phases.)

281 Key Considerations in Selecting a Test Site Cost (including TDY costs) Programs will use govt. T&E capabilities (vice contractor) unless an exception can be justified as cost-effective to the govt. DoDI Encl 4 par 5b Availability of range time / ability to schedule the test facilities when needed Does that test site have the necessary test facilities/infrastructure, instrumentation, range & air space, personnel, and support services? Organizational politics Climate & terrain at the test site Other issues such as security, encroachment, spectrum availability, or environmental considerations DoD Test Facility Capabilities The Test Resource Management Center (TRMC) strategic plan contains much information on DoD test facilities and capabilities At least once every 2 fiscal years, TRMC completes a strategic plan with respect to DoD T&E facilities and resources, to include: An assessment of the current state of available T&E facilities and resources A comprehensive review of DoD T&E requirements; and the adequacy of the T&E facilities and resources to meet those requirements over a horizon of 10 fiscal years Discussion & rationale for proposed new T&E capability improvements DoD Directive D has in-depth info. on capabilities of the DoD MRTFB (on Student Disc)

282 TRMC Range Capability Directory TRMC is currently developing a directory of T&E facilities & ranges, and associated physical & technical characteristics DoD, non-dod federal agency infrastructure Commercial sector infrastructure (later) Available locations, capabilities, workload capability Information available to customers who need T&E infrastructure info. (need-to-know basis) TRMC contact information: TRMC@osd.mil (703) DSN Note: DASD(DT&E) is also TRMC Director (dual hatted) Major Range & Test Facility Base (MRTFB) 23 Sites: Army - 8; Navy - 6; Air Force - 7; DoD Agency - 2 Cold Regions Test Center Keyport Nevada Test & Training Range West Desert Test Center Utah Test & Training Range PMRF Aberdeen Test Center DISA Reagan Test Site NAWC-AD Pax River 30 th Space Wing NAWC-WD Point Mugu NAWC WD China Lake 412 th Test Wing Yuma Test Center Electronic Proving Ground DISA, JITC White Sands Test Center 96 th Test Wing Arnold Engineering Development Complex 45 th Space Wing Atlantic Undersea T&E Center 282

283 Non-DoD Test Facilities Other U.S. Government (non-dod) test ranges, facilities and labs: TRMC Strategic Plan has some info on selected NASA, DOE & DOT facilities U.S. Commercial/Contractor test ranges, facilities and labs: TRMC may evaluate commercial test capabilities in future strategic plans Info may be obtained (upon request) directly from contractors International test facilities: Info is available from DOT&E and/or TRMC 17 T&E Science & Technology Program Program Goal: To develop or exploit new technologies (required to test future warfighting capabilities); & expedite their transition to the T&E community Work closely with test capability developers (CTEIP projects, Service Range I&M projects) to mature & reduce risk in new test technologies Leverage heavily from ongoing efforts in industry, academia, & DoD S&T community Central oversight (by TRMC) Distributed execution (Services) Tri-Service working groups (validate requirements, evaluate proposals, facilitate technology transition) Current T&E/S&T test technology areas: High Speed Systems Test Spectrum Efficient Technology Unmanned/Autonomous Systems Test Advanced Instrumentation Systems Technology Cyberspace Test Directed Energy Test Electronic Warfare Test Net-Centric Systems Test

284 Central T&E Investment Program (CTEIP) The Central T&E Investment Program (CTEIP): Develops or improves high priority test capabilities that resolve Joint or multi-service requirements shortfalls Promotes common test infrastructure and standard test methodologies Minimizes duplicative engineering development and life cycle support costs Partnerships with Services and Agencies: Services and Defense Agencies propose and execute CTEIP projects Service T&E Executives prioritize needs OSD centrally manages and provides funding Service field activities lead development 19 Three CTEIP Categories Joint Improvement & Modernization (JIM) - $ M/year 3-5 year requirement horizon Provide major test capabilities; must address joint requirements Administered by TRMC Services & agencies budget for O&M over life-cycle of delivered capabilities Resource Enhancement Project (REP) - $18-20M/year 1-2 year requirement horizon Provide instrumentation needed to address emergent requirements Must address OT shortfalls; coordinated with DOT&E Threat Systems Project (TSP) - $3-5M/year 1-2 year requirement horizon Provide target capabilities (address shortfalls in threat systems representation) Coordinated with DOT&E

285 Instrumentation Challenges Start instrumentation planning early and document in the TEMP Find commonality among different data collection requirements Share DT / OT test instrumentation Same trial but separate data streams Alternate DT & OT trial scenarios Carry over appropriate DT instrumentation components for stand alone OT tests Reduce cost & schedule delays for deinstrument/re-instrument cycles 21 Onboard vs. Off-Board Instrumentation Issues concerning onboard vs. off-board instrumentation: Size, weight, and power requirements / constraints for onboard instrumentation Onboard data storage capability and/or telemetry data transfer rates Accuracy, cost, and reliability of onboard vs. offboard instrumentation (might want both types as back-ups for each other) Intrusiveness of onboard instrumentation

286 Test Article Planning Test article planning should include: Realistic testing of the complete system, including hardware, software, people, and all interfaces. Get the user involved from the start, and understand user limitations. Ensure that sufficient time and test articles will be available. Test parts, then subsystems, and finally systems to ensure that they work as prescribed before incorporating them into the next higher assembly. 23 T&E Cost Estimates For ACAT I & IA programs, the Cost Analysis Requirements Description (CARD) is used to formally describe the acquisition program for purposes of preparing both the DoD Component and independent cost estimates. For MDAPs, the CARD is prepared in support of major milestone decision points (MS B, MS C, or the FRPDR). For MAIS programs, the CARD is prepared in support of major milestone decision points, and whenever an economic analysis is required. For other acquisition programs, the preparation of a CARD or an abbreviated CARD-like document, is strongly encouraged to support a credible Life-Cycle Cost estimate. The CARD is prepared by the program office and approved by the DoD Component PEO. DoD M Chapt. 1provides further guidelines for CARD content

287 T&E Funding T&E funding is identified in the system acquisition cost estimates, Service acquisition plans, and the TEMP. Funding for contractor and Government DT&E is programmed and budgeted by the materiel developer (Service PM). OT&E funds are usually programmed and budgeted by the PM. The Air Force funds OT&E through dedicated funding for AFOTECconducted OT, and through MAJCOM funding for MAJCOM-funded testing. Testers should: Ensure the test program is sufficiently defined for an adequate cost estimate. Review the cost estimates resulting from the CARD to ensure funding is reasonable, and is included in the Resources section of the TEMP. Review the Acquisition Strategy, TEMP, and budgeting documents regularly, to ensure that adequate testing funds are identified

288 THIS PAGE INTENTIONALLY LEFT BLANK 288

289 TEMP Review Exercise Lesson 6.5 TEMP Review Exercise 289

290 THIS PAGE INTENTIONALLY LEFT BLANK 290

291 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 6, Part 5 Lesson Title TEMP Review Exercise Lesson Time 2.5 hours Terminal Learning Objective Given a scenario and DoD policy, the student will develop Test and Evaluation Master Plan (TEMP) content. (TLO #5) Enabling Learning Objectives Given DoD policy, critique a TEMP and develop required content to support a system's technical requirements and acquisition strategy; and common DoD policies, practices, and procedures. (ELO #5.4) Given a TEMP, determine necessary resources and T&E infrastructure requirements and shortfalls (people/knowledge, funding, facilities/ranges, instrumentation and associated support, software systems integration labs, and modeling and simulation). (ELO #5.5) Identify organizations with roles and responsibilities in providing for, or overseeing the T&E strategy and TEMP. (ELO #5.6) Recognize where environmental, interoperability, cybersecurity, and mission level testing should fit into system development. (ELO #5.7) Given a TEMP, construct a Developmental Evaluation Framework Matrix, and discuss how collected data supports the evaluation framework. (ELO #5.8) Assessment Class participation; oral presentation, 15 point graded assignment. Assignments Students read the SPAW TEMP as a homework assignment the evening before the exercise. Students should have noted any mistakes they found in the TEMP. 291

292 TST 204 Intermediate Test & Evaluation Estimated Student Preparation Time 90 minutes Related Lessons Prior lesson results. Self Study References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook, Chapter 9 Test and Evaluation Management Guide, 6 th ed., 2012 version 292

293 TEMP Review Exercise MDD CDD Validation Dev. RFP Release A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment Operations & Support YOU ARE HERE As homework, you read the SPAW MS B TEMP and wrote down any mistakes you noticed. Remember the J-SPAW is an ACAT 1C, JROC Interest Program and is on the OSD T&E Oversight List for DT&E / OT&E / LFT&E. 1 TEMP Review Exercise Given Key CDD requirements J-SPAW TEMP submitted in support of Milestone B Objective: Analyze a Milestone B TEMP Overview Analyze a Milestone B TEMP (for errors in content, format, etc.) Prepare and present a briefing to the class on the results of your analysis Categorize the errors found by where you found them (TEMP Part 1, Part 2, etc.) Each team will brief the errors they found from only one part of the TEMP 2 293

294 Limited User Test (LUT) For the J-SPAW TEMP we gave you: Assume that a LUT is the same thing as an Operational Assessment (OA) LUT Phase I occurs during EMD Production & Deployment (PD) LUT occurs prior to IOT&E Typical OA objectives: Estimate potential operational effectiveness & suitability Examine / assess key risk areas, & COIs that need to be resolved earlier than IOT&E Help identify best design; examine operational aspects of system design, adequacy of CONOPS, etc. OAs often support the MS C Review (the SPAW PD LUT supports decisions concerning readiness for IOT&E and/or readiness for production) 3 What is a TEMP? Executive level strategy and primary T&E planning and management document for entire life cycle - Contract Between All Stakeholders Overall T&E structure, major elements, and objectives Consistent with Acquisition Strategy Sufficient detail to permit planning for timely availability of test resources required to support the T&E program Road Map for integrated simulation, test, and evaluation plans; resource requirements; and schedules Note: The Program Manager will use the DAG TEMP format and content as guidance in formulating DT&E plans, per DoDI , Encl 4, par 5a 4 294

295 TEMP Suggested Content Should include the following items: Acquisition strategy tie-in Time-phased threats to mission accomplishment Technical risk reduction testing (new or critical technologies from Technology Development Strategy) Component and sub-system developmental testing Critical operational and live fire issues Scope and structure of the operational and live fire evaluations Major T&E design considerations T&E schedule Anticipated M&S used for future system evaluations T&E funding estimates for programming and budgeting Rigorous use of experimental design within the development of the test design (See the TEMP Guidebook TEMP Format, which is in this section of your student book.) 5 TEMP Development Process Develop the TEMP via the T&E WIPT process Provide sufficient details about future DT&E activities (in Part III) Be specific while eliminating the fluff A TEMP is a master strategy document, not a detailed test plan To reduce the size of TEMPs, use pointers to other documents where more details are present, such as: Software development plans Modeling and simulation plans Logistic support plans Include T&E WIPT Subgroups (that is, RAM, Supportability M&S, Threat, Live Fire, etc.) in the T&E planning Leverage all testing 6 295

296 Premature Submission: Submitting a TEMP for approval when an issue(s) precluding approval exists Failure to Pre-Coordinate: Incomplete: Deadly Sins Affecting TEMP Approval OSD: DT&E and DOT&E - Requirements/Test Crosswalk Matrix missing - Classified data not integrated - submitted separately - Too many TBDs in Part IV (Resources) Size: Body of plan, excluding annexes, not kept to a manageable length Annexes: Not kept to a minimum Reference Documents: ICD/CDD/CPD & STAR not approved / unavailable 7 TEMP Do s and Don ts Do have strict configuration control over TEMP Do have Part IV (Resources) based upon Part III Do have Requirements/Test Crosswalk Matrix as Attachment 1 Do know the TEMP Approval Authority before developing the TEMP Do treat the TEMP as a Contract among all Stakeholders Do ensure that Core T&E WIPT members agree with any change(s) required to obtain TEMP approval after providing their signature on the TEMP s T&E WIPT Coordination Sheet 8 296

297 TEMP Do s and Don ts (cont.) Do check to ensure TEMP agrees with key documents such as contracting docs (RFP, SSP), SEP, etc. Don t attempt to gain TEMP approval before obtaining COIC approval Don t allow Concur with Comment on final T&E WIPT Coordination Don t have TEMP simply refer to System Evaluation Plan to avoid unnecessary tiering 9 TEMP Guidance What Reviewers are Looking For The following guidance/documents are on the Student CD-ROM: DASD(DT&E) TEMP Review Guides In the DoD folder DOT&E Memos (in the DoD, IT, & Statistics folders) Use of Design of Experiments in OT&E Cybersecurity Several other memos DTM on Reliability (in the Reliability folder) Service specific guidance (in the Air Force, Army, Marine Corps, and Navy folders)

298 DASD(DT&E) TEMP checklist DASD(DT&E) TEMP Tenets Checklist (on student CD) has: 1.Evaluation framework with testable and measurable criteria 2.Adequate scope and reasonable phasing 3.Reliability growth 4.Certifications 5.Analytical basis (STAT) 6.Achievable schedule 7.Exit / entrance criteria for TRR and major T&E phases 8.T&E WIPT / Integrated Test Team 9.Resources & resource mapping 10.Management (Chief Developmental Tester, Lead Govt. DT&E Organization identified) 11 TEMP Exercise Tasks Read the J-SPAW MS B TEMP Analyze the TEMP for errors in content, format, etc. This TEMP was submitted in support of Milestone B Ignore the sections that say omitted for training purposes Are the DT, OT, LFT&E & resources appropriate? Does the TEMP provide enough information to address Milestone B, Milestone C, FRP-DR and beyond? Suggest that one person from each team compare the J- SPAW TEMP to the DAG TEMP format in Student Guide In the real world, the TEMP would be compared to the CDD, and other sources of information. For the sake of time, we will NOT do that during this exercise. Prepare and present a team briefing to the class on the results of your analysis (errors, adequacy, etc.) Present by TEMP paragraph #s, followed by your comments 298

299 TEMP Exercise Tasks (cont.) Each team will analyze a part of the TEMP in their team briefing: Part I (including any mistakes on Tables 1-1 and 1-2) Part II and all three cover / signature pages Part III (sections 3.1 through 3.4) Part III (section 3.5 through end of Part III) Part IV You will have 60 minutes to develop your team briefing

300 THIS PAGE INTENTIONALLY LEFT BLANK 300

301 TST 204 Intermediate Test & Evaluation Course TEMP Review Exercise In this lesson, you will use the results of the previous lessons and the knowledge you ve gained from the classroom lectures and discussions to analyze a SPAW MS B TEMP. Student taskings: Task 1: Read the SPAW MS B TEMP. (This should have been done as homework.) Task 2: Analyze the SPAW TEMP for appropriate content, format, etc. Refer to the DAG Chapter 9 (which is printed in this section of your Student Guide) for TEMP format and content. A key consideration in developing or reviewing a TEMP is to be familiar with information from a number of other sources. Due to limited class time, we will NOT do this during exercise 4. However in the real world, testers and other personnel would need to review the following information sources, and compare them with proposed TEMP content. Critical technologies EMD exit criteria Cost and schedule constraints Capability Development Document (particularly Key Performance Parameters and System Attributes) Live fire lethality & vulnerability test requirements The recommended steps for you to use in this exercise are as follows: 1) Analyze the SPAW MS B TEMP. Does the TEMP provide enough information (i.e., key points or issues) to address Milestone B and beyond decisions? What mistakes did you find in the TEMP? 2) Compare the SPAW MS B TEMP with the DAG TEMP format. Does the SPAW TEMP contain all needed information, per the DAG TEMP format? 3) Prepare a briefing to the class that: - Lists and explains any assumptions made by your team. - Briefs the errors your team found in the TEMP. Note: Each person will read and analyze all parts of the MS B TEMP as homework (you should have already done this). However, each team will only analyze and brief one part of the TEMP to the class. The instructor will determine which TEMP part each team is assigned. 301

302 THIS PAGE INTENTIONALLY LEFT BLANK 302

303 Wednesday Graded Assignment Wednesday Graded Assignment MS B TEMP 303

304 THIS PAGE INTENTIONALLY LEFT BLANK 304

305 Name Team # TST 204 Graded Assignment 15 points total Part 1 (8 Points) Complete the Developmental Evaluation Framework Matrix (DEFM) for the SPAW on the back side of this page. There are 16 blank cells in the matrix to complete. Each cell is worth ½ a point. It will help you complete this assignment, if you refer (as necessary) to the SPAW CDD, SPAW MS B TEMP, and your results from previous exercises. For a DEFM example and for more information, please refer to the DEFM information in the TEMP Inputs section of your student book. Note: The Developmental Evaluation Framework Matrix on the back side of this page has been PARAPHRASED from the DRAFT DAG Chapter 9, May, 2014 version (which is on your student CD ROM). If you plan to develop an actual DEFM, always check for the latest DEFM format. You can download the latest DAG Chapter 9 at this website: Part 1 instructions: 1. Each Functional Evaluation Area must be a different area. 2. If a particular cell is already filled in, you don t need to do anything else for that cell. 3. All information in a particular row must relate to the Functional Evaluation Area (and to all of the other information) that is in that row. 4. Any Critical Technical Parameters (CTPs) in a particular row must support, and must be different than any KPPs in that row. 5. The Technical Measure(s) and Resource(s) in a particular row must directly relate to the Key System Requirement/Description in that same row. Part 2 (7 Points) Compare the SPAW MS B TEMP Part III to the SPAW MS B TEMP Part IV. List seven specific resources that are NOT currently listed in Part IV of the TEMP; and need to be added to Part IV of the TEMP. The SPAW MS B TEMP is in the Tuesday Homework section of your student book

306 THIS PAGE INTENTIONALLY LEFT BLANK 306

307 Decisions Supported Omitted for Training Purposes Omitted for Training Purposes Omitted for Training Purposes Omitted for Training Purposes Omitted for Training Purposes Developmental Evaluation Framework Matrix: Self-Propelled Artillery Weapon (SPAW) Functional Evaluation Area (major area) Decision Support Question (Developmental issue) Technical Reqs. Document Reference Key System Requirements/ Description Technical Measures (CTP, TPM, or required benchmark) Resources Cross Reference (KPP, KSA, COI, and/or Critical Requirement) Performance 3.X.X.X 3.X.X.X Interoperability Net-Ready Is the SPAW interoperable with interfacing systems? 3.X.X.X Information Operations IEEE Spectrum of interoperability model Level 6 Software Lab, Combat Systems LAN NR-KPP Cybersecurity Cyber Operations Security Does the SPAW meet Cybersecurity standards? 3.X.X.X Ability to withstand cybersecurity attacks All controls assigned, in place, configured, and adequate National Cyber Range, Red Cell Team RMF C&A Reliability Reliability Is the SPAW reliable? 3.X.X.X 307

308 Decisions Supported Omitted for Training Purposes Omitted for Training Purposes Omitted for Training Purposes Omitted for Training Purposes Omitted for Training Purposes Developmental Evaluation Framework Matrix: Self-Propelled Artillery Weapon (SPAW) Functional Evaluation Area (major area) Decision Support Question (Developmental issue) Technical Reqs. Document Reference Key System Requirements/ Description Technical Measures (CTP, TPM, or required benchmark) Resources Cross Reference (KPP, KSA, COI, and/or Critical Requirement) Performance 3.X.X.X 3.X.X.X Interoperability Net-Ready Is the SPAW interoperable with interfacing systems? 3.X.X.X Information Operations IEEE Spectrum of interoperability model Level 6 Software Lab, Combat Systems LAN NR-KPP Cybersecurity Cyber Operations Security Does the SPAW meet Cybersecurity standards? 3.X.X.X Ability to withstand cybersecurity attacks All controls assigned, in place, configured, and adequate National Cyber Range, Red Cell Team RMF C&A Reliability Reliability Is the SPAW reliable? 3.X.X.X 308

309 Lesson 7 Human Systems Integration 309 Human Systems Integration

310 THIS PAGE INTENTIONALLY LEFT BLANK 310

311 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 07 Lesson Title Human Systems Integration Lesson Time 1.0 hours Terminal Learning Objective Given a system case, the student will correctly generate OT&E planning, execution, and reporting documentation. (TLO #17) Enabling Learning Objectives Apply appropriate evaluation criteria for effectiveness, suitability, and Human Systems Integration elements to T&E planning and execution. (ELO #17.4) Describe the areas/domains encompassed by operational effectiveness, operational suitability, and human systems integration. (ELO #17.9) Learning Method Class discussion and participation. Assignments None. Estimated Student Preparation Time Method of Assessment Related Lessons None. Written examination. Lesson 10 (DT&E Assessment), Lesson 20 (OT Test Execution). References Student CD ROM. Army MANPRINT Handbook Air Force Human Systems Integration Handbook 311

312 THIS PAGE INTENTIONALLY LEFT BLANK 312

313 Human Systems Integration (HSI) The following Resources apply to this lesson, and are on the student CD-ROM: The Army MANPRINT Handbook The Air Force Human Systems Integration Handbook DoDI , Enclosure 7 The crash raises questions about over-reliance on computers 2 September 2006 / DAU 313

314 Definition The integrated and comprehensive analysis, design, and assessment of requirements, concepts, and resources for system manpower, personnel, training, environment, safety, occupational health, habitability, personnel survivability, and human factors engineering. 3 September 2006 / DAU Purpose A plan for HSI shall be in place early in the acquisition process to.. Optimize total systems performance Minimize total ownership costs Ensure that the system is built to accommodate the characteristics of the user population Operators Maintainers Support Personnel Note: DoDI , Enclosure 7 requires HSI to be addressed for all acquisition programs 4 September 2006 / DAU 314

315 Scope HSI domains: Manpower Personnel Training Environment, Safety and Occupational Health Habitability Survivability Human Factors Engineering These domains are interdependent and mutually supporting September 2006 / DAU 5 Manpower The consideration of the net effect of systems on overall human resource requirements and authorizations (the spaces) to ensure that each system is affordable from the standpoint of manpower. Includes the number of people (military, civilian and contractor) needed to Operate Maintain Support a system in peacetime and war. 6 September 2006 / DAU 315

316 Electronic Warfare Manpower Example EA-6B Prowler required a pilot and 3 Electronic Counter-measures Officers (ECMO s). Replaced by EA-18G Growler, which requires only two crew members. September 2006 / DAU Personnel The consideration of human aptitudes (i.e. cognitive, physical and sensory capabilities), knowledge, skills, abilities and experience levels that are needed to properly perform job tasks across the military, civilian and contractor work force to Operate Maintain Support a system in peacetime and war. These are the faces that fill the authorized spaces. 8 September 2006 / DAU 316

317 Human Characteristics COGNITIVE Aptitude Knowledge Ability Gender Senses Size Strength Stamina PHYSICAL PSYCHOMOTOR Coordination Dexterity EXPERIENCE Civilian Military Education Interest 9 September 2006 / DAU Armed Forces Qualification Test Score (AFQT) AFQT CATEGORY AFQT PERCENTILE I II IIIA IIIB IVA* IVB* IVC* V** 0 9 * By law, no more than 20 percent accessions in CAT IV in any year. ** By law, no CAT V recruits could be accepted. (Except T&E Instructors.) September 2006 / DAU 317

318 Aircraft Cockpit Personnel Example A-4 Skyhawk had a simple cockpit with primary flight instruments. F-35 Lightning has the most advanced cockpit ever designed, includes helmet mounted display and sensor fusion technology. September 2006 / DAU Training The use of analyses, methods and tools to ensure systems training requirements are fully addressed and documented by system designers and developers to achieve a level of individual and team proficiency that is required to successfully accomplish tasks and missions. Includes options for individual, collective and joint training for Operators Maintainers Support Personnel 12 September 2006 / DAU 318

319 F/A-18 Pilot Training Example Primary training consists of six stages: Familiarization (FAM), Basic Instruments, Precision Aerobatics, Formation, Night FAM, and Radio Instruments. Continue to Strike training where pilots learn strike tactics, weapons delivery, air combat maneuvering, and receive their carrier landing qualification. Then assigned to fleet replacement squadron where pilots learn the basics of air-to-air and air-to-ground missions, culminating in day/night carrier qualification and subsequent assignment to fleet Hornet squadrons. September 2006 / DAU Includes three aspects Environment Safety Occupational Health Environment Safety & Occupational Health Environment Affects concepts of operation and the requirements to protect systems from the operational environment (i.e. shock, vibration, extreme temperatures, etc.) and the natural environment (i.e. water, land, air and space) from the systems operations, sustainment and disposal. 14 September 2006 / DAU 319

320 Environment, Safety & Occupational Health (cont.) Safety The consideration and application of system design characteristics that serve to minimize the potential for mishaps causing death or injury to operators and maintainers or threaten the survival and/or operation of the system. Occupational Health Those system design features that minimize the risk of injury, acute or chronic illness, or disability; and/or reduce job performance of personnel who operate, maintain or support the system. September 2006 / DAU 15 F-22 ESOH Example 2002 Raptor Aeromedical Working Group (RAW- G) Experts in life support, avionics, physiology, and systems safety; along with F-22 aircrew and maintainers Founded by members of the F-22 community concerned about how unique demands of the aircraft could affect pilots F-22 can evade radar and fly faster than sound w/o afterburners Flies higher than its predecessors Has a self-contained On-Board Oxygen Generation System (OBOGS) to protect pilots from chemical or biological attack A problem that surfaced: Raptor cough Fits of chest pain and coughing dating to 2000 that stem from the collapse of overworked air sacs in the lungs 16 September 2006 / DAU 320

321 F-22 ESOH Example (cont.) RAW-G concluded that the F-22 s OBOGS was giving pilots too much oxygen, causing the coughing The more often and higher the pilots flew after being oxygen-saturated, the more vulnerable pilots would be to other physiological incidents RAW-G recommended more tests & that the F-22 s oxygen delivery system be adjusted (less oxygen at lower altitudes) through a digital controller and software upgrade RAW-G members spent 2 years pushing for the change, but the software upgrade never came through The cost was considered prohibitive in light of other items that people wanted funded for the F-22. The cost was estimated to be $100,000 per aircraft. (Kevin Dyers, former USAF physiologist & RAW-G member, 2007.) 17 September 2006 / DAU Habitability The consideration of system related working conditions and accommodations that are necessary to sustain the morale, safety, health, and comfort of all personnel. Establish requirements for Physical environment Personnel services (if appropriate) Living conditions 18 September 2006 / DAU 321

322 Not Your Grandfathers Navy! The Todays Navy sleeping cot structure Sleeping cot structure holds 5 sailors (top cot just visible) Circa 1943 September 2006 / DAU B-2 Bomber Habitability Example B-2 Bombers based in Whiteman AFB, Missouri conduct non-stop intercontinental missions. September 2006 / DAU 322

323 Survivability The consideration of the characteristics of a system (e.g. life support, body armor, helmets, plating, egress/ejection equipment, air bags, seat belts, electronic shielding, etc.) that reduce susceptibility of the total system to mission degradation or termination. Goal Reduce detectability of the war fighter Prevent attack if detected Prevent damage if attacked Minimize medical injury (if wounded or injured) Reduce physical and mental fatigue September 2006 / DAU 21 MV-22 Survivability Example MV-22 Osprey was built to transport Marines to and from the battlefield, but did not include a defensive weapon. What other attributes of the aircraft contribute to survivability? September 2006 / DAU 323

324 Human Factors Engineering The consideration and application of human capabilities and limitations throughout system definition, design and development to ensure effective human-machine integration for optimal total system performance. System designs shall minimize or eliminate system characteristics that Require excessive cognitive, physical or sensory skills Entail extensive training or work-load intensive tasks Result in mission-critical errors Produce safety or health hazards September 2006 / DAU HFE is the only discipline that relates humans to technology. How is HSI Tested / Assessed? The various requirements documents (ICD, CDD, CPD, etc.) contain HSI or HSI-related requirements. Additional HSI requirements are found in DoDI , Enclosure 8. These requirements are tested / assessed as part of the T&E process (Govt. DT&E, Govt. OT&E, contractor testing, etc.) A lot of the HSI testing / assessment can probably be done along with other (already scheduled) test events; however, dedicated HSI test events may be required 24 September 2006 / DAU 324

325 September 2006 / DAU Example of Assessing HSI (From an Army OA) Additional Issue: How well do MOS qualified soldiers having received individual and collective training, perform mission tasks to existing standards under operationally realistic conditions for the XYZ system while maintaining safety requirements? In the Army, an Additional Issue (AI) is similar to a COI In the Army, COIs are written by the user representative (TRADOC), and AIs are written by the Evaluator AIs are developed for those aspects of the system, that are not supported by the COIs (DA PAM 73-1, Par 5-9) Seven HSI-related Additional Sub-issues supported (came under) the above Additional Issue. The Additional Issue and Sub-issues were assessed along with the test events, conducted during the OA 25 HSI Example Seven HSI-related Additional Sub-issues: Sub-issue 1: MANPOWER. The recommended manning levels for operators and maintainers must be sufficient to meet workload requirements for continuous operations. Sub-issue 2: PERSONNEL. The XYZ System must not require new and unique aptitudes beyond which the Target Audience possesses. Sub-issue 3: TRAINING. The combination of instruction, education, on-the-job and self-development training and training products developed for the XYZ System must enable Target Audience Soldiers to be easily trained to perform to standard. Sub-issue 4: HUMAN FACTORS ENGINEERING. The XYZ System must demonstrate that integrated software and hardware design features are optimized for mission performance by Target Audience Soldiers. 26 September 2006 / DAU 325

326 HSI Example (cont.) Seven HSI-related Additional Sub-issues: Sub-issue 5: SYSTEM SAFETY. The XYZ System must demonstrate design features and operating characteristics that minimize or eliminate the potential for human or machine error or failure that cause injurious accidents. Sub-issue 6: HEALTH HAZARDS. The XYZ System must demonstrate design features and operating characteristics that minimize or prevent health hazard risks for bodily injury or death. Sub-issue 7: SOLDIER SURVIVABILITY. The XYZ System must demonstrate design features and operating characteristics that can reduce fratricide, detectability and probability of being attacked; as well as minimize system damage, soldier injury, and cognitive and physical fatigue. September 2006 / DAU 27 Summary HSI is not just human factors engineering or cockpit integration HSI is the effective integration of these interdependent, mutually supporting domains Manpower Personnel Training Environment, Safety and Occupational Health Habitability Survivability Human Factors Engineering 28 September 2006 / DAU 326

327 Risk Mangement Lesson 8.1 Risk Management 327

328 THIS PAGE INTENTIONALLY LEFT BLANK 328

329 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 08 Lesson Title Risk Management and Environment, Safety and Occupational Health (ESOH) Lesson Time 3.0 hours (includes an exercise) Terminal Learning Objectives Given a system description and DoD guidance, the student will develop procedures and constraints for test operations to comply with safety, environmental, and risk management policies. (TLO #10) Enabling Learning Objectives Given a system description, evaluate T&E risk factors, along with likelihood and consequences of occurrence. (ELO #10.1) Given a system description, develop information for risk mitigation/risk management plan for specific T&E risk factors. (ELO #10.2) Recognize the role of ESOH and risk management to support T&E planning efforts. (ELO #10.3) Assignments None. Estimated Student Preparation Time None. Assessment Written examination, class participation, oral presentation. Related Lessons Prior lesson results. 329

330 TST 204 Intermediate Test & Evaluation Self Study References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook, Chapter 9 Risk Management Guide for DoD Acquisition (August, 2006) MIL STD 882E, System Safety 330

331 T&E Risk Management DoD Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs Redefines Risk, Issue and Opportunity Management Process Provides detailed guidance on key activities: Process Planning Risk Identification Root Cause Determination Risk Analysis Risk Mitigation Risk Monitoring Processes for Risk, Issues, and Opportunities are parallel Available on-line through the Risk Management website on the ACC at

332 Risk, Issue, and Opportunity Definitions Risks are future events or conditions that may have a negative effect on achieving program objectives for cost, schedule, and performance. Risks are defined by (1) the probability(greater than 0, less than 1) of an undesired event or condition and (2) the consequences, impact, or severity of the undesired event, were it to occur. Issues are events or conditions with negative effect that have occurred (such as realized risks) or are certain to occur (probability of 1) in the future that should be addressed. Opportunities are potential future benefits to the program s cost, schedule, and/or performance baseline, usually achieved through reallocation of resources. Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Risk Management Defined Risk Management is an integral part of program management and systems engineering. A program must align risk appetite with organizational capacity to manage and handle risks and apply informed judgment to allocate limited resources to the best effect. Sound judgment to achieve this balance is at the core of program management. This applies to both the Acquisition Program and the Test Program 4 332

333 Overview of Potential Sources of Program Risks, Issues, and Opportunities Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Risk Management vs. Issue Management Risk management applies resources to mitigate potential future negative events and their consequences Issue management applies resources to address and resolve current problems If an event is described in the past tense, or is certain to occur, it is an issue to be resolved, not a risk

334 DoD Risk Management Process Steps Process Planning What is the program s risk mitigation process? Risk Identification What can go wrong? Communication and Feedback Risk Monitoring How has the risk changed? Risk Analysis What are the likelihood and consequence of the risk? Risk Mitigation Should the risk be mitigated? If so, how? Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Risk Process Planning When establishing risk processes and procedures: Assign roles, responsibilities, and authorities Select and document overall approach: Process and procedures Risk analysis criteria for likelihood and consequences Risk mitigation procedures Establish traceability of risk to technical requirements and overall program objectives Align government and contractor roles, responsibilities, tools, and information exchange Determine risk management resources, to include budget, facilities, personnel, schedule Determine risk management battle rhythm Process Planning What is the program s risk mitigation process? Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June

335 Risk Identification - What can go wrong? Risk Identification What can go wrong? When identifying risks: Understand the nature of the product and the requirements that shape the product Use various risk ID methodologies: Independent assessments SOW requirements Brainstorming sessions with SMEs Interviews with IPT leads, Systems Command/Center competencies Review of similar/historical programs Trade studies Review analysis of Technical Performance Measures, resource data, life cycle cost information, WBS/IMS/EVM data trends, and progress against critical path Assess technical performance at all levels: component, subsystem, integrated product, external interfaces. How big a gap? How challenging to cross it? Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Risk Identification Risk Identification asks: What can go wrong? or What is uniquely hard or difficult? This step involves examining the program to determine risk events and associated cause(s) that may have negative cost, schedule, and/or performance impacts. While the root cause of some risks may not be known or be determinable at the time the risk is evaluated, the program should attempt to drill down far enough to understand underlying root cause(s) to inform risk analysis and the development of handling strategies

336 Risk Identification (cont.) Typical risk sources include Test and Evaluation: adequacy and capability of the test and evaluation program to assess attainment of significant performance specifications and determine whether the system is operationally effective, operationally suitable, and interoperable Modeling and Simulation (M&S): adequacy and capability of M&S to support all life-cycle phases of a program using verified, validated, and accredited models and simulations Ask the why question multiple times to get past the symptoms and find the root cause 11 Risk Analysis - How big is the risk? When analyzing risks: Quantify the cost, schedule, and performance impacts: Risk Analysis What are the likelihood and consequence of the risk? RDT&E costs Procurement costs O&S costs Performance thresholds Schedule thresholds Affordability caps Assess the likelihood of the risk being realized Conduct analysis periodically to support cost, schedule, and performance risk assessments Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June

337 Risk Analysis The intent of risk analysis is to answer the question How big is the risk? by Considering the likelihood of the event occurrence Identifying the possible consequences in terms of performance, schedule and cost Identifying the risk level using the Risk Reporting Matrix 5 Likelihood Consequence Levels of Likelihood Criteria The level of likelihood of each root cause is established using specified criteria. For example, if the root cause has a 50% probability of occurring, the corresponding likelihood is Level

338 Recommended Consequence Criteria Level Cost Schedule Performance 5 Critical Impact 10% or greater increase over APB objective values for RDT&E, PAUC, or APUC Cost increase causes program to exceed affordability caps Schedule slip will require a major schedule rebaselining Precludes program from meeting its APB schedule threshold dates Degradation precludes system from meeting a KPP or key technical/supportability threshold; will jeopardize program success 2 Unable to meet mission objectives (defined in mission threads, ConOps, OMS/MP) 4 Significant Impact 3 Moderate Impact 5% - <10% increase over APB objective values for RDT&E, PAUC, or APUC Costs exceed life cycle ownership cost KSA 1% - <5% increase over APB objective values for RDT&E, PAUC, or APUC Manageable with PEO or Service assistance Schedule deviations will slip program to within 2 months of approved APB threshold schedule date Schedule slip puts funding at risk Fielding of capability to operational units delayed by more than 6 months 1 Can meet APB objective schedule dates, but other non- APB key events (e.g., SETRs or other Tier 1 Schedule events) may slip Schedule slip impacts synchronization with interdependent programs by greater than 2 months Degradation impairs ability to meet a KSA. 2 Technical design or supportability margin exhausted in key areas Significant performance impact affecting System-of System interdependencies. Work-arounds required to meet mission objectives Unable to meet lower tier attributes, TPMs, or CTPs Design or supportability margins reduced Minor performance impact affecting System-of System interdependencies. Work-arounds required to achieve mission tasks 2 Minor Impact Costs that drive unit production cost (e.g., APUC) increase of <1% over budget Cost increase, but can be managed internally Some schedule slip, but can meet APB objective dates and non-apb key event dates Reduced technical performance or supportability; can be tolerated with little impact on program objectives Design margins reduced, within trade space 2 1 Minimal Impact Minimal impact. Costs expected to meet approved funding levels Minimal schedule impact Minimal consequences to meeting technical performance or supportability requirements. Design margins will be met; margin to planned tripwires Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Risk Reporting Matrix and Criteria Level Likelihood Probability of Occurrence 5 Near Certainty > 80% to 99% 4 Highly Likely > 60% to 80% 3 Likely > 40% to 60% 2 Low Likelihood > 20% to 40% 1 Not Likely > 1% to 20% Likelihood High Moderate Low Consequence

339 T&E Example of Risk Reporting Circuit Card Availability (S) Aggressive development project may not deliver circuit cards in time to support development testing. Develop interim test bench and test methods to support integral development and test activity until full capability is available. 5 Likelihood Consequence 17 Risk Mitigation - What s the plan? Risk Mitigation Should the risk be mitigated? If so, how? When mitigating individual risks: Consider the accept, avoid, and transfer options, not just the control option Choose the best mitigation options, then select the best implementation approach for those options Ensure appropriate peers and stakeholders are informed about high risk items; elevate as needed Include cross program risks in order to consider the impact of risk mitigation actions on other programs Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June

340 Risk Handling Options Control the cause or consequence. Use the design process (including testing, Design of Experiments, M&S, etc.) to lower risk to acceptable levels. Avoidance by eliminating the root cause and/or consequence. Force the problem to go away; never give it an opportunity to occur. (Ex: change the requirements or design concepts) Acceptance (Assumption) of the level of risk and continue on the current program path. (The risk is low in probability and/or consequence) Transfer the risk. Move the problem from one area of design to another, or from one organization to another. (Ex: hardware to software, or Government to Contractor e.g. a warranty or change contract types) 19 Seeker Dome Risk Mitigation Timeline Diagram High Now Program Start Phase 1 (12-14 mos) End of Phase 1 Phase 2 (36 mos) MS C Risk Med Proposal Deliveries Down Select to One Contractor (two Domes) Produceability Effort Initiated Lab Testing Environmental Testing Tower Testing, CFT of RRS HWIL ManTech Results Final Design Selected (Most Producible Dome) Tower Test, CFT, DBF EDT Flight Tests Initial E3 Testing Production Prove Out CFT Flight Tests Low Eventline Not to Scale Comp Qual / EDT Sys Qual / PPT RFP CA CTV PDR CDR GTV DRR Risk Now Program Start End of Phase 1 MS C Seeker Dome Med Med Med/Low Low 340

341 Risk Monitoring - How has the risk changed? When monitoring risks: Track the implementation and progress of the risk mitigation activities, not just the development and planning of the selected strategy Include Technical Performance Measures as an integral activity when monitoring risks after selecting the appropriate risk mitigation strategy Conduct regular status updates to monitor risks for changes to likelihood and/or consequences Document risks that can be retired as well as risks that are still being mitigated to prevent an unnoticed relapse of the retired risk Keep lines of communication open to notify management when ability to mitigate the risk is ineffective Risk Monitoring How has the risk changed? Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Value of T&E in Risk Management OSD DT&E Study 1999 and 2002: Industry sees test as a risk reduction Some people within DoD see test as a risk Government weapon programs do not have the same market-created measures as in the private sector to demonstrate the value of testing such as warranties, recalls and class action law suits that are real in the private sector and that provide a cost risk to industry which testing helps reduce. Thomas Christie, DOT&E before the Senate Armed Services Committee, May

342 Value of T&E in Risk Management (cont.) Risk management is the means by which the program areas of vulnerability & concern are identified & managed. T&E is the discipline that helps to illuminate those areas of vulnerability As the GAO stated in their July 2000 report GAO recommends that acquisition managers structure test plans around the attainment of increasing levels of product maturity, orchestrate the right mix of tools to validate these maturity levels, & build & resource acquisition strategies around this approach. 23 Issues 100% Now/Future OSD has found that program issues are, too often, mistakenly characterized as risks. This practice is reactive and tends to blind the program to true risk management. Risk management applies resources to lessen the likelihood, or in some cases, the consequence, of a future event. Issue management, on the other hand, applies resources to address and resolve a past or occurring event and its related consequences. When a negative event has been identified and has a past or present impact to the cost, schedule, or performance of a program, it is not a risk. These events should be cataloged as issues and should be addressed within the program s normal issue management process. In addition, even though an issue may introduce a likely future consequence, this does not make it a risk. To ensure issues and risks are properly identified, programs should have an issue management approach to identify problems and track associated closure plans. Programs should also assess whether issues are spawning prospective risks

343 DoD Issue Management Process Steps Issue Process Planning What is the program s issue management process? Issue Identification What has or will go wrong? Issue Monitoring How has the issue changed? Communication and Feedback Issue Analysis What is the consequence of the issue? Corrective Action What, if anything, should be done about the issue? Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Issues Identification Issues are best identified before the beginning of a new project or contract and should be updated and reviewed periodically throughout the life cycle of the program. Unlike opportunities and risks, there is no assessment of their likelihood because issues have either already occurred or are in the process of occurring. Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June

344 Issues Correct Ignore: Accept the consequences without further action based on results of a cost/schedule/performance business case analysis or Control: Implement a plan to reduce issue consequences and residual risk to as low a level as practical or minimize impact on the program. This option typically applies to high and moderate consequences issues. Less common options include Avoid: Eliminate the consequence of the event or condition by taking an alternate path. - Examples may involve changing a requirement, specification, design, or operating procedure. Transfer: Reassign or reallocate the issue responsibility from one program to another, between the government and the prime contractor, within government agencies, or across two sides of an interface managed by the same organization. 27 Opportunities An opportunity is the potential for improving the program in terms of cost, schedule, and performance. Opportunity management supports USD(AT&L) Better Buying Power initiatives (currently BBP 3.0) to achieve should cost objectives. In Better Buying Power 2.0, the USD(AT&L) implemented should cost management, stating, Our goal should be to identify opportunities to do better and to manage toward that goal. Managers should scrutinize each element of cost under their control and assess how it can be reduced without unacceptable reductions in value received

345 DoD Opportunity Management Process Steps Opportunity Process Planning What is the program s opportunity management process? Opportunity Identification What can be improved? Opportunity Monitoring How has the opportunity changed? Communication and Feedback Opportunity Analysis What is the business case analysis of the opportunity? Opportunity Management Should the opportunity be pursued, reevaluated, or rejected? If so, how? Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Opportunity Management Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June

346 Opportunity Register Opportunity Likelihood Cost to Implement Monetary Return on Investment RDT&E Procurement O&M Schedule Performance Program Priority Management Strategy Owner Expected Closure Opportunity 1: Procure Smith rotor blades instead of Jones rotor blades. Mod $3.2M $4M 3 month margin 4% greater lift #2 Reevaluate Summarize the plan Mr. Bill Smith March 2017 Opportunity 2: Summarize the opportunity activity. Mod $350K $25K $37 5K #3 Reject Ms Dana Jones N/A Opportunity 3: Summarize the opportunity activity. High $211K $0.4M $3.6 M 4months less longlead time needed #1 Summarize the plan to realize the opportunity Ms. Kim Johnson January 2017 Source: Department of Defense Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs, June Summary Life-Cycle Risk Management is a structured process Process for risk identification and risk analysis Risk Management is required and is smart business A Robust T&E Program is Essential for Effective Risk Identification and Handling. Provides a foundation for program risk management execution

347 Risk Management/ ESHO Exercise Lesson 8.2 Risk Management/ ESOH Exercise 347

348 THIS PAGE INTENTIONALLY LEFT BLANK 348

349 Environment, Safety, and Occupational Health (ESOH), and Exercise The following continuous learning modules are relevant to this lesson: - CLE039 Environmental Issues in T&E - CLE009 ESOH in Systems Engineering - CLR030 ESOH in JCIDS ESOH Community of Practice: ESOH Implications for T&E Planning ESOH risk mgmt. is part of the risk mgmt. process Potential risks include: Impacts & adverse effects from routine testing, test failures, and mishaps Scope of risks includes: HAZMAT use & hazardous waste generation Safety (including explosives safety, ionizing & nonionizing radiation) Human health (chemical, physical, biological, ergonomic, etc.) Environmental & occupational noise Impacts to the environment (air, water, soil, flora, fauna) The T&E community must comply with all ESOH statutory & regulatory requirements 349

350 ESOH Risk Management For Environmental, Safety, and Occupational Health (ESOH) risks, the PM shall: Integrate ESOH risk management into the Systems Engineering process Eliminate ESOH hazards where possible (manage risks that can t be eliminated) Use MIL-STD 882 methodology Must accept residual risk, prior to exposing people, equipment, or the environment Residual risk acceptance authorities: High risks CAE Serious risks PEO Medium and low risks PM DoDI , Encl 3 ESOH System Safety Process Element 1: Document the System Safety Approach Element 5: Reduce Risk Element 2: Identity and Document Hazards Element 6: Verify, Validate, & Document Risk Reduction Element 3: Assess and Document Risk Element 7: Accept Risk and Document Element 4: Identity and Document Risk Mitigation Measures Element 8: Manage Life-Cycle Risk 4 350

351 Document the System Safety Approach The PM shall document the system safety approach for managing hazards as an integral part of the SE process. Minimum requirements for the approach include Describing the risk management effort ID & document the prescribed & derived requirements applicable to the system Define how hazards & associated risks are formally accepted by the appropriate risk acceptance authority Document hazards with a closed-loop Hazard Tracking System (HTS) Source: MIL-STD 882E, page Identify and Document Hazards Hazards are identified through a systematic analysis process that includes system hardware & software, system interfaces & the intended use/environment Consider Mishap data Relevant environmental & occupational health data User physical characteristics, knowledge skills & abilities Lessons learned from legacy/similar systems 6 351

352 Assess & Document Risk (Severity/Consequences) SEVERITY CATEGORIES MIL-STD-882E Description Severity Category Catastrophic 1 Critical 2 Marginal 3 Negligible 4 Mishap Result Criteria Could result in one or more of the following: Death, permanent total disability, irreversible significant environment impact, or monetary loss equal to or exceeding $10M. Could result in one or more of the following: permanent partial disability, injuries or occupational illness that may result in hospitalization of the at least three personnel, reversible significant environmental impact, or monetary loss equal to or exceeding $1m but less than $10M Could result in one or more of the following: injury or occupational illness resulting in one or more lost work day(s), reversible moderate environmental impact, or monetary loss equal to or exceeding $100K but less than $1M Could result in one or more of the following: Injury or occupational illness not resulting in a lost work day, minimal environmental impact, or monetary loss less than $100K. Assess & Document Risk (Probability) PROBABILITY LEVELS Description Level Individual Item Fleet/Inventory* Frequent A Likely to occur often in the life of an item. Continuously experienced. Probate B Will occur several times in the life of an item. Will occur frequently. Occasional C Likely to occur sometime in the life of an item. Will occur several times. Remote D Unlikely, but possible to occur in the life of an item. Unlikely, but can reasonably be expected to occur Improbable E So unlikely, it can be assumed occurrence may not be experienced in the life of an item. Unlikely to occur, but possible. Eliminated F Incapable of occurrence. This level is used when the potential hazards are identified and later eliminated. 352

353 Table A-II Example of Probability Levels PROBABILITY LEVELS Description Level Individual Item Fleet/Inventory* Quantitative Frequent A Likely to occur often in the life of an item. Continuously experienced. Probate B Will occur several times in the life of an item. Will occur frequently. Occasional C Likely to occur sometime in the life of an item. Will occur several times. Remote Improbable Eliminated D E F Unlikely, but possible to occur in the life of an item. So unlikely, it can be assumed occurrence may not be experienced in the life of an item. Unlikely, but can reasonably be expected to occur Unlikely to occur, but possible. Incapable of occurrence. This level is used when the potential hazards are identified and later eliminated. Assess & Document Risk (Risk Matrix) Probability Severity Catastrophic (1) RISK ASSESSMENT MATRIX Critical (2) Marginal (3) Negligible (4) Frequent (A) High High Serious Medium Probable (B) High High Serious Medium Occasional (C) High Serious Medium Low Remote (D) Serious Medium Medium Low Improbable (E) Medium Medium Medium Low Eliminated (F) Eliminated MIL-STD-882E 353

354 Identify & Document Risk Mitigation Measures The goal should always be to eliminate the hazard if possible. When the hazard cannot be eliminated Reduce the risk to the lowest acceptable level Within cost, schedule & performance constraints Applying system safety design order of precedence: Eliminate hazards Design changes Incorporate engineered features or devices Provide warning devices Signs, procedures, training, PPE 11 Reduce Risk Mitigation measures are selected and implemented to achieve an acceptable risk level Present the current hazards, their associated severity & probability assessments & status of risk reduction efforts at technical reviews and test readiness reviews

355 Verify, Validate & Document Risk Reduction Verify the implementation & validate the effectiveness of all selected risk mitigation measures through Appropriate analysis Testing Demonstration Inspection Document the verification & validation in the Hazard Tracking System 13 Accept Residual Risk, and Document Before exposing people, equipment, or the environment to known system related hazards, the risks shall be accepted by the appropriate authority as defined in DoDI , Encl. 12. The user representative shall be part of this process throughout the life-cycle of the system; and shall provide formal concurrence before all Serious & High risk acceptance decisions. Each risk acceptance decision shall be documented in the Hazard Tracking System (HTS)

356 Manage Lifecycle Risk After the system is fielded, the program office uses the system safety process to identify hazards & maintain the HTS throughout the system s lifecycle. Test related risks (such as those in the upcoming exercise) need to be managed throughout the T&E process. In addition, DoD requires program offices to support system-related Class A and B mishap investigations. 15 Software Contribution to System Risk The assessment of risk for software, and consequently software-controlled or software-intensive systems, cannot rely on the risk severity & probability. Another approach shall be used for the assessment of software s contributions to system risk that considers the potential risk severity & the degree of control that software exercises over the system

357 Software Control Categories MIL-STD-882E, Table IV Level Name Description 1 Autonomous (AT) 2 Semi- Autonomous (SAT) 3 Redundant Fault Tolerant (RFT) Software functionality that exercises autonomous control authority over potentially safety significant hardware systems, subsystems, or components without the possibility of predetermined safe detection and intervention by a control entity to preclude the occurrence of a mishap or hazard. (This definition includes complex system/software functionality with multiple subsystems, Interacting parallel processors, multiple interfaces, and safety-critical functions that are time critical.) Software functionality that exercises control authority over potentially safety-significant hardware systems, subsystems, or components, allowing time for predetermined safe detection and intervention by independent safety mechanisms to mitigate or control the mishap or hazard. (This definition includes the control of moderately complex system/software functionality, no parallel processing, or few interfaces, but other safety systems/mechanisms can partially mitigate. System and software fault detection and annunciation notifies the control entity of the need for required safety actions.) Software item that displays safety-significant information requiring immediate operator entity to execute a predetermined action for mitigation or control over a mishap or hazard. Software exception, failure, fault, or delay will allow, or fail to prevent, mishap occurrence. (This definition assumes that the safety-critical display information may be time-critical, but the time available does not exceed the time required for adequate control entity response and hazard control.) Software functionality that issues commands over safety-significant hardware systems, subsystems, or components requiring a control entity to complete the command function. The system detection and functional reaction includes redundant, independent fault tolerant mechanisms for each defined hazardous condition. (This definition assumes that there is adequate fault detection, annunciation, tolerance, and system recovery to prevent the hazard occurrence if software fails, malfunctions, or degrades. There are redundant sources of safety-significant information, and mitigating functionality can respond within any time-critical period.) Software that generates information of a safety-critical nature used to make critical decisions. The system includes several redundant, independent fault tolerant mechanisms for each hazardous condition, detection and display. 4 Influential Software generates information of a safety-related nature used to make decisions by the operator, but does not require operator action to avoid a mishap. 5 No Safety Impact (NSI) Software functionality that does not possess command or control authority over safety significant hardware systems, subsystems, or components and does not provide safety significant information. Software does not provide safety-significant or time sensitive data or information that requires control entity interaction. Software does not transport or resolve communication 17 of safety-significant or time sensitive data. Software Safety Criticality Matrix Severity Safety Criticality Matrix Software Control Category Catastrophic (1) Critical (2) Marginal (3) Negligible (4) 1 SwCI 1 SwCI 1 SwCI 3 SwCI 4 2 SwCI 1 SwCI 2 SwCI 3 SwCI 4 3 SwCI 2 SwCI 3 SwCI 4 SwCI 4 4 SwCI 3 SwCI 4 SwCI 4 SwCI 4 5 SwCI 5 SwCI 5 SwCI 5 SwCI 5 MIL-STD-882E, Table V

358 Software Level of Rigor (LOR) Tasks SwCl SwCl 1 SwCl 2 SwCl 3 SwCl 4 SwCl 5 Level of Rigor Tasks Program shall perform analysis of requirements, architecture, design, and code; and conduct in-depth safety specific testing. Program shall perform analysis of requirements, architecture, and design; and conduct in-depth safety-specific testing. Program shall perform analysis of requirements and architecture; and conduct in-depth safety-specific testing. Program shall conduct safety-specific testing. Once assessed by safety engineering as Not Safety, then no safety specific analysis or verification is required. Consult the Joint Software Systems Safety Engineering Handbook and AOP 52 for guidance on required software analyses MIL-STD-882E, Table V 19 Relationship Between SwCI, Risk Level, LOR Tasks, and Risk Software Criticality Index (SwCl) SwCl 1 SwCl 2 SwCl 3 SwCl 4 Risk Level High Serious Medium Low Software LOR Tasks and Risk Assessment/Acceptance If SwCI 1 LOR tasks are unspecified or incomplete, the contributions to system risk will be documented as HIGH and provided to the PM for decision. The PM shall document the decision of whether to expend the resources required to implement SwCI 1 LOR tasks or prepare a formal risk assessment for acceptance of a HIGH risk. If SwCI 2 LOR tasks are unspecified or incomplete, the contributions to system risk will be documented as SERIOUS and provided to the PM for decision. The PM shall document the decision of whether to expend the resources required to implement SwCI 2 LOR tasks or prepare a formal risk assessment for acceptance of a SERIOUS risk. If SwCI 3 LOR tasks are unspecified or incomplete, the contributions to system risk will be documented as MEDIUM and provided to the PM for decision. The PM shall document the decision of whether to expend the resources required to implement SwCI 3 LOR tasks or prepare a formal risk assessment for acceptance of a MEDIUM risk. If SwCI 4 LOR tasks are unspecified or incomplete, the contributions to system risk will be documented as LOW and provided to the PM for decision. The PM shall document the decision of whether to expend the resources required to implement SwCI 4 LOR tasks or prepare a formal risk assessment for acceptance of a LOW risk. SwCl 5 Not Safety No safety-specific analyses or testing is required. 20 MIL-STD-882E, Table VI 358

359 Risk Management / Environment, Safety & Occupational Health (ESOH) Exercise MDD Materiel Solution Analysis CDD Validation Dev. RFP Release A B C Tech Maturation & Risk Reduction Engineering and Manufacturing Development FRP Production & Deployment IOC FOC Operations & Support YOU ARE HERE 21 Risk Management / ESOH Exercise Given: Acquisition documents COIs and CTPs Objective: Identify Safety and Environmental Issues and prepare mitigating strategies Overview (4 steps) 1. Review the system hazard assessment white paper 2. Select one safety and one environmental test hazard 3. Analyze and rate hazards 4. Develop mitigating strategies

360 Hazard Assessment (DoD Risk Management Guide) Likelihood Red = High Yellow = Moderate Green = Low Destroy UAV Consequence Example: Test hazard: flutter test, structural failure, Destroy UAV Likelihood = 3; Consequence = 5 Assessment: High (see table) 23 Safety / Environmental Hazards (Sample Format) Hazard Rating Mitigating actions Hazard 1 For example: Destroying the UAV Hazard 2 Po = 3 C = 5 H, M, L Po = X C = X H, M, L Action 1-M&S to predict conditions where structural failure is likely to occur Action 2-Change the UAV design, to decrease likelihood Action 3-Change test methods, to decrease consequences (use scale model in wind tunnel instead of UAV) Action 1 Action 2 Action 3 Should they have used the MIL-STD 882 approach (vice DoD Risk Mgmt Guide)?

361 ESOH Exercise - Timeline Four Tasks (30 minutes): 1. Review system hazard assessment white paper 2. Select one safety and one environmental test hazard 3. Analyze and Rate Hazards (Make educated guesses, as necessary) 4. Develop at least 3 mitigating actions for each hazard

362 THIS PAGE INTENTIONALLY LEFT BLANK 362

363 TST 204 Intermediate Test & Evaluation Course Environment, Safety and Occupational Health (ESOH) Exercise J-SPAW Test Hazard Assessment (White Paper) A Test Plan Working Group (TPWG) was convened for the J-SPAW. The TPWG assessed potential hazards associated with anticipated DT and OT for the J-SPAW. The following types of testing are projected for the J-SPAW: 1. Maximum firing range tests 2. Maximum rate of firing 3. Maximum sustained firing rate 4. Ballistics accuracy firing tests 5. Transportability testing with C-17 and C-5 6. Communications testing 7. Vehicle road testing 8. Vehicle cross country testing 9. Fuel consumption testing 10. Vehicle fording testing 11. Cant limit testing 12. Infrared and radar cross section signature testing The TPWG assessed potential test hazards (both safety and environmental) that could be encountered during the test program. Their preliminary hazards list is as follows: Safety Hazards: 1. Firing tests: a. Catastrophic failure of the cannon tube b. Rounds falling outside the impact area c. Crew injury due to ammunition handling d. Crew injury due to burns from howitzer overheating e. Weapon rollover from cant testing f. Crew hearing loss g. Crew injury due to concussion 2. Transport tests: a. Injury to crew while on-loading / off-loading b. Static electricity shock c. Failure of tie-down lines or links d. Failure of deck plating in transport aircraft 3. Mobility testing: a. Vehicle rollover b. Engine overheat failure c. Damage to precision components from vibration d. Crew injuries due to night operation in blackout conditions 4. Signature testing: a. Crew injury from laser special test equipment b. Radiation burns from radar measurements 363

364 TST 204 Environmental Hazards: Intermediate Test & Evaluation Course 1. Firing tests: a. Dud rounds on range b. Unburned white phosphorous on range from ammunition c. Noise pollution d. Wildlife habitat disturbed from noise, shock waves e. Cordite (gun powder) left on range from unused powder bags f. Range fire due to burning unexpended powder bags g. Range fire from special ammunition (smoke, white phosphorous, etc.) 2. Mobility testing: a. Fuel spills from vehicle accidents b. Fuel spills while refueling c. Fording tests vehicle swamps, oil or fuel contaminates stream 3. Signature testing: a. Wildlife disturbed from excessive radiation 364

365 Data Mgmt & Test Scenarios Exercise Lesson 9 Data Management & Test Scenarios Exercise 365

366 THIS PAGE INTENTIONALLY LEFT BLANK 366

367 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 9 Lesson Title Data Management and Test Scenarios Exercise Lesson Time 2.5 hours Terminal Learning Objectives Given a scenario and DoD guidance, the student will develop information for a data management plan in support of test and evaluation. (TLO #7) Given key requirements of a notional weapon system, the student will develop an operational or developmental test scenario (highlevel test plan) that addresses the COI/CTP and supports the overall program plan. (TLO #8) Enabling Learning Objectives Recognize DoD policy on T&E data management, including data security, and archiving and releasing test data. (ELO #7.1) Describe the data authentication process of verifying and validating the test data set, protecting the integrity of test data, and ensuring validity of collected data to meet test objectives. (ELO #7.2) Recognize the need for measurable, high quality, timely, and cost effective data; to enable unbiased T&E results. (ELO #7.3) Describe the processes for data failure definition and scoring; including reliability, availability and maintainability scoring conferences. (ELO #7.4) Develop information for a data management plan in support of test and evaluation. (ELO #7.5) Given key requirements of a notional weapon system, develop a test scenario (high level test plan); including identification of test conditions, and controlled and uncontrolled variables. (ELO #8.1) Given key requirements of a notional weapon system, develop a test scenario (high level test plan) that supports the overall program plan, including opportunities for combined DT/OT. (ELO #8.2) 367

368 TST 204 Intermediate Test & Evaluation Assignments None. Estimated Student Preparation Time None. Assessment Class participation, oral presentation, and written examination. Related Lessons Prior lesson results. Self Study References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook, Chapter 9 Test and Evaluation Management Guide, 6 th ed., 2012 version TST 204 Student Reference Disc 368

369 Data Management & Test Scenarios Exercise MDD CDD Validation Dev. RFP Release A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment Operations & Support YOU ARE HERE 1 Learning Objectives Recognize DoD policy on T&E data management, including data security, and archiving and releasing test data. Describe the data authentication process of verifying and validating the test data set, protecting the integrity of test data, and ensuring validity of collected data to meet test objectives. Recognize the need for measurable, high-quality, timely, and cost-effective data; to enable unbiased T&E results. Describe the processes for data failure definition and scoring; including reliability, availability and maintainability scoring conferences. Develop information for a data management plan in support of test and evaluation. Given key requirements of a notional weapon system, develop a test scenario (highlevel test plan); including identification of test conditions, and controlled and uncontrolled variables. Given key requirements of a notional weapon system, develop a test scenario (highlevel test plan) that supports the overall program plan, including opportunities for combined DT/OT

370 This lesson will cover the following topics: 1. Data Management 2. Student Exercise Lesson Topics 3 Data Management Lesson Topics: 1) Data Management 2) Student Exercise 4 370

371 T&E Data / Data Management Policy Data must be collected that will contribute towards assessing: key performance parameters, critical technical parameters, key system attributes, interoperability requirements, cybersecurity requirements, reliability growth, maintainability attributes, developmental test objectives, and others as needed. Paraphrased from DODI , Encl 4 par 5a(11) Note: the service T&E regulations give additional information concerning data management. Additional information can be found in the Service folders, on the student CD-ROM. Data Requirements Base data requirements on: MOEs / MOSs / MOPs Test variables to be measured Sample sizes Evaluation Plan contents Identify agency responsible to collect the data Determine data source, type & format before starting test Exercise sound judgment in determining type & amount of data to be collected Data should be high quality, measurable, timely, and cost effective; and should enable unbiased T&E results 6 371

372 Data Analysis, Collection, and Management Plans Purpose of the plans Provide detailed procedures for the collection, reduction, quality assurance, collation, analysis, storage and disposition of data gathered to support evaluations. Objectives of the plans Eliminate duplication of efforts Provide guidance to collection/analysis effort Provide adequate and timely analytical info Manage resources: Instrumentation Data transmission, reduction & storage Data analysis teams (Analysis, Evaluation, Reporting) 7 Elements Of A Test Database The following are essential elements, for designing a test database: Accessible to all stakeholders Used for all T&E data for the organization and/or the system under test Ease of use, and ease of data mining Fields for all necessary data Appropriate choice of software Traceability to the originator / generator of the data Current status of the data (for approval, for info., etc.), version / control number, and date Security of the database Permissions (read / write vs. read only) and other controls 8 372

373 Archiving Data Data from all T&E phases must be stored and archived to support both current and future uses (such as future T&E efforts) When practical, use electronic media to store data Set up databases for ease of use and ease of data mining Provide for periodic reviews of the database Follow your organization s guidance concerning data retention, disposition, and disposal The Program Manager and test agencies for all programs will provide the Defense Technical Information Center (DTIC) with all reports and the supporting data and metadata for the test events in those reports. DoDI Encl 5, par 10c(5) 9 Archiving Data Example From ATEC Reg (March 2006) Data Category Raw data data in its original form (Level 1) Audio/video tape and film (Level 1) Written Level 2 data Processed & smoothed automated instrumentation data (Level 2) Test database of record (Level 3) Plans and reports (Levels 4-7) Supplemental analyses (Levels 4-7) Retention Retained for 1 year after end of event Retained for 1 year after end of event Retained for 1 year after end of event Archived for 1 year after end of event Archived permanently Archived permanently Archived for 3 years (nonoversight), 10 years (oversight)

374 DoD Policy For Accessing Test Data The acquisition chain of command, including the Program Manager, and the DASD(T&E) and their designated representatives will have full and prompt access to all ongoing developmental testing, and all developmental test records and reports... Data may be preliminary and will be identified as such. DoDI Encl 4 par 6c(1) DOT&E, the Program Manager and their designated representatives who have been properly authorized access, will all have full and prompt access to all records, all reports, and all data... Data may be preliminary and will be identified as such. DoDI Encl 5 par 10c(1) Releasing Test Data Within DoD: Test organization commanders determine processes & release authority for reports & information under their control Classified information must be handled per DODD , and associated documents Outside the DoD: Freedom of Information Act requests (from individuals or private industry) should be processed according to DoD Regulation , and service policy Report news media or civic organization requests to the Public Affairs Officer of the appropriate agency Follow service guidance concerning information released to Congress, the GAO, the DoD Inspector General, and similar agencies Follow service guidance concerning release of info to foreign governments, foreign liaison officers, or foreign nationals

375 Data Authentication & Scoring Prior to testing, the procedure & rules for data / test authentication must be developed Data Authentication Group (DAG) determines the validity of test events & test data Prior to testing, it must also be determined what constitutes a failure (DT&E) or a mission failure (OT&E) This information typically comes from the requirements documents and/or failure definition & scoring process Scoring conference(s) Assigns the reason(s) for test failures 13 Data Authentication Process The services/organizations have processes for data authentication. A typical process includes: Data Authentication Group (DAG) charter and standard operating procedures are developed prior to the start of testing After the test data has been collected, the DAG determines whether the data is valid and/or acceptable Whether the test was a valid test Whether the data represents what really happened (instrumentation error, for example) Once the DAG process has been completed, the DAG releases an authenticated event 14 database 375

376 Failure Definition & Scoring The services / organizations have processes for failure definition & scoring. A typical process includes: Failure Definition and Scoring Criteria (FD/SC) are developed prior to the start of testing The FD/SC typically lists detailed descriptions of what constitutes a failure, for each essential function. Classification (for example, in which essential function or nonessential function did the failure occur?) Chargeability of test incidents cause(s) of the failures. (For example, accident, crew, HW CFE, HW GFE, SW CFE, SW GFE, maintenance, support equipment, tech docs/manuals, training, secondary failure, or unknown) Scoring conferences occur after test data has been authenticated FD/SC are used to determine classification & chargeability of test incidents that occur during R&M testing which failures count against R&M, and which don t? Test Scenarios & Data Management Exercise Lesson Topics: 1) Data Management 2) Student Exercise

377 Test Scenarios and Data Management Exercise Given: Key operational, technical, and programmatic requirements Objective: Develop a developmental or operational test scenario, along with Data Collection and Data Management Plans Overview: Task 1. Identify mission objective What s the focus? (The instructor will assign you a CTP or COI) Task 2. Identify test variables (controlled / uncontrolled). Task 3. Develop an operational or developmental test scenario. (Develop test conditions to satisfy variables, and develop a DT or OT scenario) Task 4. Determine information for at least two of your data elements. (This information is needed for the Data Collection, Data Analysis, & Data Management Plans). Task 5. Identify opportunities to combine DT/OT. Are there any opportunities for combined DT/OT in your test scenario? 17 Operational Mission Scenarios Impact on Test Planning Operational mission scenarios allow the following to be identified (which facilitates test planning): Needed test resources (platforms, users, support personnel, instrumentation, range time, etc.) Cost and schedule Necessary terrain & weather conditions Environmental or safety restrictions

378 Dependent Variables Independent Variables Observations Controlled Uncontrolled Primary Factors Background Factors Background Factors Held Constant Natural Group Random Measured Not Measured Difference Between Test Mission Plans & Detailed Test Plans Test Mission Plans High Level Issue focus COI CTP Detailed Test Plans Detail Level Data focus Detailed info on data collection, data analysis, data mgmt., etc

379 Data Collection, Analysis, & Data Management Planning Some Data Collection, Data Analysis & Data Mgmt. info. is typically developed along with the Test Scenario: Important data (specific data elements) to be collected Purpose of the data (it will be analyzed to determine what?) Data accuracy and estimated sample sizes needed, for the data elements For the purposes of this exercise, you may state high, medium, or low data accuracy & sample sizes Data collection methods / instrumentation needed, for the data elements Note: more detailed Data Collection/Analysis/Mgmt. planning is typically done later, along with the detailed test plans Exercise Tasks and Timeline Task 1: Identify controlled / uncontrolled variables for your assigned CTP or COI. Task 2: Develop test conditions for one test scenario / mission plan Task 3: Outline your test scenario / mission plan (Note: you DON T need to assess the entire CTP or COI) Task 4: Determine the following information for at least two data elements that you plan to collect: Purpose of the data, data accuracy & sample sizes needed, and data collection method/instrumentation needed. Task 5: Identify opportunities for Combined DT/OT in your scenario / mission plan (40 minutes to complete all five tasks)

380 COIs and CTPs COIs Can the SPAW be rapidly inserted into the combat environment? Can the SPAW deliver sufficient and accurate fire on the battlefield? Is the SPAW survivable on the battlefield? CTPs Must protect the crew (90% probability of crew survival) against AT mine blast beside or under the platform. MTBF of 128 hours. 23 Thursday Night Homework Read the course material, for the DT&E Test Execution Exercise Read the slides (starting with the DT&E Test Execution Exercise slide) Read the four checklists & supplemental information for the exercise As you read the material, think about how you might write a DRAFT test plan, using the template provided in your book Your team will write a DRAFT test plan, as part of this exercise

381 DT&E Test Execution Exercise Lesson 10 DT&T Test Execution Exercise 381

382 THIS PAGE INTENTIONALLY LEFT BLANK 382

383 TST 204 Intermediate Test & Evaluation Lesson Assignment Sheet Lesson Number Lesson 10 Lesson Title DT&E Test Execution Exercise Lesson Time 6.5 hours Terminal Learning Objectives Given a system description, the student will correctly demonstrate DT&E planning, rehearsing (pilot test), and execution in an organized fashion; including data collection, analysis, evaluation, after action review, and reporting. (TLO #9) Enabling Learning Objectives Given a system description, document a T&E strategy/test plan that integrates policy, program requirements, cost and resource estimates, evaluation framework, and the T&E schedule to accomplish program goals. (ELO #9.1) Given a system description, assess T&E related factors (including resources, product maturity, and personnel). (ELO #9.2) Correctly plan a test readiness review. (ELO #9.3) Given a system description, monitor safety compliance (such as people and the item/system under test) and environmental requirements and constraints, to protect resources and comply with established policies. (ELO #9.4) Given a system description, commit resources needed to complete T&E activities / events with consideration for financial cost estimates for T&E support and resources. (ELO #9.5) Given a system description, analyze raw data into organized and meaningful data products, to support planned analysis, evaluation, and reporting. (ELO #9.6) 383

384 TST 204 Intermediate Test & Evaluation Enabling Learning Objectives (Continued) Given a system description, deliver T&E presentations (quicklook, test, analysis, and evaluation reports) to capture test background, methodology, limitations, results, evaluation, and recommendations to support decision making. (ELO #9.7) Given a system description, develop information on labor and other T&E support and resources needed to support a test project. (ELO #9.8) Given a system description, conduct a test readiness review to determine system/test article readiness. (ELO #9.9) Assignments Homework read the exercise and think about how to write a DT test plan. Estimated Student Preparation Time 1 hour Assessment Class participation, oral presentation, and written examination. Related Lessons Prior lesson results. Self Study References DoDD , The Defense Acquisition System DoDI , Operation of the Defense Acquisition System Defense Acquisition Guidebook 18 th Flight Test Squadron (AFSOC) Test Directors Handbook Test and Evaluation Management Guide, 6 th ed., 2012 version, chapter 4 384

385 DT&E Test Execution Exercise 1 Lesson Objectives Given a system description, document a T&E strategy/test plan that integrates policy, program requirements, cost and resource estimates, evaluation framework, and the T&E schedule to accomplish program goals. Given a system description, assess T&E related factors (including resources, product maturity, and personnel). Correctly plan a test readiness review. Given a system description, monitor safety compliance (such as people and the item/system under test) and environmental requirements and constraints, to protect resources and comply with established policies. Given a system description, commit resources needed to complete T&E activities / events with consideration for financial cost estimates for T&E support and resources. Given a system description, analyze raw data into organized and meaningful data products, to support planned analysis, evaluation, and reporting

386 This lesson will cover the following topics: Lesson Topics 1. Impact of Test Design 2. Overview of Test Phases 3. DT&E Test Execution Exercise 3 Impact of Test Design Lesson Topics: 1) Impact of Test Design 2) Overview of Test Phases 3) DT&E Test Execution Exercise 4 386

387 Impact of Test Design & Analysis Methods on Test Results The following factors may have a large impact on test results and conclusions: 1. Test validity 2. Test design 3. Confidence level & statistical power - Type I and Type II errors 4. Sample size 5. Analysis methods These slides Later slides 5 3 Types of Validity Statistical validity: B changed as A changed. Is this effect statistically significant? Design validity: A alone caused B. Did anything else cause the results? External (operational) validity: A s effect on B is expected to occur in actual operations. Were test conditions realistic enough that the results will apply to the real world? Testers must consider (and attempt to achieve) all three types of validity Note: Operational significance is also important. Does this matter, in the real world? 6 387

388 Checklist For Obtaining Validity By designing & executing test events to avoid common sources of error, testers can obtain better test results The detailed validity and significance checklist on your Student Reference CD-ROM might be useful as reference material for improving the validity of your test events 7 Dependent Variables Independent Variables Observations Controlled Uncontrolled Primary Factors Background Factors Background Factors Held Constant Natural Group Random Measured Not Measured 8 388

389 Pickup Truck Example Fuel efficiency (the dependent variable) depends on test conditions The following independent variables could affect fuel efficiency: Driving speed (0-30 m.p.h.; m.p.h.) Weight (fully loaded, empty) Tire pressure (under pressure, over pressure) Wind Direction (front, rear) Terrain (hills, flat) Road surface (smooth, bumps) Driving technique ( lead foot, normal) 9 Dependent Variables Dependent Variables Dependent variables are often specifications (such as range, speed, altitude, or fuel efficiency) Observations Dependent variables are the results (system outputs) of the test

390 Sources of Dependent Variables» ICD» CDD & CPD» MOEs» MOSs» MOPs» System Specification 11 Independent Variables Controlled Background Factors Primary Factors Driving speed (hi/low) Weight (hi/low) Held Constant Day / Night Natural Group Weather Clear Rain Random Terrain or Driving technique

391 Independent Variables (cont.) Uncontrolled Background Factors Environment Wind Visibility Measured Not Measured Wind speed & direction Visibility 13 Sources of Independent Variables»ICD»CDD & CPD»AoA»CONOPS»Operational experience»previous test results»sensitivity analysis

392 Holding constant Control of Variables Test at one level results valid for that one level Natural groups of trials (boundaries may not be easily distinguished) Hi / lo speed Day / night Offense / defense Random levels Naturally occurring frequency 15 DT&E Choosing Levels of Controlled Independent Variables Build up from center of the envelope to the extremes to verify specifications OT&E Test early at most probable portion of the operational envelope- later do edges Choose levels that are operationally significant

393 Randomization Minimizes Order Effects Randomization is the scrambling of the data collection sequence to minimize order effects Order effects are changes which occur over time: Operator learning or tiring Equipment wear or calibration Weather Randomization is often a nuisance Much easier to run experiments in logical order Takes stronger organizational skills Sometimes impossible or cost-prohibitive Failing to randomize introduces risk of failure From 53 rd Wing, Eglin AFB, FL 17 Baseline Comparisons Baseline comparisons can be conducted for: Two or more different systems, subsystems, components, training methods, maintenance procedures, manufacturing methods, TTPs, etc. Characteristics of initial prototypes vs. later versions of the system (reliability growth, for example) Purposes / benefits include: Information on cost / schedule / performance / risk of various options Comparison of new vs. existing (is the new system better than the existing system, for example)

394 Overview of Test Phases Lesson Topics: 1) Impact of Test Design 2) Overview of Test Phases 3) DT&E Test Execution Exercise 19 Typical Test Phases Typical test phases for individual tests, and/or series of test events: Phase I: Longer term test planning Phase II: Near term test coordination & preparation Phase III: Test execution & evaluation of data Phase IV: Test reporting

395 Pre-Test Activities & Constraints Longer term test planning: Test & evaluation plans, methods, procedures, objectives Select / procure test ranges, facilities, instrumentation, data collection & reduction facilities, test articles, etc. Safety, environmental & security arrangements/considerations Traceability of tests to requirements & user needs Near term test coordination and preparation: Coordinate test support requirements (range support, logistics & administrative requirements, test articles, threats & targets, instrumentation, etc.) Safety release Train & brief test participants Finalize test plans & test cards Conduct Test Readiness Review if required. (Entrance criteria met, test preparations complete, approval to conduct testing) 21 Safety Releases The Program Manager, working with the user and the T&E community, will provide safety releases... to the developmental and operational testers prior to any test that may impact safety of personnel. DoDI , Encl. 5 The services/organizations have processes to obtain safety releases, flight clearances, airworthiness certifications, etc. Army Safety releases (provided by ATEC) are required before pretest training, for tests using soldiers as test players The release states whether the item is safe for troop operation, and any necessary restrictions Safety confirmations are also required at major decision points Safety confirmations indicate whether the system is safe for operation, any technical or operational limitations or precautions, any hazards or safety problems that require further investigation & testing See ATEC Pamphlet 73-1, App K for more info

396 Measurement System Analysis Our ability to assess the performance of a system is only as good as our ability to measure it. There are several factors that contribute to measurement system errors: Lack of discrimination (or resolution) - inability to detect small changes in characteristics being measured Bias difference between observed average value of measurements and true value Lack of stability measurements changing or drifting over time Excessive measurement system variability Repeatability variation in measurements of the same characteristic by the same operator (equipment or gage variability) Reproducibility variation in measurements of the same characteristic by different operators (operator variability) 23 Measurement System Analysis (cont.) Product Variation System Performance (which is unknown) System Performance (which is unknown) Measurement System Variation Bad Measurement System Good Measurement System Total Observed Variation Observed system performance is worse than actual performance (but you ll never know) Observed system performance is representative of actual performance

397 Elements of a Test Plan Cover letter Introduction purpose, background, system description Scope of testing COIs (OT&E) or CTPs (DT&E), test objectives, test scenarios / methodology, data collection methodology & procedures, evaluation criteria, limitations, resource requirements Administrative & other information responsibilities, points of contact, visitor control, disclosure policy, training, safety, environmental impact, security, list of acronyms, reports Note: Test plan format and content varies, depending on the type of testing, and T&E goals/requirements T&E Management Guide Chapter 4 has additional info Evaluation Plans - Use / Purpose Purpose of evaluation plans: To determine and describe how data will be collected / managed, combined (from the various data sources), analyzed, and evaluated, to produce the final system evaluation To determine and describe how each of the CTPs, COIs, and/or test parameters will be evaluated or assessed (along with any limitations or impacts) To identify and plan for needed resources To shorten the time needed for writing reports, by writing portions of the report ahead of time

398 Evaluation Plans - Content Introduction System description, purpose, background, key milestones & events Evaluation Overview Approach & concept, methodology, limitations & impacts, resources needed Evaluation Details Design & procedures, including MOPs and criterion for each MOE / MOS Test / Simulation Execution Strategy Overview of test events & schedule; purpose, description & scope of the test events; data management; limitations & impacts; schedule & cost; training concept; environmental impacts; instrumentation, simulation, simulation requirements, VV&A Note: Test plan format and content varies, depending on the type of testing, and T&E goals/requirements 27 Test Cards Test cards are tools used during an actual test to organize notes, events, data & requirements Test cards typically contain: Header information (for example, aircraft type, tail number, flight date, test program, mission number, run number, security classification of data) Crew composition (names and positions) System conditions (for example, flight parameters, heading, altitude, airspeed, flap settings, etc.) Test item conditions (for example, switch settings, mode of operation, software version number) Step by step procedures of required events Go / No Go criteria for each run Data collection information (collection parameters and criteria, start/stop times, recording equipment) Room for notes on each card

399 Test Readiness Review (TRR) A TRR is usually used for system-level DT, but can be used prior to other test phases Testers often conduct TRRs at various predetermined points, leading up to the start of test Some organizations conduct a TRR prior to every scheduled test event PMs typically chair & execute TRRs for major developmental test events (such as the TRR prior to the start of a major test phase) Chief Developmental Testers also chair TRRs, in some organizations 29 Test Readiness Review (cont.) Objective: Assess the readiness of the system, concept, or force development product; support packages; instrumentation; test planning; evaluation planning; and any other area required to support the successful conduct of the test or experiment. Members: Minimum membership includes the PM / Materiel Developer, the operational and developmental testers, and the system evaluator. Four principal components of a TRR: System under test Test plan Test resources Pre-test training Reference: ATEC Pamphlet 73-1, April

400 Army TRR Working Group Example TRR objective is to determine what actions are required to assure resources, training, and test hardware will be in place to support the successful conduct of the test, and to ensure that T&E planning, documentation, design maturity/configuration, and data systems have been adequately addressed. TRR working group is typically composed of the principal T&E WIPT members / stakeholders: 1. PM / Materiel Developer / Chief Developmental Tester. 2. Requirements / user community representative (TRADOC Capability Manager, for the Army) 3. Lead DT&E Organization. 4. Operational Tester. 5. Test Analyst / System Evaluator. 6. Logistician. Info on this slide is paraphrased 7. Trainer. from several sources, including ATEC Pamphlet Others, as required. 31 TRR Procedures & Products The chairperson provides each member with a TRR agenda and package The TRR working group reviews documentation (the TRR package) and planning efforts to ensure: Resources, training & test hardware will be ready for test conduct T&E planning and documentation, design maturity/configuration, and data systems have been adequately addressed The TRR package is evaluated, and a decision is made, concerning the system s readiness to enter T&E The decision, recommendations, and any action items that must be satisfied before testing begins, are documented in the minutes References: ATEC Pamphlet 73-1, April 2004 DA PAM 73-1, May

401 TRR Package TRR package consists of the following: (1) Coordinated TEMP. (2) Test Plans. (3) Safety Assessment Report (60 days prior to start of test). (4) Environmental Impact Documents (120 days prior to start of test). (5) Description of test item configuration. (6) RAM Failure Definition/Scoring Criteria. (7) Status of System Support Package (SSP), New Equipment Training, MANPRINT, Instrumentation, Data Collection/Reduction Facilities. (8) Supportability IPT approved Supportability Strategy. (9) Airworthiness release or statement, if required. (10) Status of software. (11) Safety Release. (12) Contractor or other test data. (13) Test milestones. Reference: ATEC Pamphlet 73-1, April TRR Student Exercise & Examples Each student team will conduct a TRR, as part of preparation for the upcoming DT&E Test Execution exercise A sample TRR presentation format is provided, as part of the exercise DoD TRR and OTRR Checklists are on the student CD-ROM (in the Technical Reviews folder). Various service / organization TRR checklists are also on the student CD-ROM (in the Service folders). See the Phase II Test Coordination and Preparation student exercise checklist, for more information

402 Suggested Test Execution Processes Conduct a pre-brief, prior to each test event: Disseminate key information, changes to the test plan, test-unique information such as operating procedures for that test event, etc. Execute the test event Closely monitor test progress & status, and data validity & accuracy, to avoid unnecessary delays Regularly apprise supervisors the test status, problems encountered, action taken to overcome problems Debrief ASAP after each test event, to capture key information Lessons learned, action items & responsibility, changes to future test events (from lessons learned), safety issues, questionnaires, etc. Conduct daily quality checks Was all needed data collected, transmitted, & stored? Have the data collection sheets been filled out correctly & completely? Any modifications needed to future test events, based on the data? All key information documented? Validity of test events & test data? 35 Issues Related to Test Execution The following will increase the chance of successful test execution: Realistic cost & schedule estimates Event driven testing, vice schedule driven Adequacy of test planning Proper choice of test methods and analysis methods Availability of test resources (personnel, test articles, test infrastructure, range time, funding, training) Test realism (essential for IOT&E) User participation Effective teamwork and communication at all levels Pilot tests (dry runs) Proper emphasis on safety and risk management Stable, well-defined, realistic test requirements

403 Pilot Tests (Dry Runs) Pilot tests are typically conducted prior to the start of record trials, as a dry run An end to end check of the entire process, if possible Check test procedures, instrumentation, data collection & transmission, data reduction & storage, etc. Issues associated with pilot tests: Does the data count towards the record trials? Conduct pilot tests far enough in advance to make changes & fixes, if needed All personnel should participate if possible (testers, data collectors, analysts, support personnel, etc.) 37 Criteria to Stop Testing Services/Organizations have criteria & procedures for canceling, postponing, or stopping testing System deficiencies or performance problems, safety issues, wasting Fleet services, etc. Corrective action & recertification may be required, prior to resumption of OT Deficiency or anomaly report process may be required

404 Post-Test Activities Data authentication & scoring conferences (if needed) After Action Reviews (if needed) Data analysis Findings, recommendations Test reports & post-test briefings 39 After Action Review (AAR) Purpose - to provide performance feedback from a test event What went right & wrong (what happened & why) How well or poorly did the system perform How well was the test conducted (did the instrumentation work, did the simulation work, etc.) What improvements (if any) can be made to future test events Note: AARs are also done for training events (particularly by the Army)

405 DT&E Test Execution Exercise Lesson Topics: 1) Impact of Test Design 2) Overview of Test Phases 3) DT&E Test Execution Exercise 41 DT&E Test Execution Exercise MDD CDD Validation Dev. RFP Release A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment Operations & Support YOU ARE HERE Note: This is a DT test, conducted during the Engineering and Manufacturing Development (EMD) acquisition phase

406 Test Execution Exercise Given: Test execution checklists Information on the WIMPE and HADES systems (including test objectives and simplified evaluation parameters) Objective: Prepare for and execute a test; collect and evaluate test data; and report test results and conclusions Overview (5 steps) 1. Phase I (Test Planning) This phase is complete 2. Phase II (Test Coordination and Preparation) and Test Readiness Review 3. Phase III (Test Execution and Evaluation) 4. Phase IV (Test Reporting) 5. Team briefings and class discussion 43 Munitions Technologies The crew of the SPAW needs a new launcherdelivered anti-personnel / anti-equipment munition The current WIMPE munition (which explodes upon impact) has inadequate accuracy and lethality The new HADES munition (which explodes upon impact) is larger and could be more accurate, more lethal; though it will be more expensive The CATAPULT System is on the OSD oversight list for DT&E, OT&E, and LFT&E The system consists of the Catapult, plus the munitions WIMPE = Worthless Individual Munition for Personnel & Equipment HADES = High Accuracy Delivery Explosive System

407 Munitions Technologies (cont.) Delivery accuracy is determined by: Launching a WIMPE and a HADES at a target from a distance of 10 feet Measuring the Radial Miss Distance (RMD) (only the z distance from the target) Analyzing the data (Excel template. The template is on the Student CD-ROM.) For the Record Trials - Group members launch (one warhead at a time) up to 30 WIMPE & 30 HADES warheads using the Catapult launcher One person can launch all the warheads, or warhead launches can be divided among the team members (for example, 5 WIMPE & 5 HADES per person) 45 Catapult Band Attach Tension Pin Release Angle Stop Pin

408 DT Test Objectives Critical Technical Parameter (CTP): Radial Miss Distance (RMD) Threshold = At least 50% of the WIMPEs will have a RMD of 5 inches or less Threshold = At least 50% of the HADES will have a RMD of 8 inches or less Note that WIMPE has a smaller blast radius than HADES It is highly desirable to reduce variability of the primary factors (independent variables) To reduce sources of error, which could bias our results. Were our test results caused by inherent characteristics of the munitions or by other factors? 47 Simplified Evaluation Parameters Circular Error Probable (CEP) - Radius of a circle within which 50% of the rounds will impact. Median = value of the central data point, in rank ordered data The excel spreadsheet calculates the median RMD For purposes of this exercise, you may assume median RMD = CEP

409 Characterize Your Catapult Prior to conducting any pilot tests or record trials, you must first characterize your Catapult This data will be used for this DT exercise, as well as for an upcoming OT exercise next week Characterize the launcher (record launcher settings) For the WIMPE: What release angle provides a 10 ft. range? For the HADES: What release angles provide a 4 ft. range, 6 ft. range, 8 ft. range, and 10 ft. range? For the HADES: Next week s exercise involves a rapid shoot and scoot, from several firing locations (different ranges). As you are characterizing your launcher, think about how the warfighters will use the launcher next week. Important Make sure you keep track of which catapult you used today so you can use the same catapult next week (keep your characterization data) Or you will need to re-characterize your catapult next week 49 Conduct a Pilot Test Prior to formal test execution, all teams will conduct a pilot test Do an end to end check Test procedures, instrumentation, data collection & transmission, data reduction & storage, etc. All personnel should participate Testers, data collectors, analysts, support personnel, etc. Identify changes & fixes needed for formal test

410 Pilot Test - Sources of Variation Seek to identify and eliminate sources of variation in the data not attributed to the characteristics of the munitions. Potential sources of variation: Catapult movement during launches Observer errors spotting the hit point How much variation is being introduced, relative to the difference you are trying to detect between the munitions? Are the variations significant? Consider the requirements for WIMPE and HADES. Variations (not attributed to munitions characteristics) need to be well less than 5 inches 51 Pilot Test Procedures Using your team s launch procedures, fire dummy rounds at a downrange target that is 10 ft. from the launcher. Have at least one team member observe the launcher during setup and firing. Note any issues with the launcher, the spotters, and/or any other issues that would affect the test results. Gather at least 5-10 data points, then look at the results. Are you getting good results? Why or why not? Make improvements, and collect data for the improvements that you made. Target x Radial miss distance from target is measured o Hit point

411 Pilot Test Data Collect data for both the initial and revised methods The excel spreadsheets calculate mean and standard deviation RMD (inches) Shot # Method A Method B Method A Method B Mean Std Dev Observations (Method A): Observations (Method B): 53 Test Execution Checklists Phase I Test Planning This phase is complete Phase II Test Coordination and Preparation - Complete the remaining items on the checklist Conduct the Test Readiness Review - The test team will obtain instructor approval, before proceeding to Test Phase III Phase III Test Execution and Evaluation - Complete all items on the checklist Submit Quick Look Report to instructor Phase IV Test Reporting Write Final Test Report Prepare for team briefings Checklists in this exercise are modeled after the AFSOC 18 th Flight Test Squadron Test Team Execution Checklists

412 Test Execution Exercise - Timeline Exercise Overview & Timeline (5 steps) 1. Phase I (Test Planning) This phase is complete 2. Phase II (Test Coordination and Preparation) and Test Readiness Review 3. Phase III (Test Execution & Evaluation) 4. Phase IV (Test Reporting), and prepare for team briefings 5. Team briefings and class discussion Note: Each team will brief one of the following: (1) Test Plan & TRR results (2) Cost Estimate & Resource Requirements (3) After Action Review results (for the pilot test) (4) After Action Review results (for the record trials) (5) Test Report (6) Post Test Briefing information 55 Test Execution (Quick Look) Exercise Results GRP Test Cost ($) WIMPE (spec=5 inches) Std Mean Min Max Median Dev CEP HADES (spec=8 inches) Mean Median Std Dev Min Max CEP #1 #2 #3 #4 #5 #

413 Introduction and Background TST 204 DT&E Test Execution Exercise In prior exercises, you examined the key requirements driving the test program, developed critical technical parameters, developed appropriate measures of performance and data requirements, and developed a test mission scenario. In this exercise, you will prepare for and execute a test; collect and evaluate test data; and report test results and conclusions pertaining to system performance and comparison of the performance of two similar systems. As part of this exercise, four test execution checklists will be used to guide students concerning the assignments and deliverables that correspond with each of the four test phases. The checklists in this exercise are modeled after the AFSOC 18th Flight Test Squadron Test Team Execution Checklists. (Students have a copy of the AFSOC Handbook on their Student Reference CD.) Although the AFSOC checklists are different than those used in this exercise, checklist realism was preserved as much as possible in this exercise. In addition to the test execution checklists, a series of templates will be used in the exercise (one template per deliverable.) The appropriate templates are located in this section of the student book, along with the corresponding test execution checklists. An Excel spreadsheet (found on the Student Reference CD) may be used to analyze the data collected during this exercise. Assignments 1. As part of Phase II (Test Coordination and Preparation): - Coordinate test support (test articles, targets, instrumentation, range facilities, data collection and analysis equipment, operators, support personnel, etc.) - Finalize the test plan, and submit it to the approval authority (an instructor) as part of the Test Readiness Review. - Finalize the test cost estimate / resource requirements, and submit this to the approval authority as part of the Test Readiness Review. - Conduct a Test Readiness Review (an instructor will grant approval to proceed to the next test phase.) 2. As part of Phase III (Test Execution and Evaluation): - Characterize your Catapult - Conduct pilot tests - Execute your planned DT test events (record trails). - Conduct a quick data assessment (check data quality; will the data satisfy the test objectives?) - Analyze the data - Submit a quick look report to the instructor (within 10 minutes of data analysis completion) - Conduct After Action Reviews 413

414 3. As part of Phase IV (Test Reporting): - Write a final report. - Prepare for the class discussion and team briefings - Conduct a Post-Test briefing 4. One person per team will present a briefing, and discuss their team s results with the class. 414

415 Phase I - TEST PLANNING Critical Events Establish test team Obtain system documentation (TEMP, ICD, CDD, etc.) Test resources and cost analysis Draft test plan Determine safety requirements, and make preliminary arrangements Determine environmental requirements, and make preliminary arrangements Determine test security requirements, and make preliminary arrangements Research and studies complete Draft test plan disseminated Test timeline established Test support requests submitted Exit Criteria Checklist Complete Test Director: T. D. Approval Authority: D.A.U. Date Completed: 04/05/06 Note 1: Phase I is complete. You may proceed to Phase II. Note 2: The test resources and cost analysis results from Phase I (on the next two pages) will be used during Phase II. 415

416 Phase I Test Resources and Cost Analysis Results An analysis of DT&E resources and costs was performed, to research and identify support needed to conduct upcoming test events. As part of the study, requirements and options were identified and preliminary selections and arrangements were made. In addition, responsibilities were assigned to ensure necessary resources and support will be in place for upcoming test events. The following information and results were obtained from the study: 1. Information and issues for candidate test ranges were researched. Research topics included range costs, scheduling requirements, range and asset availability, data capabilities, operating environments supported, types of testing supported, and safety / security / environmental considerations. (Details omitted.) 2. Rough order of magnitude cost estimates were prepared for each of the candidate ranges. (Cost estimates omitted.) 2.1 Cost data for the Great Outdoors Test Range is as follows: (Note: deployment costs will be incurred, if using the Great Outdoors test range.) a. Deployment costs (per person): $0.2K b. Range time (cost for ten minutes of range time): (1) Concrete or pavement: $2K per ten minute increment (2) Grass or weeds: $1K per ten minute increment (3) Dirt, sand, gravel, mulch, or mud: $1K per ten minute increment c. Range facilities, launcher facilities, and support (this is a mandatory cost to use the Great Outdoors test range): $0.5K 2.2 Cost data for the Classroom Test Range is as follows: (Note: the Classroom test range includes all spaces inside the classroom building, or any other building. No deployment costs will be incurred for using an indoor space.) a. Range time (cost for ten minutes of range time): (1) Carpet: $1.5K per ten minute increment (2) Tile or hard surface: $2.5K per ten minute increment b. Range facilities, launcher facilities, and support (this is a mandatory cost to use the Classroom test range): $0.5K 416

417 3. The following potential resources and test support were identified and researched: Test articles, targets, instrumentation, range facilities and support, ground support, personnel requirements, data collection equipment, data analysis equipment / software, and deployment and training requirements and costs. (Details omitted) 3.1 Cost data was obtained for the following resources: (Note: training and randomization are both optional; however if these occur, the stated costs will be incurred.) a. Training: $0.2K per person. (This purchases practice shots for the WIMPE, and for the HADES.) b. Randomization: $1K (one time cost if you choose to randomize.) c. Munitions (Test Teams may purchase between one and five inert munitions of each type; munitions may be re-used for subsequent test runs, unless the munitions are lost or damaged during testing.) (1) WIMPE: $0.2K for each of the munitions. Must purchase at least 1. (2) HADES: $0.5K for each of the munitions. Must purchase at least 1. d. Instrumentation (tape measure or measuring device): $0.5K each e. Data collection equipment: No charge. (The test team has already obtained the necessary data collection equipment.) f. Data analysis equipment and software: No charge. (The test team has already obtained the necessary computer equipment and software.) g. Targets: $0.5K per target. (This cost is for any type of target. Every test requires a target; even if that target is masking tape or a piece of paper. One target is all that is needed since the munitions are inert and will not destroy the target) h. Operators, data collectors, and data analysis personnel: a. Govt Operators: $225/manhour or $40 per 10 manminutes. b. Contract Operators: $195/manhour or $35 per 10 manminutes. c. Govt Data collectors: $150/manhour or $30 per 10 manminutes. d. Contract data collectors: $190/manhour or $35 per 10 manminutes. e. Govt Data analyst: $300/manhour or $60 per 10 manminutes. f. Contract Data analyst: $260/manhour or $45 per 10 manminutes. g. Miltary Personnel: NO COST 417

418 Phase II - TEST COORDINATION and PREPARATION Critical Events Coordinate test support (test articles, targets, instrumentation, range facilities, data collection and analysis equipment, operators, support personnel, etc.) Finalize test plan, and submit to approval authority Finalize test cost estimate / resource requirements, and submit to approval authority Finalize safety, environmental, and test security arrangements Finalize test preparations (logistics and administrative requirements, deployment arrangements, training and briefing test participants, etc.) Obtain test certifications, approvals, and safety release Conduct Test Readiness Review Exit Criteria Test coordination and preparations complete Test plan approved Test cost estimate / resource requirements approved Test Readiness Review conducted, and permission granted to conduct T&E Checklist Complete Test Director: Approval Authority: Date Completed:. Note 1: For purposes of this exercise, test plan submission & approval, and cost estimate submission & approval will occur during the Test Readiness Review. Note 2: You must obtain instructor approval (as part of the Test Readiness Review), prior to proceeding to Phase III. Note 3: Normally we would crosswalk test plans with other documents (TEMP, CDD, SOW, SEP, etc.) to ensure we have designed the test event per request, and per higher level guidance. Due to time limitations, we won t do that during this exercise. 418

419 1. Introduction DT&E Test Plan (for Pilot Tests and Record Trials) 1.1 System Description 1.2 System Purpose (Omitted) 1.3 Background (Omitted) 2. Scope of Testing 2.1 Test Objectives and Purpose 2.2 Critical Technical Parameter(s) 2.3 Test Description and Methodology 2.4 Data Collection Methodology 2.5 Evaluation Approach and Methodology 2.6 Resource Requirements (Omitted) 2.7 Risks and Limitations (Omitted) 2.8 Deliverables (i.e., test report, data, other) (Omitted) 3. Administrative and Other Information (Omitted) 419

420 Test Cost Estimate / Resource Requirements 1. Range costs: a. Which test range will be used? b. Deployment costs (if any) c. Cost for range time (range time must be purchased for launcher characterization, pilot tests, record trials, etc.) d. Range facilities, launcher facilities, and support 2. Manpower costs: (manpower must be purchased for launcher characterization, pilot tests, record trials, etc.) a. Operators b. Data collectors c. Data analysis personnel 3. Costs for other resources: a. Training costs (if any) b. Randomization costs (if any) c. Munitions costs, WIMPE* d. Munitions costs, HADES* e. Instrumentation costs f. Data collection equipment g. Data analysis equipment and software h. Targets Total Test Costs: * Note that the munitions may be re-used, with NO ADDITIONAL cost 420

421 Test Readiness Review (TRR) Presentation As part of the Test Readiness Review, Student Teams will justify to the decision-maker (an instructor), whether the following items are ready to go to test. Testing will NOT commence, until each item has either been waived by the decision-maker, or verified as ready to go to test. Make any assumptions you want (you must justify your assumptions to the decision-maker.) TRR Review Elements: (1) Introduction (TRR purpose, program overview, how planned tests support the overall program, and traceability of planned tests to program requirements) (2) Status of risks (what are the risks associated with planned testing; and how are the risks being mitigated?) (3) Test program overview, including the test schedule (4) Test program staffing (organization structure/chart/roles and responsibilities) #5 is omitted for training (5) Preliminary / informal test results (identify any testing that has already been purposes conducted; identify any outstanding discrepancies as a result of previous testing) (6) Test requirements (a) Have required test resources been obtained? (Personnel, facilities, instrumentation, funding, training, test documentation, test procedures, test environment, support equipment, logistics, test assets, etc.) Are available test resources adequate? (b) Have test certifications, approvals, and safety release been obtained? (7) Recommendation on the readiness to commence testing 421

422 Phase III - TEST EXECUTION and EVALUATION Critical Events Characterize your Catapult (do this first) Conduct pilot tests (pilot tests must be completed, prior to record trials) Conduct DT test events (record trials) Data assessment (check data quality; will the data satisfy the test objectives?) Data analysis Submit quick look report to approval authority (within 10 minutes of data analysis completion) After Action Review for the pilot tests After Action Review for the record trials Exit Criteria Data analysis completed Quick look report submitted to approval authority After Action Reviews complete Checklist Complete Test Director: Approval Authority: Date Completed:. Note 1: For purposes of this exercise, approval from the Approval Authority is not needed, upon completing Phase III. However, you do need to submit a Quick Look Report to the instructor (the Approval Authority), within 10 minutes of the completion of your data analysis. A hand written report is fine. Note 2: You may start on Phase IV, prior to completing Phase III. 422

423 Characterize Your Catapult Done at 4 ft., 6 ft., 8 ft., and 10 ft. Ranges Before conducting any pilot tests or record trials, you should first characterize your Catapult, by collecting the following data. This data will be used for both the DT&E and OT&E Test Execution exercises. Note that you will collect WIMPE data ONLY at a 10 foot range; whereas HADES data will be collected at 4 ft., 6 ft., 8 ft., and 10 ft. ranges. Release Angle WIMPE HADES 4 foot range 6 foot range 8 foot range IMPORTANT Make sure you save this data for next week s OT Test Execution Exercise 10 foot range Record your launcher settings here. Other than the different release angles, the same launcher settings must be used for all of the above test runs. Stop Pin Location: Band Attach Location: Tension Pin Location: 423

DoDI ,Operation of the Defense Acquisition System Change 1 & 2

DoDI ,Operation of the Defense Acquisition System Change 1 & 2 DoDI 5000.02,Operation of the Defense Acquisition System Change 1 & 2 26 January & 2 February 2017 (Key Changes from DoDI 5000.02, 7 Jan 2015) Presented By: T.R. Randy Pilling Center Director Acquisition

More information

OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) Exhibit R-2 0605804D8Z OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) COST ($ in Millions) FY 2007 FY 2008 FY 2009 FY 2010 FY 2011 FY 2012 FY 2013 Total Program Element (PE) Cost 9.155 18.550 20.396

More information

Middle Tier Acquisition and Other Rapid Acquisition Pathways

Middle Tier Acquisition and Other Rapid Acquisition Pathways Middle Tier Acquisition and Other Rapid Acquisition Pathways Pete Modigliani Su Chang Dan Ward Contact us at accelerate@mitre.org Approved for public release. Distribution unlimited 17-3828-2. 2 Purpose

More information

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 8 R-1 Line #163

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 8 R-1 Line #163 Exhibit R-2, RDT&E Budget Item Justification: PB 2015 Office of Secretary Of Defense Date: March 2014 0400: Research, Development, Test &, Defense-Wide / BA 6: RDT&E Management Support COST ($ in Millions)

More information

Developmental Test & Evaluation OUSD(AT&L)/DDR&E

Developmental Test & Evaluation OUSD(AT&L)/DDR&E Developmental Test & Evaluation OUSD(AT&L)/DDR&E Chris DiPetto 12 th Annual NDIA Systems Engineering Conference Agenda DT&E Title 10 USC overview Organization DDR&E imperatives What Title 10 means for

More information

UNCLASSIFIED R-1 ITEM NOMENCLATURE

UNCLASSIFIED R-1 ITEM NOMENCLATURE Exhibit R-2, RDT&E Budget Item Justification: PB 2014 Office of Secretary Of Defense DATE: April 2013 0400: Research, Development, Test &, Defense-Wide COST ($ in Millions) All Prior FY 2014 Years FY 2012

More information

THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC

THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC 20301-3010 ACQUISITION, TECHNOLOGY AND LOGISTICS DEC 0 it 2009 MEMORANDUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS CHAIRMAN OF THE

More information

UNCLASSIFIED OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

UNCLASSIFIED OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) Budget Item Justification Exhibit R-2 0605804D8Z OSD RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit) Cost ($ in Millions) FY 2008 FY 2009 FY 2010 FY 2011 FY 2012 FY 2013 Actual Total Program Element (PE)

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 5000.57 December 18, 2013 Incorporating Change 1, September 22, 2017 USD(AT&L) SUBJECT: Defense Acquisition University (DAU) References: See Enclosure 1 1. PURPOSE.

More information

This is definitely another document that needs to have lots of HSI language in it!

This is definitely another document that needs to have lots of HSI language in it! 1 The Capability Production Document (or CPD) is one of the most important things to come out of the Engineering and Manufacturing Development phase. It defines an increment of militarily useful, logistically

More information

Defense Acquisition Guidebook Systems Engineering Chapter Update

Defense Acquisition Guidebook Systems Engineering Chapter Update Defense Acquisition Guidebook Systems Engineering Chapter Update Ms. Aileen Sedmak Office of the Deputy Assistant Secretary of Defense for Systems Engineering 15th Annual NDIA Systems Engineering Conference

More information

REQUIREMENTS TO CAPABILITIES

REQUIREMENTS TO CAPABILITIES Chapter 3 REQUIREMENTS TO CAPABILITIES The U.S. naval services the Navy/Marine Corps Team and their Reserve components possess three characteristics that differentiate us from America s other military

More information

Department of Defense DIRECTIVE

Department of Defense DIRECTIVE Department of Defense DIRECTIVE NUMBER 5141.02 February 2, 2009 DA&M SUBJECT: Director of Operational Test and Evaluation (DOT&E) References: See Enclosure 1 1. PURPOSE. This Directive: a. Reissues DoD

More information

Product Support Manager Workshop. Rapid Capabilities. Mr. Chris O Donnell Director, Joint Rapid Acquisition Cell

Product Support Manager Workshop. Rapid Capabilities. Mr. Chris O Donnell Director, Joint Rapid Acquisition Cell Product Support Manager Workshop Rapid Capabilities Mr. Chris O Donnell Director, Joint Rapid Acquisition Cell June 8, 2017 17-S-1832 Deliberate Requirements vs. Urgent / Rapid Requirements Lanes Urgent

More information

US Department of Defense Systems Engineering Policy and Guidance

US Department of Defense Systems Engineering Policy and Guidance US Department of Defense Systems Engineering Policy and Guidance Aileen Sedmak Office of the Deputy Assistant Secretary of Defense for Systems Engineering 17th Annual NDIA Systems Engineering Conference

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE POLICY DIRECTIVE 99-1 3 JUNE 2014 Test and Evaluation TEST AND EVALUATION COMPLIANCE WITH THIS PUBLICATION IS MANDATORY ACCESSIBILITY: Publications

More information

SUBJECT: Army Directive (Implementation of Acquisition Reform Initiatives 1 and 2)

SUBJECT: Army Directive (Implementation of Acquisition Reform Initiatives 1 and 2) S E C R E T A R Y O F T H E A R M Y W A S H I N G T O N MEMORANDUM FOR SEE DISTRIBUTION SUBJECT: Army Directive 2017-22 (Implementation of Acquisition Reform Initiatives 1 and 2) 1. References. A complete

More information

Developmental Test and Evaluation Is Back

Developmental Test and Evaluation Is Back Guest Editorial ITEA Journal 2010; 31: 309 312 Developmental Test and Evaluation Is Back Edward R. Greer Director, Developmental Test and Evaluation, Washington, D.C. W ith the Weapon Systems Acquisition

More information

Department of Defense INSTRUCTION. 1. PURPOSE. In accordance with the authority in DoD Directive (DoDD) (Reference (a)), this Instruction:

Department of Defense INSTRUCTION. 1. PURPOSE. In accordance with the authority in DoD Directive (DoDD) (Reference (a)), this Instruction: Department of Defense INSTRUCTION NUMBER 4715.17 April 15, 2009 Incorporating Change 1, November 16, 2017 USD(AT&L) SUBJECT: Environmental Management Systems References: See Enclosure 1 1. PURPOSE. In

More information

UNCLASSIFIED. UNCLASSIFIED Navy Page 1 of 7 R-1 Line #31

UNCLASSIFIED. UNCLASSIFIED Navy Page 1 of 7 R-1 Line #31 Exhibit R2, RDT&E Budget Item Justification: PB 2015 Navy Date: March 2014 1319: Research, Development, Test & Evaluation, Navy / BA 4: Advanced Component Development & Prototypes (ACD&P) COST ($ in Millions)

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Global Combat Support System-Marine Corps Logistics Chain Management Increment 1 (GCSS-MC LCM Inc 1) Defense Acquisition Management Information Retrieval

More information

The Role of T&E in the Systems Engineering Process Keynote Address

The Role of T&E in the Systems Engineering Process Keynote Address The Role of T&E in the Systems Engineering Process Keynote Address August 17, 2004 Glenn F. Lamartin Director, Defense Systems Top Priorities 1. 1. Successfully Successfully Pursue Pursue the the Global

More information

Department of Defense INSTRUCTION. SUBJECT: Physical Security Equipment (PSE) Research, Development, Test, and Evaluation (RDT&E)

Department of Defense INSTRUCTION. SUBJECT: Physical Security Equipment (PSE) Research, Development, Test, and Evaluation (RDT&E) Department of Defense INSTRUCTION NUMBER 3224.03 October 1, 2007 USD(AT&L) SUBJECT: Physical Security Equipment (PSE) Research, Development, Test, and Evaluation (RDT&E) References: (a) DoD Directive 3224.3,

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A) Defense Acquisition Management Information Retrieval (DAMIR)

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 4151.22 October 16, 2012 Incorporating Change 1, Effective January 19, 2018 SUBJECT: Condition Based Maintenance Plus (CBM + ) for Materiel Maintenance References:

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Integrated Strategic Planning and Analysis Network Increment 4 (ISPAN Inc 4) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED

More information

Department of Defense DIRECTIVE

Department of Defense DIRECTIVE Department of Defense DIRECTIVE NUMBER 5101.02E January 25, 2013 DA&M SUBJECT: DoD Executive Agent (EA) for Space References: See Enclosure 1 1. PURPOSE. This Directive: a. Reissues DoD Directive (DoDD)

More information

Middle Tier Acquisition and Other Rapid Acquisition Pathways

Middle Tier Acquisition and Other Rapid Acquisition Pathways Middle Tier Acquisition and Other Rapid Acquisition Pathways Pete Modigliani Su Chang Dan Ward 26 Jun 18 Contact us at accelerate@mitre.org Approved for public release. Distribution unlimited 17-3828-3.

More information

DoD Instruction dated 8 December Operation of the Defense Acquisition System Statutory and Regulatory Changes

DoD Instruction dated 8 December Operation of the Defense Acquisition System Statutory and Regulatory Changes DoD Instruction 5000.02 dated 8 December 2008 Operation of the Defense Acquisition System Statutory and Regulatory Changes Karen Byrd Learning Capabilities Integration Center April 2009 Changes to the

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE POLICY DIRECTIVE 90-16 31 AUGUST 2011 Special Management STUDIES AND ANALYSES, ASSESSMENTS AND LESSONS LEARNED COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Logistics Modernization Program Increment 2 (LMP Inc 2) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-8 CJCSI 8510.01C DISTRIBUTION: A, B, C, S MANAGEMENT OF MODELING AND SIMULATION References: See Enclosure C. 1. Purpose. This instruction: a. Implements

More information

DoD Cloud Computing Strategy Needs Implementation Plan and Detailed Waiver Process

DoD Cloud Computing Strategy Needs Implementation Plan and Detailed Waiver Process Inspector General U.S. Department of Defense Report No. DODIG-2015-045 DECEMBER 4, 2014 DoD Cloud Computing Strategy Needs Implementation Plan and Detailed Waiver Process INTEGRITY EFFICIENCY ACCOUNTABILITY

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE INSTRUCTION 16-1002 1 JUNE 2000 Operations Support MODELING AND SIMULATION (M&S) SUPPORT TO ACQUISITION COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-6 CJCSI 5116.05 DISTRIBUTION: A, B, C MILITARY COMMAND, CONTROL, COMMUNICATIONS, AND COMPUTERS EXECUTIVE BOARD 1. Purpose. This instruction establishes

More information

JITC Joint Interoperability Test, Evaluation, and Certification Overview

JITC Joint Interoperability Test, Evaluation, and Certification Overview JITC Joint Interoperability Test, Evaluation, and Certification Overview November 2016 UNCLASSIFIED 1 Agenda Joint Interoperability Policy and Guidance Requirements and Evaluation Framework Joint Interoperability

More information

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE N: Consolidated Afloat Network Ent Services(CANES) FY 2012 OCO

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE N: Consolidated Afloat Network Ent Services(CANES) FY 2012 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 2012 Navy DATE: February 2011 COST ($ in Millions) FY 2010 FY 2013 FY 2014 FY 2015 FY 2016 To Program Element 46.823 63.563 12.906-12.906 15.663 15.125

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Army Contract Writing System (ACWS) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents Common Acronyms and

More information

Technical Data (an Output of Systems Engineering) in the Context of the LCMC

Technical Data (an Output of Systems Engineering) in the Context of the LCMC Technical Data (an Output of Systems Engineering) in the Context of the LCMC 9 th Annual Systems Engineering Conference 23-26 October 2006 San Diego, CA R. Miller 1 Today s Agenda Statement of Purpose

More information

Acquisitions and Contracting Basics in the National Industrial Security Program (NISP)

Acquisitions and Contracting Basics in the National Industrial Security Program (NISP) Acquisitions and Contracting Basics in the National Industrial Security Program (NISP) Lesson 1: Course Introduction Contents Introduction... 2 Opening... 2 Objectives... 2 September 2015 Center for Development

More information

System Safety in Systems Engineering DAU Continuous Learning Module

System Safety in Systems Engineering DAU Continuous Learning Module System Safety in Systems Engineering DAU Continuous Learning Module NDIA Systems Engineering Conference October 25, 2005 Amanda Zarecky Booz Allen Hamilton 703-604-5468 zarecky_amanda@bah.com Course Context

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B) Defense Acquisition Management Information Retrieval (DAMIR)

More information

Information Technology

Information Technology September 24, 2004 Information Technology Defense Hotline Allegations Concerning the Collaborative Force- Building, Analysis, Sustainment, and Transportation System (D-2004-117) Department of Defense Office

More information

Prepared for Milestone A Decision

Prepared for Milestone A Decision Test and Evaluation Master Plan For the Self-Propelled Artillery Weapon (SPAW) Prepared for Milestone A Decision Approval Authority: ATEC, TACOM, DASD(DT&E), DOT&E Milestone Decision Authority: US Army

More information

GUARDING THE INTENT OF THE REQUIREMENT. Stephen J Scukanec. Eric N Kaplan

GUARDING THE INTENT OF THE REQUIREMENT. Stephen J Scukanec. Eric N Kaplan GUARDING THE INTENT OF THE REQUIREMENT 13th Annual Systems Engineering Conference Hyatt Regency Mission Bay San Diego October 25-28, 2010 Stephen J Scukanec Flight Test and Evaluation Aerospace Systems

More information

Institutionalizing a Culture of Statistical Thinking in DoD Testing

Institutionalizing a Culture of Statistical Thinking in DoD Testing Institutionalizing a Culture of Statistical Thinking in DoD Testing Dr. Catherine Warner Science Advisor Statistical Engineering Leadership Webinar 25 September 2017 Outline Overview of DoD Testing Improving

More information

Headquarters U.S. Air Force

Headquarters U.S. Air Force 23 May 07 1 Headquarters U.S. Air Force I n t e g r i t y - S e r v i c e - E x c e l l e n c e 4094 Integrating ESOH Risk Management into Acquisition Systems Engineering Mr. Sherman Forbes Office of the

More information

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 213 Navy DATE: February 212 COST ($ in Millions) FY 211 FY 212 Total FY 214 FY 215 FY 216 FY 217 To Complete Total Total Program Element 1.613 1.418 1.56-1.56

More information

Defense Science Board Task Force Developmental Test and Evaluation Study Results

Defense Science Board Task Force Developmental Test and Evaluation Study Results Invited Article ITEA Journal 2008; 29: 215 221 Copyright 2008 by the International Test and Evaluation Association Defense Science Board Task Force Developmental Test and Evaluation Study Results Pete

More information

FY 2017 Annual Report on Cost Assessment Activities. February 2018

FY 2017 Annual Report on Cost Assessment Activities. February 2018 A FY 2017 Annual Report on Cost Assessment Activities February 2018 This page intentionally left blank. FY 2017 Annual Report on Cost Assessment Activities Director, Cost Assessment and Program Evaluation

More information

Joint Interoperability Certification

Joint Interoperability Certification J O I N T I N T E R O P E R B I L I T Y T E S T C O M M N D Joint Interoperability Certification What the Program Manager Should Know By Phuong Tran, Gordon Douglas, & Chris Watson Would you agree that

More information

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON, DC

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON, DC DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON, DC 20350-3000 MCO 3100.4 PLI MARINE CORPS ORDER 3100.4 From: To: Subj: Commandant of the Marine Corps

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Defense Enterprise Accounting and Management System-Increment 1 (DEAMS Inc 1) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED

More information

Mission Based T&E Progress

Mission Based T&E Progress U.S. Army Evaluation Center Mission Based T&E Progress Christopher Wilcox Deputy/Technical Director Fires Evaluation Directorate, US AEC 15 Mar 11 2 Purpose and Agenda Purpose: To review the status of

More information

DEPARTMENT OF DEFENSE

DEPARTMENT OF DEFENSE DEPARTMENT OF DEFENSE Key Leadership Position Joint Qualification Board Standard Operating Procedures Version 4 April 6, 2015 Contents 1. Scope and Purpose... 3 2. Applicable Documents... 3 3. Definitions...

More information

Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability

Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability U.S. Army Research, Development and Engineering Command Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability NDIA Systems Engineering Conference

More information

Middle Tier Acquisition and Other Rapid Acquisition Pathways

Middle Tier Acquisition and Other Rapid Acquisition Pathways Middle Tier Acquisition and Other Rapid Acquisition Pathways Pete Modigliani Su Chang Dan Ward Colleen Murphy 28 Jul 18 Contact us at accelerate@mitre.org Approved for public release. Distribution unlimited

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE INSTRUCTION 99-103 6 APRIL 2017 Test and Evaluation CAPABILITIES-BASED TEST AND EVALUATION COMPLIANCE WITH THIS PUBLICATION IS MANDATORY ACCESSIBILITY:

More information

Overview of the Chemical and Biological Defense Program Requirements Process

Overview of the Chemical and Biological Defense Program Requirements Process Overview of the Chemical and Biological Defense Program Requirements Process 14 March 2012 Director, Joint Requirements Office for Chemical, Biological, Radiological, and Nuclear Defense J-8, The Joint

More information

DEPARTMENT OF DEFENSE Developmental Test and Evaluation FY 2013 Annual Report

DEPARTMENT OF DEFENSE Developmental Test and Evaluation FY 2013 Annual Report DEPARTMENT OF DEFENSE Developmental Test and Evaluation FY 2013 Annual Report MARCH 2014 The estimated cost of this report for the Department of Defense is approximately $521,000 in Fiscal Years 2013 2014.

More information

A udit R eport. Office of the Inspector General Department of Defense. Report No. D October 31, 2001

A udit R eport. Office of the Inspector General Department of Defense. Report No. D October 31, 2001 A udit R eport ACQUISITION OF THE FIREFINDER (AN/TPQ-47) RADAR Report No. D-2002-012 October 31, 2001 Office of the Inspector General Department of Defense Report Documentation Page Report Date 31Oct2001

More information

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON DC

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON DC DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON DC 20350-3000 Canc: Jan 2018 MCBul 3900 CD&I (CDD) MARINE CORPS BULLETIN 3900 From: Commandant of the

More information

Advanced Simulation Course for Army Simulation Management Professionals

Advanced Simulation Course for Army Simulation Management Professionals Advanced Simulation Course for Army Simulation Management Professionals Gene Paulo Department of Systems Engineering Naval Postgraduate School eppaulo@nps.edu (831)656-3452 Introduction In 2009 NPS was

More information

2011 Ground Robotics Capability Conference. OSD Perspective

2011 Ground Robotics Capability Conference. OSD Perspective 2011 Ground Robotics Capability Conference OSD Perspective Jose M. Gonzalez OUSD (Acquisition, Technology & Logistics) Deputy Director, Portfolio Systems Acquisition, Land Warfare and Munitions Discussion

More information

Headquarters U.S. Air Force

Headquarters U.S. Air Force Headquarters U.S. Air Force Processes and Enabling Policy for Compliance with Materiel International Standardization Agreements (ISA) Mr. Chris Ptachik SAF/AQRE Contractor Engineering Policy Branch 23

More information

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 213 Army DATE: February 212 COST ($ in Millions) FY 211 FY 212 FY 214 FY 215 FY 216 FY 217 To Complete Program Element 125.44 31.649 4.876-4.876 25.655

More information

DoD Analysis Update: Support to T&E in a Net-Centric World

DoD Analysis Update: Support to T&E in a Net-Centric World Session C: Past and Present T&E Lessons Learned 40 Years of Excellence in Analysis DoD Analysis Update: Support to T&E in a Net-Centric World 2 March 2010 Dr. Wm. Forrest Crain Director, U.S. Army Materiel

More information

Joint Distributed Engineering Plant (JDEP)

Joint Distributed Engineering Plant (JDEP) Joint Distributed Engineering Plant (JDEP) JDEP Strategy Final Report Dr. Judith S. Dahmann John Tindall The MITRE Corporation March 2001 March 2001 Table of Contents page Executive Summary 1 Introduction

More information

UNCLASSIFIED R-1 ITEM NOMENCLATURE

UNCLASSIFIED R-1 ITEM NOMENCLATURE Exhibit R-2, RDT&E Budget Item Justification: PB 2014 Office of Secretary Of Defense DATE: April 2013 COST ($ in Millions) All Prior FY 2014 Years FY 2012 FY 2013 # Base FY 2014 FY 2014 OCO ## Total FY

More information

Department of Defense INSTRUCTION. Non-Lethal Weapons (NLW) Human Effects Characterization

Department of Defense INSTRUCTION. Non-Lethal Weapons (NLW) Human Effects Characterization Department of Defense INSTRUCTION NUMBER 3200.19 May 17, 2012 Incorporating Change 1, September 13, 2017 USD(AT&L) SUBJECT: Non-Lethal Weapons (NLW) Human Effects Characterization References: See Enclosure

More information

DOD INSTRUCTION OPERATION OF THE DOD FINANCIAL MANAGEMENT CERTIFICATION PROGRAM

DOD INSTRUCTION OPERATION OF THE DOD FINANCIAL MANAGEMENT CERTIFICATION PROGRAM DOD INSTRUCTION 1300.26 OPERATION OF THE DOD FINANCIAL MANAGEMENT CERTIFICATION PROGRAM Originating Component: Office of the Under Secretary of Defense (Comptroller)/Chief Financial Officer, DoD Effective:

More information

DOD DIRECTIVE DOD SPACE ENTERPRISE GOVERNANCE AND PRINCIPAL DOD SPACE ADVISOR (PDSA)

DOD DIRECTIVE DOD SPACE ENTERPRISE GOVERNANCE AND PRINCIPAL DOD SPACE ADVISOR (PDSA) DOD DIRECTIVE 5100.96 DOD SPACE ENTERPRISE GOVERNANCE AND PRINCIPAL DOD SPACE ADVISOR (PDSA) Originating Component: Office of the Deputy Chief Management Officer of the Department of Defense Effective:

More information

Development Planning Working Group Update

Development Planning Working Group Update Development Planning Working Group Update Ms. Aileen Sedmak Office of the Deputy Assistant Secretary of Defense for Systems Engineering 16th Annual NDIA Systems Engineering Conference Arlington, VA October

More information

Test and Evaluation and the ABCs: It s All about Speed

Test and Evaluation and the ABCs: It s All about Speed Invited Article ITEA Journal 2009; 30: 7 10 Copyright 2009 by the International Test and Evaluation Association Test and Evaluation and the ABCs: It s All about Speed Steven J. Hutchison, Ph.D. Defense

More information

Department of Defense

Department of Defense Department of Defense DIRECTIVE NUMBER 5105.84 May 11, 2012 DA&M SUBJECT: Director of Cost Assessment and Program Evaluation (DCAPE) References: See Enclosure 1. PURPOSE. This Directive: a. Assigns the

More information

ADMINISTRATIVE INSTRUCTION

ADMINISTRATIVE INSTRUCTION Director of Administration and Management Deputy Chief Management Officer of the Department of Defense ADMINISTRATIVE INSTRUCTION NUMBER 101 July 20, 2012 Incorporating Change 1, April 19, 2017 WHS/HRD

More information

Test and Evaluation (T&E) is essential to successful system

Test and Evaluation (T&E) is essential to successful system Test and Evaluation Myths and Misconceptions Steve Hutchison, Ph.D. Test and Evaluation (T&E) is essential to successful system acquisition. For the last 43 years, the Office of the Secretary of Defense

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-8 CJCSI 3170.01C DISTRIBUTION: A, B, C, J, S JOINT CAPABILITIES INTEGRATION AND DEVELOPMENT SYSTEM References: See Enclosure C 1. Purpose. The purpose

More information

(c) DoD Instruction of 11 March 2014 (d) SECNAVINST D (e) CNO WASHINGTON DC Z Apr 11 (NAVADMIN 124/11)

(c) DoD Instruction of 11 March 2014 (d) SECNAVINST D (e) CNO WASHINGTON DC Z Apr 11 (NAVADMIN 124/11) DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON DC 20350-2000 OPNAVINST 1320.6 N13 OPNAV INSTRUCTION 1320.6 From: Chief of Naval Operations Subj: 1,095-DAY

More information

Evolutionary Acquisition an Spiral Development in Programs : Policy Issues for Congress

Evolutionary Acquisition an Spiral Development in Programs : Policy Issues for Congress Order Code RS21195 Updated April 8, 2004 Summary Evolutionary Acquisition an Spiral Development in Programs : Policy Issues for Congress Gary J. Pagliano and Ronald O'Rourke Specialists in National Defense

More information

Test and Evaluation Policy

Test and Evaluation Policy Army Regulation 73 1 Test and Evaluation Test and Evaluation Policy UNCLASSIFIED Headquarters Department of the Army Washington, DC 16 November 2016 SUMMARY of CHANGE AR 73 1 Test and Evaluation Policy

More information

DOD MANUAL ACCESSIBILITY OF INFORMATION AND COMMUNICATIONS TECHNOLOGY (ICT)

DOD MANUAL ACCESSIBILITY OF INFORMATION AND COMMUNICATIONS TECHNOLOGY (ICT) DOD MANUAL 8400.01 ACCESSIBILITY OF INFORMATION AND COMMUNICATIONS TECHNOLOGY (ICT) Originating Component: Office of the Chief Information Officer of the Department of Defense Effective: November 14, 2017

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 8330.01 May 21, 2014 Incorporating Change 1, December 18, 2017 DoD CIO SUBJECT: Interoperability of Information Technology (IT), Including National Security Systems

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Tactical Mission Command (TMC) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents Common Acronyms and Abbreviations

More information

Test and Evaluation of Highly Complex Systems

Test and Evaluation of Highly Complex Systems Guest Editorial ITEA Journal 2009; 30: 3 6 Copyright 2009 by the International Test and Evaluation Association Test and Evaluation of Highly Complex Systems James J. Streilein, Ph.D. U.S. Army Test and

More information

JOINT TEST AND EVALUATION METHODOLOGY (JTEM) PROGRAM MANAGER S HANDBOOK FOR TESTING IN A JOINT ENVIRONMENT

JOINT TEST AND EVALUATION METHODOLOGY (JTEM) PROGRAM MANAGER S HANDBOOK FOR TESTING IN A JOINT ENVIRONMENT JOINT TEST AND EVALUATION METHODOLOGY (JTEM) PROGRAM MANAGER S HANDBOOK FOR TESTING IN A JOINT ENVIRONMENT Approved By: Maximo Lorenzo Joint Test Director JTEM JT&E APRIL 17, 2009 DISTRIBUTION STATEMENT

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Integrated Personnel and Pay System-Army Increment 2 (IPPS-A Inc 2) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table

More information

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS WASHINGTON, DC MCO C C2I 15 Jun 89

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS WASHINGTON, DC MCO C C2I 15 Jun 89 DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS WASHINGTON, DC 20380-0001 MCO 3093.1C C2I MARINE CORPS ORDER 3093.1C From: Commandant of the Marine Corps To: Distribution List Subj: INTRAOPERABILITY

More information

Department of Defense Investment Review Board and Investment Management Process for Defense Business Systems

Department of Defense Investment Review Board and Investment Management Process for Defense Business Systems Department of Defense Investment Review Board and Investment Management Process for Defense Business Systems Report to Congress March 2012 Pursuant to Section 901 of the National Defense Authorization

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 6015.17 January 13, 2012 Incorporating Change 1, November 30, 2017 SUBJECT: Military Health System (MHS) Facility Portfolio Management References: See Enclosure

More information

Analyzing Sustainment and Maintenance Alternatives. Moderator Ms. Lisha Adams Deputy Assistant Secretary of Defense for Material Readiness

Analyzing Sustainment and Maintenance Alternatives. Moderator Ms. Lisha Adams Deputy Assistant Secretary of Defense for Material Readiness Analyzing Sustainment and Maintenance Alternatives Moderator Ms. Lisha Adams Deputy Assistant Secretary of Defense for Material Readiness Mr. Christopher Lowman Director for Maintenance Policy, Programs

More information

Department of Defense INSTRUCTION

Department of Defense INSTRUCTION Department of Defense INSTRUCTION NUMBER 5000.70 May 10, 2012 Incorporating Change 2, October 25, 2017 USD(AT&L) SUBJECT: Management of DoD Modeling and Simulation (M&S) Activities References: See Enclosure

More information

Report No. DoDIG June 13, Acquisition of the Navy Organic Airborne and Surface Influence Sweep Needs Improvement

Report No. DoDIG June 13, Acquisition of the Navy Organic Airborne and Surface Influence Sweep Needs Improvement Report No. DoDIG-2012-101 June 13, 2012 Acquisition of the Navy Organic Airborne and Surface Influence Sweep Needs Improvement Additional Copies To obtain additional copies of this report, visit the Web

More information

DoD Joint Federated Assurance Center (JFAC) 2017 Update

DoD Joint Federated Assurance Center (JFAC) 2017 Update DoD Joint Federated Assurance Center (JFAC) 2017 Update Thomas Hurt Office of the Deputy Assistant Secretary of Defense for Systems Engineering 20th Annual NDIA Systems Engineering Conference Springfield,

More information

Fact Sheet: FY2017 National Defense Authorization Act (NDAA) DOD Reform Proposals

Fact Sheet: FY2017 National Defense Authorization Act (NDAA) DOD Reform Proposals Fact Sheet: FY2017 National Defense Authorization Act (NDAA) DOD Reform Proposals Kathleen J. McInnis Analyst in International Security May 25, 2016 Congressional Research Service 7-5700 www.crs.gov R44508

More information

DARPA BAA HR001117S0054 Posh Open Source Hardware (POSH) Frequently Asked Questions Updated November 6, 2017

DARPA BAA HR001117S0054 Posh Open Source Hardware (POSH) Frequently Asked Questions Updated November 6, 2017 General Questions: Question 1. Are international universities allowed to be part of a team? Answer 1. All interested/qualified sources may respond subject to the parameters outlined in BAA. As discussed

More information

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION J-8 CJCSI 3170.01H DISTRIBUTION: A, B, C, S JOINT CAPABILITIES INTEGRATION AND DEVELOPMENT SYSTEM References: See Enclosure B 1. Purpose. In support of

More information

Suitability... at what cost?

Suitability... at what cost? Suitability... at what cost? 5-minute warm-up act for the T&E Service Exec Panel Talk about 3 things: 1. New Materiel Availability KPP 2. DAU Suitability Research Project 3. Announce NDIA/DAU TST-301 2007

More information

April 17, The Honorable Mac Thornberry Chairman. The Honorable Adam Smith Ranking Member

April 17, The Honorable Mac Thornberry Chairman. The Honorable Adam Smith Ranking Member April 17, 2015 The Honorable Mac Thornberry Chairman The Honorable Adam Smith Ranking Member Armed Services Committee 2126 Rayburn House Office Building Washington, D.C. 20515 Dear Chairman Thornberry

More information

2016 Major Automated Information System Annual Report

2016 Major Automated Information System Annual Report 2016 Major Automated Information System Annual Report Global Combat Support System - Army Increment 2 (GCSS-A Inc 2) Defense Acquisition Management Information Retrieval (DAMIR) UNCLASSIFIED Table of Contents

More information