Basics of PSP and TSP for Systems Engineering

Similar documents
Mining PSP Data. Dan Burton and Watts Humphrey Software Engineering Institute Carnegie Mellon University

CMMI Version 1.2 and Beyond a Tutorial

Process Improvement at NAVAIR using TSP and CMM

SCAMPI B&C Tutorial. Software Engineering Process Group Conference SEPG Will Hayes Gene Miluk Jack Ferguson

CMMI Version 1.2 and Beyond

Why Isn t Someone Coding Yet (WISCY)? Avoiding Ineffective Requirements

The CMMI Product Suite and International Standards

Overview of the New Introduction to CMMI Course and Changes to the Intermediate Concepts and Instructor Training Courses

MISC PMT. Should Mean More Than Just an Anagram. Mike Konrad Software Engineering Institute Carnegie Mellon University. September 18, 2006

Focus on Medical Device: A conversation about Case for Quality. September 22, 2017

Risk themes from ATAM data: preliminary results

Guide to the SEI Partner Network

Offshore Outsourcing. Agenda

ARMY DENCOM Strategic Plan for TeamSTEPPS Spread and Sustainment. MEDCOM PS Center

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

Reducing System Acquisition Risk with Software Architecture Analysis and Evaluation

The Los Alamos Mission Assurance Framework Subtitle: Systems Engineering is a Necessary, but Not Alone Sufficient, Enabler of Mission Success

COTS Selection and Adoption in a Small Business Environment. How Do You Downsize the Process?

CMMI: The DoD Perspective

University of Michigan Health System. Current State Analysis of the Main Adult Emergency Department

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Mark Stagen Founder/CEO Emerald Health Services

McKinsey Recommendations for Code Compliance and Economic Development. Status Report. Dallas City Council Briefing April 20, 2005 DRAFT 1

CSE255 Introduction to Databases - Fall 2007 Semester Project Overview and Phase I

STATEMENT. JEFFREY SHUREN, M.D., J.D. Director, Center for Devices and Radiological Health Food and Drug Administration

ADMINISTRATIVE REVIEWS AND TRAINING (ART) GRANTS PROGRAM Proposal Response Guidance

Sustaining Software-Intensive Systems - A Conundrum

Departments to Improve. February Chad Faiella RN, Terri Martin RN. 1 Process Excellence

FiXs Configuration Control Board Procedures Version 3.0 September 1, 2010

DoDI ,Operation of the Defense Acquisition System Change 1 & 2

OPD 201A - Unit Administration

UNCLASSIFIED. R-1 Program Element (Number/Name) PE F / NAVSTAR Global Positioning System (User Equipment) (SPACE) Prior Years FY 2013 FY 2014

A Game-Theoretic Approach to Optimizing Behaviors in Acquisition

Subj: NAVY TRAINING DEVICE UTILIZATION REPORTING (UR) Encl: (1) Definitions (2) Training Device Utilization Reporting Data Elements

Practical Applications on Efficiency

Lessons Learned with the Application of MIL-STD-882D at the Weapon System Explosives Safety Review Board

Systems Engineering & LRASM

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

Department of Defense

Fiscal Year 2009 National Defense Authorization Act, Section 322. Study of Future DoD Depot Capabilities

We acquire the means to move forward...from the sea. The Naval Research, Development & Acquisition Team Strategic Plan

AIR FORCE MISSION SUPPORT SYSTEM (AFMSS)

CAMDEN CLARK MEDICAL CENTER:

Capability Maturity Model for Business Development, Version 2.0

OPNAVINST B N8 7 Nov Subj: NAVY TEST, MEASUREMENT, AND DIAGNOSTIC EQUIPMENT, AUTOMATIC TEST SYSTEMS, AND METROLOGY AND CALIBRATION

Four Safety Truths that Reduce Workplace Injuries. Llanne Jocson Concepcion OSH Practitioner

Exhibit R-2, RDT&E Budget Item Justification

Joint Tactical Radio System (JTRS)

Developing a LIMS to Support Trials in the United Kingdom

The Guide to Smart Outsourcing (Nov 06)

A Common Interface Language For U nm anned System s Technology to the Warfighter Quicker

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

Directing and Controlling

Software Architecture and Product Quality

3/24/2016. Value of Quality Management. Quality Management in Senior Housing: Back to the Basics. Objectives. Defining Quality

REDESIGNING ALLIED HEALTH OUTPATIENTS - Lean Thinking Applications to Allied Health

September Sub-Region Collaborative Meeting: Bramalea. September 13, 2018

Integrating Software Architecture Evaluation in a DoD System Acquisition

Information Technology

Cryptologic Systems Group

1. Definitions. See AFI , Air Force Nuclear Weapons Surety Program (formerly AFR 122-1).

Successful First AESA Deployment through Application of Systems Engineering

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

May Improving Strategic Management of Hospitals: Addressing Functional Interdependencies within Medical Care Paper 238

Report on Feasibility, Costs, and Potential Benefits of Scaling the Military Acuity Model

Towards faster implementation and uptake of open government

CWE TM COMPATIBILITY ENFORCEMENT

A Measurement Guide for Long Term Care

Executive Summary: Davies Ambulatory Award Community Health Organization (CHO)

Medical Manager v12 includes the following features and functionalities to assist you with your ICD-10 transition:

NUCLEAR SAFETY PROGRAM

Effective Root Cause Analysis A Process

Developing an Incremental Proposal for EU gas transmission. Draft Project Plan

DHI Releases Updated Labor Market Tightness Measures for 37 Skill Categories

Marie Curie Nursing Service - Care at Home Support Service Care at Home Marie Curie Hospice - Glasgow 133 Balornock Road Stobhill Hospital Grounds

HOW TO USE THE WARMBATHS NURSING OPTIMIZATION MODEL

Program of Instruction Course Syllabus

Linking Supply Chain, Patient Safety and Clinical Outcomes

POLICY ON THE IMPLEMENTATION OF NICE GUID ANCE

Quality Management and Accreditation

COMMAND MISSION COMMAND ORGANIZATION

Peoplesoft Effort Certification. Participant s Manual

Enhancing Patient Quality and Safety with Compliance

LESSONS LEARNED IN LENGTH OF STAY (LOS)

High Reliability Organizations The Key to Improving Quality and Safety

Drivers of HCAHPS Performance from the Front Lines of Healthcare

Laboratory Chemical Hygiene Plan -- Teaching Lab

Project Reporting in ems Lead Partners

Welcome to the INFORMATION SESSION

Software Requirements Specification

Peritoneal Dialysis. PatientOnLine PD management software designed for your team P 3

Partnerships- Cooperation with other care providers that is guided by open communication, trust, and shared decision-making.

Outsourcing Non-core Activities A strategy for SMBs that actually works

9/15/2017 THROUGHPUT. IT S NOT JUST AN EMERGENCY DEPARTMENT ISSUE LEARNING OBJECTIVES

Laboratory Chemical Hygiene Plan Research Lab

HEALTH WORKFORCE SUPPLY AND REQUIREMENTS PROJECTION MODELS. World Health Organization Div. of Health Systems 1211 Geneva 27, Switzerland

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 11 R-1 Line #92

USDA. Self-Help Automated Reporting and Evaluation System SHARES 1.0. User Guide

Learning Objectives. Individualized Quality Control Plans. Agenda. Another Way To Determine QC? Hooray!!!! What is QC?

Nurse Call Communication System

Transcription:

Pittsburgh, PA 15213-3890 Basics of PSP and TSP for Systems Engineering James McHale November 2006 Sponsored by the U.S. Department of Defense 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering

Agenda Why PSP and TSP for Systems Engineering? Things That Change, Things That Don t Time Logging Exercise The TSP Launch The TSP Management Framework TSP Quality Management 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 2

Team Software Process The Team Software Process (TSP) is a engineering development process originally developed for software teams. TSP addresses common engineering and management issues (the same ones addressed by CMMI). cost and schedule predictability productivity and product quality process improvement TSP truly empowers teams and team members is a complete, mature, operational process provides immediate and measurable results 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 3

Improved Predictability Effort and schedule deviation are dramatically improved. Schedule Performance Typical Industry 100%+ Study baseline 27% to 112% TSP < 10% 160% 140% 120% 100% 80% 60% 40% 20% 0% -20% Average Schedule Deviation - Range Pre TSP/PSP With TSP/PSP Average Effort Deviation - Range Effort/Cost Performance Typical Industry 100%+ Study baseline 17% to 85% TSP < 5% 120% 100% 80% 60% 40% 20% 0% -20% Pre TSP/PSP With TSP/PSP Source: CMU/SEI-2000-TR-015 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 4

Improved Productivity A nine person TSP team from the telecommunications industry developed 89,995 new LOC in 71 weeks, a 41% improvement in productivity. A TSP team from the commercial software industry, developing an annual update to a large shrink-wrapped software product, delivered 40% more functionality than initially planned. A TSP team within the DoD, developing a new mission planning system, delivered 25% more functionality than initially planned. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 5

Improved Quality An analysis of 20 projects in 13 organizations showed TSP teams averaged 0.06 defects per thousand lines of new or modified code. Approximately 1/3 of these projects were defect-free. 8 7 6 5 4 3 2 1 0 Defects/KLOC 7.5 6.24 4.73 2.28 1.05 0.06 Level 1 Level 2 Level 3 Level 4 Level 5 TSP Source: CMU/SEI-2003-TR-014 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 6

Accelerated Process Improvement TSP addresses or supports most of the capabilities expected of a project team through CMMI Level 5. It provides either a starting point or a next step. CMMI Maturity Level Level 5 Level 4 Level 3 Using TSP as a starting point, three organizations have advanced from ML1 to ML4 in Level 2 less than 3 years. Directly Addressed 0% 50% 100% Supported Partially Addressed Not Addressed Unrated Percentage of SPs 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 7

TSP Results: NAVAIR AV-8B Mar. 2000 Oct. 2000 Jan. 2001 May 2001 June 2001 Feb. 2002 June 2002 Sep. 2002 Began current CMM-based improvement effort (now a CMMI-based effort) Began PSP/TSP introduction sequence First TSP team launched CBA-IPI: CMM level 2; 3 KPAs satisfied at level 3; level 4/5 observations on TSP Received draft of CMM-TSP gap analysis (levels 2 and 3 only, minus SSM and TP) to help guide improvement efforts Received late-model gap analysis (including TP at level 3 and levels 4 and 5) Launched second TSP team CBA-IPI: CMM level 4 (16 months from L2!) See Crosstalk, Sep. 2002, AV-8B s Experiences Using the TSP to Accelerate SW-CMM Adoption, Dr. Bill Hefley, Jeff Schwalb, and Lisa Pracchia, and Crosstalk, Jan. 2004, The AV-8B Team Learns Synergy of EVM and TSP and Accelerates Software Process Improvement 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 8

AV-8B CMMI Quick Look Profile PA -> RM RD TS PI VE VAL CM PPQA MA CAR DAR OEI OPD OPF OID OT OPP PP PMC IPM QPM SAM RSKM IT Specific Goal 1 U FI NR S S S S S U U NR S S S U S U S S U U U U S SP1.1 FI FI NR FI FI FI FI FI PI PI NR FI FI FI LI FI PI FI FI FI FI FI FI FI SP1.2 FI FI NR FI FI FI FI FI PI PI NR FI FI FI LI FI PI FI FI FI PI FI LI FI SP1.3 FI FI FI FI FI FI LI NR FI FI FI FI FI FI FI FI FI PI LI FI FI SP1.4 PI PI NR FI FI FI LI FI FI FI FI SP1.5 FI NR FI PI FI PI SP1.6 NR FI SP1.7 FI Specific Goal 2 S NR S S S S U U U NR U U U S S NR U U S S SP2.1 FI FI FI FI FI FI LI LI PI PI FI FI LI FI FI NR PI NR FI FI SP2.2 FI FI FI FI FI FI FI LI PI NR FI FI FI FI FI FI PI LI FI FI SP2.3 FI FI FI PI NR PI FI LI FI FI FI FI LI FI FI SP2.4 NR LI LI FI LI FI FI SP2.5 FI FI SP2.6 FI SP2.7 FI SP2.8 Specific Goal 3 NR S S S S S S S SP3.1 NR FI FI FI FI FI FI FI SP3.2 FI FI FI FI FI FI FI FI SP3.3 LI FI FI SP3.4 NR FI SP3.5 NR Specific Goal 4 SP4.1 SP4.2 SP4.3 S FI FI FI PA -> RM RD TS PI VE VAL CM PPQA MA CAR DAR OEI OPD OPF OID OT OPP PP PMC IPM QPM SAM RSKM IT Generic Goal 2 S S S U S S S Generic Goal 3 S U U S S S S S U U NR NR S S NR S U S S S S S S S LEGENDS Practices Goals FI Fully Implemented or Satisfied S Satisfied LI Largely Implemented U Unsatisfied (Goals) PI Partially Im plem ented NR Not Rated NI Not Implem ented NR Not Rated Source: NAVAIR 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 9

NAVAIR P3-C Journey May 2002 Tools PSP/TSP Process Action Teams (PATs( PATs) HPO HPO Process Process Improvement Improvement Group Group Kick-off Kick-off Documenting Documenting SSA SSA Processes Processes CMMI CMMI Level Level Rating Rating February 2002 Training Defined Web Requirements May 2004 CMM Level 4 Source: NAVAIR SM SCAMPI - Risk Management - Measurement & Analysis 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 10

Improved Quality of Work Life A more disciplined process allowed me to do a better job, and allowed me to balance my job with other aspects of my life. This project ended up a lot less stressful than other projects. Promotes a less stressful environment. Can track that the project is on schedule. Fewer defects are seen positively in the organization. It is nice to be associated with a project that had few defects. I liked the level of detail that went into initial plan, and the constant awareness of the schedule. Allowed us to make adjustments as the project went on, instead of waiting for a major milestone. It was nice that management finally allowed the team to create the schedule. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 11

Adoption Organizations that are using, piloting, or preparing to pilot the TSP. ABB ABC Informatica Activision Advanced Information Services Advanced Maturity Services, Inc. Alan S. Koch Consultants Ambient Consulting AMRDEC Boeing Centre De Investigacion En Matamaticas Census Bureau CQG, Inc. CRSIP / STSC / DRAPER Davis Systems DOE / Los Alamos DOE / Naval Reactors DPC Cirrus Dynamics Research Corp. EDS Halex Associates Heath Solutions, Inc. Helsana Honeywell IBM Intuit* Iomega I.Q. Inc. KPMG L. G. Electronics Lockheed Martin / KAPL* LogiCare Los Alamos National Laboratory M/A-Com Private Radio Systems, Inc Magellan Navigation* Microsoft* Motiva NASA Langley NCR/Teradata NCS Pearson Northern Horizons Northrop Grumman Oracle* Prodigia S.A. de C.V. PS&J Consulting / Software Six Sigma QuarkSoft Respironics Rockwell Collins SAIC Samsung SDS Siberlink STPP, Inc. STSC Trilogy TYBRIN Corporation - Air Logistics University of Alabama / Huntsville University of Queensland US Army / AMRDEC US Navy / NAVAIR* US Navy / NAVOCEANO* US Navy / NAVSEA* Xerox *Organizations we are currently working with 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 12

TSP for Systems Engineering NAVAIR and other organizations have discussed the possibilities of adapting TSP for systems engineering use for several years. Late in 2005, an effort was launched to extend TSP practice to systems engineers working in NAVAIR organizations, beginning with those that have had success using TSP for software development. Several organizations, including at least one within NAVAIR, are forging ahead with their own TSP adaptations. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 13

Building High-Performance Teams TSP builds high-performance teams from the bottom-up. 3 Team Management Team communication Team coordination Project tracking Risk analysis 2 Team Building Goal setting Role assignment Tailored team process Detailed balanced plans 1 Teaming Skills Process discipline Performance measures Estimating & planning skills Quality management skills 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 14

Personal Software Process? The PSP is a process designed for individual use that applies to structured personal tasks. PSP builds the teaming skills required for the TSP. With PSP, developers learn how to use a defined process and how to measure, estimate, plan, and track their work. This leads to better estimating, planning, and tracking protection against over-commitment a personal commitment to quality personal involvement in process improvement 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 15

PSP-TSP Process Evolution TSP Team development PSP2 Code reviews Design reviews PSP2.1 Design templates PSP1 Size estimating Test report PSP1.1 Task planning Schedule planning PSP0 Current process Time recording Defect recording Defect type standard PSP0.1 Coding standard Size measurement Process improvement proposal (PIP) 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 16

PSP Improves Performance Estimation accuracy fewer underestimates more accurate estimates estimates balanced around zero 40 20 PSP 0 0-200% 40-100% 0% 100% Quality yield improves by 2X to 3X fewer defects in unit test, integration test, system test COQ is flat or reduced PSP 1 20 0-200% -100% 0% 100% 40 20 PSP 2 0-200% -100% 0% 100% Effort Estimation Accuracy 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 17

PSP Quality Results Defects Per KLOC Removed in Compile and Test 120 Mean Number of Defects Per KLOC 110 100 90 80 70 60 50 40 30 20 10 Mean Compile + Test PSP Level Mean Comp + Test 0 0 1 2 3 4 5 6 7 8 9 10 11 Program Number 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 18

Agenda Why PSP and TSP for Systems Engineering? Things That Change, Things That Don t Time Logging Exercise The TSP Launch The TSP Management Framework TSP Quality Management 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 19

Non-Software Disciplines Many software-intensive projects have significant nonsoftware components in terms of requirements and test support activities customer deliverables The ways that these other activities are planned, staffed, and managed are reflected in organizational structure. separate departments for systems engineering, test, documentation, etc. often depends on the size of the organization and the size of the typical project multi-disciplinary teams matrixed project teams 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 20

Introduction to Personal Process SEI teaches a two-day class, Introduction to Personal Process, which begins the individual quality journey by raising the issues of size measures and process and defect definitions for intellectual work other than software development. It makes both economic and technical sense to extend the formal definitions of such work so that it may be planned and tracked with TSP methods. NAVAIR has been a leader in adapting PSP and TSP to non-software work, and is actively engaged with SEI to formalize this work. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 21

Process Improvement for Others Applying TSP practices to other disciplines besides software engineering can be relatively straightforward. many teams are already doing it successfully based on CMM originally, which was based roughly on Crosby s five-level model of the manufacturing quality journey planning and tracking mechanisms are not softwarespecific size and defect definitions (by default) are rooted in the software-specific examples from PSP training! In order to adapt PSP for use by other disciplines, size measures and defect definitions must be addressed. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 22

Size Measures For a size measure to be useful, it must be useful for planning precisely defined directly countable in an intermediate or final product 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 23

Defect Definitions A defect is anything in an interim or finished product that must be changed for the product to be used as intended. Defects in test procedures, requirements analyses, specifications, or user documentation can all adversely affect a customer s use of the delivered product. Defect definitions must make sense to the people who must correct them. Defect correction is sometimes called rework. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 24

Building High-Performance Teams TSP builds high-performance teams from the bottom-up. 3 Team Management Team communication Team coordination Project tracking Risk analysis 2 Team Building Goal setting Role assignment Tailored team process Detailed balanced plans 1 Teaming Skills Process discipline Performance measures Estimating & planning skills Quality management skills 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 25

Team Management Framework The TSP team management framework helps the team meet their planned commitments by providing support for team communication and coordination project tracking and status reporting requirements management change management risk management Team members gather data and manage their personal plans. These data are consolidated at the team level and used by the team to manage the team s plan. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 26

TSP Base Measures Size Effort Quality Schedule Source: CMU/SEI-92-TR-019 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 27

TSP Project Tracking Enter Time by Task Enter Week Task Completed Product Summary Task Status Engineer A Updated Team and Engineer Task, Schedule, and Quality Plans Team Task and Schedule Summary Enter Defects by Component and Phase Enter Size by Component Quality Summary Schedule Status Engineer A 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 28

Tracking with TSP Measures The TSP base measures can be combined to provide a number of derived measures for managing projects. Estimation accuracy (size/time) Prediction intervals (size/time) Time in phase distribution Defect injection phase distribution Defect removal phase distribution Productivity %Reuse %New Reusable Cost performance index Planned value Earned value Predicted earned value TSP Derived Measures Defect density Defect density by phase Defect removal rate by phase Defect removal leverage Review rates Process yield Phase yield Failure cost of quality Appraisal cost of quality Appraisal/Failure COQ ratio Percent defect free Defect removal profiles Quality profile Quality profile index 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 29

TSP Weekly Tracking TSP teams track their status weekly using a defined process and the weekly status summary in the TSP support tool. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 30

Earned Value Management TSP teams review progress at the weekly meeting using earned value tracking provided by the TSP support tool. 100.0 90.0 Percent Complete 80.0 70.0 60.0 50.0 40.0 30.0 20.0 Cumulative Planned Value Cumulative EV Cumulative Predicted Earned Value Baseline Cumulative Plan Value 10.0 0.0 8/30/2004 9/13/2004 9/27/2004 10/11/2004 10/25/2004 11/8/2004 11/22/2004 12/6/2004 12/20/2004 1/3/2005 1/17/2005 1/31/2005 2/14/2005 2/28/2005 3/14/2005 3/28/2005 4/11/2005 4/25/2005 Week 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 31

Resource Management TSP teams review resource utilization at the weekly meeting using analyses provided by the TSP support tool. 1200.0 1000.0 800.0 600.0 400.0 Cumulative Planned Hours Cumulative Actual Hours 200.0 0.0 8/30/2004 9/13/2004 9/27/2004 10/11/2004 10/25/2004 11/8/2004 11/22/2004 12/6/2004 12/20/2004 1/3/2005 1/17/2005 1/31/2005 2/14/2005 2/28/2005 3/14/2005 3/28/2005 4/11/2005 4/25/2005 Cumulative Planned Hours Week 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 32

Quality Management TSP teams use the Quality Profile as an early warning indicator of post-development defects. The quality profile uses five software quality benchmarks. Satisfied criteria are plotted at the outside edge of the chart. Component 2 Risk Factors Design/Code Time Design Review Time Code Review Time Inadequate design review time results in design defects escaping to test and production. Component 5 Risk Factors Design/Code Time Design Review Time Code Review Time Unit Test D/KLOC Compile D/KLOC Unit Test D/KLOC Compile D/KLOC High quality component Poor quality component 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 33

Defect Removal Profile TSP teams use the Defect Removal Profile to track plan and actual defects removed by phase early vs. late defect removal plan Defects Removed by Phase for Assembly SYSTEM 900.0 800.0 Defects Removed by Phase 700.0 600.0 500.0 400.0 300.0 200.0 100.0 Plan Actual 0.0 REQ Inspection HLD Inspection DLD Review DLD Inspection Code Code Review Compile Code Inspection Unit Test Build and Integration Test System Test Phase 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 34

Agenda Why PSP and TSP for Systems Engineering? Things That Change, Things That Don t Time Logging Exercise The TSP Launch The TSP Management Framework TSP Quality Management 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 35

Exercise Objectives The PSP is the foundation for the TSP. This exercise provides an understanding of the baseline process, PSP0 familiarity with the basic measurement forms used in the PSP Similar measures and forms are used in the TSP. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 36

Basic Process Elements A process script and basic measures A project plan summary form A time recording log A defect reporting log A defect type standard 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 37

Basic Process Measures -1 The reason to measure a process is to understand it. how much time is spent in various activities what is produced at various times how many defects are injected and removed, and when With these data, engineers can better plan and estimate the work to be done evaluate the results improve the process for the next project 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 38

Basic Process Measures -2 To measure the process, the work is divided into defined activities called phases. Each phase consists of the task to be done during the phase the entry criteria, or the items required before the work can start the exit criteria, or the items that must be produced by the end of the phase verification steps to ensure that the work is properly done 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 39

Basic Process Measures -3 The measures for each phase are time spent (in minutes) in that phase defects injected in that phase defects removed in that phase The program size is also measured, but only during the postmortem phase at the end of the project. These measures provide the foundation for all PSP measurements, analyses, and planning. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 40

Baseline Process Phases Baseline Process Planning Development Design Code Compile Test Postmortem 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 41

A Process Script PSP0 Process Script Phase Number Purpose To guide you in developing module-level programs Entry Criteria Problem description PSP0 Project Plan Summary form Time and Defect Recording Logs Defect Type Standard Stop watch (optional) 1 Planning Produce or obtain a requirements statement. Estimate the required development time. Enter the plan data in the Project Plan Summary form. Complete the Time Recording Log. 2 Development Design the program. Implement the design. Compile the program and fix and log all defects found. Test the program and fix and log all defects found. Complete the Time Recording Log. 3 Postmortem Complete the Project Plan Summary form with actual time, defect, and size data. Exit Criteria A thoroughly tested program Completed Project Plan Summary form with estimated and actual data Completed Defect and Time Recording Logs 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 42

PSP0 Project Plan Summary The project plan summary holds project data in summary PSP0 Project Plan Summary form. Student Date Program Program # planned and actual data Instructor Language to date history time in phase defects injected defects removed Time in Phase (min.) Plan Actual To Date To Date % Planning Design Code Compile Test Postmortem Total Defects Injected Actual To Date To Date % Planning Design Code Compile Test Total Development Defects Removed Actual To Date To Date % Planning Design Code Compile Test Total Development After Development 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 43

Time Recording Log Engineers use the time recording log to record the time when they start on a project phase the time when they stop work on a phase the interruption time the elapsed time less interruption time comments 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 44

Defect Recording Log Engineers use the defect recording log to record information about all defects found in reviews, compiling, and test. the defect number the defect type the phase in which it was injected the phase in which it was removed the time to find and fix the defect a brief description of the defect If the defect was injected while fixing a defect, that defect s number is recorded. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 45

Exercise Instructions -1 Read through the PSP0 process scripts (in the workbook) so that you understand the entry and exit criteria for each phase. Read JD s scenario for program 1A and fill out the time log. The defect log and project plan summary are already filled out for you. Refer to the instructions for each form to determine what information goes in each field. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 46

Exercise Instructions -2 When did JD start? When did he finish? Was he interrupted? What process phase is this? Where should this information be recorded? JD begins work on assignment 1A [8:00] by reviewing the requirements in the assignment package, including the test requirements, to be sure he understands them. He copies the requirements to his note pad. Then, based on the data presented on past student performance and JD s feeling about his own performance, he estimates this assignment will take 3 hours and writes this on his note pad [8:06]. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 47

Results How long did the project take? How many defects were removed? In what phase did JD spend the most time? What percent of JD s time was spent in compile + test? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 48

Exercise Summary The baseline personal process is simple and easy to use. The PSP forms simplify data collection and provide a convenient reference for planning future projects. The basic PSP time, size, and defect measures provide the data for the TSP. HOMEWORK: For systems engineering in your organization, how would the Plan Summary change? What phases of development would you define? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 49

Agenda Why PSP and TSP for Systems Engineering? Things That Change, Things That Don t Time Logging Exercise The TSP Launch The TSP Management Framework TSP Quality Management 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 50

Building High-Performance Teams TSP builds high-performance teams from the bottom-up. 3 Team Management Team communication Team coordination Project tracking Risk analysis 2 Team Building Goal setting Role assignment Tailored team process Detailed balanced plans 1 Teaming Skills Process discipline Performance measures Estimating & planning skills Quality management skills 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 51

TSP Structure and Flow -1 In the TSP, each major project cycle or phase begins with a Launch. The Launch is a defined team planning process that also facilitates team-building. Launch Relaunch Cycle 1 Postmortem The team reaches a common understanding of the work and the approach. They produce a detailed plan to guide the next development phase or cycle. Relaunch Cycle 2 Postmortem 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 52

TSP Structure and Flow -2 TSP has four principal development phases. Requirements, High-Level Design, Implementation, Test (TSP default) or a project-defined lifecycle Launch Relaunch Requirements High-Level Design Postmortem TSP projects can start or end on any phase. from requirements through system test requirements only high-level design only as needed to do the work Relaunch Relaunch Implementation Integration and Test Postmortem Postmortem Postmortem 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 53

TSP Structure and Flow -3 The TSP phases can and should overlap. The TSP development strategy encourages incremental development iterative development multiple builds or cycles work-ahead TSP permits whatever process structure makes the most business and technical sense to the team. Launch Relaunch Relaunch Relaunch Iteration 1 Iteration 2 Iteration 3 Iteration 4 Postmortem Postmortem Postmortem Postmortem 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 54

TSP Process Elements Checklists, specifications, standards, and other process assets (22), including TSP introduction sequence Launch planning guidance Executive tools such as checklists for planning assessment and quarterly reviews TSP role specifications (12), including Meeting roles and responsibilities Inspection roles and responsibilities Customer interface manager role and responsibilities Process manager role and responsibilities Forms (22), including Time Recording Log Defect Recording Log Inspection Report Process Inventory Quality Summary Process Scripts (30), including Overall development and enhancement process Overall maintenance and enhancement process Launch process Test defect handling 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 55

The Launch Process Meetings Day 1 Day 2 Day 3 Day 4 1. Establish Product and Business Goals 4. Build Topdown and Next-Phase Plans 7. Conduct Risk Assessment 9. Hold Management Review 2. Assign Roles and Define Team Goals 5. Develop the Quality Plan 8. Prepare Management Briefing and Launch Report PM. Launch Postmortem 3. Produce Development Strategy 6. Build Bottomup and Consolidated Plans 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 56

The TSP Launch Artifacts Business needs Management goals Product requirements What? How? When? Who? How well? What if? Team goals Conceptual design Planned products Size estimates Team strategy Team process Task hour plan Schedule plan Earnedvalue plan Team roles Task plans Detailed plans Quality plan Risk evaluation Alternative plans 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 57

TSP Project Tracking -1 Project tracking in the TSP is based on the principles and measures used in the PSP. The detailed team and individual plans facilitate precise project tracking. Each team member is responsible for gathering data on their work tracking status against their personal plan keeping the team informed the quality of the work they produce 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 58

TSP Weekly Meeting Manager s report (team leader) new issues and developments Role reports (8, more or less) customer/requirements, design, implementation, test, planning, process, quality, support Risk report status and changes in assigned risks impending flag dates and required actions Project status individual and team (planning manager) Next week s plans individual tasks dependencies (e.g. reviews needed) task, hour, EV goals 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 59

Agenda Why PSP and TSP for Systems Engineering? Things That Change, Things That Don t Time Logging Exercise The TSP Launch The TSP Management Framework TSP Quality Management 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 60

TSP Project Tracking -2 Project tracking in TSP is based on the team s plan task hour and task completion data plan and earned value Individual plans facilitate precise project tracking. Team members are each responsible for gathering data on their work tracking status against their personal plans the quality of the work that they produce keeping the team informed of their progress Individual team member data are consolidated each week so that the team can assess progress against goals. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 61

The WEEK Summary The weekly team meeting is the forum that the team uses to track progress against the plan track the status on the project's issues and risks communicate with each other TSP Week Summary - Form WEEK Name Consolidated Team Plan Date 2/7/2000 Team Security System Upgrade Status for Week 5 Cycle Week Date 1/31/2000 Plan/ Weekly Data Plan Actual Actual Project hours for this week 80.0 69.0 1.16 Project hours this cycle to date 400.0 344.8 1.16 Earned value for this week 10.3 3.1 3.37 Earned value this cycle to date 40.2 30.0 1.34 To-date hours for tasks completed 293.0 303.8 0.96 Plan Actual Earned Planned Plan Hrs./ Assembly Phase Tasks Completed Resource Hours Hours Value Week Actual Hrs. SYSTEM REQ Write SRS general sections tmc 14.0 12.0 1.4 4 1.17 SYSTEM REQ Weekly requirements analysis meeting 5tma 4.0 4.0 0.4 5 1.00 SYSTEM REQ Weekly requirements analysis meeting 5tmb 4.0 4.0 0.4 5 1.00 SYSTEM REQ Weekly requirements analysis meeting 5tmc 4.0 4.0 0.4 5 1.00 SYSTEM REQ Weekly requirements analysis meeting 5tmd 4.0 4.0 0.4 5 1.00 TASKS DUE THROUGH WEEK 7 SYSTEM REQ Review SRS general sections tmc 5.0 0.0 4 SYSTEM STP Complete Validation Test Plan tmd 8.0 8.5 0.0 4 0.94 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 62

Maintaining the Team s Schedule The team manages its commitments by using the data it collects. The team determines how it is doing against its plan. If the team is falling behind, it determines what is the likely cause what the team can do to maintain its commitment The team informs management if the commitment cannot be maintained or if management help is needed. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 63

Determining Status Against Plan -1 Two things are important here. the team s current project status the team s projected completion date Current status is determined using data on the WEEK form. TSP Week Summary - Form WEEK Name Consolidated Team Plan Date 2/7/2000 Team Security System Upgrade Status for Week 5 Cycle Week Date 1/31/2000 Plan/ Weekly Data Plan Actual Actual Project hours for this week 80.0 69.0 1.16 Project hours this cycle to date 400.0 344.8 1.16 Earned value for this week 10.3 3.1 3.37 Earned value this cycle to date 40.2 30.0 1.34 To-date hours for tasks completed 293.0 303.8 0.96 weeks behind = ( plan EV todate actual EV todate) ( actual EV todate / current week) 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 64

Determining Status Against Plan -2 Projected completion date can be determined using data on the WEEK form and the original planned weeks. TSP Week Summary - Form WEEK Name Consolidated Team Plan Date 2/7/2000 Team Status for Week Millenium Upgrade 5 Cycle Week Date 1/31/2000 Plan/ Weekly Data Plan Actual Actual Project hours for this week 80.0 69.0 1.16 Project hours this cycle to date 400.0 344.8 1.16 Earned value for this week 10.3 3.1 3.37 Earned value this cycle to date 40.2 30.0 1.34 To-date hours for tasks completed 293.0 303.8 0.96 weeksto go = ( 100 actual EVtodate) ( actual EVtodate) ( currentweek) weeks behind at completion = ( weeksto go + current week) original planned weeks 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 65

Identifying Estimating Problems The cost performance index (CPI) shows how the team is performing with respect to the effort estimates in the plan. CPI = plan hours for completed tasks actual hours for completed tasks The CPI is available on the WEEK form. TSP Week Summary - Form WEEK Name Consolidated Team Plan Date 2/7/2000 Team Status for Week Security System Upgrade 5 Cycle Week Date 1/31/2000 Plan/ Weekly Data Plan Actual Actual Project hours for this week 80.0 69.0 1.16 Project hours this cycle to date 400.0 344.8 1.16 Earned value for this week 10.3 3.1 3.37 Earned value this cycle to date 40.2 30.0 1.34 To-date hours for tasks completed 293.0 303.8 0.96 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 66

Interpreting the CPI A CPI of 1 means sum of the effort estimates for the completed tasks = sum of the actual effort for the completed tasks What does this imply about the accuracy of the individual estimates? Assuming the team is achieving the planned task hours, what does this imply about schedule performance? What does a CPI of 0.5 imply about effort estimates? schedule performance (assuming the team is achieving the planned task hours)? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 67

Interpreting the CPI (continued) What does a CPI of 2 imply about effort estimates? schedule performance (assuming that the team is achieving the planned task hours)? What general characterization can be made about schedule performance based on the CPI? Schedule growth (due to effort estimates) = 1/CPI Projected schedule = Original plan weeks/cpi 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 68

Interpreting Task Hour Data The task hour data is in the form WEEK and can be interpreted similar to the effort for completed tasks data. TSP Week Summary - Form WEEK Name Consolidated Team Plan Date 2/7/2000 Team Security System Upgrade Status for Week 5 Cycle Week Date 1/31/2000 Plan/ Weekly Data Plan Actual Actual Project hours for this week 138.0 69.0 2.00 Project hours this cycle to date 689.6 344.8 2.00 Earned value for this week 10.3 3.1 3.37 Earned value this cycle to date 80.4 30.0 2.68 To-date hours for tasks completed 293.0 303.8 0.96 If (Plan hours to date)/(actual hours to date) = 2 What does it mean? What is the effect on schedule performance? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 69

Interpreting Task Hour Data (continued) If (Plan hours to date)/(actual hours to date) = 0.5 What does it mean? What is the effect on schedule performance? What general characterization can be made about schedule performance based on the plan/actual task hours? Schedule growth (due to task hours) = plan/actual Projected schedule = Original plan weeks * (plan/actual) 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 70

Improving Task Hours Average task hours per developer per week were improved from 9.6 hours to 15.1 hours through quiet time, process documentation, more efficient meetings, etc. 18 Average Task Hours Per Week +57% 16 15.1 14 13.3 12.6 12 10 8 6 4 2 0 9.6 04/20/1998 04/27/1998 05/04/1998 05/11/1998 05/18/1998 05/25/1998 06/01/1998 06/08/1998 06/15/1998 06/22/1998 06/29/1998 07/06/1998 07/13/1998 07/20/1998 07/27/1998 08/03/1998 08/10/1998 08/17/1998 08/24/1998 08/31/1998 09/07/1998 09/14/1998 09/21/1998 09/28/1998 10/05/1998 10/12/1998 10/19/1998 10/26/1998 11/02/1998 11/09/1998 11/16/1998 11/23/1998 11/30/1998 12/07/1998 12/14/1998 12/21/1998 12/28/1998 01/04/1999 01/11/1999 01/18/1999 01/25/1999 Task Hours Avg. Task Hours - W eek Avg. Task Hours - Phase 02/01/1999 02/08/1999 02/15/1999 02/22/1999 03/01/1999 03/08/1999 03/15/1999 03/22/1999 03/29/1999 04/05/1999 04/12/1999 04/19/1999 04/26/1999 05/03/1999 05/10/1999 05/17/1999 05/24/1999 05/31/1999 06/07/1999 06/14/1999 06/21/1999 06/28/1999 Source: Allied Signal 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 71

Agenda Why PSP and TSP for Systems Engineering? Things That Change, Things That Don t Time Logging Exercise The TSP Launch The TSP Management Framework TSP Quality Management 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 72

What is Quality? Basic definition: Meeting the user s needs There are three categories of product quality. functionality properties (e.g., safety, security, privacy, usability) defects A software-intensive product can t be safe or secure until it is nearly defect-free. Most current software-intensive processes are preoccupied with removing defects. Little or no time is left for the other aspects of quality. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 73

The System Quality Problem Software quality problems are largely caused by defects. Defects are injected by the product s developers. Even experienced and capable developers inject many defects. Each defect is a potential system failure. A significant fraction of software defects can be avoided or mitigated by effective systems engineering. Current practices often rely on testing to remove these defects. Testing is necessary but, for finding and fixing defects, it is time-consuming expensive ineffective 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 74

The Defect Problem Programs are complex products. Small programs have thousands of instructions. Large programs have millions of instructions. These instructions are individually produced. Each instruction must be precisely correct, beginning with the problem statement. Software effort has a multiplying effort on systems engineering defects. On average, even experienced programmers inject a defect about every 10-to-12 instructions. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 75

Testing A single test exercises the product under one set of conditions produces correct or incorrect results If there is a problem, developers must find the defect, fix it, and then test the fix. For products with many possible operating conditions, many tests are required. How many of these tests are defective? Projects that rely on testing for quality spend a lot of time and money on testing. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 76

Testing Takes a Long Time Cumulative Defects 200 180 160 140 120 100 80 60 40 20 0 1 10 19 Magellan Spacecraft 22,000 LOC 28 37 46 55 64 73 82 91 100 109 118 All Non-Crit Critical Weeks 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 77

Testing Effectiveness Large complex systems cannot be exhaustively tested. It is impossible to test every operating condition. Testing must focus on only the most frequent conditions. Extensive user testing finds even more defects. Testing finds a percentage of the defects in a product, usually less than 50%. To get a quality product out of test, you must put a quality product into test. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 78

Testing is Ineffective Overload Configuration Resource contention Hardware failure Operator error Safe and secure region = tested (shaded) Unsafe and insecure region = untested (unshaded) Data error 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 79

Reviews and Inspections Save Time 10000 Time in Minutes 1000 100 10 5 System Test is the least efficient phase in which to remove defects 22 2 25 32 1405 1 Design Review Design Inspect. Code Review Code Inspect Unit Test System Test Source: Xerox Defect-removal Phase 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 80

Why TSP is Faster and Better With TSP most defects are removed by reviews and inspections few defects are left for testing testing takes relatively little time By using TSP, organizations can cut testing times by 80% or more shorten schedules reduce costs produce better products Testing should verify that the development process worked well, rather than fix its exported problems. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 81

Measuring Quality To produce quality systems, the quality of all its parts must be measured and managed. These measures must be made at every step in the process. With TSP and the underlying PSP principles, developers use quality measures to manage the quality of their work. The developers inject fewer defects remove most defects soon after injecting them 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 82

TSP Quality Measures There are many potential quality measures. With the TSP, every product element and every process step can be measured. Product Quality Process Quality Quality Measure Total defect density Compile defect density Test defect density Percent defect free Phase yield Review rate Defect removal rate - defects/hour Quality profile Process quality index (PQI) Description The number of defects found in development, per unit of size The number of defects found in compile, per unit of size The number of defects found in test, per unit of size The percent of system modules or components that had no defects in a defect removal phase The percent of defects in a product that are found during the phase The volume of code or design that is reviewed per hour The hourly rate at which defects are removed in reviews or inspections Composite picture of a module's process quality A composite value representing the five quality profile dimensions 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 83

Quality Implications With proper training, guidance, and motivation, most developers can produce near-defect-free programs. Does the same hold true for systems engineers? With essentially defect-free products testing times are sharply reduced delivered products work maintenance costs are reduced The key is the engineer s ability to produce defect-free products. measure quality manage quality personal quality commitment 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 84

Quality Goals and Plans With data, TSP teams can set measurable quality goals make quality plans to meet these goals estimate the defects injected and removed in each phase track the work to see if they are meeting their quality plans 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 85

The TSP Defect Model At each step of development, defects are injected, removed, or possibly both. For each step: Defects Out = Defects In + Defects Injected Defects Removed Defects In = Defects Out from the previous step Defects Injected = function of time in production activities Defects Removed = percentage (usually much less than 100%) of Defects In + Defects Injected 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 86

Example: Planning for Quality -1 A TSP team plans to develop 20 KLOC. The goal is a design review yield of at least 70%. The plan shows 442 hours in detailed design. Data show that developers inject 1.3 defects per hour in detailed design. Data show that they remove 3 defects per hour in detailed design reviews. What is the minimum design review time required to remove these defects? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 87

Example: Planning for Quality -2 Defects injected 442 hours of design 1.3 defects injected per hour 1.3*442 = 574.6 defects injected Defect removal 574.6 defects total 3 defects removed per hour 574.6/3 = 191.5 hours of design review time The team should plan on 191.5 hours of review time. To achieve a 70% yield, they must spend at least 0.7*191.5 = 134.1 hours in design reviews. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 88

Example: Planning for Quality -3 Assume that no design reviews are done ½ of the design defects (.5 * 574 = 287) can be found by integration testing at 5 hours/defect ½ of the remaining defects (i.e. ½ of ½ or.5 * 287 = 144) can be found in system testing at 10 hours/defect How much time will integration and system testing take? How much time will be saved by doing design reviews? How many design defects will likely remain for your customers to find? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 89

Maintain Process Discipline To produce quality systems, every part must be of high quality. This is possible only if every developer consistently follows a quality process. To consistently follow a quality process, each member of the development team must be properly trained (with the PSP or equivalent) work on a disciplined team (with the TSP or equivalent) have coaching support and management guidance 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 90

Management Support People do not naturally do disciplined work. To ensure disciplined work, management must train and support the developers ensure that the developers work is guided and monitored provide coaching assistance Management must also build and maintain effective teams ensure that all team members are trained and willing to follow the process recognize and reward quality work 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 91

TSP Quality Messages High-quality processes produce high-quality products. Quality work is not done by accident; it requires discipline, commitment, management, and measurement. Quality work saves time and money. The cornerstone of a high-quality software process is early defect removal. TSP shows teams how to efficiently remove defects at the earliest possible point in the process. 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 92

Your Organization is Unique but most organizations share common problems. An organization can change under duress, or it can change in response to leadership. Duress can lead to undesirable consequences since, by definition, it is trying to get away from whatever is causing the duress. Only leadership can take an organization reliably in a desired direction. Where will you lead your organization? 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 93

Thank you! Contact information: jdm@sei.cmu.edu Contact a PSP or TSP transition partner: http://www.sei.cmu.edu/collaborating/partners/trans.part.psp.html Contact SEI customer relations: Carnegie Mellon University Pittsburgh, PA 15213-3890 Phone, voice mail, and on-demand FAX: 412/268-5800 E-mail: customer-relations@sei.cmu.edu 2003-06 by Carnegie Mellon University Version 1.0 Basics of PSP and TSP for Systems Engineering - 94