Quality Assurance (QA) Work Plan for the State of Washington Department of Corrections Advance Corrections Initiative Prepared by
Page i Table of Contents 1. QA OBJECTIVES... 1 2. QA APPROACH... 2 2.1 ASSESSMENTS AND REPORTING... 2 2.1.1 Monthly Reports... 2 2.1.2 Ongoing Assessment and Reporting... 2 2.1.3 Initial Assessment... 3 2.1.4 Assessments of DOC Submissions to WaTech for Stage/Gate Reviews... 4 2.2 REVIEW STRUCTURE FOR ASSESSMENTS AND REPORTING... 5 3. QA WORK PLAN... 9
Page 1 1. QA Objectives This Quality Assurance (QA) engagement provides professional services from ( bluecrane ) for assessing the Advance Corrections Initiative (ACI) Project for the Washington State Department of Corrections (DOC). The QA effort will focus on providing an independent assessment of ACI Project activities and progress, with a primary aim of supporting the project executive sponsors, the project sponsors, the program manager, the project managers and other PMO staff, and the project team in achieving a successful project execution. Key stakeholders include: Project Sponsor: Jody Becker-Green Executive Sponsor: Dan Pacholke Julie Martin: Assistant Secretary, Administrative Services Division CIO: Ira Feuer Project Director: Amy Seidlitz Implementation Manager: Mark Kucza Project Managers: Kelley Barnard and Bob Billings Enterprise Project Director: Jeanette Sevedge-App The QA Team s efforts are centered on risk analysis and risk avoidance. The key to reducing and, to the maximum extent possible, avoiding risk is: Operational Implementation must be aligned with Business Strategy and Operational Success Factors. A proper alignment reduces risk by Saving money; Validating business strategy; and Meeting operational goals. Our approach to QA services delivery is founded on interaction with the project sponsor, project manager, members of the project team, and other stakeholders. Frequent contact with the project team and project manager fosters a healthy relationship for achieving the primary objective of all involved: the successful delivery of the project s deliverables within approved budget, schedule, and scope parameters. The QA team will provide its assessments via monthly reports, including a QA Dashboard.
Page 2 2. QA Approach 2.1 Assessments and Reporting 2.1.1 Monthly Reports The bluecrane QA Team will develop monthly reports that include a summary of the overall project, detailed discussion of significant issues and risks identified during the reporting period, recommendations for resolving the highest priority issues and addressing the highest priority risks, and the potential impact to the project if the issues/risks are not addressed. The QA Team will provide draft monthly reports electronically to the ACI project sponsors, program manager, and the project managers. After review by these individuals and a briefing by bluecrane consultants, the ACI project sponsors will provide approval for the monthly reports to be released to the contracted vendors (Sierra-Cedars and Assessments.Com [ADC]) and the Washington State Office of the Chief Information Officer (OCIO) at WaTech. The meetings to review the draft reports are intended to confirm the factual basis for findings and observations in the reports, ensure there are no surprises for the DOC management teams, ensure alignment of the management teams with respect to on-going risk response efforts, and continue to assist with the number one objective of ensuring project success. The reviews will in no way compromise QA findings or any other contents of the report itself. 2.1.2 Ongoing Assessment and Reporting The QA Team will monitor and evaluate project processes, activities, and deliverables to ensure that the project is being managed in a manner that will result in the successful delivery and deployment of the ACI Project on schedule and within budget. The QA Team will participate in weekly meetings with the ACI sponsors, program manager, project managers, and project teams. At these meetings, the QA Team will review activities performed during the previous week. The QA Team will perform the following on-going assessments: Adequate Project Staffing: Perform on-going assessments of project staffing requirements to successfully deliver ACI. Provide recommendations for the required number and classifications of project staff to reduce the risk of having staff without the required skill sets and experience, and the risk of increased costs incurred to fill the gaps if not corrected in a timely manner. Feasibility of the Project Schedule: Perform on-going analysis of the accuracy and feasibility of project activities identified in the project schedule and provide recommendations for maximizing utilization of the project team to reduce the risk of schedule delays.
Page 3 Project Progress: Perform on-going assessments of the project s progress towards the ontime, on-budget completion of project deliverables. Provide recommendations to reduce the risk of schedule delays, increased costs, and unmet expectations. Integration of Quality: Perform on-going assessments of the quality of project deliverables and the quality management processes used by the project, and provide recommendations to reduce the risk of low-quality project deliverables. Alignment of Stakeholders: Perform on-going assessments of the alignment of stakeholders to the project and make recommendations for stakeholder strategies and organizational change management activities to reduce the risk of lack of project support and involvement by key stakeholder groups and to ensure that expectations of key stakeholder groups are met. Effective Communications: Perform on-going assessments of project communications to ensure that the project is communicating effectively to stakeholders using meetings, reports, and websites, as appropriate; that project teams are communicating and working well together; that project meetings are run effectively; and that project documentation is clear, concise, and easily retrieved from the project library (parts or all of which may be electronic). Risk and Issue Management: Perform on-going assessments of the project s ability to identify and mitigate risks and address issues in a timely manner. Provide recommendations to reduce the risk of project delays and increased costs from unmanaged risks and issues. Software Development Lifecycle (SDLC) Methodology: Perform on-going assessments to determine level of compliance with System Development Lifecycle (SDLC) standards. Provide recommendations to reduce the risk of inadequate system design and testing, unmet requirements, and unmet expectations. The level of detail of these assessments for work being done by DOC, Sierra-Cedar, and ADC will depend, to some extent, on the level of access that we are permitted. We anticipate that assessments of external vendors level of compliance with SDLC will be based primarily on review of artifacts and deliverables at various stages, while our assessment of DOC compliance may be done from first-hand observation as well. Testing and Results: Perform on-going assessments of testing approaches and, with a particular emphasis on provider testing. Integration with Enterprise Architecture: Perform on-going assessments to determine level of compliance with enterprise architecture standards. Assessments will focus primarily on adherence to DOC standards and guidelines, while also taking into account any relevant State standards. Provide recommendations to reduce the risk of rework of interfaces and system implementation failures. 2.1.3 Initial Assessment Our draft initial assessment will be complete by. It is our goal to publish the final initial assessment before December 31, but we will schedule the draft report review around
Page 4 DOC staff holiday schedules and will publish the final report shortly after the New Year, if necessary. 2.1.4 Assessments of DOC Submissions to WaTech for Stage/Gate Reviews Our team will additionally provide reviews and assessments of DOC documentation for WaTech Stage/Gate reviews. We will provide a written report on our findings and recommendations for each Stage/Gate review.
Page 5 2.2 Review Structure for Assessments and Reporting We began our QA engagement for the ACI Project by developing an understanding of the project at a macro level. We started by analyzing the following five Project Areas : Project Management and Sponsorship People Application Data Infrastructure It is not our practice to duplicate Project Management activities by following and analyzing each task and each deliverable that our clients are tracking in their project management software (such as Microsoft Project). Rather, we identify those groups of tasks and deliverables that are key signposts in the project. While there are numerous tasks that may slip a few days or even weeks, get rescheduled, and not have a major impact on the project, there are always a number of significant task groups and deliverables which should be tracked over time because any risk to those items in terms of schedule, scope, or cost have a potentially significant impact on project success. We de-compose the five categories listed above into the next lower level of our assessment taxonomy. We refer to this next lower level as the area of assessment level. The list of areas of assessment grows over the life of the project. The following list is provided as an example of typical areas of assessment: Project Management and Sponsorship o Governance o Scope o Schedule o Budget o Communication o Staffing and Project Facilities o Change Management o Risk Management o Issue Management o Quality Management People o Stakeholder Engagement o Business Processes/System Functionality o Contract Management/Deliverables Management o Training and Training Facilities o Organization Preparation
Page 6 o User Support Application o Application Architecture o Requirements Management o Implementation o Application Interfaces o Application Infrastructure o Reporting o Testing o Tools Data o Data Preparation o Data Conversion o Data Security Infrastructure o Technical Infrastructure o Technical Help Desk For each area of assessment within a Project Area, we document in our QA Dashboard our observations, any issues and/or risks that we have assessed, and our recommendations.
Page 7 Assessed status is rated at a macro-level using the scale shown in the table below. Assessed Status Extreme Risk Meaning Extreme Risk: a risk that project management must address or the entire project is at risk of failure; these risks are show-stoppers Risk Risk: a risk that is significant enough to merit management attention but not one that is deemed a show-stopper Risk Being Addressed No Identified Risk Risk Being Addressed: a risk item in this category is one that was formerly red or yellow, but in our opinion, is now being addressed adequately and should be reviewed at the next assessment with an expectation that this item becomes green at that time No Risk: All Systems Go for this item Not Started Not Started: this particular item has not started yet or is not yet assessed Completed or Not Applicable Completed/Not Applicable: this particular item has been completed or has been deemed not applicable but remains a part of the assessment for traceability purposes We recognize that simultaneously addressing all risk areas identified at any given time is a daunting task and not advisable. Therefore, we prioritize risk items in our monthly reports as: 1. Very Urgent Consideration 2. Urgent Consideration 3. Serious Consideration We define these priority ratings in terms of impact for each phase of the project being assessed as the project lifecycle progresses. Rating risks at the macro-level using the assessed status and urgency scales described above provides a method for creating a snapshot that project personnel and executive management
Page 8 can review quickly, getting an immediate sense of project risks. The macro-level ratings are further refined by describing in detail what the risk/issue is and what remedial actions are being taken/should be taken to address the risk/issue. The result is a framework for DOC management to evaluate project risks in terms of business objectives and traditional project management tasks.
Page 9 3. QA Work Plan The detailed QA Work Plan below defines specific tasks that will be undertaken to provide QA services for the ACI project. WBS TASK NAME DURATION / MILESTONE 1 Quality Assurance Project Start-Up 1.1 Complete Work Plan for ACI Quality Assurance Within 30 days of contract start 2 2.1 On-going Project Management and Sponsorship Assessments Assess Governance planning, execution ( practice ), and Assess and report monthly 2.2 Assess Scope Management planning, execution, and 2.3 2.4 Assess Schedule Management planning, execution, and Assess Budget Management planning, execution, and 2.5 Assess Communication planning, execution, and 2.6 2.7 2.8 Assess Staffing and Project Facilities planning, execution, and Assess Change Control Management planning, execution, and Assess Organizational Change Management planning, execution, and 2.9 Assess Risk Management planning, execution, and 2.10 Assess Issue Management planning, execution, and
Page 10 WBS 2.11 TASK NAME Assess Quality Management planning, execution, and DURATION / MILESTONE 2.12 Assess other areas of Project Management as identified and needed over the course of the ACI Project 3 On-going People Assessments Assess and report monthly 3.1 3.2 3.3 3.4 3.5 Assess Stakeholder Engagement planning, execution, and Assess Business Processes/System Functionality planning, execution, and Assess Contract Management/Deliverables Management planning, execution, and Assess Training and Training Facilities planning, execution, and Assess Organization Preparation planning, execution, and 3.6 Assess User Support planning, execution, and 3.7 Assess other areas of People as identified and needed over the course of the ACI Project 4 On-going Application Assessments Assess and report monthly 4.1 4.2 4.3 4.4 Assess Application Architecture planning, execution, and Assess Requirements Management planning, execution, and Assess Development/Configuration planning, execution, and Assess Application Interfaces planning, execution, and
Page 11 WBS 4.5 TASK NAME Assess Application Infrastructure planning, execution, and DURATION / MILESTONE 4.6 Assess Reporting planning, execution, and 4.7 Assess Testing planning, execution, and 4.8 Assess Tools planning, execution, and 4.9 Assess other areas of Application as identified and needed over the course of the ACI Project 5 On-going Data Assessments Assess and report monthly 5.1 Assess Data Preparation planning, execution, and 5.2 Assess Data Conversion planning, execution, and 5.3 Assess Data Security planning, execution, and 5.4 Assess other areas of Data as identified and needed over the course of the ACI Project 6 On-going Technical Infrastructure Assessments 6.1 6.2 6.3 Assess infrastructure planning, execution, and, as appropriate Assess Technical Help Desk planning, execution, and, as appropriate Assess other areas of Technical Infrastructure as identified and needed over the course of the ACI Project
Page 12 WBS TASK NAME Note: The balance of this WBS is based on likely areas of assessment related to specific activities and deliverables of the ACI Project. The specific areas for assessment, other than the areas of on-going assessment identified above, will evolve as the ACI Project s specific solution and project plans develop in more detail. DURATION / MILESTONE 7 Assess Definition Activities Assess and report in planning and execution 7.1 Assess ACI Hardware/Software Plan 7.2 Participate in and assess effectiveness of Definition Meetings 7.3 Assess Preliminary Implementation Specification 7.4 Assess training plan and preparation of training materials (cross-reference with WBS item #12.1) 8 Assess Base Configuration Activities Assess and report in planning and execution 8.1 Assess Preliminary Baseline for Scope 8.2 Assess Preliminary Baseline Activities 8.3 Participate in and Assess effectiveness of Verification Sessions 8.4 Assess activities to revise Implementation Specification
Page 13 WBS TASK NAME DURATION / MILESTONE 8.5 Review finalized Implementation Specification 9 Assess Development/Configuration Activities Assess and report in planning and execution 9.1 9.2 Assess conduct and testing of development/configuration activities Assess Report and Correspondence development/configuration activities 9.3 Assess Interface Design activities and documents 9.4 Assess plan for Application Security 10 Assess Conversion Activities Assess and report in planning and execution 10.1 Assess Data Store Inventory activities 10.2 Assess Conversion Plan 10.3 Assess Mock Conversions 10.4 Assess Data Verification planning and activities 11 Assess Testing Assess and report in planning and execution 11.1 Assess Testing Plan
Page 14 WBS TASK NAME DURATION / MILESTONE 11.2 Assess System Testing execution and 11.3 Assess Converted Data Testing execution and 11.4 Assess Performance Testing execution and 11.5 Assess End-to-End and User Acceptance Testing execution and 12 Assess Training Activities Assess and report in planning and execution 12.1 Assess Training plan and approach (cross-reference with WBS item #7.4) 13 Assess Rollout Assess and report in planning and execution 13.1 Assess Operations and Support Plan 13.2 Assess Disaster Recovery Plan updates 13.3 Assess Business Cutover Checklist 13.4 Assess Project Cutover Checklist 13.5 Assess Plan for Help Desk/Desk-side Support 14 WaTech/OCIO Stage/Gate Submissions Review and advise in submissions
Page 15 WBS 14.1 TASK NAME Provide advice and counsel (including a written report of findings and recommendations) to DOC DURATION / MILESTONE 15 Presentations Develop and/or review and advise in presentations 15.1 Provide support to DOC for presentations and/or make presentations on DOC s behalf, as needed