Doe Contractor Assurance

Doe Contractor Assurance

November, 2016 We do the right thing. EFCOG CONTRACTOR ASSURANCE WORKING GROUP (CAWG) UPDATE MATURITY EVALUATION TOOL (MET) Patricia Allen, Chair Jita Morrison, SRR CAS Lead S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 1 MET Purpose We do the right thing. DOE O 226.1B, Source Requirement (2.a) The contractor must establish an assurance system that includes assignment of management responsibilities and accountabilities and provides evidence to assure both the Department of Energys (DOE) and the contractors managements that work is being performed safely, securely, and in compliance with all requirements; risks are being identified and managed; and that the systems of control are effective and efficient. The Maturity Evaluation Tool can enable DOE Contractors to apply a graded approach and prioritize improvement efforts. S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m

2 Evaluation Process We do the right thing. Two Maturity Levels are established Level 1 Meets Requirements Level 2 Enhanced Subjective in nature Focused on understanding details of current performance and determining where to improve Performed annually or as needed to enable discussion and improvement Best practices will be noted S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 3 MET Categories We do the right thing. DOE 226.1B Contractor Requirements: CAS Management and Scope (Att.1, Section 2.a) CAS Components must include (Att.1, Section 2.b) CAS Effectiveness Validation Self-Assessment and Feedback Process Structured Issue Management Process Feedback and Improvement Timely Communication to DOE CO Metrics

CAS Program Implementation and Monitoring (Att.1, Section 2.c and 2.d) S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 4 CAS Evaluation Structure We do the right thing. 226.1B CRD Elements CAS Management & Scope Effectiveness Effectiveness Validation Validation _ perf. metrics _ central /line org _ Mgmt reviews _ external audit _ certification _ etc Issues Issues Mgmt Mgmt Process Process Self-Assessments

Self-Assessments and and Feedback Feedback Evaluation Criteria S AV A N N A H R I V E R S I T E Program Implementation & Monitoring Program Components Feedback Feedback & & Improvement/Tim Improvement/Tim ely ely Communication Communication Performan Performan ce ce Analysis Analysis & & Metrics Metrics SubElement AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 5

CAS Management &Scope/ Effectiveness Validation We do the right thing. CAS Management and Scope - Source Requirement (2.a) The contractor must establish an assurance system that includes assignment of management responsibilities and accou 1ntabilities and provides evidence to assure both the Department of Energys (DOE) and the contractors managements that work is being performed safely, securely, and in compliance with all requirements; risks are being identified and managed; and that the systems of control are effective and efficient. CAS Effectiveness Validation Source Requirement (2.b) The contractor assurance system, at a minimum, must include the following: (2.b(1)) A method for validating the effectiveness of assurance system processes. Third party audits, peer reviews, independent assessments, and external certification may be used and integrated into the contractors assurance system to complement, but not replace, internal assurance systems. Level 1 Meets Requirements Y/N Comments Roles & Responsibilities for CAS are clearly defined CAS processes are efficient and effective for assuring that work is performed safely, securely and in compliance with requirements and that risks are understood and managed Performance metrics established and reviewed Formal review meetings with management Independent Assessments are conducted S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 6 CAS Mgmt &Scope/ Effectiveness Validation Evaluation Criteria We do the right thing. Level 2 Enhanced

Score 1-10 Comments Performance metrics reviewed centrally and by line organizations (See example) Performance metrics periodically reassessed for effectiveness including benchmarking Performance metrics chosen based on the ability to be predictive of future performance Management reviews are interactive and lead to improvement actions Management reviews of performance data periodically drill down to greater detail in select functional areas Management reviews periodically include DOE and/or Corporate Managers Effectiveness reviews are integrated with other process/system reviews (ISMS/QA/etc) Independent assessments are planned/scheduled to ensure all areas of risk are periodically reviewed Independent assessments utilize subject matter experts Multiple external reviews formats are employed (3 rd party audits, certifications, parent reviews, etc.) S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 7 SRR Nuclear Safety Culture Dashboard July 2016 September 2016 We do the right thing. Management Leadership Organizational Learning

Number of Alleged Retaliation # of Employee Concerns Employee Concerns Timeliness of Employee Concern Timeliness Evaluation of Employee Concern Evaluations # of Disciplined Operations E Conops/Disciplined Operations Events Events (ORPS Reported) T A (ORPS Reported) Timeliness of ORPS Characterization PD Timeliness of ORPS Characterization U Integrated Assessment Plan vs. Schedule Integrated Assessment Plan vs. Assessment Quality Evaluation Schedule Self Identified versus Event Response Assessment Quality Evaluation Corrective Actions Complete Corrective Action Program Timeliness Self Identified versus Event Response Corrective Actions Good S AV A N N A H R I V E R S I T E Marginal

Training Hours in Nuclear Safety (Ops/Maintenance) D E D E Senior Management MFO E N Performance Corrective Action Program Timeliness Key: October 11, 2016 Management Field Observations Employee Engagement New Issue Actions Identified (STAR CTS) Safety Meeting Attendance Behavior Based Safety Observations Unsatisfactory AIKEN, SC No Data w w w. S R R e m e d i a t i o n . c o m 8 Self-Assessment & Feedback We do the right thing. Self-Assessment and Feedback- Source Requirement (2.b(2)) Rigorous, risk-informed, and credible self-assessment and feedback and improvement activities. Assessment programs must be risk-informed, formally described and

documented, and appropriately cover potentially high consequence activities. Level 1 Meets Requirements Y/N Comments Rigorous self-assessment activities are performed Risk-informed basis for self-assessment selection is used Self-assessments are credible Formally described and documented assessment program exists Self-assessments appropriately cover potentially high consequence activities S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 9 Self-Assessment and Feedback Process Evaluation Criteria We do the right thing. Level 2 - Enhanced Score 110 Comments An annual plan is developed to integrate various input and ensure that all functional areas and facilities are periodically assessed (See example) Assessors are trained in effective assessment techniques The quality of assessments is evaluated and results are used to promote improvement (See example) The assessment plan includes required and management directed assessments The assessment plan is periodically updated to address emergent issues based on performance monitoring Risk determinations include safety basis impact, level of hazard, potential impact to mission, and degree of change associated with the

activity Assessment program health is monitored, including the ratio of issues found during assessment versus the number identified from events and external assessments, the frequency of related ORPS issues, and the number improvement issues per assessment Management supports the assessment program by participating themselves and freeing staff to participate as well (See example) Effective software is used to facilitate the performance of assessments and the processing of assessment results Assessments are viewed by the staff as a path to improvement rather than a necessary evil S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 10 Integrated Assessment Plan Performance We do the right thing. S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 11 Self-Assessment Grading We do the right thing. S AV A N N A H R I V E R S I T E

AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 12 MFO Performance We do the right thing. S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 13 Issues Management We do the right thing. Issues Management - Source Requirement (2.b(3), (2.b(3)(a) (2.b(3)) A structured issues management system that is formally described and documented and that: (2.b(3)(a)) Captures program and performance deficiencies (individually and collectively) in systems that provide for timely reporting, and taking compensatory corrective actions when needed. Issues Management - Source Requirement (2.b(3)(b)) Contains an issues management process that is capable of categorizing the significance of findings based on risk and priority and other appropriate factors that enables contractor management to ensure that problems are evaluated and corrected on a timely basis. For issues categorized as higher significance findings, contractor management must ensure the following activities are completed and documented: (2.b(3)(b)(1)) A thorough analysis of the underlying causal factors is completed; (2.b(3)(b)(2)) Timely corrective actions (2.b(3)(b)(3)) Effectiveness review (2.b(3)(b)(4)) Documentation of the analysis process and results (2.b(3)(b)(5)) Communicated to Sr Mgmet (Not included here see Performance Analysis) S AV A N N A H R I V E R S I T E A I K E N , S C w w w. S R R e m e d i a t i o n . c o m

14 Issue Management Process Evaluation Criteria We do the right thing. Level 1 Meets Requirements Y/N Comments Formally described and documented Captures program and performance deficiencies Categorizes the significance of findings based on risk and priority Tailors response for higher significance findings to ensure: (1) A thorough causal analysis (2) Timely corrective actions (3) Effectiveness reviews after actions are performed (4) Documentation of the analysis process and resolution of issues (5) Communication of issues, performance trends, and analysis results with senior management S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 15 Issue Management Process Evaluation Criteria We do the right thing. Level 2 Enhanced Score 1-10 Comments The number of systems used for issue tracking is minimized, ideally to a single system, to enhance effectiveness of the processes

Screening teams are staffed with trained and knowledgeable personnel who make reliable determinations Causal analysts are well trained and proficient The management team is familiar with causal analysis techniques and understands the value of the process Corrective action plans are reviewed by senior management for high significance issues to ensure effectiveness and timeliness Effectiveness reviews are performed for all higher significance issues an appropriate time after corrective actions are completed and institutional changes are established Management promotes the use of the issues management process as an important core part of their business, encourages a low threshold for issue reporting and positively recognizes personnel for reporting issues Metrics related to issues management are routinely reviewed and evaluated by the senior management team to monitor the health of the program. (See example) Issue management software supports the monitoring of action resolution, including notices to assignees and management for overdue issues Plans are developed for benchmarking reviews to enhance effectiveness S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 16 We do the right thing. S AV A N N A H R I V E R S I T E Corrective Action Closure Significance Category 1,2 Issues AIKEN, SC w w w. S R R e m e d i a t i o n . c o m

17 We do the right thing. S AV A N N A H R I V E R S I T E Corrective Action Closure Significance Category 3 Issues AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 18 We do the right thing. S AV A N N A H R I V E R S I T E Corrective Action Closure Significance Category T Issues AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 19 Feedback & Improvement/ Timely Communication We do the right thing. Feedback & Improvement Source Requirement (2.b(5)) Continuous feedback and improvement, including worker feedback mechanisms (e.g., employee concerns programs, telephone hotlines, employee suggestions forms, labor organization input), improvements in work planning and

hazard identification activities, and lessons learned programs. Timely Communications Source Requirement (2.b(4)) Timely and appropriate communication to the Contracting Officer, including electronic access of assurance-related information. Y/N Comments Level 1 Meets Requirements Methods for continuous feedback and improvement, including worker feedback, are established Timely and appropriate communication is provided to the Contracting Officer, including electronic access of assurance-related information S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 20 Feedback & Improvement Evaluation Criteria We do the right thing. Level 2 Enhanced Score 1-10 Comments In addition to standard processes for feedback like the employee concerns, work planning feedback, and lessons learned programs; other feedback methods are also employed, such as hot lines, labor organization input, and employee suggestion programs Managers routinely spend time in work areas observing, interfacing and coaching work. Results of interface opportunities are documented and observed issues are captured in the issues management system (See example) Non-threatening fact findings are conducted soon after events to provide information needed for causal analysis Lessons learned experts are designated within the organization and through training or experience know how to create and access Lessons Learned information

The use of Lessons Learned information is monitored and trended to understand the cost and benefit of the program Managers, supervisors, causal analysts and work planners have familiarity with the Lessons Learned database to support accessing and reporting relevant information Responsible managers and subject matter experts participate in industry working groups and benchmarking reviews to share experiences and bring improvement opportunities to their programs Continuous learning is valued by the management team, as evidenced by the strong application of internal and external lessons learned Managers are trained in techniques and support a strong safety culture, including a safety conscious work environment Partnership meetings are held with the field office and include review and discussion of CAS data to promote a joint focus on overall mission success S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 21 Findings / Improvements from MFOs We do the right thing. Contractor Assurance Performance Evaluation Savannah River Remediation LLC Findings / Improvements from MFOs 12 Month Period Ending September 30, 2016 90 400 354 80 350 308 Percent 60

262 235 297 287 270 219 249 235 40 TE A D P U 20 10 300 249 250 50 30 324 D E ED E N 200 150 Number Count

70 100 50 0 0 Oct-15 Nov-15 Dec-15 Jan-16 Finding Number of MFOs performed within SRR broken into Tier I and Tier II+III (workplace visit and evolution evaluation). Feb-16 Mar-16 Apr-16 May-16 OFI Total Jun-16 Jul-16 Aug-16 Sep-16 Total Tier II&III MFOs Analysis: The number of MFOs performed this month significantly increased over the previous month. The simple return on investment for the MFO process was 28% (total issues identified divided by total number of evaluation MFOs), 1% less then the previous month. Findings from MFOs were evenly split between COTS (30) and SC 3 (32) issues. There is no trigger or target associated with this performance indicator.

Action: Continue to monitor MFO documentation to identify missed findings or OFIs included in the Summary of the report. No grading is assigned to this indicator. It is for trend / perrformance evaluation only. 10016 KPI Manager P. E. Shedd, 803.208.8287 S AV A N N A H R I V E R S I T E AIKEN, SC SME: D. L. Lester, 803.208.8743 w w w. S R R e m e d i a t i o n . c o m 22 Performance Analysis & Metrics We do the right thing. Performance Analysis (Communication and Use of Metrics)- Source Requirements(2.b(3)(b)(5) and and 2.b.(6)) (2.b(3)(b)(5)) Communicates issues and performance trends or analysis results up the contractor management chain to senior management using a graded approach that considers hazards and risks, and provides sufficient technical basis to allow managers to make informed decisions and correct negative performance/compliance trends before they become significant issues. (2.b(6)) Metrics and targets to assess the effectiveness of performance, including benchmarking of key functional areas with other DOE contractors, industry, and research institutions. Y/N Level 1 Meets Requirements Comments Issues and performance trends are communicated to senior management using a graded approach

The analysis results include sufficient technical basis to allow managers to make informed decisions and correct negative performance/compliance trends before they become significant issues Metrics and targets are used to assess the effectiveness of performance S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 23 Performance Analysis & Metrics Evaluation Criteria We do the right thing. Level 2 Enhanced Score 1-10 Comments Management review teams are employed to routinely monitor issues and corrective actions with a higher degree and level of oversight provided for more significance issues Trend codes are assigned during the review of issues to support the analysis of performance data A metrics dashboard is used to provide management a tool for highlighting program elements with degrading performance (See example) Personnel performing trending and analysis are well trained and have the tools needed to perform effectively Statistical analysis techniques are understood by the management team and are employed in the review of performance data as appropriate Analysts review performance data to identify early indications of adverse trends. Management understands, supports, and values the use of performance trending through metrics and analysis Key performance data is periodically reviewed and evaluated by corporate management The project routinely benchmarks its performance against other projects and industry standards Performance analysis and metrics processes are periodically assessed and refreshed to ensure effectiveness

S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 24 FY 2016 POMC Dashboard We do the right thing. Fiscal Year to Date Performance: 10/1/2015 through 8/31/2016 Safety Index FY16 Target Goal > 80% Cumulative Dose Assessments Graded FY16 Goal < 12.5% Variance from ALARA Target Goal FY16 Goal Grade > 25% per Month FYTD FYTD xx% xx% FYTD FYTD <

Radiological Control Severity Index Environmental Compliance Index NSC Dashboard FY16 Goal < 75% Goal FY16 Goal < 1.0/Month FY16 Goal YTD Average > 8.0 FYTD FYTD xx% xx% FYTD FYTD 0.0 0.0 FYTD FYTD x.x x.x S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 25 Program Implementation & Monitoring We do the right thing. Program Implementation - Source Requirement (2.c) The contractor must submit an initial contractor assurance system description to

the Contracting Officer for DOE review and approval. That description must clearly define processes, key activities, and accountabilities. An implementation plan that considers and mitigates risks should also be submitted if needed and should encompass all facilities, systems, and organization elements. Once the description is approved, timely notification must be made to the Contracting Officer of significant assurance system changes prior to the changes being made. Program Monitoring -Source Requirement (2.d) To facilitate appropriate oversight, contractor assurance system data must be documented and readily available to DOE. Results of assurance processes must be analyzed, compiled, and reported to DOE as requested by the Contracting Officer (e.g., in support of contractor evaluation or to support review/approval of corrective action plans). Y/N Level 1 Meets Requirements Comments An initial contractor assurance system description must be submitted to the Contracting Officer for DOE review and approval CAS data must be documented and readily available to DOE to facilitate appropriate oversight S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 26 MET - Program Implementation & Monitoring We do the right thing. Level 2 - Enhanced Score 110 Comments CAS Description documents are periodically updated and changes are reviewed with the site DOE office prior to submittal to ensure concurrence

A partnership agreement is established with DOE that promotes the healthy interaction and exchange of performance information regarding issues and trends All levels of management are trained in Human Performance Improvement (HPI) tools and champion their use Self-critical metrics are used to monitor the effectiveness of management oversight (See example) Managers understand, value and routinely use CAS tools for monitoring and improving their programs The central CAS organization provides expert support and facilitation to the management team for their effective and efficient use of the CAS tools (See Cover Sheet example) CAS related performance metrics are routinely reviewed and evaluated by senior management, and discussed with DOE and Corporate Management Assessments, including reviews of CAS elements, are periodically performed at or by other organizations CAS elements are integrated to enhance the efficiency and effectiveness of the tools Management frequently reinforces expectations for the strong safety culture to ensure employees are encouraged to raise issues and report their own mistakes S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 27 ORPS Normalized Score We do the right thing. Contractor Assurance Processes Savannah River Remediation LLC ORPS Normalized Score -- SRR Only One-Year Period Ending September 30, 2016 16.00 Corrective Action Plan

14.00 12.00 10.00 Investigate 8.00 6.00 Good TE A D P U 4.00 4.93 2.31 4.12 0.51 3.33 2.59 3.52 10.99 6.17 1.03 1.03 2.00 4.04

Excellent D E ED E N Oct-15 Nov-15 Dec-15 Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Sep-16 0.00 Description Analysis / Actions The ORPS Normalized Average is a lagging indicator used to compare the number of ORPS events recorded at each of the DOE EM facilities across the complex. Data normalization is achieved by comparing hours worked against 200,000 hour baseline. However, the indicator does not factor in other considerations such as numbers of SS/SC equipment, numbers of nuclear facilities, or operational status. Each of these factors play considerable roles in the development of a truly normalized score for the evaluation of a number of the NOCs. Analysis:Two ORPS reported events in September. Events remain sporadic. No significant trend

exists for this indicator. As demonstrated on the histogram chart above, absent the eight events which occurred in June, the majority of months were graded out as "Excellent" performance versus the established management variances. Grading: The histogram to the right of the chart outlines the DOE-EM designated grading for this indicator. The DOE-EM grade is based on the normalized average score for all DOE-EM facilities.This is currently based on "Excellent" performance being less than 2.75. SRR has not adjusted the other grading bands awaiting standard deviation values from DOE-HQ. S AV A N N A H R I V E R S I T E Action(s): Continue to monitor. KPI Owner SME P. E. Shedd, 803.208.8287 D. L. Lester (803.208.8743) AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 28 Management Concern ORPS Events We do the right thing. Contractor Assurance System Savannah River Remediation LLC Comparison Trend of ORPS NOC 10 Management Concerns 2-Year Period Ending September 30, 2016 4 Period 1 Number of Events Reported 3

2 2 1 1 0 Period 2 0 0 0 0 0 TE A D P U 0 0 1 1 0 Monthly 3 Mth. Ave. 2 D E ED E N

1 0 1 0 12 Mth. Ave. Mean 1 1 1 0 1 0 0 Linear (Monthly) Description Analysis / Actions The purpose of this metric is monitoring ORPS reports in NOC = 10 (Manaagement Concerns) category. There is no specified goal or target associated with this KPI since its intent is to track those issues which management deems important enough to share through the ORPS process across the complex. No scale degradation is associated with this indicator, and the review of the rolling three month average number of events will be used to guage this indicator. One NOC = 10 event was reported by SRR facilities this month. The number of events in Period 2 is 7 greater than Period 1 as shown be the negative trendline. Events are identified as being sporadic and do not represent a significant negative trend or hazard to workers, the environment or the public. S AV A N N A H R I V E R S I T E Other than continued monitoring, no additional actions are recommended pending completon of the issue evaluations for the most recent events.

KPI Owner SME P. E. Shedd, 803.208.8287 D. L. Lester, 803.208.8743 AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 29 Periodic Reviews with Corporate and DOE Management We do the right thing. S AV A N N A H R I V E R S I T E AIKEN, SC w w w. S R R e m e d i a t i o n . c o m 30

Recently Viewed Presentations