2023 Edition

Performance and Quality Improvement Introduction

Purpose

 An agency-wide performance and quality improvement system effectively engages staff, persons served, and other stakeholders in advancing the agency’s mission and achieving strategic goals through continuous, integrated, data-driven efforts to improve service delivery and administrative practice. 

Introduction

COA’s Performance and Quality Improvement (PA-PQI) Standards provide the framework for an agency-wide PQI system that increases agency capacity to make data-informed decisions that support the achievement of performance targets, program goals, individual and family outcomes, and staff and consumer satisfaction. Building and sustaining a comprehensive, mission-driven PQI system is dependent upon the active engagement of staff at all levels, persons served, and other stakeholders throughout the improvement cycle. 

Currently viewing: PERFORMANCE AND QUALITY IMPROVEMENT

Viewing: PA-PQI - Performance and Quality Improvement

VIEW THE STANDARDS

Note: Please see the PA-PQI Reference List for the research that informed the development of these standards.


Note: Please see the PQI Toolkit for additional guidance on these standards.


Note: For information about changes made in the 2020 Edition, please see PQI Crosswalk.


2023 Edition

Performance and Quality Improvement (PA-PQI) 1: PQI Infrastructure

The PQI system has an infrastructure that gives the agency capacity to:
  1. ensure the integrity of measurement practices, including data collection and analysis; 
  2. identify agency-wide and region- and program-specific areas of strength and areas for improvement; and
  3. identify, implement, and monitor improvement strategies.
1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • PQI Plan
  • PQI operational procedures
  • PQI meeting/activity schedule for the next 12 months
  • Document or chart detailing the agency's PQI structure including committees and work groups with member lists, as appropriate
State Administered Agency (Regional Office)
  • Regional PQI plan
  • Regional PQI operational procedures
  • Regional PQI meeting/activity schedule for the next 12 months
  • Document or chart detailing the region's PQI structure including committees and work groups with member lists, as appropriate
No On-Site Evidence
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • Interviews may include:
    1. Agency leadership
    2. Program field personnel
    3. Community stakeholders
    4. PQI Director
    5. IM/data analysis manager
    6. Contracted providers
    7. Persons served
State Administered Agency (Regional Office)
  • Interviews may include:
    1. Regional Director
    2. Program field personnel
    3. Administrative personnel (HR, Training, PQI, IM)
    4. Community stakeholders
    5. Contracted providers
    6. Persons served

 

PA-PQI 1.01

The PQI plan and procedures: 
  1. cover every program or service area within each of the agency’s regions or sites;
  2. articulate the agency's approach to quality improvement including specific models and methodologies it employs;
  3. describe the PQI system's structure and outline all major PQI activities;
  4. define staff roles and assigns responsibility for implementing and coordinating the PQI process;
  5. identify the core areas of performance being measured and the purpose or goals for measuring these areas;
  6. define measurement practices including data collection and analysis methods, process, and applicable timeframes; and
  7. reflect how the PQI system is evaluated.
CFS Interpretation: For child and family services agencies, the PQI system must include all the jurisdictions in which the services included in the Child and Family Services Plan are provided. 

Interpretation: For state-administered agencies, regional PQI plans should directly correlate with the agency-wide plan while also being responsive to the region's specific needs. The regional plan and accompanying procedures explain the structure for carrying out PQI activities in the region, including any region-specific committees, processes, and performance measures. Additionally, if the region contains any unique programs, the regional PQI plan should address the PQI activities and measures for those programs.

Interpretation: The agency’s PQI plan, as the guiding document for implementing and refining the PQI system, is distinct from time-sensitive, actionable plans that the agency develops to target improvement activities such as the Child and Family Services Review Program Improvement Plans.  
 
Examples: Among other things, evaluation of the PQI system can include assessing how well PQI activities align with best practices in measurement including: (1) is the agency asking the right performance questions, (2) how well do chosen metrics answer the performance questions being asked, and (3) is the agency accurately measuring change over time. 

Note: In regards to element (e), please see the Person-Centered Logic Model Core Concept in each assigned Service Standard for additional information on program outputs and individual outcomes to be included in the PQI plan.


 

PA-PQI 1.02

The PQI plan:
  1. defines a broad range of internal and external stakeholder groups; and
  2. specifies how these stakeholder groups will be involved in the PQI process.
Interpretation: Stakeholder involvement is fundamental to any well-designed PQI system and is crucial to a public agency’s ability to achieve its mission and elicit public trust. Stakeholders are often thought of in terms of categories or groups of people—sometimes referred to as communities of interest. 

Common stakeholder categories include: 
  1. persons served, including children and youth when applicable;
  2. community-based organizations and neighborhood associations; 
  3. service providers, particularly contracted providers;
  4. community-based business entities;
  5. public agency partners including other social service agencies and the court system; 
  6. statewide or national advocacy organizations; and 
  7. academic, learning, and research institutions. 

Stakeholders should participate in a broad range of activities including involvement in PQI planning activities; activities that gather qualitative data on the experience of receiving services or providing services as a contracted provider; reviewing and interpreting summary data information; and identifying, implementing, and monitoring solutions.
Note: Throughout the PA-PQI standards, staff are intentionally differentiated from other stakeholders in order to highlight their unique involvement in the PQI process.

 

PA-PQI 1.03

The PQI plan outlines the flow of information between frontline workers and those responsible for implementing and coordinating the agency's PQI process to ensure:
  1. staff at all levels of the agency receive information on PQI evidence and findings;
  2. frontline staff and their supervisors have timely access to the information they need to clarify expectations and implement practice improvements; and
  3. timely, effective delivery of data and feedback to PQI system administrators.  
2023 Edition

Performance and Quality Improvement (PA-PQI) 2: Roles and Responsibilities

Staff at all levels of the agency participate in, conduct, and sustain performance and quality improvement activities.
1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities
All Agencies
  • PQI training curricula table(s) of contents broken down by job category including:
    1. staff coordinating the agency's PQI system
    2. supervisors, program directors, and senior managers
All Agencies
  • Training curricula and materials
  • Documentation tracking staff completion of required PQI trainings
All Agencies
  • Interviews may include:
    1. PQI staff
    2. Managers and program directors
    3. Staff at all levels

 

PA-PQI 2.01

Staff responsible for implementing and coordinating the agency's PQI process are trained on, or demonstrate competency in, sound measurement practices including: 
  1. identifying indicators of quality practice for the programs being evaluated; 
  2. implementing internal and external evaluation methods, such as benchmarking, appropriate to the programs being evaluated;
  3. collecting, analyzing, and interpreting data from a range of sources; and
  4. communicating evidence and findings to staff and other stakeholders in a manner that facilitates their active engagement.

 

PA-PQI 2.02

Staff receive ongoing training in PQI activities including, as appropriate to individual roles and responsibilities:
  1. the goals, relevance, and inherent value of the PQI process; 
  2. the roles of all staff in implementing the PQI process; 
  3. data collection tools and forms;  
  4. the key decision-making junctures in their work and how data should inform decisions; and
  5. case review forms and processes.

 

PA-PQI 2.03

Supervisors of direct service staff, program directors, and senior managers are trained on, or demonstrate competency in: 
  1. collecting, monitoring, and interpreting data and using this evidence to evaluate and discuss performance as it relates to outcomes;
  2. targeting areas of improvement;
  3. supporting staff in ensuring data collection and integrity; and
  4. supporting staff in using data as evidence to inform casework and operational decision-making. 
2023 Edition

Performance and Quality Improvement (PA-PQI) 3: Measures and Indicators

The agency identifies and utilizes measures and indicators for evaluating the following within the agency and with any contracted providers: 
  1. the impact of services on individuals and families; 
  2. the quality of service delivery; and
  3. management and operational performance.
Examples: Measures, indicators, and tools required by regulation can be utilized to go beyond measuring compliance by engaging staff and other stakeholders to: 
  1. review data that is important for their work or interest;
  2. use data to benchmark results with other agencies providing the same funded services; or
  3. compare data with other data collected by the agency that is not covered by contractual requirements in order to improve services.
Examples: Agencies providing child and family services are encouraged to integrate the Child and Family Services Review (CFSR) Outcome Measures and Systemic Factors, particularly those identified in Performance Improvement Plans, into their overall PQI system. 
 
1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities

County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity

  • See PQI plan (PA-PQI 1) for a description of what is being measured, including:
    • outcomes measures
    • outputs
    • data sources
    • performance indicators
    • performance targets
  • See outcomes information provided in the Person-Centered Logic Model Core Concept in each Service Standard
  • Agencies seeking re-accreditation only

State Administered Agency (Regional Office)

  • See Regional PQI plan for region-specific measures or indicators, if appropriate
  • See outcomes information provided in the Person-Centered Logic Model Core Concept in each Service Standard
  • Agencies seeking re-accreditation only

All Agencies

  • ufeffDocumentation of staff/stakeholder involvement in ongoing review of measures, indicators, data sources, and performance targets
  • Regulatory/licensing or other external reviews/reports
  • Documentation that COA Stakeholder Surveys were distributed (e.g. email chains, Stakeholder Survey Recipient Reporting Form, etc.)
All Agencies
  • Interviews may include:
    1. PQI staff
    2. Relevant staff
    3. Other relevant stakeholders

 

PA-PQI 3.01

Staff throughout the agency and stakeholders, including contracted providers, participate in the ongoing review of outputs and outcomes, and related:
  1. quantitative and qualitative indicators;
  2. data sources including measurement tools and instruments for each identified measure; and
  3. performance targets.

Interpretation: Agencies are encouraged to use standardized or recognized outcomes evaluation tools when such tools are available and appropriate. Functional assessments permit the analysis of an individual or family’s status over time and, in the aggregate, this case-level data can inform the analysis of trends and relationships to correlating service delivery components.


Interpretation: Program outputs and individual outcomes must be identified in the logic model submitted in the Person-Centered Logic Model Core Concept in each assigned Service Standard. 


Interpretation: Agencies should assess variation in service population, service area, staffing and other factors in order to develop baselines, performance targets, and benchmarks that are tailored to the local area or program.

Examples:

Outputs are what the program delivers. Examples of program outputs include:

  1. number of educational or clinical sessions provided;
  2. total number of individuals served over a specified period of time; and
  3. number of housing placements made.

Outcomes are the observable and measurable effects of a program's activities on its service recipients. Examples include:

  1. improved functioning as measured by the Children's Functional Assessment Rating Scale (CFARS); 
  2. number/percent of homeless and runaway youth that are reunited with family during past quarter;
  3. reduction in criminal justice system involvement; and
  4. improved family/community involvement.

 

PA-PQI 3.02

To evaluate the quality of its service delivery practices, the agency identifies and uses outcome measures related to the following:
  1. training and supervision of program staff; and
  2. consumer satisfaction.

 

PA-PQI 3.03

To evaluate management and operational performance, the agency identifies and uses outcome measures across the agency, and with contracted providers when applicable, to:
  1. measure progress toward achieving its strategic goals and objectives;
  2. evaluate operational functions that influence service delivery; and
  3. identify and mitigate risk.
Related Standards:
CFS Interpretation: For child and family services agencies, implementation of this standard includes an examination of relevant systemic factors assessed by the Child and Family Services Reviews (CFSRs).

Examples: Examples of outcome measures related to operations and management can include:

  1. efficiency in the allocation and utilization of its human and financial resources to further the achievement of agency objectives;
  2. effectiveness of risk prevention measures;
  3. staff retention/turnover and satisfaction;
  4. service delivery costs versus benefits derived by persons served;
  5. achievement of budgetary objectives;
  6. effectiveness of public education and outreach; 
  7. efforts to diversify the leadership or workforce; and
  8. staff fidelity to the process and quality standards set by the agency.

Examples: Agencies that use contracted providers may also measure important contract oversight and system integration processes, such as: the proportion of services that are meeting defined outcomes for persons served;

  1. the proportion of services that are evidence-based or meet nationally recognized treatment guidelines developed by consensus groups; 
  2. the integration of performance and outcomes data across the system;
  3. the integration and coordination of service provision processes across the system including ease of access to services;
  4. the effectiveness of contractor training and technical assistance efforts; 
  5. the satisfaction of stakeholders, such as high volume referral agents (e.g., judges, court workers, schools, and law enforcement); and
  6. results of case reviews, including the percentage of charts in which a placement decision includes an appropriate application of clinical criteria.

 

PA-PQI 3.04

Findings and recommendations from external review and monitoring processes are integrated into the organization’s PQI system.
Interpretation: When agencies are involved in litigated third-party oversight, such as consent decrees, strategic plans and PQI plans (agency-wide plans or jurisdiction-specific plans) should indicate how the overall PQI system balances pursuit of compliance with the larger quality improvement agenda.

CFS Interpretation: For child and family services agencies, the PQI system must incorporate the findings of the Child and Family Services Review and support implementation of the strategies outlined in its Program Improvement Plan.
 
Examples: External review and monitoring processes can include:
  1. reviews related to federal, state, and local requirements;
  2. litigated third party oversight, including consent decrees;
  3. government audits;
  4. accreditation reviews; and 
  5. other reviews, where appropriate.
2023 Edition

Performance and Quality Improvement (PA-PQI) 4: Case Review

The agency maintains case review processes for each of its services that inform performance and quality improvement activities by evaluating: 
  1. the impact of service delivery on each program's service population;
  2. the quality and effectiveness of service delivery practices; and
  3. the quality of documentation and data entry.

NA The agency is only assigned the Early Childhood Education (PA-ECE) and/or Out-of-School Time Services (PA-OST) standards.

 

NA The agency provides only non-clinical group, crisis intervention, and/or information and referral services.

Note: The case review processes addressed in this standard produce aggregate qualitative and quantitative data from across each service area that can be used to evaluate the impact of the agency’s service delivery practices on the outcomes of its service populations and to inform system-wide improvements, when indicated. These reviews are distinct from the case-level, supervisory review that is conducted for individual cases on a quarterly or more frequent basis to assess service plan implementation and the individual’s progress towards meeting his or her service goals and desired outcomes.

1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • Procedures for:
    1. Qualitative case reviews
    2. Case record reviews
  • Qualitative case review scoring tool(s)
  • Quantitative Case record review scoring tool
  • Sampling methodologies
  • Aggregate reports from the most recent case review processes
State Administered Agency (Regional Office)
  • Aggregate reports from the most recent case review processes
All Agencies
  • Results of external case record audits, if applicable
All Agencies
  • Interviews may include:
    1. PQI staff
    2. Relevant staff

 

PA-PQI 4.01

The agency implements an annual qualitative case review (QCR) process on a sample of cases to evaluate the quality and effectiveness of services provided.

Interpretation: Jurisdictions can implement this standard by utilizing the CFSR protocol in between federal reviews, choosing some other nationally recognized CQR process such as the Quality Service Review, or developing their own local QCR process.  


Interpretation: Sample sizes for qualitative case reviews will vary depending on the caseload size of each jurisdiction or site. Reducing the sample size may be one way to reduce the overall cost and increase the feasibility of implementing an effective QCR process. 

Examples: Qualitative case reviews monitor the quality and effectiveness of services provided by evaluating the following, as appropriate to the program: 
  1. safety, well-being, and/or progress of the individual or family;
  2. timeliness and comprehensiveness of the completed assessment;
  3. appropriateness of the service plan and related service decisions for the individual or family;
  4. family engagement; 
  5. collaboration with external service provider(s);
  6. achievement of service goals; and
  7. level to which service implementation and results are being monitored, evaluated, and modified.
Examples: Qualitative data obtained from case reviews can provide greater insight into the underlying practices causing a change in the quantitative data (PA-PQI 4.03). Conversely, the quantitative data can be used to determine the scope or breadth of a practice concern (e.g. system wide, regional, worker, etc.). As such, both qualitative and quantitative data have a critical role to play in any effective PQI system.

 

PA-PQI 4.02

Annual, qualitative case reviews include: 

  1. reviewing selected case records;  
  2. conducting case-specific interviews with persons served, workers, and other stakeholders involved with the case; and 
  3. providing feedback to individual caseworkers. 

 

PA-PQI 4.03

Quarterly reviews of case records are conducted to:
  1. evaluate the presence, timeliness, clarity, quality, continuity, and completeness of required documents;
  2. monitor compliance with regulatory, funding, and accreditation requirements; and
  3. minimize risk associated with case record completeness and documentation.

 

PA-PQI 4.04

Case review processes include:
  1. staff at all levels of the agency including frontline staff;
  2. a stratified, random sample of both open and closed cases; 
  3. uniform scoring tools to ensure consistency and permit comparison of information; 
  4. measures to minimize conflict of interest such as ensuring that reviewers do not review cases in which they have been directly involved as a provider, supervisor, or consultant;
  5. measures to maintain process integrity such as third party quality assurance checks; and 
  6. mechanisms to address safety concerns identified in cases under review. 
Interpretation: 
Sampling: The chosen sample must reflect all of an agency's regions and/or sites, each of its programs and service areas, and the various types of record reviews the agency conducts. Agencies should choose a sampling method that satisfies any applicable regulatory requirements and is appropriate to their size and agency structure. 

Closed cases: COA does not define the percentage of closed cases that must be included in the sample. The majority of cases the agency reviews should be open, but the agency must include a sample of closed cases to evaluate documentation related to discharge planning, case closing, and aftercare.
2023 Edition

Performance and Quality Improvement (PA-PQI) 5: Gathering Data and Communicating Information

The agency’s data management practices facilitate the collection, analysis, communication and interpretation of data.
1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • Policies and/or procedures for:
    1. Data management
    2. Reviewing and aggregating data
  • Most recent aggregate data reports and additional summary documents (e.g., performance dashboards, reports of gains made against goals, annual scorecards, etc.)
  • Documentation of reporting to:
    1. staff, oversight entities, and stakeholders at least annually
    2. the public
  • Documentation of:
    1. Decisions made at the agency level based on PQI findings (corrective actions, initiatives, etc.)
    2. Tracking the impact of decisions made (data reports that link to areas named in PIP, annual reports, etc.)
  • Current agency and/or program improvement plans
State Administered Agency (Regional Office)
  • Aggregate data reports and additional summary documents (e.g., performance dashboards, reports of gains made against goals, annual scorecards, etc.)
  • Documentation of reporting to:
    1. staff, oversight entities, and stakeholders at least annually
    2. the public
  • Documentation of:
    1. Decisions made at the worker, program, and regional level based on findings (corrective actions, initiatives, etc.)
    2. Tracking the impact of decisions made (data reports that link to areas named in PIP, annual reports, etc.)
  • Current regional and/or program improvement plans
All Agencies
  • Documentation of stakeholder review and discussion of PQI summary reports
  • PQI meeting minutes for the past six months
  • Leadership team, management, and staff meeting schedules, agendas, and minutes from the past six months
All Agencies
  • Interviews may include:
    1. Relevant staff
    2. External stakeholder groups
  • Observe systems for collecting, analyzing, and communicating data

 

PA-PQI 5.01

Procedures for collecting, reviewing, and aggregating data:

  1. ensure data integrity and reliability;
  2. protect personal identifiable information (PII) in data reports;
  3. engage staff at all levels of the agency including frontline staff; and
  4. facilitate the development of useable reports for analysis and interpretation.

Interpretation: The aggregation of data reduces the risk of disclosing PII in most instances; however, risk of disclosure still exists particularly when data is being disaggregated and unique or easily observable characteristics might allow someone to be identified in the data set. As such, data collection and reporting procedures should include mechanisms for avoiding such disclosure such as data suppression, rounding, reporting in ranges rather than exact counts, combining sub-groups into larger groups, etc.


 

PA-PQI 5.02

The agency analyzes disaggregated PQI data to:

  1. track and monitor identified measures;
  2. identify patterns and trends; and
  3. compare performance over time.


Interpretation: Agencies should disaggregate data to identify patterns of disparity or inequity that can be masked by aggregate data reporting. Common characteristics used to disaggregate data include:

  1. race and ethnicity/country of origin;
  2. generation status;
  3. immigrant/refugee status;
  4. age group;
  5. sexual orientation; and
  6. gender/gender identity.



 

PA-PQI 5.03

Summary reports of PQI information: 
  1. are distributed and discussed with staff and stakeholders in a timeframe and format that facilitates review, analysis, interpretation, and timely corrective action;
  2. reflect multiple data sources, when appropriate, including quantitative and qualitative data and formal and informal information gathered; 
  3. enable the comparison of data against the results of similar programs, internal or external benchmarks, etc.; and
  4. facilitate compliance with regulatory data reporting requirements. 
Interpretation: The content and format of PQI summary reports should take into account the needs of regional and/or local offices to ensure the data is presented in a useful way that facilitates corrective action at the worker and program level.

Interpretation: Timely corrective action should include ensuring information is distributed early enough that regional and local offices can evaluate and implement changes prior to the next round of internal or external reviews. 

In regard to element (d), in addition to the data itself, child and family services agencies participating in the Child and Family Services Reviews must be prepared to provide the federal government with:
  1. the data source; 
  2. the methodology for calculating or analyzing the data;
  3. the scope of the data (i.e. geographic, population, etc.); 
  4. the time period applicable to the data;
  5. information pertaining to the completeness, accuracy and reliability of the data; and 
  6. other known limitations of the data.
Examples: Methods for sharing findings can include:
  1. performance dashboards, report cards, or other types of summary reports;
  2. using monthly reports of key service delivery outputs and outcomes in staff supervision activities; 
  3. conducting focus groups and presentations at community meetings; 
  4. soliciting feedback via interviews or surveys;
  5. providing quarterly reports to the oversight entities, stakeholder advisory groups, and leaders on important data related to key operations and management functions; and
  6. quality review activities that engage community providers.
Examples: In regard to element (a), discussions with staff and stakeholders about PQI information can include:
  1. areas of strength and quality practice;
  2. areas for improvement; and
  3. how to prioritize targeted areas, identify interventions, and monitor the effectiveness of interventions over time.

 

PA-PQI 5.04

The agency has a mechanism for reporting, at least annually, to oversight entities, stakeholders, and staff on:
  1. key PQI activities that are ongoing, have been resolved, or that need further intervention;
  2. issues that require continued monitoring within the PQI system; and  
  3. PQI priorities and goals for the coming year.

 

PA-PQI 5.05

The agency shares PQI information with the public as part of its public outreach and education strategy.
Note: See PA-AM 4.01 for more information on developing a public outreach and education strategy.

 

PA-PQI 5.06

The agency:
  1. acts on PQI findings at the worker, program, region/community, agency, and system level; and
  2. monitors the effectiveness of interventions and adjusts interventions, as needed.
Interpretation: Information generated by the PQI system serves as evidence for identifying interventions in relation to:
  1. fulfilling the mission and meeting legal mandates;
  2. monitoring progress toward strategic plans and long-term goals;
  3. managing programs and operations efficiently and effectively;
  4. supporting direct service staff to meet program goals, make infomed case-level decisions, and have a positive impact on persons served; and
  5. meeting regulatory requirements. 
Examples: Agencies can use PQI findings and feedback to:
  1. develop solutions;  
  2. replicate good practice;  
  3. recognize and motivate staff; 
  4. update staff training and other professional development activities;
  5. improve organizational systems, processes, policies, and procedures; and
  6. eliminate or reduce identified problems.

 

PA-PQI 5.07

The agency develops improvement plans when issues have been identified that will involve coordinated and ongoing activities and monitoring.
Interpretation: Improvement plans formally lay out the actions that will be taken to address areas in need of improvement that are identified by staff and stakeholders as crucial to meeting the agency’s goals and delivering quality services. Improvement plans should be implemented when it is necessary to monitor and address the issue over time. 
 

Interpretation: State-administered agencies should manage a statewide and regional performance improvement action planning process in order to take system-wide action and also allow for targeted PQI activities based on regional context.
 

 

PA-PQI 5.08

Agency leaders, senior managers, program directors, and supervisors:
  1. keep PQI on the agenda of management and staff meetings;
  2. integrate data discussions and outcomes monitoring into case reviews, supervision, performance review, and contract monitoring; 
  3. regularly evaluate the need for and uses of data at the worker, program, region/community, agency, and system level; and
  4. evaluate the PQI infrastructure, processes, and procedures.
2023 Edition

Performance and Quality Improvement (PA-PQI) 6: Contracting Practices

The agency enters into contracts as a purchaser of services with due regard for practices that promote positive service recipient outcomes and efficient use of resources.
Interpretation: The standards in PA-PQI 6 apply to all contracts entered into by the agency in which it acts as a purchaser of (1) social and human services or (2) staff training and other personnel development services.  This includes contracts with provider organizations as well as contracts with independent contractors.
NA Contracting is managed by an external department.

NA State-administered agency regional office
Related Standards:
1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • Contracting and procurement policies, procedures, and applicable regulations
  • List of applicable contracts
  • Sample of three applicable contracts
State Administered Agency (Regional Office)
  • Evaluated at the Central Office Only
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • Additional contracts
State Administered Agency (Regional Office)
  • Evaluated at the Central Office only
County/Municipality Administered Agency, State Administered Agency (Central Office), or other Public Entity
  • Interviews may include:
    1. Agency head
    2. Contract manager(s)
    3. Contracted providersincluding independent contractors
State Administered Agency (Regional Office)
  • Evaluated at the Central Office only

 

PA-PQI 6.01

The pursuit of contracts is:
  1. consistent with the agency’s mission and practice model; 
  2. aligned with, and supportive of, the agency’s service array and resource development goals; and
  3. responsive to the identified needs and desired outcomes of persons served.

 

PA-PQI 6.02

The agency:
  1. establishes a system of standardized contracting practices; and
  2. conducts due diligence in contracting activities including review of possible risks.

 

PA-PQI 6.03

The agency has a process for verifying that prospective contractors:
  1. have sufficient human and financial resources to fulfill the terms of the contract; 
  2. are licensed or otherwise legally authorized to provide the contracted services; 
  3. employ appropriately qualified staff; and 
  4. have a history of satisfactory performance under previous contracts with the agency, as applicable.
Interpretation: The agency should have a process for verifying the qualifications of independent contractors or personnel employed by contracted providers including confirmation that providers:
  1. possess relevant licenses and/or credentials;
  2. have the desired expertise and competencies for the contracted service, including cultural responsiveness and sufficient experience delivering services to the population served; and
  3. receive appropriate supervision.
2023 Edition

Performance and Quality Improvement (PA-PQI) 7: Contract Monitoring and Quality Improvement

The agency monitors, evaluates, and enhances the quality and effectiveness of services purchased from other provider organizations or independent contractors.
Interpretation: Contracting of services does not relieve the public agency of their responsibility to ensure that high quality, effective services are being delivered. Contract monitoring practices ensure contracted providers are in compliance with applicable law and regulation, providing high quality services, achieving identified deliverables, and meeting desired outcomes. 
 

Interpretation: Public agencies must have a well-defined monitoring process that is laid out in its contract monitoring procedures. For state-administered agencies, this includes identifying the role of regional offices in implementing each of the contract monitoring and quality improvement activities identified in this Core Concept. For example, when case responsibility is shared by the regional office or when the contract originates at the regional office, it may be appropriate for the region to be more directly involved in contractor monitoring and quality improvement.
 
Related Standards:
1
Full Implementation, Outstanding Performance
A rating of (1) indicates that the agency's practices fully meet the standard and reflect a high level of capacity.  
  • All elements or requirements outlined in the standard are evident in practice, with rare or no exceptions: exceptions do not impact service quality or agency performance. 
2
Substantial Implementation, Good Performance
A rating of (2) indicates that an agency's infrastructure and practices are basically sound but there is room for improvement.
  • The majority of the standards requirements have been met and the basic framework required by the standard has been implemented. 
  • Minor inconsistencies and not yet fully developed practices are noted; however, these do not significantly impact service quality or agency performance.
3

Partial Implementation, Concerning Performance
A rating of (3) indicates that the agency's observed infrastructure and/or practices require significant improvement.  

  • The agency has not implemented the basic framework of the standard but instead has in place only part of this framework.  
  • Omissions or exceptions to the practices outlined in the standard occur regularly, or practices are implemented in a cursory or haphazard manner.  
  • Service quality or agency functioning may be compromised.  
  • Capacity is at a basic level.
4
Unsatisfactory Implementation or Performance
A rating of (4) indicates that implementation of the standard is minimal or there is no evidence of implementation at all.  
  • The agency’s observed administration and management infrastructure and practices are weak or non-existent; or show signs of neglect, stagnation, or deterioration.
Self-Study Evidence On-Site Evidence On-Site Activities
County/Municipality Administered Agency, State Administered Agency (Central Office) or other Public Entity
  • Contract monitoring procedures
  • Sample of three contract monitoring plans
  • Contract monitoring tools and scoring mechanisms
  • Information provided to contractors
  • Sample of three contractorimprovement plans
  • Sample of three contractor progress reports
State Administered Agency (Regional Office)
  • No Self-Study Evidence
County/Municipality Administered Agency, State Administered Agency (Central Office) or other Public Entity
  • See contracts in PA-PQI 6
  • Additional contract monitoring plans
  • Documentation of technical assistance to contracted providers
  • Training curricula for contract manager(s)
  • Sample of three job descriptions for contract manager(s)
  • Documentation tracking contract managers' completion of required trainings
State Administered Agency (Regional Office)
  • Documentation of reporting/information sharing between the region and the central office regarding the quality of services from contracted providers
County/Municipality Administered Agency, State Administered Agency (Central Office) or other Public Entity
  • Interviews may include:
    1. Agency leadership
    2. In-house counsel
    3. Contract manager(s)
    4. PQI personnel
    5. Contracted providers
State Administered Agency (Regional Office)
  • Interviews may include:
    1. Regional Director
    2. PQI personnel
    3. Contracted providers

 

PA-PQI 7.01

Written contracts contain all significant terms and conditions in accordance with applicable law.
Interpretation: “Significant terms” can include, as appropriate to the type of contract:
  1. roles and responsibilities of participating agencies;
  2. services to be provided;
  3. service authorization including eligibility criteria;
  4. provisions and/or requirements for provider training and technical assistance, as necessary;
  5. duration of contract including delineation of follow-up services;
  6. policies and procedures for sharing information including access to case record provisions;
  7. methods for resolving disputes;
  8. utilization management protocols;
  9. performance and quality improvement responsibilities;
  10. a plan and procedure for timely payment and consequences for failure to pay;
  11. documentation necessary for, and means of reporting to, funding or oversight bodies; 
  12. required levels of insurance; and
  13. conditions for termination of the contract.
NA Contracting is managed by an external department.

 

PA-PQI 7.02

The agency integrates contract monitoring into its performance and quality improvement activities by developing a plan for monitoring contractor progress that:
  1. is developed in partnership with the provider and tailored to the service being provided;
  2. establishes goals and performance measures for service quality, consumer satisfaction, and outcomes; 
  3. specifies monitoring activities including frequency and responsible parties;
  4. establishes specific requirements for provider participation in performance and quality improvement activities including qualitative and quantitative data reporting and corrective action; 
  5. outlines how performance data will be  monitored and reported out; and 
  6. establishes mechanisms for ongoing, regular communication between the public agency and the contracted provider. 
Interpretation: In regards to element (d), the collection, analysis, and distribution of contract monitoring data should be aligned with the agency’s performance and quality improvement system, ensuring that incoming data is used to inform continuous quality improvement of purchased services.
Examples: In regard to element (e), in addition to sharing findings with relevant staff within the public and private agency, the agency may also wish to tailor reports for additional stakeholder groups that have an impact on, or vested interest in, performance achievement such as the public, courts, provider networks, citizen review boards, and legislators.

Examples: Monitoring activities include, but are not limited to:
  1. review of performance reports from contracted providers to track progress and identify trends/concerns;
  2. case reviews;
  3. meetings; and
  4. visits to the program.

 

PA-PQI 7.03

Contracted providers receive information on:
  1. agency mission, principles, logic models, and system-wide performance indicators;
  2. relevant service-delivery policies and procedures;
  3. relevant federal and state requirements;
  4. technical assistance procedures; 
  5. the conflict resolution and provider appeal process; and
  6. other information necessary to establish consistent practice and policy implementation. 
NA Contracting is managed by an external department.
Examples: Technical assistance can include providing the support needed to:
  1. use the information management system for data reporting; 
  2. understand how data will be used to track performance;
  3. ensure service continuity and quality; and
  4. support implementation of system-wide practice initiatives.

 

PA-PQI 7.04

Systems are in place to collect and respond to contractor performance concerns identifed by public agency staff at all levels, including frontline staff and supervisors, and when areas of concern are identified, the agency:
  1. develops an improvement plan in conjunction with the contractor;
  2. ensures contractor follow-up and remediation; and
  3. terminates contracts if contractors do not comply with improvement action/remediation plans.

 

PA-PQI 7.05

A qualified staff member is assigned to oversee and monitor each contract and is trained and supervised on: 
  1. facilitating partnership and collaboration;
  2. understanding and using data collection and monitoring tools;
  3. the relationship between the PQI system, contract monitoring, and quality service delivery;
  4. report writing; and
  5. contract requirements.
Interpretation: When monitoring responsibilities are spread across divisions, personnel should work collaboratively to ensure their efforts are aligned, findings are shared, and duplication of effort is minimized.
Copyright © 2024 Council on Accreditation