Aboriginal Courtwork Program, Formative Evaluation

3. Findings – Performance Measurement

One of the purposes of this evaluation was to assess the implementation of the ACW performance measurement strategy and determine whether the performance information required for the summative evaluation would be available. This chapter presents the findings relating to the performance measurement strategy. Annex D provides an overview of the actual performance measurement requirements outlined in the program's RMAF. The performance measurement information is necessary to meet the federal accountability requirements and demonstrate the value of the program when the Terms and Conditions are renewed (by March 31, 2008).

3.1. Extent of Implementation

The document review indicates that the key decisions and deadlines around the implementation of the performance reporting requirements were as follows:

Pilots of the judiciary and client surveys are under way in British Columbia, Nova Scotia and Alberta.

P/T officials primarily had concerns with performance measures 7 to 12. Specific P/T concerns were as follows:

In the North, territorial officials expressed specific concerns about:

In this regard, it should be noted that recent DOJ reviews of access to justice to services in the North[13] call for an integrated approach to collecting performance measurement information and, in some cases, suggest what these measures should be.  The reviews also identify the capacity constraints of the Legal Services Boards, which could affect data collection in the North.

According to the SDA, several factors help in preparing the performance measurement information reports, including the pilots under way, established reporting procedures, and SDA' motivation to ensure accountability.  Challenges were related to achieving compliance and consistency among a large and geographically dispersed ACW team, the inflexibility of the federal reporting template, and lack of a dedicated information technology expert to troubleshoot problems.  Some SDA reported inconsistency between their information management systems and DOJ reporting requirements. 

SDA insisted that key operational terms (e.g., service delivery and client) must be defined and standardized before reliable statements can be made.  They also questioned the practicality of long-term follow up and of obtaining the collaboration of criminal justice system officials (e.g., judges, lawyers) in the collection of performance information.  They were also concerned that a lack of capacity might inhibit their ability to meet reporting requirements and that the reporting requirements did not capture all courtworker activity.

The SDA made several recommendations:

To date, the performance information received by the Department has been of varying quality. In addition, the Department has held workshops in each jurisdiction to refine the logic model to reflect courtworker activities. The Department is aware of the P/T and SDA challenges and concerns, but views the performance measurement information as necessary to meet the federal accountability requirements and demonstrate the value of the program when the Terms and Conditions are renewed (by March 31, 2008).

Courtworkers were asked about performance measurement information they were required to provide to their SDA.  Almost all courtworkers (90%) indicated that they had been providing their SDA with statistical information on their clients.[14]  Of these, 81 percent provided information monthly; 24 percent reported weekly or daily.

Conclusions:  In anticipation of the December 31, 2006 reporting deadline for the first performance report, there are varying degrees of implementation within the SDA.  The requirements for core measures 1 to 6 do not appear problematic, although definitions are still needed for the terms client and service delivery.

Pilot judiciary and client surveys are under way in B.C., Nova Scotia and Alberta. P/T officials are concerned about the capacity of smaller SDA to conduct the judiciary and client surveys, especially without any additional resources and training.  Several data utility and quality issues need to be addressed, and an information management system should be developed that allows for community-based analyses.  There are also concerns about confidentiality and privacy, literacy and response rate challenges for the surveys, and the lack of training and resources in SDA.

3.2. Adequacy of Tools and Resources to Assist in Meeting Performance Reporting Requirements

To help identify and develop the performance measurement reporting requirements, the TWG SC, through the Department, contracted with a consulting firm to conduct performance measurement consultations with provincial officials and SDA staff in all provinces and use those findings to develop a performance measurement and reporting guide for the P/T and SDA.  The consultants held workshops with provincial officials and SDA staff in 2004-05 to develop an ACW logic model specific to each jurisdiction, to identify and refine the core performance measures, and to identify means of addressing data collection challenges.  The Department subsequently conducted similar workshops with Ontario and two territories in 2005 and one territory in 2006.

The Performance Measurement and Reporting Guide for the provinces was finalized in September 2005, the guide for NWT was completed in October 2005, and that for Yukon in May 2006.  The guide for Nunavut had not been finalized at the time of this evaluation. The Performance Measurement and Reporting Guide contains reporting templates for use by SDA, and a sample year-end performance report developed by DOJ.

Although P/T officials and SDA found the consultation workshop, the guide and templates useful, additional clarity, training, resources and time are needed to successfully implement the performance measurement requirements.

Specifically, P/T identified the following additional support needed:

All SDA had received tools (such as templates, guides, forms, samples or notes) to help them meet the performance measurement requirements.  Most SDA expressed concern that the tools were adequate only from a funder's perspective and were not useful for the SDA, though others felt that the logic models and performance measurement guide provided were helpful tools, in part because they provided good real-life examples. Some of the SDA felt more standardization was required or at least agreement regarding what data would be most useful. A few SDA stressed the need to make the tools simpler and more understandable.

Some SDA said they needed more human resources to collect, manage and report the data, and others suggested training, updates, communication or opportunities for SDA to share information or experiences.  Two SDA suggested establishing a database that would match the performance measurement requirements.

Almost all courtworkers surveyed (94%) indicated that they had been given tools to collect the required performance information.  The most frequently mentioned tools were:

About three quarters of the courtworkers indicated they had the tools they needed.  Many of those who required more tools said they needed a computer, a better computer, or a laptop.

Conclusions:  Although the various resources have been useful, more tools and resources are required, including survey training, a hands-on operational guide, a clear definition of clients, and more funding to support the added burden on SDA to carry out the surveys.  Some jurisdictions suggested a national database would be beneficial.  SDA in particular would like to see the performance measurement requirements better reflect their management information needs.

3.3. Factors Facilitating or Impeding the Collection and Reporting of Performance Information

Table 5 includes a summary of the factors affecting the data collection, which were cited by P/T officials and SDA.

Table 5 – Facilitating and impeding factors to measurement
Respondents Facilitating Factors Impeding Factors
SDA
  • Many are collecting similar information
  • People are motivated
  • Well-established working relationships
  • Infrastructure already in place
  • Need more involvement
  • Needs to be more SDA driven (information or the collection process)
  • Need better understanding of how the information will be used
  • Need to ensure there is a balance between service delivery and reporting requirements (the burden is high)
P/T
  • Will help with accountability and demonstrate the value of the program
  • Training and tools have helped
  • SDA not well equipped to deal with confidentiality and privacy issues
  • A lot of accountability for a little bit of money, particularly compared to other DOJ programming e.g., AJS
  • Need a national database
  • Lack of integrated approach to performance measurement in the territories under the Access to Justice Services Agreements
  • Lack of capacity in most SDA (mostly for survey measures)
  • Survey training needed
  • Potential response bias
  • Fear of what will happen if performance information is negative
DOJ
  • Logic model and performance measurement workshops
  • Performance Measurement Guide and reporting templates
  • Inconsistent terminology
  • Capacity issues
  • Implementation was delayed by the differences in dates agreements were signed with the P/T
Courtworkers
  • The tools provided are helpful
  • Ability to follow up with clients
  • Equipment available (computers, telephones, fax machines)
  • Access to the Internet
  • Access to people (supervisor, other courtworkers, etc.)
  • The time it takes/lack of resources to do it
  • Inability to access tools off site
  • Problems with and lack of equipment (no laptop, network problems, internet problems, phone service problems, etc.)

Nearly half of SDA believed that their jobs had been made easier by having the infrastructure in place and being reasonably familiar with reporting requirements.  One SDA believed well-established working relationships with key stakeholders had also helped. Nearly half of SDA indicated the challenges resulted because of the requirements had not been accompanied by adequate resources. At least two SDA were hindered by inconsistency between the Department and SDA over how data would be used.  Others identified the following challenges:  securing information in a timely manner, team turnover in the field or within the Department, lack of information about roles and responsibilities, and changing expectations. At least one SDA questioned how information could be obtained from those with no obligation to respond, such as criminal justice system officials.

One SDA indicated that a well-designed national database would help to illustrate the big picture provincially and nationally, and would help promote understanding of program priorities.  In fact, just over half the SDA recommended creating a standardized national electronic database that would be relevant for SDA and would meet the federal reporting requirements.  The rest of the SDA, however, and especially those who are struggling with data collection, want to meet to review accuracy, answer questions and exchange information. Other recommendations included a more streamlined information system with more human resources to populate the database and a policy analyst to analyse the data.  Two SDA believed that more time was needed to discuss system development at the TWG.

The Department was aware that SDA had capacity concerns about core performance measures 7 to 12 but tried to address these concerns by the extensive consultation on the performance reporting and the development of the Performance Measurement and Reporting Guide. Departmental officials also acknowledged that the North has more severe SDA capacity constraints.[15]

Courtworkers named several factors that facilitated performance reporting: the tools provided, the contacts with clients, the equipment available (including telephones and fax machines), access to the Internet and to people (supervisor, other courtworkers, etc.).  Impeding factors included the lack of ability to follow up with clients (i.e., if the information is not collected in the first visit, it may be impossible to get), the time it takes to follow-up and the lack of resources with which to do it, lack of off-site access to the tools, and equipment problems.

Conclusion:  Several factors facilitate or impede data collection and reporting. Facilitating factors include:

Impeding factors include: