Evaluation methodology

To guide the evaluation, a methodology and evaluation matrix were developed that incorporated the flexibility of the Treasury Board’s Policy on Results (2016). The scope of the evaluation focused on relevance and performance. The evaluation included seven main questions, as summarized below.

Relevance

  1. Do the Centres continue to address a demonstrated need within Justice?

Effectiveness and Efficiency

  1. What is the current design of each Centre and to what extent has each Centre been implemented according to plan?
  2. How effective and efficient is the design and delivery of the Centres, including the mandate and service delivery model?
  3. To what extent do the Centres work collaboratively to provide high-quality and consistent legal and policy advice?
  4. How do the Centres contribute to Justice’s Vision of client-centric strategic partnerships? Are there any areas of improvement?
  5. Are there any opportunities (i.e., good practices or lessons learned) that could be implemented across other Centres to enhance their design or service delivery?
  6. Are there common elements related to success among the Centres?

The evaluation included multiple data collection methods, including a document review, administrative data review, process mapping, key informant interviews, and focus groups. Each of these methods is described below.

3.1 Document Review

The document review provided descriptive information on each Centre, as well as information that responds to evaluation questions. It covered the following documentation:

3.2 Administrative Data Review

The administrative data review was obtained from Justice’s Departmental Business Analytics System (i.e., Explore). Data was extracted from Explore’s Data Warehouse via Tableau, which includes data from iCASE, LEX, the IFMS, and the Human Resources Management System. The data were extracted between July 2020 and January 2021. The data review focused on files to which Centre timekeepers recorded time between FY 2015-2016 and FY 2019-2020. The data review considered both the number of actively managed files and hours recorded on those files by file type (excluding corporate files), client name, risk and complexity rating, and type of work conducted.

3.3 Process Mapping

To ensure that the evaluation was based on a complete and accurate understanding of the design and delivery of each of the Centres, an early data collection task included a process mapping exercise. An Evaluation Working Group representative for each Centre was asked to assist with identifying participants for the process mapping sessions. This exercise involved one three-hour in-person session with each Centre, with the goal of describing the flow of work, how processes were carried out, and where potential constraints existed or where improvements could be made.

3.4 Key Informant Interviews

Interviews were conducted with key informants representing the following groups: senior personnel from the Centres, other areas of Justice (i.e., LSUs, NLS, other Sectors or Portfolios as applicable), and representatives from client departments and agencies. The interviews with Centre legal counsel covered a broad range of questions regarding the relevance, design and delivery, effectiveness and efficiency of the Centres. Interviews were also conducted with users of the Centres, including those within Justice and other client departments and agencies who worked directly with the Centres. These interviews focused on the legal needs addressed by the Centres, the users experiences working with the Centres (e.g., when and how they engaged with the Centres), and questions related to the quality, consistency, and efficiency of the services received.

The evaluation included 60 small group interviews, involving 128 individuals. Table 4 provides further details on the distribution of key informants per Centre, with a further breakdown by stakeholder group provided in Table 7 in Appendix B).

Table 4 : Distribution of key informants interviewed per Centre
Centres Number of key informants interviewed

CLEL

21

CLS

16

ALC

17

CIPL

21

CAILS

12

HRLS

15

OLAD

15

CoEPL

11

Total

128

3.5 Focus Groups

Focus groups were conducted with legal counsel working in the Centres.8 They centered on the strengths and limitations of the current mandate and work processes, as well as the quality of the support offered by the Centres. A total of 15 focus groups, involving 48 individuals, were held across the Centres (see Table 8 in Appendix B for a more detailed breakdown of number of participants per Centre).

3.6 Limitations, Challenges and Mitigation Strategies

The evaluation encountered a few limitations and challenges, with mitigation strategies implemented accordingly (see Table 5).

Table 5: Summary of limitations, challenges, and mitigation strategies
Limitation or Challenge Mitigation Strategy
  • Interviews and potential for bias
  • While a significant number of interviews and focus groups were held as part of this evaluation, they cover eight fairly unique Centres. As such, the number of stakeholders consulted under each category of each Centre remained limited.
  • There is a potential for bias due to the sampling approach, the voluntary nature of participation, self-reporting, and the possible desire to affect outcomes.
  • To address this, multiple lines of evidence (e.g., Justice’s Client Feedback Survey, other relevant surveys, and administrative data) and triangulation were used to confirm results where possible.
  • Efforts were made to include a representative sample of participants, with diverse perspectives:
  • Wide range of stakeholders (i.e., client departments and agencies, LSUs, NLS, other areas within Justice, and legal counsel within the Centres);
  • Stakeholders who engaged with the Centres to varying degrees (i.e., large, medium and small users).
  • Reliability and validity of administrative data captured in iCASE/LEX
  • The amount of time recorded to various file types may not accurately reflect the type of work being conducted. Several Centres noted that there may have been inconsistent recording in iCASE/LEX (i.e., some work may have been recorded to incorrect categories).
  • Due to some of the data entry and consistency issues encountered, it was challenging to obtain more detailed information in some areas, including:
  • Risk and complexity of files were often not assessed and/or reflective of the risk and complexity of the file as a whole (rather than the work the Centre was doing on the file).
  • For Centres who provided legal advice to LSUs, they were not always provided with the file numbers for the legal advice they provided. In some cases, this resulted in Centres opening their own files to record their time, which could result in break in the link between the original file information (e.g., risk and complexity level, client department for whom the legal advice was provided, etc.) and the new file opened by the Centre.
  • This was addressed by holding working sessions with each Centre to fully assess any potential data management issues, and determine the best approach to appropriately represent the information in the reports.

Footnotes

8 Senior personnel from the Centres who already participated in interviews were excluded from the focus groups.