Evaluation methodology
To guide the evaluation, a methodology and evaluation matrix were developed that incorporated the flexibility of the Treasury Board’s Policy on Results (2016). The scope of the evaluation focused on relevance and performance. The evaluation included seven main questions, as summarized below.
Relevance
- Do the Centres continue to address a demonstrated need within Justice?
Effectiveness and Efficiency
- What is the current design of each Centre and to what extent has each Centre been implemented according to plan?
- How effective and efficient is the design and delivery of the Centres, including the mandate and service delivery model?
- To what extent do the Centres work collaboratively to provide high-quality and consistent legal and policy advice?
- How do the Centres contribute to Justice’s Vision of client-centric strategic partnerships? Are there any areas of improvement?
- Are there any opportunities (i.e., good practices or lessons learned) that could be implemented across other Centres to enhance their design or service delivery?
- Are there common elements related to success among the Centres?
The evaluation included multiple data collection methods, including a document review, administrative data review, process mapping, key informant interviews, and focus groups. Each of these methods is described below.
3.1 Document Review
The document review provided descriptive information on each Centre, as well as information that responds to evaluation questions. It covered the following documentation:
- Program documents: Key background documents provided by the Centres were reviewed. In some cases, it included internal data (e.g., training sessions, monitoring trends, and number of requests).
- Publicly available departmental and other government documents: The document review included publicly available documents, and relevant documents on federal priorities.
- Surveys: The Department of Justice Canada Client Feedback Survey is administered by the Corporate Planning, Reporting, and Risk Division as part of its overall performance management agenda. The purpose of the Survey was to obtain feedback on the degree to which Justice legal services respond to the needs of client departments and agencies. Qualitative feedback from Cycle III (2016-2019) were available for some of the Centres (i.e., CLEL and CoEPL). In addition, results from the Client Satisfaction Survey conducted by CoEPL were reviewed.
3.2 Administrative Data Review
The administrative data review was obtained from Justice’s Departmental Business Analytics System (i.e., Explore). Data was extracted from Explore’s Data Warehouse via Tableau, which includes data from iCASE, LEX, the IFMS, and the Human Resources Management System. The data were extracted between July 2020 and January 2021. The data review focused on files to which Centre timekeepers recorded time between FY 2015-2016 and FY 2019-2020. The data review considered both the number of actively managed files and hours recorded on those files by file type (excluding corporate files), client name, risk and complexity rating, and type of work conducted.
3.3 Process Mapping
To ensure that the evaluation was based on a complete and accurate understanding of the design and delivery of each of the Centres, an early data collection task included a process mapping exercise. An Evaluation Working Group representative for each Centre was asked to assist with identifying participants for the process mapping sessions. This exercise involved one three-hour in-person session with each Centre, with the goal of describing the flow of work, how processes were carried out, and where potential constraints existed or where improvements could be made.
3.4 Key Informant Interviews
Interviews were conducted with key informants representing the following groups: senior personnel from the Centres, other areas of Justice (i.e., LSUs, NLS, other Sectors or Portfolios as applicable), and representatives from client departments and agencies. The interviews with Centre legal counsel covered a broad range of questions regarding the relevance, design and delivery, effectiveness and efficiency of the Centres. Interviews were also conducted with users of the Centres, including those within Justice and other client departments and agencies who worked directly with the Centres. These interviews focused on the legal needs addressed by the Centres, the users experiences working with the Centres (e.g., when and how they engaged with the Centres), and questions related to the quality, consistency, and efficiency of the services received.
The evaluation included 60 small group interviews, involving 128 individuals. Table 4 provides further details on the distribution of key informants per Centre, with a further breakdown by stakeholder group provided in Table 7 in Appendix B).
| Centres | Number of key informants interviewed |
|---|---|
CLEL |
21 |
CLS |
16 |
ALC |
17 |
CIPL |
21 |
CAILS |
12 |
HRLS |
15 |
OLAD |
15 |
CoEPL |
11 |
Total |
128 |
3.5 Focus Groups
Focus groups were conducted with legal counsel working in the Centres.8 They centered on the strengths and limitations of the current mandate and work processes, as well as the quality of the support offered by the Centres. A total of 15 focus groups, involving 48 individuals, were held across the Centres (see Table 8 in Appendix B for a more detailed breakdown of number of participants per Centre).
3.6 Limitations, Challenges and Mitigation Strategies
The evaluation encountered a few limitations and challenges, with mitigation strategies implemented accordingly (see Table 5).
| Limitation or Challenge | Mitigation Strategy |
|---|---|
|
|
|
|
Footnotes
8 Senior personnel from the Centres who already participated in interviews were excluded from the focus groups.
- Date modified: