Central Agencies Portfolio Evaluation

3. Methodology

The evaluation made use of multiple lines of evidence in order to support robust findings. The methodology included four lines of evidence: a document and data review, key informant interviews, a survey of legal counsel, and case studies.

The evaluation matrix (which links the evaluation questions, indicators, and lines of evidence) and the data collection instruments were developed with the input of the CAP evaluation working group. The evaluation matrix is included in Appendix B, and the data collection instruments are in Appendix C.

Each of the evaluation methods is described more fully below. This section also includes a brief discussion of methodological challenges.

3.1. Document and Data Review

The document and data review was conducted both to inform the development of data collection instruments and to address the majority of the evaluation questions.

Documents reviewed were obtained from internal, external and publicly available sources. Departmental documents reviewed included Departmental Performance Reports; Reports on Plans and Priorities; the results from Public Service Employee Surveys (PSES)Footnote 14, and Client Feedback Survey results.Footnote 15 Internal Portfolio documents, as well as publicly available information such as Budgets and Speeches from the Throne, were also reviewed.

In addition to documents, the evaluation involved the review of iCase data from fiscal years 2010-11 to 2014-15; iCase is the Department’s integrated case management, timekeeping, document management, and reporting system.

3.2. Key Informant Interviews

The key informant interviews conducted for this evaluation addressed the majority of evaluation questions, and were a key line of evidence in gathering information on the need for the Portfolio, as well as the effectiveness of Portfolio activities. A list of potential key informants was prepared, and interview guides tailored to each key informant group were developed in consultation with the evaluation working group. Interviews were conducted with a total of 31 key informants. The specific categories of key informants are included in Table 3.

The following scale has been applied to report on interviews:

Table 3: Key Informant Interviews
Category Number of Key Informants
ADMO and the CAP LSUs 14
Other areas of the Department of Justice (regional offices, specialized sections within the Public Law Sector, and the LSB) 7
Client departments or agencies 10
TOTAL 31

3.3. Survey of Counsel

To gather the input of all Portfolio counsel, the evaluation included a confidential web-based survey. The survey was online for approximately two weeks — from September 23 to October 8, 2015. During this period, two reminders were sent to potential participants in order to increase the response rate. Invitations were sent to 64 counsel, but three counsel were away for the entire period of survey.Footnote 16 In total, 39 respondents completed the survey for a response rate of 64%. Once the survey was finished, answers to open-ended questions were coded and the survey data were analyzed using SPSS, a statistical software package.

Table 4 provides a profile of survey respondents. Generally, respondents were representative of the population of Portfolio counsel in terms of level, years with the Department, and where they work within the Portfolio.

Table 4: Comparison of the CAP and Survey Respondent Profiles

Note: Some totals do not sum to 100%, due to rounding.

What is your current classification?
Characteristics CAP (n=64) Survey Respondents (n=39)
Number % Number %
Counsel: LP-01 17 27% 6 15%
Counsel: LP-02 30 47% 24 62%
Counsel: LP-03 12 19% 5 13%
Counsel: LP-04 2 3% 1 3%
Counsel: LC-02 2 3% 2 5%
Counsel: LC-03 1 2% 1 3%
When did you first join the Department?
Characteristics CAP (n=64) Survey Respondents (n=39)
Number % Number %
Less than a year ago 1 2% -- --
Between 1 and 5 years ago 17 27% 10 26%
Between 6 and 10 years ago 16 25% 8 21%
More than 10 years ago 30 47% 21 54%
Where do you work?
Characteristics CAP (n=64) Survey Respondents (n=39)
Number % Number %
ADMO 2 3% 2 5%
Finance – GLS 8 13% 5 13%
Finance – TCD 6 9% 5 13%
FINTRAC LSU 2 3% -- --
OSFI LSU 2 3% 1 3%
PSC LSU 8 13% 5 13%
TBS LSU 36 56% 21 54%
Types of services performed regularly or frequently in work for Portfolio LSUTable note i
Characteristics CAP (n=64) Survey Respondents (n=39)Table note ii
Number % Number %
Advisory services     31 79%
Litigation services     7 18%
Legislative drafting     4 10%
Other     1 3%
Table note i

Information not available for all CAP counsel in the survey sample.

Return to table note i referrer

Table note ii

Multiple response allowed in survey; total sums to more than 100%.

Return to table note ii referrer

3.4. Case Studies

Nine case studies of Portfolio files were conducted to allow for an exploration of best practices and lessons learned. The files were a mix of litigation support, advisory, and legislative services files. For each case study, a file review template was completed. In addition, a total of 17 telephone interviews with LSU counsel in the Portfolio, counsel in other areas of Justice, and clients were conducted to supplement documented information and to allow for a more in-depth assessment of how the file was handled and the effectiveness of the working relationship between the Portfolio, other areas of Justice (i.e., other LSUs, regional offices, specialized sections within headquarters), and the client representatives. The case studies included two files for the TBS LSU, two files for Finance – GLS, two files for the PSC LSU, one file for Finance – TCD, one file for FINTRAC LSU, and one file for the OSFI LSU. The files included examples of advisory, litigation and legislative legal services.

3.5. Limitations

The evaluation faced a few methodological limitations. These are listed below by line of evidence.

Review of documents and data. In the planning stages of the evaluation, it was anticipated that iCase data could provide information on trends in legal risk and complexity of CAP files, and that it could support an analysis of the effectiveness of process optimization efforts related to file assignment based on legal risk and complexity levels. However, few files were given numeric legal risk and complexity assessments in iCase. As a result, the evaluation could not use administrative data to address these evaluation issues, and had to rely on the perceptions of CAP counsel and clients.

Interviews, case studies, and the survey. The interviews with key informants and case study participants, as well as the survey of counsel, have the possibilities of self-reported response bias and strategic response bias. Self-reported response bias occurs when individuals are reporting on their own activities and so may want to portray themselves in the best light. Strategic response bias occurs when participants answer questions with the desire to affect outcomes.

Mitigation strategy. The mitigation strategy for the above methodological limitations was to use multiple lines of evidence from different stakeholder groups, as well as different types of evidence in general. For example, the evaluation gathered information from the Portfolio as well as clients. In addition, the evaluation used both quantitative and qualitative data collection methods to answer evaluation questions. By triangulating the findings from these different sources, the evaluation was able to strengthen its conclusions despite the limitations.