Public Safety, Defence, and Immigration Portfolio Evaluation

3. Methodology

This section provides a brief description of the methodology used to evaluate the PSDI Portfolio.

3.1. Evaluation Approach

The approach to both the data collection and the analysis of evaluation findings reflects the nature of activities being undertaken by the PSDI Portfolio. Legal services provided by legal counsel operating within the operational framework of the Portfolio are expected to support the ongoing management and operations of a number of federal governments and agencies. Contrary to more typical programs or initiatives, these services are not meant to rectify an identified need or gap, or to fundamentally alter the conditions of a targeted group of individuals or communities. In fact, under many scenarios, these services are provided in order that client departments and agencies can proceed with programs and initiatives that respond to identified needs or priorities of the federal government. In that sense, it could be argued that PSDI activities assist federal department and agencies in carrying out their respective mandates.

Consequently, the data collection and analysis carried out as part of this evaluation assess the extent to which client departments and agencies are receiving the support they require to successfully carry on their activities. For instance, the ability of the Portfolio to respond in a timely and consistent manner to client requests and its ability to communicate legal risks in such a way as to allow client departments and agencies to make informed decisions are some of the indicators of success that the methodological approach has attempted to assess. In doing so, it is important to emphasize that evaluation activities were never meant to assess the quality of legal opinions developed by legal counsel. This would be well beyond the scope of this evaluation, and would detract from the fundamental goal of determining the extent to which client departments and agencies are receiving the legal support they require.

Both the data collection and analysis conducted as part of this evaluation align with the overall framework provided by the federal government’s Policy on Evaluation, which expects the evaluation to support ongoing accountability, to inform government decisions on resource allocation, and to support the ongoing management and improvement of the program. Footnote 11

Finally, all research activities undertaken as part of this evaluation were administered in accordance with normal practices in the field of program evaluation, including the guidelines provided in the Code of Ethics and the Evaluation Standards of the Canadian Evaluation Society. Footnote 12

3.2. Research Methods

In order to assess the relevance and the performance of the Portfolio as detailed in the evaluation matrix included in Appendix A, the evaluation included a number of data collection activities that are described in this subsection.

3.2.1. Administrative file and document review

The administrative file and document review covered both administrative and publicly available information. In addition to the 2013 PSDI Portfolio Evaluation Framework, the administrative file and document review includes the following:

The list of publicly available information includes Departmental Performance Reports, Reports on Plans and Priorities, Budget Speeches, and Speeches from the Throne.

3.2.2. Key informant interviews

During the initial stage of the evaluation process, two preliminary group interviews were conducted with representatives of the PSDI Portfolio. The purpose of these interviews was to obtain contextual information that would complement the written information available at that point in time.

Table 2: Distribution of Key Informant Interviews
Source Number of Interviews
DLSUs 24
Regional Offices 10
Client Departments 19
ADAG Office 6
Other stakeholders 9
Total 68

As part of the main data collection process, 68 interviews involving 83 individuals were conducted (Table 2). These included interviews with senior managers and staff from DLSUs, client departments, the ADAG Office, and other stakeholders (central agencies, Public Law Sector, Policy Sector).

Before being contacted, all key informants received a letter from Justice Canada, in both official languages, describing the purpose and nature of the research and inviting their participation. A follow-up was conducted to confirm a time and date for the interview. Once the interview was confirmed, key informants received the interview guide in advance of the interview, allowing them to prepare accordingly. The findings from key informant interviews were analyzed to identify trends and divergence among the selected group. That analysis was done with the support of NVivo software.

3.2.3. Survey of legal counsel

An anonymous and confidential survey of legal counsel from DLSUs and regional offices was conducted to obtain information on the Portfolio’s performance, key issues and impacts. The survey was conducted online for a total of three weeks. During this period, two reminders were sent to potential participants to increase the response rate. Using a stratified random sampling approach, 500 legal counsel received an email invitation that included a direct and protected link to the online survey questionnaire. As indicated in Table 3, 216 individuals completed the survey questionnaire, for a response rate of 46%.

Table 3: Survey methodology summary
Sample method Stratified random sampling
Survey method Online
Pretest date October 8, 2014
Survey dates Oct. 28–Nov. 14, 2014
Total invitations 500
Undeliverable 33
Net usable invitations 467
Completed surveys 216
Response rate. Footnote 13 46%

Once the survey was completed, open-ended responses were coded and the survey data was analyzed using SPSS, a statistical software package.

Table 4 includes a profile of respondents to the legal counsel survey. Overall, survey respondents are well distributed among the various subgroups of the Portfolio.

Table 4: Profile of survey respondents
Subgroups Number Percentage
DLSUs
CBSA 17 16%
CIC 14 13%
CSC 7 7%
CSE 4 4%
ND/CAF 21 20%
NSLAG 17 16%
PBC 2 2%
PS 7 7%
RCMP 17 16%
Subtotal 106 100%
Regional offices
Atlantic 11 10%
Quebec 19 17%
Ontario 17 15%
Prairies 29 26%
British Columbia 26 24%
Northern 8 7%
Subtotal 110 100%
Total 216  

3.2.4. Case studies

Ten case studies were conducted as part of the evaluation. The purpose of these case studies was to illustrate how the Portfolio works in practice, particularly how legal counsel and client departments and agencies collaborate in the management of legal matters. In order to ensure that solicitor-client privilege was respected, a Department of Justice employee reviewed the legal files and completed standardized data collection templates, which are included in Appendix C. The interviews were based on the summary provided in these documents.

The selection of case studies was done to ensure a meaningful representation of the litigation and advisory work done by the Portfolio, and included files involving multiple clients and the Office of the ADAG. These case studies focused largely on closed files and included a review of the case file, followed by key informant interviews with both legal counsel involved in the case (from DLSUs, regional offices and the ADAG Office as applicable) and representatives from client departments and agencies.

3.3. Limitations

The evaluation faced a few methodological limitations, which are briefly described in this sub-section.

3.3.1. iCase data

iCase serves as the Department’s web-based national application that supports the practice of law and the management of legal services provided by the Department of Justice Canada. For the purpose of this evaluation, iCase data was used to document a number of evaluation indicators and to measure several dimensions of the work performed by the Portfolio. In doing so, however, some limitations relating to iCase data had to be considered. First, DLSUs do not use a consistent approach to opening advisory files, particularly for those files that require limited work. Some legal counsel have recorded this work under general advisory files, while others have opened a new specific advisory file. Secondly, the litigation support work provided by DLSUs does not appear to be recorded in a consistent manner. Some DLSUs record this work under the advisory category, whereas other DLSUs recorded this work under the litigation category. Such inconsistencies in data collection practices across the Portfolio can undermine the overall usefulness of the data collected. Footnote 14 Data for legislative services started to be collected in a comprehensive manner in 2010/11. In addition, late in the evaluation reporting process, the evaluation also discovered a problem with the coding of hours on legislative files in at least one DLSU. As a result, the evaluation looked at aggregate legislative data rather than examining the data by department. Finally, it should be noted that, as a result of recent changes made to iCase, it was not possible (at the time of the evaluation) to access data on the level of risks and complexity of advisory files. To mitigate some of these challenges, the evaluation is reporting advisory work largely on the basis of the hours worked, as opposed to the number of files actively managed, particularly when considering historical trends.

3.3.2. Interviews and survey

Interviews with key informants and case study participants, as well as the survey of PSDI legal counsel, have the possibilities of self-reported response bias and strategic response bias. Self-reported response bias occurs when individuals are reporting on their own activities and so may want to portray themselves in the best light. Strategic response bias occurs when the participants answer questions with the desire to affect outcomes. To mitigate this limitation, multiple lines of evidence were used. In particular, the evaluation included client perspectives through the use of key informant and case study interviews, in addition to using Client Feedback Survey performance data.

3.3.3. Selection of case studies

During each year covered by the evaluation, the Portfolio managed well over 20,000 active files. In this context, it was not feasible to obtain a representative sample of files to be used for case study purposes. Instead, the evaluation relied on the guidance of a working group to select files that represented various dimensions of the work performed by the Portfolio. Here again, the mitigation strategy consisted in using multiple lines of evidence and in ensuring that findings from the case study were properly contextualized and used for illustrative purposes.