Understanding the Development and Impact of Child Advocacy Centres (CACs)

3. CAC Study—Methodology

Objectives

The Department identified the research objectives and questions, and contracted Proactive Information Services Inc. to conduct the study. The project’s goals were as follows:

  1. better understand how Canadian CACs are developing and operating;
  2. measure client satisfaction with CACs;
  3. measure client satisfaction with the criminal justice system’s process and outcomes; and
  4. measure how CACs meet the following FVS objectives:
    1. increasing access to victim services;
    2. enhancing capacity to deliver appropriate and responsive services to victims; and
    3. reducing financial and non-financial hardships for victims.

Research questions

Objective #1: Better understand how Canadian CACs are developing and operating

Research Question: What services are provided by the CAC and how does it operate?

  1. How long has the CAC been operating? At what stage is it in its strategic plan? What elements are left to be realized? What is the timeline for completion?
  2. What are the CAC’s objectives?
  3. How does the CAC operationalize its objectives?
  4. Where is the CAC located (e.g., neutral facility, court house, hospital)?
  5. What services are available on site?
  6. What are the CAC’s policies and procedures on the following:
    1. multi-disciplinary team response;
    2. child and family-friendly facilities;
    3. forensic interviewing;
    4. victim advocacy and support;
    5. specialized medical treatment and evaluation;
    6. specialized mental health treatment;
    7. training, education, and support for workers;
    8. community education;
    9. case training and review; and
    10. cultural competency and diversity?
  7. How do clients come into contact with the CAC (e.g., referrals)?
  8. How does the CAC’s internal referral process function (e.g., are referrals for medical examinations standardized/automatic or made on the judgment of other staff members)?
  9. What lessons learned and/or best practices can be shared with other CACs?

Research Question: What are the characteristics of cases at the CAC?

The information collected should include, but not be limited to, the following:

  1. incident characteristics and allegations;
  2. victim and family characteristics;
  3. characteristics of the accused;
  4. services referred by staff and accessed by victim and non-offending caregivers;
  5. investigation details (e.g., what examinations were performed and where? How was the forensic interview conducted and who was involved? How many times was the child interviewed?);
  6. charges laid (recommended);
  7. court outcomes;
  8. sentencing; and
  9. elapsed time.

Objective #2: Measure client satisfaction with CACs

Research Question: How satisfied were the child and his/her non-offending caregiver(s) with the services received through the CAC and the techniques/procedures that were used to deliver these services (e.g., forensic interviewing, referrals, locations of services, access to/availability of services, culturally sensitive services, and services available in language of choice)?

Objective #3: Measure client satisfaction with the criminal justice system’s process and outcomes

Research Question: How satisfied were the child and his/her non-offending caregiver(s) with the criminal justice system (e.g., the length of time it took to lay charge(s), the charge(s) laid, the court process, the length of the trial, the court decision, and sentencing)?

Objective #4: Measure how CACs meet the FVS objectives

Research Question:

  1. How does the CAC attempt to mitigate financial and non-financial hardships for the victim and his/her non-offending caregiver(s)?
  2. In the caregiver’s opinion, were the hardships reduced by the CAC? In what ways? What else could reduce hardships (e.g., enhanced capacity for the delivery of appropriate and responsive victim services)?
  3. To what extent has the funding received through the FVS enabled CAC capacity enhancements (e.g., additional staff, tools, knowledge and training, and access to information/resources)?
  4. How has the CAC increased access to services in response to victims’ needs and gaps in services (e.g., hours, location, types of services provided, culturally sensitive services, and translation/language of choice)?

How the study’s scope changed

The Department originally selected the following five sites to participate in the study:

CACs were selected to reflect a variety of governance structures and were chosen from different regions of the country. Originally, the Department proposed 600 interviews with child/youth victims and non-offending caregivers over a three year period as well as approximately 60 interviews with MDTs to understand how the CACs are operating.

However, recognizing that four of the five CACs were still in the development phase, it was not feasible to have the number of client interviews projected. Therefore, the study period was extended to five years to allow for the CACs to establish and be able to have a sufficient number of interviews to be conducted (approximately 200, 1/3 less than the originally 600 targeted). Also, in response, the researchers focused on documenting the development of CACs to meet clients’ needs. Consequently, researchers conducted 109 MDT interviews (including 125 individuals) rather than 60.

A sixth CAC, Sophie’s Place in Surrey, British Columbia, was also brought into the study in year three when it became evident that one of the five CACs would not be able to include interviews with victims and non-offending caregivers.

Data sources

Three main data sources informed this report:

  1. case file data from the CACs,
  2. client interviews (child/youth victims and non-offending caregivers), and
  3. MDT interviews. Researchers also interviewed CAC stakeholders, including members of boards of directors and local politicians, and conducted a criminal justice system satisfaction survey. These tools formed the basis of a research and evaluation resource that the Department of Justice created for CACs in 2015.Footnote24

1. Case file data

Caribou Child and Youth Centre collected data from 320 anonymized case files between January 1, 2014 and September 30, 2016 using an online Fluid Survey instrument developed by the Department. It reported data in Excel and SPSS formats. Data featured distinct files for victims, witnesses, and family members; and variables in the following domains: alleged offences, characteristics of victims and family, characteristics of the accused, services provided by the CAC, and number of forensic interviews.

SeaStar CYAC collected data from 511 case files between January 1, 2014 and September 30, 2016. It reported data in Excel format. It was limited to reporting aggregate data, as opposed to distinct files for victims, witnesses, and family members. Data featured variables in the following domains: case information, demographics, CAC participation, types of abuse, services, and outcomes.

Koala Place CYAC collected data from 319 anonymized case files between January 1, 2014 and September 30, 2016 using Fluid Survey. It reported data in Excel and SPSS formats. Data featured distinct files for victims, witnesses, and family members; and variables in the following domains: screening information, client information, incident information, case information, trial information, forensic interview, forensic medical exam, mental health clinical assessment, multi-disciplinary team case review, information provided to client/caregiver, court accompaniment, services provided to family members, closing file, and follow-up.

Project Lynx collected data from 82 anonymized case files between January 1, 2014 and September 30, 2016 using an Excel template. It reported data in Excel format. Data featured variables in the following domains: client information, services, case information, testimonials aids and other measures, and services provided to family members.

Regina Children’s Justice Centre collected data from 107 anonymized case files between January 1, 2014 and September 30, 2016 using Fluid Survey. It reported data in Excel and SPSS formats. Data featured approximately 79 variables in the following domains: screening information, client information, incident information, case information, trial information, forensic interview, forensic medical exam, mental health clinical assessment, MDT case review, and information provided to client/caregiver, court accompaniment, services provided to family members, closing file, and follow-up.

Sophie’s Place CAC collected data from 470 anonymized case files between April 1, 2014 and September 30, 2016 using an Excel template developed by Sophie’s Place. It reported data in Excel format. Data included approximately 28 variables.

From all six sites, the researchers were able to identify 15 key variables to aggregate for the national report (Table 1).

Table 1: Case File Variables for National Reporting
Domain Variable # of CACs reporting
Client Type 5
Gender 6
Age 6
Ethnicity 6
Services Referral source 4
Joint investigation and forensic interview 6
Location of forensic interview 6
Interpretation required 5
Advocate support 5
Forensic medical exam 5
Other referrals offered/accepted 4
Alleged offence/offender Alleged offence 6
Relationship to offender 6
Age of alleged offender 5
Outcome of police investigation 3

2. Client interviews

Researchers conducted 123 in-person interviews with 26 child victims (aged five to 11), 17 youth victims (aged 12 to 19), five adults who had been victims as children (i.e., deemed historical cases), and 75 non-offending caregivers (Table 2).

Table 2: Child/Youth and Non-Offending Caregiver Interviews by Site and Year
Year/Respondent Caribou SeaStar Koala Lynx RCJC Sophie’s Place Total
2013/14
Child - - - - 8 (5F/3M) - 8 (5F/3M)
Youth - - - - 3 (3F) - 3 (3F)
Caregiver - - - - 10 (10P) - 10 (10P)
2014/15
Child - 2 (1F/1M) - 2 (1F/1M) - - 4 (2F/2M)
Youth - 2 (1F/1M) - 3 (3F) - - 5 (4F/1M)
Caregiver - 3 (3P) - 8 (5P/2G/1C) 7 (7P) - 18 (15P/2G/1C)
2015/16
Child 4 (3F/1M) - - - 3 (3F) - 7 (6F/1M)
Youth - - - 2 (2F) 3 (3Ftable note *) 1 (1F) 6 (6F)
Historical - - - - 2 (2M) - 2 (2M)
Caregiver 3 (3P) 2 (2P) - 5 (4P/1G) 8 (8P) 1 (1P) 19 (18P/1G)
2016/17
Child 5 (5F) - - 1 (1M) - - 6 (5F/1M)
Youth 1 (1F) - - 2 (2F) 1 (1F) - 4 (4F)
Historical - - - - 3 (3F) - 3 (3F)
Caregiver 4 (4P) 6 (6P) - 3 (3P) 8 (7P/1G) 7 (5P/2O) 28 (25P/1G/2O)
Total 17 15 - 26 56 9 123
F= female, M= male, P= parent, G= guardian, C= caregiver, and O= other relative

Proactive Information Services Inc. drafted the interview instruments, which were reviewed by the Department and CAC sites. Interview instruments were standardized (i.e., they included questions that were asked at all sites) and separate guides were developed for each of the respondent groups: child victims, youth victims, adult historical cases, and non-offending caregivers.

In addition, while the interviewers asked questions, children were invited to draw something that they liked about the CAC. At the end, the interviewer transcribed the child’s description of his/her drawing to ensure an accurate interpretation. Five drawings were relevant to the child’s CAC experience. While other children drew unrelated drawings (e.g., a family friend or family dog), the activity nevertheless encouraged the child to relax. To reimburse expenses associated with participating in this study (e.g., child care and parking), each family received $30.

3. MDT interviews

Researchers conducted interviews with 125 MDT members (Table 3). Although most MDT interviews were in-person, some were conducted by phone. Proactive Information Services Inc. drafted the interview instrument, which was reviewed by the Department and CAC sites. While the interview instrument was standardized, only questions relevant to each member’s role were asked. Executive directors, program coordinators, and/or victim advocates were interviewed multiple times at all sites.

Table 3: MDT Member Interviews by Site (2013 - 2017)
Site Number of Interviews Number of Individuals Interviewed
Caribou 20 25
SeaStar 23 25
Koala 10 12
Lynx 15 19
RCJC 19 20
Sophie’s Place 22 24
Total 109 125

Researchers interviewed 11 executive directors, 18 program coordinators, 26 police officers, 17 child protection workers, 11 Crown prosecutors, two family support advocates, 16 victim services workers, five victim/witness coordinators, one Crown witness coordinator, four doctors, three therapists, two clinical supervisors, one mental health worker, two dog handlers, one court worker, one domestic violence coordinator, one board chair, three board members, one CEO foundation, and one executive assistant to a mayor.

Criminal justice system satisfaction survey

Proactive Information Services Inc. also developed a survey, which was reviewed by the Department and CAC sites to measure client satisfaction with the criminal justice system. One questionnaire was for children (aged five to 11) and their non-offending caregiver to complete together, while another was for youth victims (aged 12 to 19). CAC staff mailed surveys to clients following a justice system outcome, accompanied by a letter explaining the survey’s purpose and assuring anonymity. Respondents also received a postage-paid, addressed return envelope. However, only one survey was returned. This was in part due to the length of time in which it takes for a case to move through the criminal justice process and have an outcome. Given that there was only one survey returned, researchers evaluated clients’ satisfaction with the criminal justice system during interviews, wherever possible.

Limitations of the study

A number of limitations of the study are identified below along with their mitigation strategy.

Although the purpose of the research was to provide an overview of CACs operating in Canada, it is important to note that the results are not generalizable to other CACs outside the six sites included in the study. However, the results of the study provide valuable lessons learned and best practices that can be adopted by other CACs in Canada and in other countries.

A second limitation concerns the reporting of case file data. Since each of the CACs collected and reported their case file data differently it was not possible to analyze all of the data across CACs. To mitigate the variability, the researchers and the Department identified 15 variables to be analyzed across the different sites. Even for the 15 variables identified, some sites were not able to provide a complete accounting. Also, since aggregate case file data was only available for one CAC, researchers could not distinguish victim only data from this site. Finally, inconsistencies were apparent in the different Fluid Surveys tools (e.g., different response categories to the same question). Although case file data that can be reported on a national level is limited, it provides a picture of the types of cases that are referred to and dealt with by the six CACs.

A third limitation was that the youth interview instrument proved too lengthy and difficult to answer. As a result, the child and youth interview instruments were merged early in the study, which led to some missing data on youth victims. However, the integrated instrument produced more consistent and comparable data between children and youth.

A fourth limitation was the recruitment of client interviewees. One CAC declined to participate in client interviews but shared the results of a client satisfaction questionnaire developed by the CAC.Footnote 25 Additionally, participation rates of victims and non-offending caregivers were low. Some families were automatically excluded because the victim advocate felt they were too vulnerable to participate, while other families simply declined to participate. Other factors included one CAC’s lengthy protocol to contact families, and delays in receiving a research ethics board’s approval to conduct interviews. As a result, a sixth CAC was brought into the study and the length of the study was extended from three to five years. So although the study did not achieve the 600 client interviews originally projected, the 123 interviews with victims and non-offending caregivers over the five years provide a valuable insight into the experiences of CAC clients.

A final limitation of the study was in assessing client satisfaction with the criminal justice system. Although a client satisfaction survey was developed and sent to clients once their case received an outcome in the criminal justice process, only one survey was returned and included in the study. To address this limitation, the researchers asked questions about satisfaction with the process in the interviews. However, it must be noted that this only included satisfaction at the beginning of their experience with the system and did not include any information on their participation post involvement with the CAC.

Research lessons learned

Relationships are important in multi-year projects: One member of the research team liaised with CACs throughout the project, which built and maintained trust and collaborative relationships. As a result, site visits went smoothly and information flowed easily in both directions.

Research involving child/youth victims and their families is challenging: Researchers were not able to interview as many victims or caregivers as first intended or even as revised during the study. Many families and one CAC declined to participate. Families coping with trauma may not want to be interviewed again. Researchers were most successful in obtaining participation in the interviews when the victim advocate explained the purpose of the research to the caregiver and accompanied families to interviews (e.g., entertaining the child while the caregiver was interviewed).