Q: What’s the problem with schools publicly releasing student data if it doesn’t have their names attached to it? Is that the only risk to student privacy?

A: The U.S. Department of Education’s privacy guidelines require schools to remove “direct identifiers” — such as names — from data being released, but that alone “DOES NOT constitute adequate de-identification.” In other words, removing just the names from student-level data may not be enough to protect their identities. The U.S. Family Educational Rights and Privacy Act (FERPA) considers student data to be identifiable if “a reasonable person in the school community” without personal knowledge of the circumstances could “identify the student with reasonable certainty.”

Apart from school districts publicly releasing such data, even keeping it and using it to run the program presents certain privacy risks. One district, in response to a public-records request by The Seattle Times for documents that include the term “Check Yourself,” provided emails between its SBIRT specialist and therapists that discussed screening results for some students, including one’s suicidal thoughts.

There’s also the ever-present threat of cyberattacks. The U.S. Government Accountability Office found that thousands of students were affected by data breaches between 2016 and 2020. King County said in a 2020 audit that it is “very likely” that the county could experience a breach within five years.

Q: What does it mean for a screening tool to be validated?

A: A validation study gauges how accurately a tool measures what it intends to measure. A validated screening tool should be able to accurately flag at-risk students and rule out those who aren’t at risk, as a COVID-19 test detects the presence or absence of the coronavirus.

Advertising

Organizations like the American Academy of Pediatrics and the National Association of School Psychologists recommend using validated screening tools, and it is considered a best practice by some experts in school screening. Other experts say it’s important for schools to use questionnaires that get at issues they really want to know about, rather than limiting themselves to validated tools. King County officials stress that their aim is to start conversations, not to make diagnoses.

The version of Check Yourself used in King County schools includes a handful of questions drawn or adapted from validated screeners (such as the GAD-2, for anxiety, and the PHQ-2, for depression) with the majority of questions being drawn from other sources. Prior to launching the SBIRT program, there was a local randomized trial involving Check Yourself in King County that used different questions for an older adolescent population.

Though not a validation study, a former county physician points to this data as relevant, saying that the school questionnaire included some of the same questions as the trial version and maintained the “general approach methodology.” The Check Yourself version used in middle schools has been revised at least two times.

Q: Given that an institutional review board (IRB) determined the SBIRT evaluation isn’t research, why is there debate over this? Why does it matter?

A: When IRBs examine whether a project is research, they are applying a technical and somewhat vague definition from the federal code: “a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge.” Exactly what constitutes “generalizable knowledge” isn’t defined, so there is a gray area that scholars have debated for years.

The team evaluating King County’s SBIRT program informed the IRB that the findings “will be used solely to inform program improvement for this SBIRT program and will not be generalizable beyond this project.” Still, some experts point out that part of the evaluation’s goal — to determine whether SBIRT is “an appropriate model for youth in middle school” — and findings about its promise have the potential to affect a much broader population than those who participated in the program, giving it the feel of research.

Advertising

Quite a few scholars emphasized that disagreements are common in this field. As one expert put it, determining whether a project is research is “a continuum, not a dichotomy.”

Had an IRB determined the program met the definition of research, it would have required oversight that can be expensive and time-consuming for researchers. That oversight generally comes with the requirement that participants — or parents of minors — give informed consent, a high bar that often significantly limits who participates.

Q: What changes might enhance protections for student privacy?

A: One interesting counterpoint to King County is Massachusetts, which passed a law in 2016 requiring schools to verbally screen students for substance use. The state uses a validated screening tool known as the CRAFFT-II. The same state law prohibits creating any record that includes information that could identify a student, a legal mandate that doesn’t exist in Washington.

Other experts concerned about data privacy suggested that King County school districts should eliminate the school that students attend when transmitting data to Tickit’s central platform. That would effectively place anonymous students in a much larger population — the district rather than their school — and offer them greater privacy.