Times Watchdog stories dig deep to hold power accountable, right wrongs and create change. This work is made possible by The Seattle Times Investigative Journalism Fund. Donate today to support watchdog journalism in our community.

The middle school students across King County opened their school email, clicked a hyperlink to fill out a questionnaire and began confiding some of the most sensitive parts of their lives into the screen.

In Auburn, 22 students ages 10 to 12 reported using alcohol, marijuana, tobacco or vaping. In Northshore, 39 students said they were questioning their gender identity. In Seattle, 119 students said they had thought seriously about suicide in the past year.

Asked what they experience at home, some students typed paragraphs about their anguish. “please don’t share this with anyone cause my parents might get mad at me,” one student wrote.

The data collection is part of a roughly $30 million initiative funded by King County to screen middle- and high-school students for mental health, substance use and other risks. After screening more than 20,000 students over four years, county officials say the program is delivering on its promise, identifying previously unknown needs and saving student lives.

But no one tells kids or their parents that they have helped fine-tune a commercial screening tool whose accuracy hasn’t been rigorously tested. No one tells them that contractors evaluating the program have a broader research agenda for the screening tool. And no one tells them that the information schools gather, without student names but with potentially identifying information, could become public, an investigation by The Seattle Times has found.


The questionnaire collects students’ age, grade, race, language, gender identity, sexual orientation, school, and other details they include in open-ended responses — an unusually intimate record for a survey that isn’t anonymous.

This data is transmitted without student names to a for-profit software firm, which grants access to Seattle Children’s Research Institute — both contracted by King County. Schools, which can link student names to their answers, follow up with them to offer support.

With such sensitive information in hand, schools must determine whether kids’ screening results are part of their education record. If so, parents would have a right under federal law to see things that their children might not want them to know. If not, the information isn’t protected by a federal privacy law — raising the prospect that it could be released publicly and compromise student privacy, experts say.

The 12 school districts participating in screening disagree whether the federal privacy law applies, but nearly all said that data they share with King County has no information that could identify kids. To test how well schools protect this data, The Times requested it under Washington’s Public Records Act.

Only one district so far, Highline, declined to provide the data, saying it is part of students’ education records. Three districts provided anonymous student responses with many redactions, admitting that the data shared with King County and its contractors contains identifying student information. Five districts provided student responses with limited or no redactions.

Some students responded to open-ended questions with names of family members, friends and pets. (The Times did not attempt to identify any students.)


Even without names, each piece of demographic information — such as the grade, ethnicity and school of a student who reports being captain of the soccer team — can dramatically shrink the universe of students to the point that they could be identified, according to privacy experts.

“That feels to me like a significant gap that can result in real harm to students, especially considering the sensitivity of the questions that were asked and the responses that were elicited,” said Linnette Attai, a privacy consultant and author.

King County officials say that the data is “subject to robust security safeguards” and that they haven’t had any problems or received any reports of harm from students. The data is purged from a central database every summer, according to the county, though it acknowledged that schools can export and save their own copies.

Katie Rogers, a county spokesperson, said that the questionnaire “builds on proven approaches” and that four years of collecting data demonstrates the program’s effectiveness.

The program is voluntary and its risk is minimal, the county says, especially considering the crisis in youth mental health.

That crisis is well documented, with the suicide rate for kids ages 10 to 14 doubling from 2011 to 2020, according to the U.S. Centers for Disease Control and Prevention.


Some students have said that the screening shows their schools care about them. Schools have credited the program with revealing many cases of students who were suffering in private.

“For whatever reason, they’re not reaching out to adults, but they’re saying on the screen, ‘I need some support’ and we’re able to follow up with that, and then reach them early,” said Margaret Soukup, who coordinates the program for King County.

Interviews with 35 experts in screening questionnaires, research ethics and privacy yielded a complex mosaic of views on the program. Experts in screening youth generally lauded King County’s approach as innovative and thoughtful, while privacy professionals voiced alarm about the data collection. Several ethics scholars questioned whether the program crosses into research — despite a Seattle Children’s panel decision that it didn’t — which generally would require informed consent from students or their parents.

Others questioned why the county hired Seattle Children’s to evaluate the program when the institution has ownership rights to the screening tool at the heart of the program. That “should have raised some red flags,” said John Baumann, associate vice president for research compliance at Indiana University. “Why would they do that?”

A researcher and a conflict

As King County laid the groundwork for its new screening program in 2017, Soukup consulted an expert on the subject, a professor at Seattle Children’s Research Institute named Cari McCarty.

There are many free or low-cost questionnaires for screening kids, from mental health to substance use, that have been scientifically validated for accuracy. McCarty suggested such a questionnaire, but King County wanted one that covered a broader range than the off-the-shelf options.


Flush with cash from the Best Starts for Kids levy — which voters last year expanded to $872 million over six years — and Mental Illness and Drug Dependency sales tax, the county had ample means to build its own program.

King County fashioned its model on an approach known as SBIRT (Screening, Brief Intervention and Referral to Treatment, pronounced “ESS-birt”), which traditionally focuses on substance use. Soukup’s team settled on a questionnaire called “Check Yourself” that hasn’t been validated but is customizable.

McCarty knew Check Yourself well. She’d helped to create it.

Seattle Children’s owns the copyright to Check Yourself, according to a hospital lawyer’s email, and licensed it to Tickit Health, a Canadian firm that adapted the questionnaire for electronic use. McCarty herself stood to receive royalties from Tickit’s sales of Check Yourself, according to disclosures on academic papers.

“I am personally excited that you are recommending Check Yourself,” McCarty emailed Soukup in January 2018.

King County authorized two no-bid contracts: purchasing a subscription to Check Yourself from Tickit, and hiring McCarty’s team to evaluate it, for at least $240,000 and $588,000, respectively. As principal investigator, McCarty would effectively evaluate her own work.


McCarty said in an email “I have never received any compensation in any form” related to the licensing of Check Yourself and that she had given up rights to any future compensation. She didn’t elaborate or respond to interview requests or written questions.

Sandy Whitehouse, Tickit’s co-founder, said the company has never paid McCarty or Seattle Children’s “in connection with our work with King County.”

An initial evaluation published in 2020 highlighted apparent successes, mentioning McCarty’s role as a co-creator of Check Yourself but not disclosing any conflicts.

“They’re bound to be biased because they have a stake in the outcome,” Peter York, an expert in program evaluations as a principal at the consulting firm BCT Partners, said of the McCarty-led team. That could undermine a good program, he said, adding, “In all likelihood, a program like this, I think is going to make a difference.”

Seattle Children’s implemented a plan to manage McCarty’s interest in Check Yourself, and county officials were satisfied with it. In response to written questions, King County officials wrote, “We have full confidence in this evaluation model and the expertise of Seattle Children’s on this program.”

To evaluate the program, McCarty and her team submitted their plan to the Seattle Children’s Institutional Review Board (IRB), an oversight panel that looks out for the welfare of participants in research.


At the time, she was involved in at least four IRB-approved studies involving Check Yourself. The analysis for King County schools was different, according to McCarty’s plan, because her team was trying to improve the county’s program and not seeking to conduct research. The IRB was persuaded, but some experts disagree.

“They’re trying to learn whether it works, whether it is able to identify at-risk middle school students and is therefore a useful and appropriate tool to use in middle schools,” said Michael Carome, director of Public Citizen’s health research group and a former senior official at the U.S. Office for Human Research Protections. “Those are factors that strongly point to this being research.”

Casey Egan, a Seattle Children’s spokesperson, declined to respond to questions regarding the IRB’s determination.

McCarty and her team submitted a paper last year to an academic journal about their work in King County. “I am very surprised,” one reviewer wrote, that the authors’ work was “determined to not be human subjects research and IRB approval was not needed.”

“It is hard to walk the line”

As schools began administering Check Yourself in 2018, some raised questions about the evidence behind it.

“We are unsure if we are being asked to field test a research or diagnostic tool,” staff at Tukwila School District wrote in an initial progress report to King County.


Amid the pushback, Margaret Cary, a county physician, acknowledged in an email with colleagues in 2019 that “it is hard to walk the line of ‘experimenting’ on kids vs expanding the evidence base and trying to improve on existing efforts.”

Cary, who has since left King County, said that the program is “both evidence informed and an important innovation to fill a gap.” She added, “There was universal advocacy from multiple stakeholders that Check Yourself was the best fit.”

Based on students’ experiences, King County, Seattle Children’s and Tickit made a series of tweaks to Check Yourself. Among the changes: giving students the option not to answer some questions, tweaking language on sexual attraction, and removing questions about trauma that they decided shouldn’t be asked.

Many schools have warmed to the benefits of screening.

“We’ve been able to reach so many kids who otherwise might have fallen under the radar,” an Auburn School District counselor wrote.

Highline School District created a “Queer Connections” support group after using SBIRT data to uncover “an unsafe school environment for students who identify as LGBTQ+.”

In the Tahoma School District, a straight-A student confided in Check Yourself that he had thoughts of suicide. Following an intervention, the boy stopped harming himself.


Even as the program notches successes, schools have had to reckon with questions from parents about the content and the privacy of screening.

Parents in most cases receive a general notification about the program with the option to opt out, but not detailed information about sensitive questions like gender and sexuality. Seattle Children’s is exploring these topics for their links to health.

“I felt like it’s completely none of their business and totally irrelevant,” a Lake Washington School District parent, who asked that her name be withheld to protect the privacy of her kids, said of the gender and sexuality questions. She said that the screening quickly became the talk of the school, with students speculating about why their peers were pulled from class to meet with a counselor.

A parent in the Snoqualmie Valley School District went a step further: asking to see what records the school was collecting.

The Snoqualmie parent emailed the school principal in March 2020 to ask for all the child’s SBIRT records for a “data privacy research project,” according to correspondence the district provided in response to a public-records request.

School staff were not sure what to do. “We tell the students that this is not going to be shared with anyone, except for the counseling team,” one staffer emailed colleagues.


Snoqualmie Valley told The Times that student responses to Check Yourself are “not identifiable by name,” and “staff do not maintain records of the student responses.” Yet three months after the Snoqualmie parent asked for the SBIRT records, the district provided a copy.

A district spokesperson had no explanation for how officials provided a record they claim not to keep, but LeRoy Rooker did.

“They know they have to,” said Rooker, a former head of the U.S. Department of Education’s Family Policy Compliance Office, who along with three other education experts believes student responses to Check Yourself are part of students’ education record.

Send us a tip

Tips are the lifeblood of investigative reporting. Good tips are clear, specific, have documents or evidence to back them up and involve a problem with real-world consequences. We accept tips by several methods, including through secure encrypted email, text and phone calls, as well as mail, so a reporter can follow up with you.

Of greater concern, he said, is that some schools provide the anonymous screening data publicly, believing that removing names is enough to protect student identities.

“If I were a parent,” he said, “I would be very disturbed by that.”


Students, for their part, have reported that Check Yourself is easy to use. Tickit Health says its questionnaires are designed — through language, colors and icons — to build trust with various audiences and elicit more accurate information, and the approach appears to be effective with some students.

“Thanks for making us do this survey,” a Tahoma student wrote in response to an open-ended prompt. The student had initially felt uncertain, but after opening the questionnaire, “I felt like I could tell you guys anything that’s bothering me.”

Engagement reporter Taylor Blatchford contributed to this article.

A deeper look at student screening, research and privacy

— — —

What’s the problem with schools publicly releasing student data if it doesn’t have their names attached to it? Is that the only risk to student privacy?

The U.S. Department of Education’s privacy guidelines require schools to remove “direct identifiers” — such as names — from data being released, but that alone “DOES NOT constitute adequate de-identification.” In other words, removing just the names from student-level data may not be enough to protect their identities. The U.S. Family Educational Rights and Privacy Act (FERPA) considers student data to be identifiable if “a reasonable person in the school community” without personal knowledge of the circumstances could “identify the student with reasonable certainty.”

Apart from school districts publicly releasing such data, even keeping it and using it to run the program presents certain privacy risks. One district, in response to a public-records request by The Seattle Times for documents that include the term “Check Yourself,” provided emails between its SBIRT specialist and therapists that discussed screening results for some students, including one’s suicidal thoughts.

There’s also the ever-present threat of cyberattacks. The U.S. Government Accountability Office found that thousands of students were affected by data breaches between 2016 and 2020. King County said in a 2020 audit that it is “very likely” that the county could experience a breach within five years.

What does it mean for a screening tool to be validated?

A validation study gauges how accurately a tool measures what it intends to measure. A validated screening tool should be able to accurately flag at-risk students and rule out those who aren’t at risk, as a COVID-19 test detects the presence or absence of the coronavirus.

Organizations like the American Academy of Pediatrics and the National Association of School Psychologists recommend using validated screening tools, and it is considered a best practice by some experts in school screening. Other experts say it’s important for schools to use questionnaires that get at issues they really want to know about, rather than limiting themselves to validated tools. King County officials stress that their aim is to start conversations, not to make diagnoses.

The version of Check Yourself used in King County schools includes a handful of questions drawn or adapted from validated screeners (such as the GAD-2, for anxiety, and the PHQ-2, for depression) with the majority of questions being drawn from other sources. Prior to launching the SBIRT program, there was a local randomized trial involving Check Yourself in King County that used different questions for an older adolescent population.

Though not a validation study, a former county physician points to this data as relevant, saying that the school questionnaire included some of the same questions as the trial version and maintained the “general approach methodology.” The Check Yourself version used in middle schools has been revised at least two times.

Given that an institutional review board (IRB) determined the SBIRT evaluation isn’t research, why is there debate over this? Why does it matter?

When IRBs examine whether a project is research, they are applying a technical and somewhat vague definition from the federal code: “a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge.” Exactly what constitutes “generalizable knowledge” isn’t defined, so there is a gray area that scholars have debated for years.

The team evaluating King County’s SBIRT program informed the IRB that the findings “will be used solely to inform program improvement for this SBIRT program and will not be generalizable beyond this project.” Still, some experts point out that part of the evaluation’s goal — to determine whether SBIRT is “an appropriate model for youth in middle school” — and findings about its promise have the potential to affect a much broader population than those who participated in the program, giving it the feel of research.

Quite a few scholars emphasized that disagreements are common in this field. As one expert put it, determining whether a project is research is “a continuum, not a dichotomy.”

Had an IRB determined the program met the definition of research, it would have required oversight that can be expensive and time-consuming for researchers. That oversight generally comes with the requirement that participants — or parents of minors — give informed consent, a high bar that often significantly limits who participates.

What changes might enhance protections for student privacy?

One interesting counterpoint to King County is Massachusetts, which passed a law in 2016 requiring schools to verbally screen students for substance use. The state uses a validated screening tool known as the CRAFFT-II. The same state law prohibits creating any record that includes information that could identify a student, a legal mandate that doesn’t exist in Washington.

Other experts concerned about data privacy suggested that King County school districts should eliminate the school that students attend when transmitting data to Tickit’s central platform. That would effectively place anonymous students in a much larger population — the district rather than their school — and offer them greater privacy.

— Daniel Gilbert, Seattle Times staff reporter