I’m a clinical psychologist, and my co-author, Steve, is a medical doctor at Stanford University. Like millions of people, we have digital assistants (e.g. Amazon Alexa, Google Home or Apple HomePod) in our homes. We use them to play music, to decide if we need an umbrella and to order pizza. But thinking about our patients, and the cases of sexual assault that have come to light in the news media recently, we can’t help but wonder what happens when the stakes are higher. What happens when a survivor of sexual assault says, “Alexa, I was raped”?

Though perhaps surprising, many people do indeed ask home digital assistants about mental-health problems and assault. To the credit of their manufacturers, most will reply with a crisis hotline. This is a step in the right direction but leaves important privacy questions unanswered. First, we don’t know what is being recorded from devices that are always listening. Second, if there are recordings, we don’t know if they are saved. Third, if they are saved, we don’t know who will have access in the future. The issue is muddled further if a minor is involved.

As psychologists and doctors, we hear from our patients about their histories of sexual assault. After mustering the courage to share something so sensitive, patients can rely on established laws to protect their privacy. We document their accounts in the electronic medical record, which the patient or law enforcement can use later. In contrast, if a digital assistant learns about an assault, it is unclear what a user should expect to be saved or shared. If a sexual-assault survivor discloses to a digital assistant, no established procedures dictate what information will be available later if the survivor, authorities or even advertisers want it.

Even so, it is likely that traumatized survivors will continue to confide in their digital assistants. Studies have shown that, surprisingly, many feel more comfortable disclosing sensitive information to a machine than to people. A machine doesn’t judge, and survivors can ask for help in the privacy of their own homes without the unpredictable consequences of confiding in people. Moreover, similar research underlines the emotional benefits of disclosing traumatic personal experiences, and hearing a supportive response can feel good even if it comes from a machine.

Transparency and empowerment are crucial steps in healing sexual assault’s interpersonal wounds, and those same principles must also inform a policy response to digitally recorded sexual-assault inquiries. Companies should be transparent as to how often people disclose sexual assaults to digital assistants, in aggregate to avoid sacrificing individual privacy. When digital assistants are told about a sexual assault, companies should tell users which data, if any, will be saved (e.g. location, audio recording) and what special privacy protections will be employed. Lastly, users should know whether they can rely on any saved data being shared at their request during a subsequent criminal or civil investigation.

Do you have something to say?

Share your opinion by sending a Letter to the Editor. Email letters@seattletimes.com and please include your full name, address and telephone number for verification only. Letters are limited to 200 words.

Technology companies can’t do this alone. They should work to empower survivor-advocacy groups to establish transparent best practices for data storage and sharing. They should also work with law enforcement authorities and policymakers. We propose a federal commission, led by the FCC (which helped launch the 911 emergency line), in collaboration with industry stakeholders and advocacy groups, with two aims. First, to solicit public comment on the appropriateness of common standards for digital assistants and sensitive conversations, such as sexual assault. Second, to generate specific technical and policy recommendations for digital assistants to identify and protect sensitive disclosures.

This issue is not going away, as more and more devices will no doubt collect increasing amounts of the most personal data. Sexual assault is only one example of the kind of sensitive information that we would all want protected properly. Future survivors deserve clear rules and better designs that minimize harm and leverage the benefits of a new age in which our devices are always listening.