BOSTON (AP) — Filmmaker Beeban Kidron, a member of Britain’s House of Lords, began advocating for online child protection after directing “InRealLife,” a 2012 film about kids and the internet. She has been a driving force behind a U.K. law, which takes effect Sept. 2, that sets a code of conduct for online services designed to shield the under-18 crowd.
The so-called Age Appropriate Design Code comprises 15 standards for making sure children’s best interests are the prime consideration in the design of online services. It is the first law of its kind and, because the internet is global, the tech sector is already reacting as a one-year transition expires. Violators will face the same penalties for noncompliance as under the EU General Data Protection Regulation, four percent of global revenues or $25 million.
Kidron has combined intense lobbying in Silicon Valley and Washington, D.C. with research and advocacy through the non-profit she founded, 5Rights Foundation. She predicts we’re only just “in the foothills” of reform and expects more action from online services in the next few weeks.
The questions and answers in her recent interview with The Associated Press have been edited for length and clarity.
Q: You had a profound realization when you were making “InRealLife” that set you on this course?
A: The people who consider themselves the internet’s founders were proud of a vision to treat all users equally. But that meant treating a child as an adult. I saw that kids couldn’t cope with the adult world. Pornography and violence and unwanted contact were part of it. But it’s much more. It’s the hostility, the fake news, the popularity matrix, the commercialization and commoditization of childhood — a very reactionary and regressive force against the notion of children and childhood.
Q: The digital world is not currently a safe place for kids to learn, explore and play, U.K. Information Commissioner Elizabeth Denham said in advocating the systematic change the design code aims to bring. How does it do that?
A: It says children must be guaranteed a higher bar of privacy and consideration. So, for example, you must not reveal their exact location. That’s dangerous for a kid. You also can’t economically exploit what you know about them (from surveilling their online activity). In the past few weeks we’ve seen some related action. TikTok and Instagram have stopped direct messaging by unknown adults to children under the age of 16. YouTube introduced age verification for adult content (among other changes ).
Q: Facebook’s child-focused changes for Instagram — initially being applied in the U.S., Britain, Australia, France and Japan — include narrowing the scope of targeted ads that teens receive. It says they will now only get ads based on their age, gender and location. Is that enough? Instagram still collects data on teens’ social interactions online.
A: No, we want more. But I don’t think you can underestimate the huge shift that has occurred already. This is a very complex global industry and there will be many pieces of follow-on legislation. We’ve proven that the online world can be redesigned on principles.
Q: Facebook says it is going ahead with an Instagram for kids under 13 that won’t include ads, despite the objections of 44 U.S. state attorneys general. Is that OK with you?
A: No. I’ve told Facebook and a U.S. congressional subcommittee that I oppose Instagram for kids. It’s not that I don’t want kid-appropriate services. There’s not enough of them. Kids need inventive and creative spaces to play and learn and socialize. And whilst the advances Facebook is making should be recognized, it has not proven itself a good babysitter. I think Facebook has got a little bit to go before they’re a trusted brand for children.
Q: What more should the industry do?
A: I welcome individual companies taking individual measures, but that is not the end game. Online services need to get tougher in protecting children — from being directly messaged by adults they don’t know, about age restrictions on adult material and not surveilling children. We’re looking for industry norms of high privacy by default. I want to see a race upwards, not to the bottom, which we’ve seen for the last decade.