Social media and game platforms often use recommendation algorithms, find-a-friend tools, smartphone notices and other enticements to keep people glued online. But the same techniques may pose risks to scores of children who have flocked to online services that were not specifically designed for them.

Now California lawmakers have passed the first statute in the nation requiring apps and sites to install guardrails for users younger than 18. The new rules would compel many online services to curb the risks that certain popular features — like allowing strangers to message one another — may pose to child users.

The bill, the California Age-Appropriate Design Code Act, could herald a shift in the way lawmakers regulate the tech industry. Rather than wade into heated political battles over online content, the legislation takes a practical, product safety approach. It aims to hold online services to the same kinds of basic safety standards as the automobile industry — essentially requiring apps and sites to install the digital equivalent of seat belts and air bags for younger users.

“The digital ecosystem is not safe by default for children,” said Buffy Wicks, a Democrat in the state Assembly who co-sponsored the bill with a Republican colleague, Jordan Cunningham. “We think the Kids’ Code, as we call it, would make tech safer for children by essentially requiring these companies to better protect them.”

The state Senate passed the bill Monday evening by a vote of 33-0. The state Assembly had already approved a version of the bill. It now requires approval by Gov. Gavin Newsom, who has not taken a public stance on the measure.

The new rules tap into a national debate over the potentially deleterious effect that social media platforms may have on the mental health and body images of some young people.

Advertising

Instagram in particular has come under heightened scrutiny. Last fall, members of Congress examined how the social network’s automated recommendation engine had served graphic images of self-harm to teenage girls as well as content promoting eating disorders to younger users. Soon after, President Joe Biden called for greater child safety on social media.

Some companies have faced criticism for exploiting children’s data. In 2019, Google and the operators of Musical.ly, the popular video-sharing app now known as TikTok, each agreed to pay multimillion-dollar federal fines to settle charges that they had illegally collected personal information from children without parental permission.

Federal regulators said Google had profited by using children’s data to target them with ads on YouTube. Separately, regulators complained that Musical.ly had made children’s profile photos and other sensitive details public by default, saying the practice could have enabled adult strangers to contact younger users.

Proponents of the California bill say the new rules should reduce such risks while promoting children’s autonomy and well-being online.

Critics in the industry say the legislation is overly broad and could subject many more online services than necessary to burdensome rules.

The scope of the California legislation far exceeds federal safeguards for youngsters online. A federal law, the Children’s Online Privacy Protection Act of 1998, narrowly protects the privacy of users younger than 13 — and then only when they use online services aimed at youngsters, such as children’s video apps.

Advertising

California is already a pioneer in children’s online safety and data privacy, enacting protections over the last decade that dozens of other states have replicated. Now it has become the first state to pass a bill requiring general-audience sites and apps “likely to be accessed” by children to install basic protections for users younger than 18.

“Children should be afforded protections not only by online products and services specifically directed at them,” the statute reads, “but by all online products and services they are likely to access.”

The California bill would require online services for general audiences to proactively design their products and features to protect child users. In practice, that means apps and sites must analyze and mitigate the risks that their services may pose to minors — like exposing them to explicit content or using manipulative techniques to prod them to spend hours on end online.

The legislation also requires online services to turn on the highest privacy settings by default for minors. And it prohibits online platforms from collecting children’s precise locations without “providing an obvious sign to the child” while their whereabouts is being tracked.

The new rules, which would take effect in 2024, could prompt some online services to introduce nationwide changes, rather than treat minors in California differently.

The California statute takes many of its cues from Britain, where regulators put comprehensive online protections for minors into effect in 2021. British officials have said their effort, called the Children’s Code, was intended to set baseline safety standards, like preventing adult strangers from contacting children or disabling social media features that could show a child’s exact location on a map to other users.

Advertising

Designers of the British code said they also wanted to limit manipulative practices — like barraging children with notifications at all hours or automatically playing videos one after the other — that could get young users hooked on social media and game platforms.

More

“We all as a society have to start actually setting a floor,” said Beeban Kidron, a member of the House of Lords who spearheaded the British effort and is the founder of the 5Rights Foundation, a digital rights group for children. “Let’s stop introducing adults to children or putting children on a map so you can see where they are. Don’t notify kids all through the night. Turn off autoplay.” With the new British rules looming last year, YouTube, Instagram and other popular services bolstered their safeguards for younger users worldwide. Some said they had begun developing the product changes well before the British code took effect.

Last summer, YouTube said it would make uploads private by default for users ages 13 to 17 worldwide so only followers approved by teenagers may view their videos. It also turned off autoplay by default for minors.

TikTok said it had made all existing accounts registered to users 13 to 15 private by default, while Instagram has made new accounts private by default for users younger than 16. Snapchat, where all accounts are set to private by default, recently took steps to hinder adult strangers from interacting with younger users, as have Instagram and TikTok.

Google has turned on SafeSearch, a feature that can hide explicit search results, by default for users younger than 18 worldwide. It has also disabled location history for minors globally.

Sponsored

The California code could apply to many other online services that children are likely to use: game platforms, connected toys, voice-activated digital assistants and virtual reality apps. The bill could also affect popular education services like Google Classroom, a school assignment portal used by millions of children, whose privacy policy says it collects information about users’ locations.

Opponents of the children’s code said the wide mandate could pose problems for businesses. Among the most visible critics: the California Chamber of Commerce and TechNet, a tech industry association whose members include Amazon, Apple, Cisco, Google, Oracle, Pinterest, Snap and Meta, the social media giant formerly known as Facebook.

Industry groups pressed California lawmakers to narrow the bill’s definition of a “child” to a person younger than 16 — rather than a minor younger than 18. They also argued that the scope of the bill was too broad and its provisions too vague to carry out.

“The requirement that companies consider the ‘best interests’ of children is incredibly difficult to interpret,” TechNet and the Chamber of Commerce wrote in a letter to legislators in April. In a similar letter in June, industry groups said the bill’s broad focus on online services “likely to be accessed” by children would subject “far more websites and platforms than necessary” to the bill’s requirements.

Civil liberties experts raised concerns about another issue: consumer privacy. In particular, they warned that the requirement for general-audience sites to provide greater protections for children could lead to unintended consequences for adults.

“Such a system would likely lead platforms to set up elaborate age-verification systems for everyone, meaning that all users would have to submit personal data and submit to more corporate surveillance,” the Electronic Frontier Foundation, a digital rights group, wrote to legislators in April.

Advertising

The News/Media Alliance, a trade group representing 2,000 publishers including The New York Times, has also lobbied for changes, saying the language of the bill could require newspapers and magazines to undertake costly changes like instituting age verification for online readers or creating different versions of articles for minors.

Legislators have made some changes to accommodate industry concerns. For one thing, they added a provision giving companies a grace period to fix violations after receiving notice from regulators. But the most disruptive aspect of the broad children’s online safety effort may lie in its “first, do no harm” philosophy. That proactive stance could usher in a new approach for regulating tech companies in the United States — even as it challenges the build-it-first-and-beg-forgiveness-later startup ethos of Silicon Valley.

Indeed, the bill explicitly instructs companies to “prioritize the privacy, safety and well-being of children” over commercial interests.

“We design playgrounds to be reasonably safe and a lot of fun,” said Baroness Kidron, the House of Lords member. “We design medicine to be reasonably safe and appropriate to your size. And we need to design the digital world to be reasonably safe and appropriate to your age anytime.”