Google says it plans additional privacy measures to protect teenage users on YouTube and its search engine, becoming the latest technology giant to adopt tougher standards in the face of criticism that companies are not doing enough to protect children.

In a blog post Tuesday, Google said that videos uploaded to YouTube by users 13-17 years old would be private by default, allowing the content to be seen only by the users and people they designate.

Google also will start to allow anyone younger than 18, or a parent or guardian, to request the removal of that minor’s images from Google Image search results, the company said. It is unclear whether this process will be easy and responsive, considering Google’s historical reluctance to remove items from search results.

In addition, Google said it would turn off location history for all users younger than 18 and eliminate the option for them to turn it back on.

The company plans to roll out the changes in the “coming weeks,” it said.

There is growing bipartisan support in Washington to press technology companies to do more to protect children. In the past few months, two pieces of legislation, one in the House and one in the Senate, seek to update the Children’s Online Privacy Protection Act. The 1998 law, known as COPPA, restricts the tracking and targeting of children younger than 13 years old, and the bills would extend those protections to teenagers.


Google has repeatedly faced scrutiny over its handling of data related to children. In 2019, it agreed to pay a $170 million fine for violating COPPA by collecting children’s data without parental consent.

Google’s announcement comes on the heels of changes unveiled last month by Facebook to protect teenage users on Instagram. Among the advertising and privacy policy changes, one will make accounts created by children younger than 16 private by default, Instagram said.

Both Facebook and Google said they were limiting the ability of marketers to target teenagers with advertising, but in slightly different ways. Facebook said advertisers would be able to target people younger than 18 based only on their age, gender and location — and not on their interests or their activity on other apps and websites.

Google said it would block personalized ads that were based on age, gender or interests to people younger than 18. It will still allow ads based on context, such as a person’s search requests.