Parents will now be able to have images of their children removed from Google search results, the company has said.
It came as Google announced a range of changes to child-safety measures across several of its products.
It will also remove “overly commercial content” from the children’s version of YouTube and change what kind of adverts can be targeted at under-18s.
Several major technology companies have introduced such measures under scrutiny from governments and safety advocates.
For example, Instagram has made under-16s’ accounts private by default, as it battles opposition to plans to introduce a children’s version of the app.
And in recent months, China has cracked down on teenage gaming habits and sued technology company Tencent over one of its messaging apps’ Youth Mode.
Google has itself come under fire for its YouTube Kids product, with US politicians calling it a “wasteland of vapid, consumerist content”.
And in a separate post to the YouTube blog, the company said it would now be changing its approach on “commercial content for kids”.
“We’ve never allowed paid product placements in YouTube Kids,” it said.
“In the coming weeks, we’ll also begin to remove overly commercial content from YouTube Kids, such as a video that only focuses on product packaging or directly encourages children to spend money.”
YouTube will also turn off “autoplay” as the default setting, for children, in both the Kids and mainstream apps, to prevent unsuitable videos playing automatically.
Allowing users to remove image-search results of children would afford “more control over their digital footprint”, Google said.
“Of course, removing an image from search doesn’t remove it from the web but we believe this change will help give young people more control of their images online,” it said.
The other changes include:
- stopping ad targeting based on children’s age, gender or interests
- preventing “age sensitive” types of adverts being shown to younger users
- changing the default mode for uploaded videos, for children, to “the most private option”
- turning adult-filtering mode Safe Search on for minors
- preventing young people from using Location History, the feature that tracks and logs a phone’s location constantly
- new parental advice on the Google Play app store
“Some countries are implementing regulations in this area – and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens,” it said.
Many young people may lie about their true age to circumvent these kinds of controls, however.
And age-verification measures are not widely used beyond the online sale of products such as nicotine and alcohol.
But planned legislation – such as the UK’s Online Safety Bill – may place the onus on the technology giants to ensure potentially harmful content cannot be accessed by minors.