Google Introduces New Child Safety Feature To Protect Minors

After Apple and Facebook, Google has now introduced new child safety features on its platforms. While Facebook made changes to its advertising algorithms, and Apple is going to use automated tools to scan for child sexual abuse material (CSAM), Google’s changes are more widespread, including changes to its policies. The company will limit advertising capabilities […]

Topics

  • After Apple and Facebook, Google has now introduced new child safety features on its platforms. While Facebook made changes to its advertising algorithms, and Apple is going to use automated tools to scan for child sexual abuse material (CSAM), Google’s changes are more widespread, including changes to its policies. The company will limit advertising capabilities of those targeting children, and also make changes to its products to limit childrens’ exposure to explicit content, etc. Most of Google’s new policies will come into effect in the ‘coming weeks’, though Google didn’t share an exact timeline for the same.

    The company said it will now allow anyone under the age of 18, or their parents/guardians to request removal of their images from Google Image results. “Of course, removing an image from Search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online,” noted Mindy Brooks, General Manager, Kids and Families at Google, in a blog post.

    Also Read: How Location Analysis Helps Data-driven Marketers

    The company is also making changes to YouTube, Search, the Google Assistant, Google Play Store and Location Histories on your Google Account. It will set the default upload setting for videos on YouTube to private for users who are aged between 13-17, and will “prominently surface” digital wellbeing features, which request users to cut their screen time.

    YouTube will also have “safeguards and education” about commercial content, and Google will turn off the autoplay feature for children. This allows YouTube videos to keep playing continuously, moving from one video to another, based on algorithmic recommendations. Children under 18 will also get reminded to take a break and for bedtime when spending time on YouTube.

    Google Search will have SafeSearch turned on by default for users under the age of 18, as long as they are signed into their Google account. This feature filters explicit content that may show up when certain items are searched for. It will also be applied to the Google Assistant and Google Workspace accounts of K-12 institutions.

    Google will stop allowing children under the age of 18 from turning on location history on their devices, as long as they have Supervised Accounts. This allows parents to add a child account under their account, giving them a level of control on the child’s online activity. The company doesn’t allow children who have such accounts to turn location history on, which is a feature that lets Google track virtually every place you visit for advertising purposes.

    Topping off the changes to Google’s apps is a new safety section, which Google had announced last week. This is similar to Apple’s App Transparency protocols and requires app developers to disclose what kind of user data they access and use.

    Like Facebook, Google is also limiting what kind of advertising activities can be performed using data from kids accounts. The company said it will block ad targeting based on age, gender or interests of people under 18, in a somewhat different from Facebook’s approach, which allows only age, gender and location data to be used for advertisers who target children on its platform. The blog post said these changes will be coming in the “coming months”.

    Topics

    More Like This