Weeks after Instagram rolled out increased protections for minors using its app, Google is doing the same for its services, including Google Search, YouTube, YouTube Kids, Google Assistant, and others. The company this morning announced a series of product and policy changes that will allow younger people to stay more private and protected online and others that will limit ad targeting. The changes in Google’s case are even more expansive than those Instagram announced, as they span across an array of Google’s products instead of being limited to a single app.
Though Congress has been pressing Google and other tech companies on the negative impacts their services may have on children, not all changes being made are being required by law, Google says. “While some of these updates directly address upcoming regulations, we’ve gone beyond what’s required by law to protect teens on Google and YouTube,” a Google spokesperson told TechCrunch. “Many of these changes also extend beyond any current or upcoming regulation. We’re looking at ways to develop consistent product experiences and user controls for kids and teens globally,” they added. In other words, Google is building in some changes based on where it believes the industry is going rather than where it is right now.
On YouTube, Google says it will “gradually” start adjusting the default upload setting to the most private option for users ages 13 to 17, limiting the visibility of videos only to the users and those they directly share with, not the wider public. These younger teen users won’t be prevented from changing the setting back to “public,” necessarily, but they will now have to make an explicit and intentional choice when doing so. YouTube will then provide reminders indicating who can see their video, the company notes. YouTube told us the changes would only apply to new uploads — it will not retroactively set any existing videos to private.
YouTube will also turn on its “take a break” and bedtime reminders by default for all users ages 13 to 17 and will turn off autoplay. Again, these changes are related to the default settings — users can turn off the digital well-being features On YouTube’s platform for younger children, YouTube Kids. The company will also add an autoplay option, which is turned off by default, so parents will have to decide whether or not they want to use autoplay with their children. Later, parents will also be able to “lock” their default selection. The change puts the choice directly in parents’ hands after complaints from child safety advocates, and some members of Congress suggested such an algorithmic feature was problematic.
YouTube will also remove “overly commercial content” from YouTube Kid, following increased pressure from consumer advocacy groups and childhood experts, who have long since argued that YouTube encourages kids to spend money (or rather, beg their parents to do so). How YouTube will draw the line between acceptable and “overly commercial” content is less clear. Still, the company says it will, for example, remove videos that focus on product packaging — like the famous “unboxing” videos. This could impact some of YouTube’s more prominent creators of videos for kids, like multimillionaire Ryan’s Toy Review.
In addition to product packaging, it also says it will move any content that “incites viewers to buy a product” and “content focused on the excessive accumulation or consumption of products.” Elsewhere on Google, other changes impacting minors will also begin rolling out. In the weeks ahead, Google will introduce a new policy allowing anyone under 18, or a parent or guardian, to request the removal of their images from Google Image search results. This expands upon the existing “right to be forgotten” privacy policies in the E.U. but will introduce new products and controls globally for bids and teenagers. The company will make several adjustments to user accounts for people under t8.