Google is making several much-needed changes to protect the privacy of your kids. Not only will the company prevent ad targeting based on kids’ age, gender, or interests, but it will let kids request that their photos be removed from Images. Plus, Google will launch a privacy-focused Play Store category and enable Safe Search and privacy settings on kids’ accounts by default.
Yeah, it sucks to find out that Google has been tracking your kid to sell data to advertisers, and I’m not sure why kids’ Google accounts weren’t set to Safe Search from the get-go, but at least a change is coming. Google is even setting kids’ YouTube videos to private by default—a small detail most people would never think of!
But is this a compassionate stance from everybody’s favorite mega-corporation? In its blog announcing these changes, Google states the following:
Some countries are implementing regulations in this area, and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens globally.
There’s your answer! It isn’t clear what countries or regulations Google is referring to, though the company was sued for “unlawful use of children’s data” by the UK and EU last year and is a constant target for European lawmakers.
In an interview with TechCrunch, a Google spokesperson clarifies that the company’s “gone beyond what’s required by law,” and that many of these changes “extend beyond” any regulations. That’s nice, but given that Facebook implemented similar child-protecting rules just last week, we have to ask whether Google is looking out for kids or looking out for itself. These changes are necessary, but they should have happened over a decade ago.