Thursday, December 26, 2024

Google and Meta ran a secret ad campaign targeting teens

Must read

Photo: Sheldon Cooper/SOPA Images/LightRocket (Getty Images)

Google and Meta ran a secret ad campaign targeting teens, in violation of Google’s own rules, the Financial Times reports. The ads on YouTube were intended to bring more 13- to 17-year olds to Instagram as TikTok’s dominance rises.

The companies intended to expand the advertisements abroad, but Google investigated and shut down the project after being approached by the Financial Times.

Google told Quartz that the campaign was “small in nature” but that it is has “thoroughly reviewed the allegations with regard to circumvention of our policies” and is taking “appropriate steps.” Specifically, the company said it will refresh its training to make sure that our sales representatives understand that they are prohibited from assisting advertisers in trying to specifically target sensitive audiences. Google said it has a history of company initiatives aimed at protecting kids and teens online. Meta did not immediately respond to Quartz’s request for comment but denied wrongdoing in statements to FT.

Why it matters

Meta has been under major scrutiny for its failure to protect teen users. CEO Mark Zuckerberg publicly apologized for those failures at a Senate hearing in January. While advertisements designed to get teens to use Instagram is a far cry from the much-more-extreme issues of “sexploitation” on Meta’s platforms, ads targeting teens still have apparent harm and have been linked to negative health outcomes. And the industry at large is facing heat for profiting from advertising directed at children.

The U.S. Senate just passed legislation that is designed to hold technology giants accountable for any harm their platforms may cause minors. One of the bills, the Children and Teens’ Online Privacy Protection Act, or COPA 2.0, bans targeted advertising to minors and the collection of data without their consent. It gives parents and children the option to delete their information from social media platforms.

The second bill, the Kids Online Safety Act, requires tech companies to design online platforms in ways that would mitigate or prevent harming users, including cyberbullying, sexual exploitation, and drug use. The bill will require platforms to limit the ability of adult users to speak with minors and offer parental tools that allow guardians to manage their privacy.

Latest article