Saturday, February 8, 2025

Lawmakers Demand Answers From Ad Tech Vendors Allegedly Monetizing CSAM | AdExchanger

Must read

On Friday, Sens. Marsha Blackburn (R-Tenn) and Richard Blumenthal (D-Conn) sent co-written letters to Amazon, Google, Integral Ad Science, DoubleVerify, the MRC and TAG notifying the companies that they have been identified as “responsible for serving or certifying” ads on pages hosting child sexual abuse material (CSAM).

The letters request answers to questions about how these ads slipped by online advertising safeguards and why the content itself went unidentified and unreported.

Blackburn and Blumenthal, as well as legal and child protection services in the US and Canada, were alerted to the issue by Adalytics in a report also published on Friday.

The background

Adalytics was working on a separate and unrelated project involving URLscan.io when it uncovered examples of programmatic ads running against CSAM.

URLscan.io is a bot that scans the web for malicious sites and also logs ads that are served on those pages.

Adalytics identified an archive of CSAM on two free file-sharing websites: imgbb.com and ibb.co. These sites allow users to anonymously create links that host photos and videos. Users can also prevent links from being indexed by web search crawlers and set timers on links to disappear within a few days.

Both ibb.co and imgbb.com are familiar to the National Center for Missing & Exploited Children (NCMEC). In recent years, NCMEC has sent dozens of notifications to imgbb.com regarding hosted links containing CSAM, according to Adalytics.

The content and ads documented by Adalytics in the report were from 2021, 2022 and 2023. Some of the links had been sent to auto-delete, but other CSAM content was removed by the sites after takedown orders were sent by US and Canadian authorities.

In the past, Adalytics has received pushback on some of its research, because the methodology it typically uses involves generating ad impressions to study in real time. For example, in 2023, Adalytics demonstrated how advertisers could end up on sanctioned Iranian sites or on pornography sites via the Google Search Partner Network in response to benign search queries. Although Adalytics produced screen shots, it was not clear how many people were actually using those sites to look for new jobs, say, or search for groceries.

The letters


Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

The letters sent by Blackburn’s and Blumenthal’s offices demonstrate a knowledge of the ad tech ecosystem and are tailored to the recipients. These are not cookie-cutter questions.

For example, Amazon and Google are pressed on why their ad tech products (DV360, Google Ads and Google’s Performance Max get specific callouts) do not report page-level URL information for where ads are served. The letters also request details on the exact amounts the platforms have refunded advertisers or the United States government for ads served on the two sites.

In response to these accusations, Amazon released a statement: “We regret that this occurred and have swiftly taken action to block these websites from showing our ads. We have strict policies in place against serving ads on content of this nature, and we are taking additional steps to help ensure this does not happen in the future.”

A Google spokesperson, meanwhile, told AdExchanger that the company has “zero tolerance when it comes to content promoting child sexual abuse and exploitation and both of the accounts in question are terminated. Our teams are constantly monitoring Google’s publisher network for this type of content and we refer information to the appropriate authorities.”

For TAG and the MRC, Blackburn and Blumenthal want to know whether any vendors have previously reported CSAM content. Both groups are also being asked to document past examples of certifications that have been revoked or suspended for noncompliance, what specifically they’ll consider in this case with DV and IAS, and to share “what specific audits, monitoring or oversight mechanisms” they employ to ensure certified companies comply with their standards.

DoubleVerify and Integral Ad Science are also being asked to provide URL-level transparency. As well as if they’ve ever reported instances of CSAM content online.

“While the impression volume for our customers on this site was very small, we take this issue seriously,” according to a blog post in response by DoubleVerify. “The vast majority of these ads appeared alongside neutral content, in large part due to the pre-bid controls used by many of DV’s customers.”

The letter addressed to IAS CEO Lisa Utzschneider includes a particularly biting reproof.

“How does your company ensure comprehensive monitoring and vetting of websites in the ad supply chain?” the senators wrote. “Why were your systems unable to identify and block overtly unlawful websites hosting CSAM?”

Check your logs

The fact that ads were served against CSAM at all is a damning indictment of ad tech infrastructure.

Although some imgbb.com or ibb.co pages were deliberately cloaked and set to expire quickly, others were about as blatant as can be, with obvious CSAM text and images clearly visible on the page.

AdExchanger spoke with two brand marketers, two agency buyers and one paid media consultant who advised Adalytics and scoured their own log files for instances of ads served to imgbb.com or ibb.co.

Their consensus is that most of the ads observed being served to CSAM or adult pornography on imgbb.com or ibb.co were served by Amazon ad tech – which trafficked ads to the sites as a DSP and SSP – as well as by Google’s DV360 DSP.

To be fair, a long list of third-party ad tech vendors, including Criteo, TripleLift, Beeswax, PubMatic, Quantcast, Sharethrough, Zeta Global and Infillion, are also accused of allegedly serving ads to explicit content based on the archive identified by Adalytics.

But Amazon’s and Google’s ad tech is where the real problem lies, .

Although, yes, third-party ad tech vendors were observed serving ads on imgbb.com, according to one agency buyer, it was “literally a few.”

Yet the fallout will likely be toughest for third-party ad tech, he said, because those vendors generally provide log files, even when the logs demonstrate that ads were served to explicit content, if not to even monetize CSAM.

But the walled garden platforms “almost certainly served more – we just can’t tell,” the agency buyer said, because they don’t share log file data.

For instance, another agency buyer told AdExchanger they were able to document that The Trade Desk and other open DSPs they use had not served ads on imgbb.com or ibb.co, let alone against CSAM content, because they could see the log files.

But procuring information regarding Google Performance Max campaigns is a major hassle, they said, and Amazon’s AI-based ad network, Performance+, only provides domain level-information and not by URL.

In other words, an advertiser can see whether its ads ran on “imgbb.com,” but not specific known URLs hosted by the site or exactly how often a URL was targeted.

As bad as bad gets

For programmatic advertisers, these latest findings by Adalytics are a somber reminder of the danger in thinking that you can truly solve a problem simply by checking a box.

One brand marketer and one agency buyer, for example, confirmed to AdExchanger that their ads had been served on imgbb.com and ibb.co in 2022 and 2023, including next to graphic content.

All of these impressions were marked as either “brand safe” or “brand suitable” by DoubleVerify or Integral Ad Science, and with post-bid brand safety tech, the marketer told AdExchanger.

No doubt many brands and agencies will respond to this news by becoming even more skittish about spending on the open web, said the former agency brand safety product leader who advised Adalytics on its report. Instead, budgets will shift to Instagram and TikTok, which are also plagued by instances and monetization of CSAM, he said, but which are rarely held accountable by advertisers.

Ad tech is no stranger to scandals, from fraud, bots and invalid traffic to made-for-arbitrage ad schemes and other programmatic pestilences. But missing the mark so badly on CSAM has more sobering consequences than wasted ad spend.

Several of the links reported by Adalytics that contain CSAM – and programmatic ad units – show children as young as four to six years old, according to NCMEC in the US and the Canadian Centre for Child Protection (C3P), which both reviewed the material.

One known missing child has been identified in the material by the C3P, according to two sources familiar with the Adalytics report, the fact of which is now part of an investigation.

Latest article