Tuesday, November 5, 2024

Google’s Search AI Makes Disgusting Recommendation for Parents of Toddlers

Must read

Should parents smear human feces on balloons to teach their kids a lesson? Google’s AI Overview says yes.

Google searches regarding toilet training tactics for children repeatedly returned suggestions from the company’s “AI Overview” search feature advising us to — apologies in advance — smear poop on a balloon.

Consider a search for “how to teach wiping poo during potty training,” a perfectly reasonable query for parents struggling with transitioning their kids away from diapers.

“Make it fun,” the AI told us in response. “You can try a balloon bathroom activity where you put a little poo on a balloon and tape it to a chair. Have your child practice reaching around to wipe the balloon.”

The “balloon” method of toilet training is a real method of showing kids how to wipe themselves after using the bathroom. Basically, the idea is that a parent puts a bit of fake waste, simulated with shaving cream or peanut butter, on one or two inflated balloons. They then tack the balloon or balloons to the back of a chair, and the child practices reaching around to wipe clean.

Emphasis on the shaving cream or peanut butter, because it would be obviously unsanitary and disgusting to use actual human feces for the exercise.

Unfortunately, Google’s AI Overview clearly didn’t get that memo.

Many related queries returned the same bizarre unhygienic advice. Asking “how to teach wiping poo,” for instance, returned yet another similar suggestion.

“To make wiping fun, you can tape a balloon with a little bit of poo on it to the back of a chair,” the AI encouraged. “Have your child practice wiping the balloon as if they were on the toilet.”

And when we asked Google point-blank whether it was a “good idea to wipe poop on a balloon,” the AI didn’t miss a beat.

“Yes,” it said. It then broke down the balloon method in detail, telling us that we should “use a little bit of poop.”

“Start with a small amount of poop on the balloon,” it clarified.

All of these queries cited the same primary source: a 2022 YouTube video by Australian pediatric occupational therapists, in which two employees demonstrate how the balloon method works. After explaining that they’ll be using shaving cream, they cheekily refer to it as “poo” during the rest of the video.

“So all you’re going to need is a balloon, some shaving cream, some toilet paper, a chair, and just a little piece of sticky tape as well,” one of the Australian therapists explains to the camera. “So to start off with, I’m going to put some ‘poo’ on the balloon,” she adds, dabbing on a bit of shaving cream.

As any adult human can understand from the clip, when the person in the video uses the word “poo,” she’s not suggesting that parents use actual feces. Referring to the shaving cream as “poo” is just a way of explaining the training method and why it’s a useful technique.

But Google’s AI Overview completely missed that context, instead suggesting that parents go the literal — not to mention horribly unsanitary — route.

As such, the AI’s misunderstanding is a perfect example of how tech companies like Google often deploy the still-unreliable tech: they roll it out for a huge number of users, it attempts to interpret complex information on the web and inevitably butchers it — remember when the same search AI recommended that users put glue on pizza and eat small rocks for their health? — and then the company manually fixes bad responses piecemeal as people point them out.

It’s a prime example of a large language model-powered AI tool mangling information as it parses through it, missing important nuances in the source material. As a result, the AI coughs up a much worse final output, therefore offering no real service to the human searcher on the other side.

“I checked the source link and it’s a cutesy video about teaching your child to wipe properly by practicing with a balloon, toilet paper, and shaving cream,” a Bluesky user who first caught the AI’s error wrote in a Monday post. “AI just managed to make is so much worse.”

“The magic trick the AI companies pulled is pretty amazing,” another Bluesky user wrote back. “They took what amounts to ‘an interesting tech demo’ and turned it into a gigantic, world-consuming grift that produces nothing of value.”

In response to questions, a spokesperson for Google told us that “AI Overviews are dynamic,” and “in this specific instance, some overviews are missing quotation marks that would help contextualize the tip a bit better, while some include the term in quotes.”

The spokesperson also said that “AI Overviews are built to only show information that’s backed up by top web results with links so people can click through to learn more,” adding that the “accuracy rate” of the AI-embedded feature is “on par with” longtime Google Search features like Featured Snippets.

“When AI Overviews misinterpret language or miss some context,” the spokesperson continued, “we use those examples to improve, as we do with all Search features.”

After its AI told users to eat rocks and put glue on pizza, Google pared back the tool and removed certain responses. That was in May, but judging by its poop balloon suggestions, the AI still has a long way to go until it’s reliably reasoning through contextually-layered information.

The stakes for shoddy AI are higher than bad search results, too. The state of Nevada is currently in the process of launching a system, powered by Google’s AI, that will recommend which applicants should get unemployment benefits after losing their jobs.

More on Google’s AI Overview: Google Admits Its AI Search Feature Is a Dumpster Fire, Says It Will Scale Back the Tool

Latest article