I had no problem taking out the vaping couple on a beach towel. They’re in the background of one of my vacation photos, and they’re really mucking up the vibe. The Google Pixel’s Magic Editor feature takes them right out with a few taps, and you know what? I felt fine about that. It’s drawing the line that I’m having a hard time with.
My taken-on-an-iPhone, edited-on-the-Pixel-8-Pro vacation photos are just the kind of thing Google’s generative AI editing features are designed for. We went to a beach on Lake Michigan for sunset, and I had an adorable photo of my child sitting on my husband’s shoulders. It’s the kind of moment you want to crystallize and keep in a jar for the whole rest of your life.
Except for the couple of dinguses in the background. Three taps in Magic Editor, and they’re gone. But while I’m there, I start to consider the other stuff in the scene I could change. How about the handful of cars in the parking lot behind them? What about the trash can in the distance? Maybe I could emphasize the glow of the sunset a little more?
I messed around with the AI tools and discovered that, yeah, I can do all of that stuff. But could I even call it a photo of our vacation with all of those changes? Or have I tipped into “it’s a memory, not a photo” territory? That’s when I got kind of queasy and noped right out of the app.
Things are about to get even weirder. The Pixel 9 series launches on August 22 and will come with generative AI tools on a whole new level that let you “reimagine” entire portions of your photo. You’ll be able to use AI to add objects and scenery to images with text prompts or get everyone into a group photo by merging two different frames. You won’t just be able to tweak the background and the lighting of your vacation photos; you’ll be able to change the location entirely. Scrubbing out a few parked cars is nothing compared to what’s coming in a matter of days.
Like me, not everyone will have the stomach for it. In fact, some people are running in the opposite direction as fast as they can. iPhone camera app maker Halide just released a new mode called Process Zero that skips the AI and multiframe processing, rolling back the clock to the early days of phone cameras, before computational photography. And Gen Z is fueling something of a vintage digital camera revival, seeking a grittier, lo-fi aesthetic that you don’t get from a modern phone camera app optimized to boost shadows, pump the saturation, and brighten faces.
Personally, I’d rather stick with the native camera app and get the most I can out of every pixel. But it’s a telling reaction to the prevailing data-heavy technology, and it’s not unlike the backlash to Google’s recent Summer Olympic faux pas. The company put out an ad that featured a father using Gemini to help his daughter write a fan letter to her track star idol. That didn’t sit well with a lot of people, who argued that actually writing a letter like that is sort of the point. Google eventually pulled the ad.
Imperfection is sometimes the point
The thing is, imperfection is sometimes the point. Putting in the work to write a heartfelt letter, one word after another, is what makes it meaningful. Smoothing out the edges takes something essentially human away from the final product. I think Gen Z’s gravitation toward disconnected, “dumb” digital cameras reflects a similar impulse. When everything looks too good, then it feels less personal.
Like digital processing, we’re all going to find our own comfort level with generative AI photo edits because these tools definitely aren’t going away anytime soon. And for some kinds of photos, I guess I do like having the option of brushing out a distraction in the background. But I don’t need every photo to look polished and ready for the Christmas card, the same way I wouldn’t write a letter to a friend the way I’d write, I don’t know, a college application essay. Sometimes a little bit of grit is just perfect.