For the better part of a decade, Google has owned the mobile photography space. Dating back to, arguably, the very first Pixel in 2016 — and if not, then surely its successor models a year later — Google’s in-house smartphone ambitions felt born out of a desire to shake up what a standard smartphone camera could do. Even as sensors were reused over and over, Google found the right touches within its software and the processing performed on every single shot, leading to a world where Apple and Samsung seemingly could not compete.
In 2025, though, I have to admit I’m starting to sour on Google’s current camera quality. There are plenty of reasons as to why the Pixel lineup doesn’t feel indisputably at the top of its photographic game anymore, ranging from a lack of manual control over the appearance of my photos to a crutch-like lean on Ultra HDR processing. As much as I might like Google’s overall flavor of Android, I’m just not sure I see the camera as a real selling point in 2025.
Google’s obsession with AI-powered perfection is really turning me off
Hey Google, where’d my contrast go?
Let’s be clear about one thing: I don’t think the Pixel 9 series — regardless of model — takes bad photos. When I first reviewed the Pixel 9 Pro last August, I (rightfully) heaped plenty of praise onto its camera system:
Looking at my earliest samples taken for this review, photographs taken on Pixel remain my absolute favorite captured by a smartphone. Google’s devices are a reliable way to point, shoot, and capture memories, which can’t be said about every smartphone. I don’t think of myself as a particularly strong photographer, but the Pixel 9 Pro makes me feel like one.

Read our review
The Google Pixel 9 Pro is the Goldilocks flagship we’ve been waiting for
Not too big, not too small; just right
Looking back through my photo samples, I can completely understand how I’d reach that conclusion. From capturing the wedding of two close friends to taking photos in the park at dusk, I’ve taken plenty of really excellent photos with the Pixel 9 Pro (and, to a lesser extent, the Pixel 9 Pro Fold, which holds up well against its foldable competition). But despite all of the hype surrounding Google’s computational photography skills and AI-assisted tools, I’m just not sure the last time I saw a major improvement in how Google’s photos actually look.
Google’s always taken a slow approach to hardware upgrades — it essentially used the same camera sensor from the Pixel 2 through the Pixel 5 with nothing but the most minor of changes. In the 2010s, getting by on improved algorithmic changes to how that exact same sensor was capturing the scenery around it was more than enough, especially with photography wizard Marc Levoy at the wheel. Levoy left Google just shy of five years ago, in March of 2020, a very chill month where so many events happened, it took until May for anyone to even notice.
I’ll make it easy on you — here are my Pixel 9 Pro photo samples, as seen in last year’s review.
We’ve seen a handful of hardware changes since then, but none big enough or meaningful enough to really have an impact on how end users think about the Pixel. It’s still the “best smartphone camera you can buy,” which means a very different thing in the US than it does in the vast majority of the rest of the world. Instead, everything now is about AI.
Since the launch of the Pixel 6 series, we’ve seen Google push all of its photographic chips in on the concept of machine learning. Each step of the process has pushed the boundaries of what we are or are not comfortable with when it comes to automated photo edits. Want to erase something in the background of your shot? Magic Eraser has your back. Want to move someone closer into the group? Magic Editor takes things to the next level. Want to ditch the silly faces your kids made while riding a merry-go-round at Disney? With AI, anything is possible, so long as every photo looks as sterile as possible.
Go back and read the rest of the camera section in my Pixel 9 Pro review, everything I wrote under the blurb I included above. Google’s thrown so much at the wall this generation, I barely have any time to talk about how the actual photos look before I have to dive deep into discussing some of the very tools I’ve laid out above. Meanwhile, when it comes to the overall quality of the images, they’re… fine? Certainly in the upper echelon of what you can find at your local Verizon store, but a far cry from the leaps-and-bounds Google made against its competition in the earliest days of Pixel.
While it’s easy to blame Google’s focus on AI for this change — it’s certainly to blame for plenty of other problems — I also think the company, like Apple and Samsung, have decided that crafting as flat an image as possible is what consumers want. That’s how you end up with shots where shadows look as overexposed as the sun shining around them. Sure, it’s fine for posting to social media in one of your many photo dumps, but they lack the natural contrast or depth that makes a truly great photo pop.
At night, things get even worse, as Google’s processing chooses to brighten entire scenes rather than work with existing light sources. Yes, there’s technically more detail than ever — it’s not hard to zoom in on a night shot and find details in bushes or leaves that you probably wouldn’t have seen from your Pixel 2 — but a lot of the time, the final result ends up looking overprocessed and fake. Comparing photos from the Pixel 9 to its older predecessors, I can’t deny that Google’s image quality has obviously improved. But do I like these shots more than what I was getting in 2017, 2018, and even 2019? Not particularly.
Google can win me back, but it’s going to take effort
And I’m not sure the company cares
In fact, I’m editing more than ever. While the samples in my review are untouched — as they are in every review, with the exception of fixing unrotated shots and downsizing file size — once that review wrapped, I found myself tapping the edit button in Photos a lot more than I ever previously needed. It’s possible to get some truly jaw-dropping shots out of the Pixel 9 Pro, but in almost every case, you’ll need to tweak your color settings in the gallery app of your choice. I’m talking white balance, saturation, and above all else, contrast — all things, I think, that could be handled behind the scenes.
If Google wants to make its photo efforts feel a little more exciting to those of us who know our way around Photoshop or Lightroom, it has a few moves it can make. On the software side, taking a page out of the OnePlus playbook could make for a pretty compelling experience. While the Hasselblad partnership started off as a marketing gimmick, it’s since transformed into, in my eyes, one of the best examples of color science on a smartphone. The OnePlus 13 utilizes contrast in practically every shot it takes, all without losing out on detail, creating shots that feel vibrant in a way the Pixel’s photos do not.
In my eyes, OnePlus is now leagues above where I’ve seen other companies, frankly, and it’ll only get better on future devices. I’m not saying Google should go out and get some kind of Hasselblad-esque partnership. Rather, rethinking the color science behind every photo you take — rather than simply trying to make everything look visible — could result in a huge improvement. At the very least, I think it’s worth Google optimizing how these final photos appear after processing.
On the hardware side, I’d love to see Google move to meaningfully improve the sensors it selects on new phones. Google is among those brands who have crossed well over the four digit mark, and I’d like to see its specs match that price tag. Why not pair its AI optimizations — which, for better or worse, are the best in the game — with better hardware? What’s stopping Google from being the first company to ship a 1-inch sensor within a phone officially sold in the US? Why shouldn’t the Pixel team push boundaries while playing within a space that has grown increasingly stagnant throughout this decade?
I’d also like to echo what my fellow AP editor Taylor Kerns wrote about last month, and that’s the need for something like Photographic Styles on Android. Apple launched its updated Photographic Styles feature with the iPhone 16 last year, and while actively using it is about as user-hostile as most of Apple’s current-day software — get better, Apple — it is the perfect middleground, allowing for those of us who want extra contrast or a warmer look in our photos to get them without having to edit the shot every single time.
I still love the Pixel series, and I’m excited for the Pixel 10 Pro (not to mention the Pixel 10 Pro Fold, which will likely take the OnePlus Open 2’s spot as the foldable I’m particularly hyped to see this year). But if Google wants to keep me around as a user outside of review periods, it needs to refocus on producing better shots, and not just tossing in more AI-powered editing tricks. Until then, I’m left wondering whether the Pixel 9 Pro has really earned a spot in my pocket, or if that upcoming mini OnePlus 13T might take it for good.