The iPhone camera has a software problem and this video proves it

Back of the iPhone 14 Pro in Space Black
(Image credit: Joe Wituschek)
iPHONE 15: What you need to know

If you've been looking at the photos your iPhone 14 Pro is taking and wondering what is going on with them, you aren't alone. Something's amiss, and YouTuber Marques (MKBHD) Brownlee thinks he has the answer.

Brownlee recently ran his annual smartphone camera awards to see which device people think is taking the best photos and, as is often the case, Apple's flagship found itself floundering in the middle of the pack. But given the new 48-megapixel camera, you'd be forgiven for expecting better.

So what's going on?

It's all about the software

In a new YouTube video looking at the iPhone 14 Pro's camera performance, Brownlee points to the Google Pixel lineup as one example of where Apple needs to do some work.

You should definitely watch the full video to see what's going on, but Brownlee points to the fact that Google historically relied on the same camera sensor across most of its Pixel devices — using software to then tweak photos to look pretty great. The combination of the same sensor and gradually improving software meant for some awesome photos — even if the video performance sometimes left a little to be desired.

But Google found that when it changed the sensor for a new 50-megapixel one, things didn't look right. The software was doing too much work, creating an image that appeared artificial and overworked. And Apple now has the same problem.

For years, Apple's iPhones all used the same 12-megapixel camera sensor with Apple layering its own software on top to work out any kinks. Just like Google, that allowed Apple to iterate and refine that software, taking great photos along the way.

But things changed with the iPhone 14 Pro. Now, Apple's using a much higher resolution sensor — a 48-megapixel one, no less. But Brownlee believes that Apple's software is going overboard, working just as hard as it did with that 12-megapixel sensor when in reality it doesn't need to anymore.

The result? An artificial-looking photo. One that appears over-processed and just...off.

So what happens next? It's all down to Apple and, in all likelihood, this will all get fixed in software. Apple just needs to dial things back a bit to let that new sensor do the work, not its software.

Whether that'll come to the iPhone 14 Pro or not, we'll have to wait and see. Will Apple's best iPhone get a camera software update or will we all have to buy a new iPhone 15 Pro to see the fruits of Apple's labor?

Oliver Haslam
Contributor

Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too.

Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.