Is the iPhone camera a seller or a settler?

I'm reading Vlad Savov's piece on The Verge about how the iPhone camera used to be a selling point but is now a settling point.

I like Vlad's writing a lot. Mostly because his point of view is often so different than my own. It forces me to think, to challenge my own preconceptions. And, regardless of whether I end up agreeing or disagreeing, to learn through the process.

Rather watch than read? Hit play on the video above!

It's here and now, and Vlad writes that the iPhone camera has fallen behind, and it's now something users tend to accept rather than anticipate.

It's 2009 and Phil Schiller is announcing video recording, coming for the first time to the iPhone 3GS.

It's 2010 and Steve Jobs is showing off the iPhone 4, giving Apple their first modern camera keynote, and us, arguably, their first really modern camera.

It's here, now, Vlad writes that in the past couple of years every phone company has worked overtime to secure a lead in this all-important area. Well, every company except for Apple and Samsung, it seems. And, if you're in a limited market like the US where they're the only options you see, you could start to believe that mobile cameras have stagnated.

It's September 2016, I'm sitting in the Bill Graham Civic Center as Apple's Senior Vice President of Worldwide Marketing, Phil Schiller, announces the iPhone 7 Plus, its dual camera system, 2x optical zoom, and Portrait Mode.

It's a week or so later, I'm in New York City with Serenity Caldwell and Michael Fisher, MrMobile, walking across the Brooklyn Bridge well after midnight, testing the low light capabilities of the 7 Plus, experiencing something on a phone I've only ever experienced with a DLSR — seeing more light in a photo than I did with my naked eye.

It's here, now, Vlad asks where is the iPhone?

It's September 2017, I'm sitting in the Steve Jobs theater, Phil Schiller is back on stage. He's introducing the TrueDepth camera system on iPhone X. It doesn't just do Face ID, it does Portrait Selfies, Portrait Lighting, and in the Clips app, Portrait Green Screen. A computational trifecta, if one that's still far from perfect. But more, augmented reality. Sure, Animoji and Snapchat get all the stage time, but it's boiling the water for facial tracking and the emotive AR avatars of the future.

Selfies on iPhone

Selfies on iPhone (Image credit: iMore)

It's November 2017, I'm back in New York City for the iPhone X launch, at the Apple Store on 5th avenue, talking to the people in line. One of the people, a doctor I think, it's a blur, says he buys the new iPhone every year just for the camera. It's his main camera and he uses it to take photos of his kids. He can never go back in time and take better photos, so he always wants the photos he's taking now to be as good as they can possibly be.

It's here, now, Vlad writes Google came out with the Pixel camera in 2016 and raised the bar of expectation for mobile photography a couple of notches above the iPhone.

It's December 2017, my Pixel 2 XL arrives. I ordered it after watching Mario Queiroz show it off on the Made by Google stream. I take it out, eager to see how Google is doing Portrait Mode with a single camera, a segmentation mask, and some of the best algorithms in the business. I search for it but can't find it. Finally, I do, buried under the Menu system. I turn it on. I see… nothing. I think I must be doing something wrong. I toggle it again. And again I take a photo. No depth. I think it's broken. Or that I'm an idiot. I take another photo. No depth. I start cursing. Literally cursing. And then, after a few long seconds, the depth appears. It wasn't real-time like Apple had been doing for over a year. It was an after effect. Why didn't I know that? I check all the Pixel 2 reviews I can find. Only a couple of them even mention it in passing.

Still, the algorithm is outstanding. It can misfire, but so can Apple's. Either way, it's evident: We're fully in the age of computational photography.

It's here, now, and it echoes, where is the iPhone?

It's September 2018, Phil Schiller is once again pacing the stage, introducing the iPhone XS camera with its image signal processor bound to its neural engine, delivering smart HDR, but also Portrait Mode bokeh with a virtual lens system, giving it the character of real-world glass. And the iPhone XR, with an entirely differently modeled virtual lens system for its Pixel-like Portrait mode. What's more, a video camera that's not only 4K and 60fps, but when shooting 30 or under, interleaves enhanced dynamic range data in between the frames, and with stereo sound.

iPhone XS Depth Change

iPhone XS Depth Change (Image credit: Rene Ritchie / iMore)

It's October 2018 and I'm trying to order a Pixel 3. It's just been shown off by Liza Ma and I've ordered almost every Google phone since the Nexus One, which was one of my all-time favorite phones, but I can't get the order to go through. Because Canada. Or Google. Or whatever. I have a lot of other stuff to do so I figure I'll get back to it later. But, I really — I mean really — want to try out its Night Sight feature which uses very clever hardware and software synchronization to suck in so much light it can literally turn midnight into, if not midday, then at least mid-afternoon.

It's praised as the wonder camera. The best ever. Shaming all others. At least at launch.

It's here, now, Vlad writes the iPhone, even while maintaining an edge in video quality and improving its stills photography in small ways every year, has been surpassed by faster-moving competitors.

It's January 2019, I'm in Las Vegas for CES. I have an iPhone XS with me. I'm shooting everything with it, still and video. Two of my colleagues and friends who work for Android Central are with me. They have Pixel 3s. They love the photos it produces but complain that it often takes some random variable amount of time to launch and they're missing shots.

It's so bad Andrew Martonik writes about how fed up he is. Droid Life says it's pissing them off. Artem from Android Police tweets. MKBHD makes a ( The list grows and grows.

Maybe it's Android on 4GB of memory. Maybe it's failing to clear enough of that memory to launch. Maybe there's more to great software than just phenomenal algorithms. Theories abound. I'm 12 percent less upset my Pixel 3 order never went through. An argument could be made for 15.

It's here, now, Vlad writes It wasn't that long ago that we looked to Apple to be the leader in popularizing — if not necessarily inventing — new creative technologies.

It's there, then, I settle in and watch High Flying Bird. It's by Steven Soderbergh and it's shot on iPhone. His second film to do that.

It's February 2019, Samsung is unpacking the Galaxy S10. The camera improvements seem solid, especially the LG-style ultra-wide third lens and the video camera improvements. The color science has matured. Become less garish and more real. But, strangely, it doesn't get as much attention as I expected. Not the video stuff, of course. Most of tech media has been ignoring that on the iPhone for years so ignoring it on Samsung is just par for the stills course.

And not even because of the the Pixel 3, which the notorious DXOmark has dropped, conspicuously, lower than simple release order most of their other rankings fall under, quality or consulting notwithstanding.

But maybe because of Huawei, which has stepped boldly out of its me-too shadow to push the boundaries of photography hard. Massive AI modes and multiple lenses hard.

It's here. Now. Vlad. Huawei didn't settle for merely raising the bar for low-light photography.

It's March 2019 and I'm watching my friends and colleagues watch Huawei make the announcement. It's got a time of flight sensor, a periscope assembly, and other hardware that, as usual, seems to be part bleeding edge and part sci-fi.

I want to talk about it so I bring on a friend and colleague who covered it onto the show, Daniel Bader, Managing Editor of Android Central, and he confirms what everyone's saying about how aggressive and impressive the hardware is. But also confirms the other thing everyone is saying — the software and interface are still complex and challenging and it takes two hands to do what Apple enables far more simply with just one.

It's not quite an LG or HTC story. They both had been doing interesting cameras for years, with ultra wide angles and ultra low apertures, but with that one part never quite overcoming the lackluster sum of their whole. But it's not completely unlike it either.

I'm sitting in on a social strategy seminar when a slide comes up that says you should always post to Instagram from an iPhone. A few of the Android users in attendance object until the hosts rattle off a list of missing features and bugs or inconsistencies in how Android cameras hit Instagram, to the point where they simply consider it unusable.

I mention it to a friend of mine who works in marketing for a non-Apple company. He confirms. The dirty little secret is that almost all of them also carry and use an iPhone. I ask if that's why we see so many slip-ups, where Android brands and endorsors post from iPhone by mistake. He laughs. I picture the smiling crying emoji.

It's here. Now. Vlad. Apple's innovative edge has blunted.

I notice YouTuber Tyler Stalman's comment on my video. He's a pro photographer and videographer, and he says 90% of the other pros he encounters are all still primarily on iPhone as well. It's not inertia. It's trust. It's the same reason that doctor was upgrading to take pictures of his children.

It wasn't about the number of cameras or the cleverness of the algorithms, though nobody was dismissing either of those things. It was simply because, from pocket to shot to share, the iPhone, for them, was still the camera to beat.

Especially when you combined it with a company that you could trust to push out updates for years and years, and not just abandon one phone as soon as the carrier asked for the next, it was an easy sell.

And one that designed everything from the silicon on up, including the custom storage controller, to make sure every part of the pipeline, every burst, every frame, was saved, every time. It's not the kind of innovation that gets noticed on stage or in the press, but it's part of the same chain of trust, of reliability, that keeps so many people coming back.

It's here. Now. Vlad. The iPhone, for most of its existence, has been the standard-setting phone for mobile photography. Yes, the Nokia Lumia 1020 and 808 PureView happened, but they never put the pieces of usability and quality together quite like Apple's phone did.

The last decade flashes in front of me. Matthew Miller is calling the 808 the best camera ever on a phone. Daniel Rubino is extolling the virtues of the giant, face-hugger-like sensor on 1020…

Optically, better than anything Apple or Android offered at the time, but they never found much appeal among the mainstream, even the mainstream, even those for whom phones were increasingly cameras first.

It's here. Now. Vlad echoes again. Nokia never put the pieces of usability and quality together quite like Apple's phone did.

It's 2015 and Vlad is getting outshot by an iPhone 6s at CES and he's writing To beat the iPhone, you have to beat the iPhone's camera

It's 2016 and Apple's introducing the iPhone 7, not only with portrait mode and 2x zoom, but with a full DCI-P3 wide gamut imaging pipeline, from capture to color science and management to screen calibration.

It's 2017 and we're getting TrueDepth, 4K 60fps video, and a new image format called HEIF. Custom encode and decode blocks carry the load at the hardware level.

It's 2018, Smart HDR, virtual lenses while others are still doing disc blurs, and real-time Depth Control on Portrait Mode photos while others still can't do real-time Portrait Mode.

Internally, a neural engine three years in the making debuts with the A11 and is expanded and integrated with the image signal processor in the A12. A custom storage controller makes sure every burst, every frame, is saved every time, so nothing is skipped, nothing is lost.

It's here. Now. Vlad. This could always be the quiet before the storm of new groundbreaking designs and innovations from Apple.

It's 2017 and Google is announcing Portrait Mode a year after Apple. But it's not real time. It's 2018, Google is announcing again. It's still not real time. But nobody cares because they have Night Sight and Apple doesn't.

Precisely zero new iPhones have come out since then but Apple is still being accused of failing to counter it.

It's September 2019 and I'm waiting for the keynote to start and I'm still wondering if all the low light expectational debt built up by all the nerds is going to be satisfied when nerds have seldom if ever been Apple's target.

Because Google and Huawei are certainly doing it, but they're certainly not doing it in real time either. They're doing it in post. Like an after effect. Like a filter.

And real-time seems critically important to Apple. They still seem to be treating the iPhone camera as close to a real camera as possible and they simply don't seem to want to do anything in post process at all. Portrait Mode is live. Portrait Lighting is live. Portrait Green Screen – my name, not theirs — is live. Depth Effect is live. Smart HDR is, I think, simulated live, but it's so hard to tell and so instant on tapping on the pic, that it may as well be live.

Night Sight isn't. So will Apple lose the live religion and push out an after effects filter that does similar or the same? The lights dim. The music fades away.

But low light is clearly the current battleground. Some people try to poo poo that and say it's not important and Apple doesn't need to address it, but not me, because it absolutely is. It was that way for DSLR and it's that way for phones. So, Apple will have to find some way to address it, even if it's a very different, very silicon, way to do it.

Tim Cook takes the stage. He smiles. Waves.

Apple has stuck to their everyday photography philosophy for as long as they've been serious about photography. Leaving manual modes, RAW processing, depthy API, and the like to third party pro app developers, and keeping the built-in camera experience strictly tuned to capturing the best image possible, in as wide a range of situations as possible, for as many people as possible, as fast and easy as possible, without too many obstacles getting in the way.

"Good morning… Thank You…"

Tim has just announced the next iPhone and Phil is on the stage to tell us all about it. Will it be the rumored 3 camera system and time of flight sensor? Will it be the rear-mounted, TrueDepth AR camera I've heard whispered about for years. Will it be the next giant computational leap forward with full environmental ingestion for real time lens and angle change and virtual reconstruction? I still don't know. I don't even know if it's 2019 any more or the year after or the year after that. The future streams out like the past.

It's here. Now. Vlad. Where is the iPhone camera?

I like his writing. A lot of people, probably more than you might guess, simply churn out the latest Android review while still using an iPhone as their daily driver. Vlad isn't like that. I don't enjoy the benefit of knowing him almost at all, but his appreciation and expertise in Android is evident. He seems to really love it, sincerely. Excitedly. But not blindly. Which is the best way to do it.

It's why I value his point of view even and especially when it's often different than my own.

It's 2015 and Vlad's writing that Speed kills, and the iPhone goes from 0 to a good picture faster than anything else.

It's 2019 and I'm seeing rants from dedicated Android experts that the Pixel 3 camera is insufferably slow to launch and Huawei's camera app is cumbersome to use. And I'm shooting all the b-roll and taking all the thumbnails and posting all the social at the speed of tap from my iPhone.

It's 2020, we have the Pixel 4 Ultra, Galaxy S11, the P40 Pro, that portless, buttonless killer camera phone from that vendor nobody expected, and Phil Schiller flips to the iPhone 12 camera slide, and…

It's here. Now. And I'm apologizing to Alan Moore for so mangling his writing style, and never the less asking you to hit like, hit subscribe if you haven't already, because it really helps out the channel, and then I'm thanking Vlad for making me think and thanking you all so much for reading this little experiment.

○ Video: YouTube
○ Podcast: Apple | Overcast | Pocket Casts | RSS
○ Column: iMore | RSS
○ Social: Twitter | Instagram

Rene Ritchie

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.