iPhone 11 rumors have been floating around since… well, roughly 42 nanoseconds after the iPhone XS was introduced. But this year, perhaps more than any other, the iPhone 11 rumors just keep coming and almost all of them focus on the new camera.
The iPhone XS has a damn good camera and a terrific video camera, but since it was announced Google has released the Pixel 3 with Night Vision, Samsung has put out the Galaxy S10 with a three-camera system, and Huawei has escalated with a periscope lens and time of flight system.
Competition is fierce.
Rather watch that read? Hit play on the video above!
A (brief) history of iPhone cameras
The original iPhone shipped with a perfunctory 2-megapixel sensor. It took until the iPhone 3GS to even get video. But, for me, it was Steve Jobs' introduced of the iPhone 4 that really signaled Apple's shift towards being not just a mobile phone company but a mobile camera company. We got the first front-facing camera, but we also got the first really good camera, at least for its time.
Since then, Phil Schiller has introduced year after year of steady improvements, almost all of them designed to make real, higher-end features available on the iPhone. Focus Pixels, which is Apple's name for phase detection autofocus, deep trench isolation, and other physical advances came first.
Then, computational photography — bits and algorithms that could far exceed the atoms and optics. Portrait Mode, Portrait Lighting, Portrait green screen, which is my name for the Clips effect.
Now, we're in the age of augmented reality and neural engines — AR and AI — and we're seeing face tracking Animoji and 3D stickers and scenes, and image signal processor getting looped in to provide Smart HDR.
And with Huawei proving that optical advances are still possible, and Google pushing hard on the computational side, it's obvious there's far more to come.
Well, obvious in light of the rumors...
iPhone 11 camera: The rumors
iPhone 11 camera rumors started flying around really early. Like really early. Long before iPhone XS was even announced early.
From the Taipei Times, May 7, 2018:
Apple added a second camera to the iPhone 7 Plus, an effectively 56mm telephoto to the existing effective 22mm wide angle, back in 2016. In addition to letting you easily switch between and shoot with both focal lengths, it made the telephoto feel and act like a 2 times optical zoom, and also provided enough depth information for the first portrait mode.
Triple cameras, already present on Huawei, Samsung, and other phones, add, obviously, a third camera. In some cases that's been a monochrome lens but what's been even more exciting, and what I'd really like to see, is an ultra-wide angle like LG, Samsung, and Huawei have been fielding.
Samsung's is effectively 12mm and Huawei's, 15mm. It's proven hard to do well but if Apple nails it, we'll be able too zoom out and get everything in a shot as easily and elegantly as we've been zooming in to grab just what we want with the telephoto.
And, with Apple, I'm counting on proper and consistent color regardless of which camera we're using.
May 28, 2018, from Economic Daily:
November 1, 2018. From Kuo Ming-Chi, via AppleInsider:
Time-of-flight is a technology that uses lasers to more accurately measure the distance of objects in the scene. Think of it as depth effect on Hulk serum. In other words, it should let the rear camera do more of what the front, TrueDepth, camera has been doing.
I've been hearing about Apple's rear-facing AR camera for a few years now, and that it's really cool, but we haven't seen it ship yet. If this year is finally the year, then we could see the real beginning of Apple's augmented world being given its first test run on currently existing devices.
January 6, 2019 saw Steve Hemmerstoffer, @OnLeaks, show off what he claimed was the iPhone 11 camera system, at least in EVT form.
EVT is engineering validation test, which is followed by DVT, or design validation test, and PVT, or production validation test. In other words, an early step on the long road from prototype to release.
Now, there's only one way to say this: that big square bump is hella ugly. Uglier, for my money, than the horizontal strip on the back of the Galaxy S10 or the vertical strip, and little square ToF sensor, on the back of the Huawei P30 Pro. Uglier even than the also square but center instead of corner aligned, and even instead of stagger step arrayed lenses of the Mate 20 Pro. And, I say that as someone who though the early iPhone 4 leaks were ugly and ended up absolutely loving that now classic design.
January 11, 2019. The Wall Street Journal:
This comes off as both late to the rumor and off on the approach. Only having the triple camera on the 11 Max, and not the 11, would be a regression. In the days of the Plus, people had to choose between a smaller phone or a better camera. With the XS and XS Max, you no longer had to choose. Doing everything possible to fit the bigger camera system into the smaller phone would keep that very welcome trend going.
January 30, 2019, Mark Gurman and Debby Wu for Bloomberg as well:
Three on the iPhone XS updates and two on the iPhone XR update makes more of the kind of sense that does.
And this is also more of what I was referring to when I said the bits could transcend the atoms. We've seen depth effect, we've seen night vision, but photo and video repair is yet another fascinating use case for multiple cameras beyond just multiple focal ranges.
That's a significant increase and not just for depth effect but for computer vision, environment ingestion, and yeah, augmented reality.
March 15, 2019, Mac Otakara:
The Big Mac O's renders don't look as awkward as OnLeaks but don't look as realistic either. But renders are renders. God and the devil are in the production details.
March 28, 2019, Mr White on Twitter showed off something labeled "New iPhone XR", which really kinda far more resembled a schematic of the rumored iPhone 11 camera array, square, staggered layout and all.
April 4, 2019, from Weibo via Ben Geskin on Twitter, we see what looks to be the part based on the previous schematic.
If accurate, it's yet more evidence of both the placement and the layout of the new camera system.
April 8, 2019, Mac Otakara
If true, and Mac Otakara has gotten more than its fair share of things right in the past, this is really good news. Yes, camera bumps are hideously ugly and people hate them, but periscopes aside, bigger sensors are better but so are deeper lens systems.
It's the only real way to improve the physical optics of a camera system that includes several lenses of different focal lengths. I know some will repeat the old "just make the whole phone that thick and fill it with battery" cliche, but that'll just cause me to repeat the equally old "everybody says they want bigger batteries but no one buys or enjoys heavier phones" response, which is why Apple makes a separate smart battery case you can put on and take off, so you get the best of both battery worlds.
And, hey, the fugly battery hump makes the fugly camera bump mostly disappear. Mostly.
April 18, 2019, Kuo Ming-Chi via MacRumors:
Over the years, we've gotten closer and closer to that singular object that seems to be the inevitable endpoint of Apple and Jony Ive's minimalist design tendencies.
I, personally, like the industrial look of exposed lenses, but if Apple can make them blend close to or completely into the new frosted glass and metal that's rumored to make up the chassis, then that certainly seems like something the company would do.
Especially if it makes the giant, Home screen icon-shaped camera bump look even a little less giant.
iPhone 11: The potential
What about night vision?
If the rumors about bigger sensors are true, that should help out optically. Computationally, like what Google's doing, we'll have to wait and see.
The other factor, beyond just science, is philosophy. So far, Apple's been almost religious about keeping camera effects real-time. It's what they've done with Portrait Mode, Portrait Lighting, and even Depth Control.
Apple wants that photo to be live in preview and ready and waiting the moment you finish capture, no additional processing needed.
Google, from their Portrait Mode to Night Vision, seems super happy to handle heavy computational photography as, essentially, an after effect. You don't see it live and, after capture, you're perfectly welcome to wait a while as it processes.
When Apple does do stuff like that, like the Live Photo-derived Boomerang loops, Apple stuffs it in Photos instead of Camera. But, that really wouldn't meet expectations for enhanced low light, I mean, would it?
Huawei does its version of night more less after the fact and more as part of the process, talking you through it the way Apple had for a long time already with Panorama Mode.
That seems like a great way to bend but not break the rules, and to meet expectations better than post-process.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.