The mobile camera war is currently being fought in glass, in chips, in apps, and on the cloud. Where should iPhone 6 be?

With the iPhone 6 — presumably due sometime this fall — Apple will no doubt continue its focus on photography and on making the best "everyday" cameras in the business. But how will they keep doing that? There are several possibilities. They can add a bigger piece of glass that takes up a lot more space but captures an incredible amount of information. They can continue to improve their already amazing image signal processor (ISP) that takes what information is captured and makes incredible use of it. They can also continue to improve the software that takes the bits, analyzes and processes the pixels out of them, and creates incredible results. And they can introduce fantastic cloud services that take whatever comes out of the phone and applies incredible server-side power to it. In other words, they can amp up the glass, in the chip, the apps, and/or the cloud. The question is, which way will the iPhone 6 go?

big glass bumps

With the iPhone 5s, Apple has done an amazing job of squeezing high-quality optics into very little z-axis — that is, thickness. What glass there is in the iPhone 5s is composed of five discreet elements, precision aligned, with a backside illuminated sensor (BSI), 8 megapixel, 1.5-micron pixel size, and f/2.2 aperture. Considering the casing is only 7.6mm thin, that's pretty close to an engineering miracle.

Nokia, by contrast, has a 41-megapixel Carl Zeiss lens on its Lumia 1020. It has six elements, is also BSI and f/2.2, and has 1.4-micron pixels. In addition to the giant lens, however, it adds optical image stabilization (OIS). In other words, it physically floats the lens so it can keep the aperture open for a long time and drink in a ton of light. All of that takes up a lot of physical space, however. Not only does it make the Lumia 1020 10.4mm thick, it forced a face-hugger-sized bump on the back of the phone.

The original HTC One also had OIS and a thickness of 9.3mm but managed to avoid the bump by rounding the entire back. HTC went with 4 megapixels of resolution, but larger pixels at 2.0 microns and an f/2.0 aperture. The new HTC One M8 added a second lens to the back. It records distance information so that, later, the focal point/depth of field can be adjusted.

Samsung, for its part, will be continuing to wage the megapixel war, going to 16 megapixels for the Galaxy S5. With those megapixels come a mega-bump, however.

There's a lot Apple could do to improve the optics on the iPhone, but would they do it if the cost was a thicker, bumpier product?

The signal processor

What the iPhone lacks in lens size it more than makes up for in image signal processing (ISP) power. Most recently, that's thanks to the supercharged Apple A7 chipset. The ISP focuses, balances, exposures, and otherwise processes the image so it comes out looking as good as possible. Some of this occurs pre-capture, like multiple facial recognition. Some of it occurs post-capture, like the auto (digital) image stabilization (AIS).

The A7 is so fast that saving high-dynamic range (HDR) images — something that took a few seconds on older iPhones — can now be done instantly and even automatically on the iPhone 5s. The A7 can also handle 10fps burst mode, dynamic exposure for panoramas, and more. It can even handle AIS, which requires 4 short exposure pictures be taken in rapid succession and then combined to produce the least noise, subject motion, and hand motion as possible.

Other manufacturers also have dedicated image signal processors yet none of them can yet do as much with as little as the iPhone 5s. If rumors are true and Apple is readying an Apple A8 processor for the iPhone 6, then even more image signal processing may be possible.

AIS can stabilize some amount of subject motion, which OIS can't do, but it's not as good for hand motion and doesn't allow the kind of long exposures for low-light that OIS makes possible. Likewise dynamic depth of field control currently requires a second lens to gather distance information.

The A8 ISP will no doubt be better than the A7, but how much better can it realistically be?

The camera app

The iPhone has always had a good camera app. With iOS 7 the old, heavily chromed, shutter-effect laden version was abandoned in favor of a newer, cleaner, blur-transitioning version but the functionality has remained much the same. What new features there are serve to support the Apple A7 ISP. Auto-HDR can be toggled on and off. Burst mode photos can be fully explored. Filters can be applied live and post-capture. Images can be auto-enhanced and have red-eye removed. And that's about it. Everything else from more manual controls to more in-depth editing to special effects are left to App Store apps, including Apple's own iPhoto.

Almost all other manufacturers have been more aggressive about adding functionality to their built-in camera apps. BlackBerry 10 offers time-shift so that you can move backward or forward to find the perfect picture, almost like a continuous, selective burst mode. HTC and Samsung offer a host of image manipulations and effects. Some of them, including Nokia, will even let you access the RAW image data.

Some of the effects are silly and, because of complexity, I'd argue many of them are better left to optional third-party apps. However, some are useful and some are just plain fun. Would Apple be willing to integrate even more options into the Camera app?

The cloud

Apple currently limits their iCloud imaging services to Photo Stream. Originally a way to temporarily store photos to aid in propagating them across iOS and OS X devices, Apple more recently added sharing to the mix. Now you can create albums for friends, family, and coworkers, like each others photos, and leave comments. It's... quaint.

Google, on the other hand, is awesome. Auto Awesome. If you're willing to let them suck all your photographs into the G+ cloud, they'll put the full weight of their massive server-farm behind it and perform effects from the functional to whimsical to the silly. Google also bought Nik Software, the makers of Snapseed and famous filter and FX packages. They're also putting considerable talent and resources behind their photography efforts.

Apple has a fantastic Pro Apps team, including those working on iPhoto and Aperture. Could they take those services to the iCloud?

The bottom line

Apple already makes the best image signal processors in the industry and their software is simple and solid. It would be great to see them do some of the dynamic focus stuff and a few of the specialty modes like multiple exposure but otherwise they seem to really be in the zone when it comes to post-capture, on-device processing.

That leaves better glass and server-side processing as the major opportunities. Could Apple make a better iPhone 6 camera by upping the optical ante? Almost certainly. Would it be willing to increase the thickness of the iPhone 6 to do it? Almost certainly not. While a thicker iPhone would also allow for things like longer battery life, it would make for a heavier device, like the Lumia 1020, and that is something Apple has thus far avoided at all costs. Likewise, it's tough to see Apple doing anything that would result in an unsightly bulge. A slightly raised camera, like on the iPod touch, is one thing. (Especially if the trade off is an even thinner casing, which is what happened with the iPod touch.) A big honking protrusion is quite another.

Services aren't typically Apple's forte either. They also don't like manipulating people's data on the cloud, preferring to keep things local when they can. iWork for iCloud is a recent exception, so maybe there's some hope for iPhoto for iCloud. It would be great to have a viable competitor to the Google and Facebook-based photo clouds. Storage and backup aside, though, there seems little advantage to Apple doing in the cloud what they can do on-device.

All of that adds up to the iPhone 6 likely getting what we typically get from Apple — slightly better optics, impressively better ISP, slightly better Camera app, and small improvements to cloud services. Camera nerds will want more, but it's a strategy that best serves the every day photographer Apple is focused on.

Still, we're a long way from September so let me know — what would you like to see from the iPhone 6 camera?

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.