iPhone camera wishlist: ND filter, Macro, Smart Zoom

Last year, I asked for three main things when it came to iPhone 11 photography: An ultra-wide camera, a Night Mode, and Portrait Video. And, yeah, to fix the weird skin tones and textures in Smart HDR.

We didn't get Portrait Video but 3 out of 4 wasn't just not bad, it was enough to convince all the doomsayers, boreds now, and must-skippers out there that the iPhone 11 and its cameras were far closer to must-have.

You know, something anyone reading this column had known long before they were even announced.

But time and expectations are relentless, and while it sounds like what we're getting this year with the iPhone 12 will be the time-of-flight sensor and augmented reality camera — basically True Depth on the back — I've been hearing about and looking forward to seeing for a few years now, there are still lots of other things Apple could do to improve and round out iPhone photography, even if it's in years to come.

ND Filter

Last year, one of the rumors from Kuo Ming-Chi and other supply-chain looky-loos was that Apple would adopt a special black lens coating that would hide the cameras on the iPhone 11. Sort of like tinted windows.

Obviously, that didn't pan out. In fact, we got the opposite. A three camera system so wickedly all up in our face Apple had to lean so far into it, they put it up on billboards.

But, right now, this week, at CES 2020, OnePlus is showing off a McLaren concept phone that does something remarkably similar: It uses electrochromic glass, like you'd find in McLaren supercars, to switch between transparent and translucent-verging-on-opaque states.

When the cameras are off, all you see is smooth, smooth black glass. But, launch the camera app and, 0.7 seconds later, cameras all the way down.

Cosmetically, it looks great. As we keep adding more and more cameras — and it feels like Nokia is up to a baker's dozen already — the stranger and stranger the backs of our phones keep looking. Like spider eyes. Like moonscapes.

That's why flattening it all out and hiding it all away holds so much appeal.

Sure, right now I'm all about the iPhone 4-style squared off design, and the squared-off bump could still hide lenses if not, you know, the bump. But, if Apple ever wants to go the other way and return to the curves of the iPhone 3G design, even the bump could be made to look gone.

Functionally, though, things get even more interesting. The electrochromic glass also works as a neutral density or ND filter.

ND filters are typically used, for example, on bright days when you're shooting outside, and you want to limit the amount of light that gets captured, so you can avoid things like blowing out the sky, but still keep the accuracy of the colors. You know, like sunglasses for your camera.

Now, Smart HDR on the iPhone already does a pretty great job at handling bright, outdoor captures. It basically exposes for both light and dark. But, with a few levels of ND available, it could work even better in an even wider range of conditions.

Especially if, unlike OnePlus where you have to dive into Pro settings to enable it, it just came on when needed, like Night Mode. A Bright Mode, so to speak.

Macro Camera

The iPhone camera system works that way because Apple treats it that way — like a system rather than like a bunch of separate parts. It's not a camera body where you have to manually screw a bunch of different virtual lenses on and off just to get a range of useful shots. It's meant to be more like a point-and-shoot, where you just click the button and it does everything it can, uses every different discreet camera size and computational mode, to deliver the best possible capture it can of whatever you're aiming it at at the time.

Ultra wide-angle, Deep Fusion — they're all just implementation details that, for now, Apple uses as super exciting marketing terms to make them sound all sci-fi and sexy. But, which I imagine will fade away sooner rather than later, the way telephoto and deep trench isolation have.

Still, any system is only as good — or should I say as useful — as the sum of its parts.

Last year we got a new part — that ultra-wide-angle. I'd been seeing all the great photos all my friends with LG and other phones had been getting using ultra-wide-angle cameras and I was hella jealous.

You can fake telephoto good enough by stepping up or cropping in, but you can't fake ultra-wide anywhere nearly as well. Computationally, there's just no data to draw from, and sneaker zoom, well, doesn't help if you have to exit the room or the view you're trying to shoot in.

For everyone saying not having an ultra-wide was fine. Everything's fine. Literally everyone else was out enjoying shooting with their ultra-wides.

This year, though, I'm all about the macro.

See, ultra-wide is about getting the big picture but macro… macro is about getting the smallest detail.

One of my favorite lenses on my big camera is the Canon 100mm macro. I use it for way more than just macros but I love, love using it especially for macros.

And they're no longer just the providence of big cameras either. We're starting to see them on other phones now as well. Even the just-announced Samsung Galaxy S10 Lite has one. And for the same reason we started seeing ultra wides years ago — they're FUN.

Getting super close to a flower or insect or Lego mini-fig or backplane or being able to see the pentile subpixel layout on an OLED display. All of it, super FUN.

First, we had wide-angle, then telephoto, then ultra-wide-angle. Macro just feels like what's next. From all the way out to ALL the way in.

Just thinking about being able to go from near fish-eye to near full contact using the iPhone camera app interface makes me giddy.

Smart Zoom

Which brings me to computational zoom. Which, unlike the previous two, is something I do think we could see as soon as this year and the iPhone 12.

The benefit of having a point-and-shoot over a body and lens system is that you can do most things without having to change lenses. But, in order for the to work, the camera has to let you do those most things.

I've been shooting with Google's Pixel 4 for a few months now and it's also a fun camera. But, by and large, I haven't really needed to shoot with it as much as I thought I would. The iPhone 11 crushes it in video, which is a big part of my workflow, I prefer Apple's color science, quirky as it can be at times, to Google's constant solving for the same cool, cool look, and Night Mode keeps up with Night Sight, even if each has its own strengths and weaknesses.

But what the Pixel 4 has that I miss every time I pick up the iPhone 11 is Super Res Zoom.

It's built off of what a lot of computational photography is built off of — using big silicon and big algorithm to make up for the lack of big sensor and big glass.

More specifically, by taking a bunch of frames incredibly quickly, then chewing through them, keeping only the best bits, and presenting you with a final image better than any of its under, over, or OIS shifted parts.

So, for zoom, instead of just giving you a gross, pixel embiggened end product, it sifts through all the data from all the ever-so-slightly different frames to piece together one, single, crisp photo far better than any purely digital zoom has ever produced before.

It's not optical zoom quality, just like portrait mode depth of field still isn't optical depth of field perfect, but it's similarly impressive for glass flat enough to be welded onto the back of a phone.

Apple already has similar technology in Smart HDR, they're just not yet applying it to a Smart Zoom. And I really wish they would because it remains one of the few things not only the Pixel camera does better, but traditional cameras still do better.

Add that in and fix the light aberrations we're seeing in some shots, including Night Mode shots, at the same time, and it would be outstanding.

○ Video: YouTube
○ Podcast: Apple | Overcast | Pocket Casts | RSS
○ Column: iMore | RSS
○ Social: Twitter | Instagram

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.