Skip to main content

#BeautyGate Explained: What iPhone XS is and isn't doing to your selfies

iPhone XS Depth Change
iPhone XS Depth Change (Image credit: Rene Ritchie / iMore)

Our protests have been heard and our long international selfie nightmare will soon be ended: Apple will the "fixing" the computational photography algorithms used for selfies to eliminate the smoothing and warming that led to the misconception that a "beauty filter" was being applied to our faces.

Nilay Patel first reported the good news on The Verge:

Apple told me that the forthcoming iOS 12.1 update, currently in public beta, will address the issue of the front camera appearing to smooth out skin by picking a sharper base frame for Smart HDR, but I wasn't able to test it yet.

I've heard the same. So, if you hated the old look, get ready for a new look as soon as iOS 12.1 drops... maybe at or following Apple's October 30 event?

I've covered the new iPhone XS imaging system in both my initial review and my 3-weeks-later review, but one particularly bad bit of intel just keeps making the rounds, so I wanted to do a real explainer to lay it to rest once and for all.

So, #BeautyGate or #SmoothSelfieGate began as most quote-unquote gates do: A combination of people with legitimate questions and concerns and those who are super eager to amplify anything they can in order to get attention, even and especially if it does nothing to answer those questions or address those concerns. Add internet and… wildfire.

But, if you dig into why people are concerned and you go beyond those who's sole intent is to sensationalize, you get to something truly fascinating:

The ongoing evolution for camera system to computational photography system.

Rather watch than read? Check out the video above and subscribe for more.

Computational photography

If you're not familiar with computational photography, you're probably familiar with two of its biggest watershed moments: Google's auto-awesome, that sucked photos up from a dizzying array of different hardware to give you the best possible process on the cloud, and Apple's iPhone 7 Plus Portrait Mode, that simulated the bokeh of big, prime lens glass using two separate, tiny phone cameras.

In other words, using the near-limitless potential of custom silicon and machine learning to go far beyond the physical limits of sensors and lenses.

And with iPhone XS and Smart HDR, Apple is using computational photography to a far greater degree than ever before. So much so that we're seeing hashtag smoothgate or hashtag selfiegate as a result.

So, to get into in, iPhone XS has not just a new camera system this year but a whole new imaging pipeline.

The iPhone XS camera

iPhone XS

iPhone XS (Image credit: iMore)

It starts with the wide angle on the back. It's still a f/1.8 lens with a 12-megapixel but it's got bigger pixels, up from 1.22 to 1.4 microns, to drink in more light and deeper pixels, up from 3.1 to 3.5 microns to keep that light from getting cross-contaminated. It's also got more focus pixels — Apple's name for phase detection autofocus, so it can latch on to your subject twice as fast as before.

It results in a sensor that, according to John Gruber's calculations on Daring Fireball, is 30% larger and a lens that moves from a focal equivalent of 28mm to 26mm.

It's so different it's led to some speculation that Apple switched from Sony to Samsung as its supplier, with all the changes in characteristics that come with it.

Now, computational front-facing selfies have little to do with rear-facing hardware, but I wanted to highlight just how much and how deeply everything has changed this year.

The RGB camera in the TrueDepth array also has a new sensor on iPhone XS. Apple's only said that it's twice as fast as last year, but that's just burying the lede.

The biggest change is that Apple is tying the new 8-core neural engine in the A12 Bionic into the image signal processor to not only do more, but to do more faster.

Back to the rear camera for a moment, only because Apple has provided more detail on what it's doing there, but I don't believe the processes are that dissimilar.

From the moment you open the camera, it starts buffering so that there's zero shutter lag when you go to take the photo. Like I said in my review, that's not new. That it can buffer 4 frames now in order to better isolate and capture motion is new. At the same time, it's also capturing underexposed versions of each frame to preserve highlight details. And, once you take the shot, it's capturing a long exposure as well so you can get even greater details from the shadows.

By the way, it's also doing something similar for up to 30 frame-per-second 4K video by capturing extended dynamic range data in between each of those frames and seconds.

So, #BeautyGate...

The all new, all better optics combined with the huge advance in computational photography that Apple is calling Smart HDR — is what's leading to the new selfie look.

Specifically, what we're seeing with #beautygate is the result of the new noise reduction and extended dynamic range.

Matthew Panzarino, former pro photographer, current sneaker aficionado and editor-in-chief of TechCrunch tweeted it this way:

https://twitter.com/panzer/status/1046870351563505664

Sebastiaan de With, former Apple stitched-leather enabler, DoubleTwist Pentile Anti-Aliasing inventor, and current Leica shooter and Halide designer, did an amazing deep dive on how the new, higher dynamic range creates images very different from traditional contrast-based sharpening filters. From the Halide blog:

It's important to understand how our brains perceive sharpness, and how artists make things look sharper. It doesn't work like those comical CSI shows where detectives yell 'enhance' at a screen. You can't add detail that's already been lost. But you can fool your brain by adding small contrasty areas.Put simply, a dark or light outline adjacent to a contrasting light or a dark shape. That local contrast is what makes things look sharp.To enhance sharpness, simply make the light area a bit lighter near the edge, and the dark area a bit darker near the edge. That's sharpness.The iPhone XS merges exposures and reduces the brightness of the bright areas and reduces the darkness of the shadows. The detail remains, but we can perceive it as less sharp because it lost local contrast. In the photo above, the skin looks smoother simply because the light isn't as harsh.Observant people noticed it isn't just skin that's affected. Coarse textures and particularly anything in the dark— from cats to wood grain— get a smoother look. This is noise reduction at work. iPhone XS has more aggressive noise reduction than previous iPhones.

Seb goes on to note that it's a result of just how fast iPhone XS is taking photos now, both shutter speed and ISO, and the noise that comes with that speed requiring new and different kinds of reduction.

On the rear-facing camera, with its big, bright sensor, even in low light, it's not as noticeable. On the much smaller front-facing camera sensor, it's more noticeable.

So, it all comes down to this, which is what it always comes down to: Design is compromise and engineering is trade-offs.

With iPhone XS, you get much better exposure, much better dynamic range, detail in highlights and shadows, which means fewer blow-outs and less banding, and a much, much higher tolerance for back, side, or just plain bad lighting. Which means more better selfies in more badder places. But it comes at the cost of what's traditionally been seen as edge detail and image texture.

And, yeah, it has nothing to do with beauty filters or faces — the iPhone XS camera treats all similar objects similarly with this new pipeline.

It's similar if not the same as what Austin Mann mentioned in his iPhone XS camera review: That the dynamic range is now so good he's finding it nearly impossible to shoot silhouettes anymore. Every step forward, dammit, leaves something behind.

Bring the noise

The bad news is, if you hate the way selfies look on iPhone XS, since there's no beauty mode, there's no way for Apple to add an on/off switch for the smoothing. It's an integral part of the entire process. Also, if you shoot RAW, as Seb also explains in his deep-dive, you're in a for a world of computationally-optimized hurt as Neural Networks and ISPs become increasingly, inextricably coupled together.

Apple, for its part, really, truly, deeply believes the new imaging pipeline is better than the previous one and better than what anyone else is doing today. If you disagree — and when it comes to the selfie results, I personally disagree hard — or soft, or smooth, or whatever — it's important to let Apple know. A lot. Because pipelines can and will be tweaked, updated, and improved over time.

And, like I said, if they can detect and preserve fabric, rope, cloud, and other textures, why not skin texture as well?

For now, if you want to avoid it, and you want to shoot in traditional RAW, you're going to have to get a third-party app and go manual.

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.

20 Comments
  • A well explained article, but it avoids the elephant in the room: the Pixel 2 also uses a similar system of HDR and doesn't make every selfie look like it's taken in a wax museum.
  • It's a different algorithm of noise cancellation. In some pictures it removes noise that the Pixel 2 doesn't, but it is too aggressive on faces
  • It's automatic skin smoothing to make selfies look air-brushed. This has absolutely nothing to do with image stacking and HDR. The image stacking is actually designed to retain details while reducing noise, not smooth out skin and zero out fine details. Go use a Pixel Phone, FFS. Anyone who tries to use that excuse is simply ignorant of the technology and how it works. It isn't even new. Apple has been using image stacking for years, as has Google and Samsung. Only now have they started doing Beauty Mode on the images. Compare an iPhone 8 Plus to an iPhone XS - Front AND Back Camera. This is not an HDR/Image Stacking thing. The reason why Beauty Mode is a thing is because it's popular in some Asian Markets. It doesn't look bad with Asian Skin Tones, especially females who are wearing (possibly heavy) makeup, anyways. In that case, it "works" as intended. They tend to have relatively smooth, even skin, anyways... so the effect doesn't stand out as much. Where it does, is when you're male with stubble and a few wrinkles, or African, etc. Those people are going to notice this effect almost immediately, on the first photo. It's the reason why so many people complain about Samsung pictures, but Apple's Beauty Mode is stronger than the strongest setting on a Note9. It's really drastic. There is no #BeautyGate. It is what it is. It's pretty obvious what's happenings and it has nothing to do with HDR Image Stacking; unless Apple's developers are just that bad... This is being added to more and more phones as companies push harder into the Asian Markets, because the NA and European Markets are "sort of" plateauing. Also, there is aggressive noise reduction, which further works to decrease details. This has always been an issue on iPhones, and why they've tended to produce "water color" images in mixed-low light. That doesn't help things. However, the whole purpose of using image stacking is to reduce noise while maintaining details; so this makes very little sense "technologically," unless Apple's implementation is simply super weak in comparison to Google and Samsung's (who has Beauty Mode, but the iPhone's looks stronger and worse in comparison).
  • This^. This "explanation" was rather embarrassing. It's pretty clear that there is no beauty mode on the Pixel 2 XL and iPhone X even though they both use the same camera technology, as you've explained above, yet beauty mode is suddenly there on the iPhone XS. It's clear as day on the selfies I take on my white, stubbly face.
  • The funny thing about taking selfies with the Pixel 2 XL is that sometimes it captures your skin a little too well and can make a person feel a bit self conscious. And I say that as someone who loves the phone, especially the camera.
  • It's not a beauty mode though, it's the noise cancellation algorithm being different on the XS, which I agree looks bad
  • There's no "Beauty Mode" on the iPhone, it's simply the aggressive noise reduction as you stated
  • In camera settings, you can choose to turn smart HDR off and/or "keep normal photo"
  • Embarrassing explanation that shows a lack of understanding the technologies being used. Cmon.
  • What's embarrassing about this explanation? I thought it was pretty thorough
  • What's embarrassing is that he's unaware of how the technology works. This excuse literally only makes sense to people like you.
  • At least explain what part(s) is wrong, because all you've said is that it shows a lack of understanding, without mentioning anything else
  • Rene to apple's rescue again.... Come on lol.
  • It's called an explanation.
  • What a joke. Imore from hundreds of comments now has only 3 to 7 per article at best. Rene isn’t happy. He is working hard to make this number 0. Then no more Apple invitations and free devices for the poor Rene. Aren’t you tired of licking Apple balls man?
  • I haven't really noticed the comments going down, are you sure you're not referring to articles where the old comments get brought over because the article is updated rather than being started fresh?
  • This can't be true. Rene spent so much energy demystifying this and they now fix it by software update?
  • Yes? There's nothing strange about that. The TL;DR version is basically that Apple changed the noise cancellation algorithm which created this undesired effect, so Apple will have either changed it back to the old algorithm, or improved the new one.
  • I thought it was working as intended, though? The tune changes swiftly around these parts. Even those camera app people were vomiting up excuses for this, and quoted here.
  • Nice report, really. Beyond noise reduction and picking the sharpest buffered frame, imo it does not explains why skin tones and white balance change as soon as a face is detected during the capture. This happens either capturing photos from the front camera, the rear camera or even shooting a video. Just try to cover your face with your hand during a selfie and pay attention how your skin color changes. Such white balance and wierd skin tones even happen whether SmartHDR is active or not. It happens even if manual HDR is active or not. So iOS 12 is doing something in the background. And yes, iOS 12.1 still does. Looking at the pictures some time after, the orangy faces can look kind of normal, but it’s just not fine to me, far enough from real colors that causes kind of uncanny valley effect. Why HDR should vary white balance? Why specially noticeable in faces? This effect imo is contributing to the porcelain effect complained for some users. Furthermore I noticed erratic global white balance during camera preview that I didn’t observed in previous models, at least with 5s and 6s that I have, which resulted in not few photos during last days with evident magenta-biased coloration. Curiously I found the same issue referred by just a single post in the web during last weeks. I have the XR model BTW.
    I hope Apple rectifies this wierd camera issues. To me it is unacceptable for the price of the X-family devices, to such un extent that I thinking seriously to go for a older model and save some bucks. Cheers