Skip to main content

iPhone X vs. Pixel 2: Portrait Selfie Showdown

Portrait Selfies are my favorite thing happening in smartphones right now. I can feel your eyes rolling as you read this, but hear me out for a moment. On a technical level, we're seeing a massive push to improve the quality of the front-facing camera while applying depth maps and color correction so a photo can be instantly transformed into something new. It's fascinating and fun, and in the early days of this feature existing on smartphones, there are two massively different ways of creating this effect being used.

Apple's TrueDepth camera features multiple sensors to create a depth map in real time on the iPhone X. Google's solution relies entirely on machine learning and an impressive single camera to accomplish the same effect on the Pixel 2. Here's a look at how these two methods compare!

Left: Google Pixel 2 Right: Apple iPhone X

The first thing you need to understand when comparing these cameras is where Apple and Google are placing priority when it comes to the photos being produced. We know on a technical level these two phones have the best photo capabilities in the world right now, with the only real difference for most people being what parts of the photo and what features are taking priority.

In these shots, two things become quickly apparent. The background in the iPhone X shot is blown out in an attempt to make sure my face is well lit and properly colored. The background in the Pixel 2 shot is better balanced with the rest of the photo thanks to Google's HDR+, but as a result my face is noticeably darker and takes on a slightly reddish tint you don't see in the other shot.

There are also substantial differences in that blurring effect used to create Portrait Selfies. Google's software applies a more aggressive blur, and it makes imperfections in the depth map stand out quite a bit. It almost looks like a cut-out sticker of me has been applied to this photo. Apple's blue is a little more natural looking right until you get to my hair and parts of me start to look a little out of focus. Neither shot really "nails" the picture due to their respective faults, but the flaws are subtle enough that you have a pair of fun photos.

Here's another example, only this time it's an outdoor shot zoomed in 100%.

This pair of shots offers a lot of the same conclusions. Apple better lights my face, in particular my eyes, while Google better lights the background and makes the whole photo feel more rich. Apple's blurring extends too deep into my hair and makes part of me look out of focus, while Google's blur is so severe it actually edited out some of my hair that was sticking out of place. Without the Portrait Mode it's clear the iPhone X took the better photo, but with that photo mode enabled it's easier to appreciate the effect from Google's software.

Apple and Google seem to be approaching this photo technique from opposing directions, and it seems likely what we'll see in the not-too-distant future is some kind of meeting in the middle.

There are some natural limitations to both of these photo modes, especially in these early days. The iPhone X will flash warnings in Portrait Mode if the background is too bright or if objects in the background are too far away to grab a proper depth map. Google's algorithm has a nasty habit of editing out corners of prescription glasses or objects in the foreground that aren't attached to a person. There's a bit of trial and error in getting both to work correctly, which in theory gets less frequent over time as Apple and Google continue to improve this tech.

The biggest thing Apple has over Google in this Portrait Mode right now has to be Portrait Lighting. Being able to see the possible edits to the photo in real time as you go to take the picture is a big deal, and the ability to continue editing Portrait Lighting after the photo has been taken is incredible. It's a very hit-or-miss feature right now, especially when you start looking at Stage Lighting, but nothing but the iPhone has this right now.

As editing the photos goes, Google has something Apple currently doesn't and probably should. When you take a Portrait Mode photo with a Pixel 2, you get a pair of photos. One photo is edited with the blurred areas, and the other is just a plain selfie with the front camera. This gives you the ability to edit both photos and see which you prefer. Apple keeps everything as a single photo, which in my opinion is a little less convenient to see edits between the two modes. On the other hand, it's one fewer photo in your Camera Roll, so this is probably just personal preference.

There's a lot to be said about the "right" way to take a photo, and even more to be said about the virtues of color accuracy and feature effectiveness. Apple and Google seem to be approaching this photo technique from opposing directions, and it seems likely what we'll see in the not-too-distant future is some kind of meeting in the middle. Right now, it's not immediately clear that Apple's hardware-based approach is measurably better than Google's software-based approach to Portrait Selfies. All of the things that make the iPhone X camera take a great selfie already exist in the iPhone 8 and 8 Plus. At the same time, it's not clear how much better Google can make its photo editing software with a single camera and a lighting preference for the whole photo instead of the face.

One thing is absolutely clear — we're nowhere near the end of Apple and Google giving us more and better selfie features.

Russell is a Contributing Editor at iMore. He's a passionate futurist whose trusty iPad mini is never far from reach. You can usually find him chasing the next tech trend, much to the pain of his wallet. Reach out on Twitter!

  • I'm just here to say incredible weight loss transformation, dude. Look great! Oh and I like the iPhone X selfie cam better :)
  • Pixel 2 is better.
  • Love how everyone does just a camera showdown. Does nothing else matter? Love all the Samsung and Google fans saying iPhone is behind and does not compare but they are all pretty close and depends on what shot is being made. Plus most people are not going to have the phones next to each other. For me it is the entire phone that matters and eco system. So hands done iPhone wins for me. Wish I had the funds to do these test because I have a feeling that some people fudge the results. I know when I have a friend with an Android their pics never look as good.
  • We're at a point now where not much new hardware gets added on a phone, so essentially you get a better version of what's already there. Faster CPU/GPU, better screen, higher quality camera, etc. For the general consumer, the most important change out of these is the camera, and it's probably also the most noticable change as well. But you are right, the whole phone should matter, and there are many people who don't care about the camera on their phone, and are more focused on other features
  • It's worth noting that portraits where the hair and ears are blurred out have long been a staple of close-up professional portrait photography. It's not wrong way to approach portraiture in that sense. The difference between the traditional way (large aperture telephoto lens and proper lighting) and these new computational methods is the falloff from in-focus to out-of-focus areas. On the former, if done correctly, it's like a gradual, creamy slope. On the Pixel 2 and iPhone X images, it's like it simply fell off a cliff.