iPhone 8 screen resolution vs. screen technology

Rene using an iPhone
Rene using an iPhone (Image credit: Rene Ritchie)

Confession: I hadn't even been considering the screen resolution of iPhone 8 (D22). I'm no stranger to such considerations. I went more than a little overboard running the resolution options on a 4-inch iPhone back before iPhone 5 shipped. Likewise imagining the resolution of a 5-inch iPhone before iPhone 6 and iPhone 6 Plus shipped.

This year, though, John Gruber kicked my ass. From Daring Fireball:

Using similar logic, and considering all of the rumors and purported part leaks, I have a highly-educated guess as to the dimensions of the D22 display:5.8 inches, 2436 × 1125, 462 PPI, true @3x retina with no scaling.

I think the reason I didn't do the math this year is is that, back then, scaling iOS to different screen sizes was a great unknown. It had been bound by pixel perfection to the original size, then the original size plus a row, for years. Breaking those bonds was a big problem for them to solve and imagining how Apple might solve big problems excites me.

There's a great story about iPad mini and how Apple solved for scaling iOS down:

Originally, Steve Jobs was adamantly opposed to a smaller iPad. An argument led by Eddy Cue, though, ultimately changed his mind. So, an iPad mini was put on the release schedule. That created two problems: How would Apple scale the interface, and how would the already overwhelmingly busy team do it in what was the smallest turn-around window of any iOS product in its history.

It turns out, they lucked out. Instead of having to redesign and implement a custom interface, by scaling down the existing 9.7-inch UI and applying it to the new 7.9-inch display size, it matched iPhone resolution and touch target size exactly. It wouldn't be too small to "read" or to interact with. Indescribable relief for all involved, including customers who then got the benefit of a consistent experience across products.

Scaling iOS up from 4-inches to not just 4.7-inches but 5.5-inches as well was a much more bigger problem to solve. Apple couldn't simply make the UI bigger because it would lose information density and touch targets would become oversized.

So, the teams eventually turned to Auto Layout, a technology that was ported to iOS from the Mac, and built out Size Classes, Dynamic Type, and other frameworks that allowed for design and scaling beyond the absolute coordinates of the pixel grid. (And would go on to enable things like side-by-side Split View apps on iPad Pro as well.)

It made iOS, if not resolution independent, effectively resolution non-depedant. It still requires developers to implement the technologies and to apply just as much care and craft to design as they did in the era of pixel perfection, but it's essentially a solved problem.

That's the reason I stopped thinking about it, but it's not an excuse.

John again:

So we know that iOS 11 has support for a 2436 × 1125 iPhone display. We know that 462 PPI is the "natural" (no scaling) resolution for @3x retina on iPhone. We know that a 2436 × 1125 display with 462 PPI density would measure 5.8 inches diagonally. We know that all rumors to date about the D22 iPhone claim it has a 5.8-inch display. We know that a 5.8-inch display with a 2.17:1 aspect ratio (2436/1125), combined with 4-5mm bezels on all sides, would result in a phone whose footprint would be just slightly taller and wider than an iPhone 7. And we know that all rumors to date say that D22 is slightly bigger than an iPhone 7.

There's also this: The next big break point for phone resolutions is far beyond anything Apple ships today.

Right now, Retina — the density at distance where you no longer see pixels — is tuned for reading distance. The only reason it needs a major improvement is for virtual reality (VR), where it leaps to lens distance. That's going from how far away you typically hold a book to how far away you typically place your sun glasses.

Getting a phone to that high a density, at that high a frame rate, is a significant challenge — assuming Apple is even considering it, given the companies current focus on augmented reality. (Which only requires classic Retina levels of density given its applicable distance.)

In other words, it would depend on whether or not Apple's plans for iPhone include strapping it over your eyes.

But that's the future. Fascinating as it one day might be, I'm currently more intrigued with how Apple might solve the problems of the present. And that's less about display resolution and more about display technology.

For instance:

  • How will Apple handle color management with OLED in the mix? There's no world where they'll take the Samsung cop-out of forcing users to manually switch color profiles (yes, like animals) — so how will the company maintain its currently top-notch color matching across devices, new and old?
  • Similarly, Apple and developers alike will want to provide consistent visuals between devices that, as of this fall, will have very different screen technologies. How will Apple, internally and externally, help make sure the different elements and geometries all work out to the same experience?
  • Once I see Apple's implementation of OLED, how will it feel to go back to LCD on other devices?
  • What's the new 3D Touch technology going to feel like? The old 3D Touch used deformation of the screen measured by the LED backlights to determine pressure. OLED has no LED backlight so, rumor has it, Apple is using a film-based technology instead. Will the change be unnoticeable to users or will it, combined with the potential for new taptics, create an entirely new experience?
  • OLED burn-in is more common than LCD. What, if anything, will Apple do to mitigate that for persistent on-screen controls like the theoretical virtual home button? I've seen enough persistent Poké Balls burned into Samsung phones to know I don't ever want to see one while I'm trying to watch a video on an OLED iPhone.

As to that last part, while I love the idea of a virtual home button and function area — and loved the gesture area on webOS back in the day — a literal virtualization doesn't seem long-lived to me.

Of course, nothing is final or confirmed until Tim Cook or another Apple executive holds it up on stage. Until then, though, it's fun to consider all the myriad possibilities.

What do you want to see?

Rene Ritchie

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.