Apple camera execs talk iPhone 12 in a brand new interview

Iphone 12 Pro Review
Iphone 12 Pro Review (Image credit: Daniel Bader / Android Central)

What you need to know

  • The cameras in Apple's iPhone 12 lineup are the best yet.
  • Two of the people responsible for them have been talking to photography site PetaPixel.
  • Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, discussed the cameras

The launch of Apple's iPhone 12 and iPhone 12 Pro models bring the best cameras ever attached to an iPhone and the company has a ton of people working to make sure they're as impressive as can be. Two of those people, Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, have been speaking to photography site PetaPixel.

Straight out the gate, it's clear that Apple thinks of cameras in a way that means it isn't just the lenses that make the magic happen. Instead, the report notes that Apple sees camera development as something that includes everything about the iPhone – including the A14 Bionic.

In an interview with Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, both made clear that the company thinks of camera development holistically: it's not just the sensor and lenses, but also everything from Apple's A14 Bionic chip, to the image signal processing, to the software behind its computational photography.

Apple's ability to close the computational photography gap between iPhone and Google's Pixels is something that has been noticeable in the last couple of years, not least with the addition and growth of Night Mode.

The pair also note that iPhone tries to do what Apple thinks a photographer might normally do in post. That means that it's applying machine learning to try and create a finished image without the need for taking shots into an app afterwards.

"We replicate as much as we can to what the photographer will do in post," McCormack continued. "There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there."

That's accomplished by taking an image and then breaking it down into its components, allowing machine learning to get to work.

"The background, foreground, eyes, lips, hair, skin, clothing, skies. We process all these independently like you would in Lightroom with a bunch of local adjustments," he explained. "We adjust everything from exposure, contrast, and saturation, and combine them all together."

The full interview is absolutely worth a read with stuff covered in way more detail. There are some gorgeous sample shots showing off what these cameras are capable of, too.

Oliver Haslam
Contributor

Oliver Haslam has written about Apple and the wider technology business for more than a decade with bylines on How-To Geek, PC Mag, iDownloadBlog, and many more. He has also been published in print for Macworld, including cover stories. At iMore, Oliver is involved in daily news coverage and, not being short of opinions, has been known to 'explain' those thoughts in more detail, too.

Having grown up using PCs and spending far too much money on graphics card and flashy RAM, Oliver switched to the Mac with a G5 iMac and hasn't looked back. Since then he's seen the growth of the smartphone world, backed by iPhone, and new product categories come and go. Current expertise includes iOS, macOS, streaming services, and pretty much anything that has a battery or plugs into a wall. Oliver also covers mobile gaming for iMore, with Apple Arcade a particular focus. He's been gaming since the Atari 2600 days and still struggles to comprehend the fact he can play console quality titles on his pocket computer.