In a new interview, Apple's Jony Ive an Tim Cook, and other company executives sat down with Bloomberg to discuss 3D Touch on the new iPhone 6s and iPhone 6s Plus. In the interview, the execs touch upon how 3D Touch came to be, including the long winding road of development in an effort to get the experience just right.
According to Apple design chief, Jony Ive, 3D Touch on the iPhone has been a long time in the making, spanning a multiple years of development time. Phil Schiller expands on the point, noting the difficulty in engineering the experience:
Engineering-wise, the hardware to build a display that does what [3D Touch] does is unbelievably hard," says Schiller. "And we're going to waste a whole year of engineering—really, two—at a tremendous amount of cost and investment in manufacturing if it doesn't do something that [people] are going to use. If it's just a demo feature and a month later nobody is really using it, this is a huge waste of engineering talent.
Apple's Craig Federighi expanded upon this point, delving into some of the unique challenges of engineering a unique experience while dealing with technical hurdles, such as interpreting force while accounting for gravity and the orientation of the device:
It starts with the idea that, on a device this thin, you want to detect force. I mean, you think you want to detect force, but really what you're trying to do is sense intent. You're trying to read minds. And yet you have a user who might be using his thumb, his finger, might be emotional at the moment, might be walking, might be laying on the couch. These things don't affect intent, but they do affect what a sensor [inside the phone] sees. So there are a huge number of technical hurdles. We have to do sensor fusion with accelerometers to cancel out gravity—but when you turn [the device] a different way, we have to subtract out gravity. … Your thumb can read differently to the touch sensor than your finger would. That difference is important to understanding how to interpret the force. And so we're fusing both what the force sensor is giving us with what the touch sensor is giving us about the nature of your interaction. So down at even just the lowest level of hardware and algorithms—I mean, this is just one basic thing. And if you don't get it right, none of it works.
The whole interview is a fascinating read, and even includes some tidbits about setting up the event keynote and Jony Ive's promotion to design chief. For much more, be sure to check out the full interview from Bloomberg at the source link below.