It would be surprising to see Apple only use time-of-flight sensors on the back of the next iPhone, but not entirely unexpected. Lots of augmented reality hardware already use these sensors, including Microsoft's Hololens and Google's Tango platforms.
Where the current True Depth sensor does a great job gathering data up close, the smaller time-of-flight sensors necessary for the back of a phone would be better suited for gathering data at room-scale. Being able to see how far the wall or the couch is from the phone means augmented reality apps could better integrate with the environment, instead of asking the user to find a suitable space to play in like many ARKit apps currently do.
Like other AR platforms, it's likely what we will actually see from this supposed research is a new True Depth sensor which combines time-of-flight and the existing structured-light techniques for a more complete picture of the world around the iPhone. Either way, an iPhone with better depth sensing on the back of the phone is great news for the future of ARKit and a clear indicator of how important Apple thinks this tech is going to be moving forward.
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.