Apple is expected to keep the TrueDepth system, so future iPhones will have both front and rear-facing 3-D sensing capabilities. (Bloomberg/Michael Short)
Apple Inc. is working on a rear-facing 3-D sensor system for the iPhone in 2019, another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.
Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3-D image for authentication. The planned rear-facing sensor would instead use a time-of-flightapproach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
The company is expected to keep the TrueDepth system, so future iPhones will have both front and rear-facing 3-D sensing capabilities. Apple has started discussions with prospective suppliers of the new system, the people said. Companies manufacturing time-of-flight sensors include Infineon Technologies AG, Sony Corp., STMicroelectronics NV and Panasonic Corp. The testing of the technology is still in early stages and it could end up not being used in the final version of the phone, the people said. They asked not to be identified discussing unreleased features. An Apple spokeswoman declined to comment.
The addition of a rear-facing sensor would enable more augmented-reality applications in the iPhone. Apple Chief Executive Officer Tim Cook considers AR potentially as revolutionary as the smartphone itself. He’s talked up the technology on Good Morning America and gives it as almost much attention during earnings calls as sales growth. “We’re already seeing things that will transform the way you work, play, connect and learn,” he said in the most recent call. “AR is going to change the way we use technology forever.”
AR on the iPhone X. (Bloomberg/File)
Apple added a software tool called ARKit this year that made it easier for developers to make apps for the iPhone using AR. The tool is good at identifying flat surfaces and placing virtual objects or images on them. But it struggles with vertical planes, such as walls, doors or windows, and lacks accurate depth perception, which makes it harder for digital images to interact with real things. So if a digital tiger walks behind a real chair, the chair is still displayed behind the animal, destroying the illusion. A rear-facing 3-D sensor would help remedy that.
The iPhone X uses its front-facing 3-D sensor for Face ID, a facial-recognition system that replaced the fingerprint sensor used in earlier models to unlock the handset. Production problems with the sensor array initially slowed manufacturing of the flagship smartphone, partly because the components must be assembled to a very high degree of accuracy.
While the structured light approach requires lasers to be positioned very precisely, the time-of-flight technology instead relies on a more advanced image sensor. That may make time-of-flight systems easier to assemble in high volume.
Alphabet Inc.’s Google has been working with Infineon on depth perception as part of its AR development push, Project Tango, unveiled in 2014. The Infineon chip is already used in Lenovo Group Ltd.’s Phab 2 Pro and Asustek Computer Inc.’s ZenFone AR, both of which run on Google’s Android operating system.