Skip to content

November 14, 2017

Apple Reportedly Working on 3D Sensor System for Rear Camera in 2019 iPhones

by John_A

Apple is developing 3D depth sensing technology for the rear-facing cameras in its 2019 iPhones, according to a new report by Bloomberg on Tuesday. The 3D sensor system will be different to the one found in the iPhone X’s front-facing camera, and is said to be the next big step in turning the smartphone into a leading augmented reality device.

Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.

The existing TrueDepth camera would continue to be used in the front-facing camera of future iPhones in order to power Face ID, while the new system would bring the more advanced “time-of-flight” 3D sensing capability to the rear camera, according to the sources cited. Discussions with manufacturers are reportedly already underway, and include Infineon, Sony, STMicroelectronics, and Panasonic. Testing is said to be still in the early stages, and could end up not being used in the phones at all.

With the release of iOS 11, Apple introduced the ARKit software framework that allows iPhone developers to build augmented reality experiences into their apps. The addition of a rear-facing 3D sensor could theoretically increase the ability for virtual objects to interact with environments and enhance the illusion of solidity.

Apple was reportedly beset with production problems when making the sensor in the iPhone X’s front-facing camera, because the components used in the sensor array have to be assembled with a very high degree of accuracy. According to Bloomberg, while the time-of-flight technology uses a more advanced image sensor than the existing one in the iPhone X, it does not require the same level of precision during assembly. That fact alone could make a rear-facing 3D sensor easier to produce at high volume.

Late last month, oft-reliable KGI Securities analyst Ming-Chi Kuo claimed that Apple is unlikely to expand its front-facing 3D sensing system to the rear-facing camera module on iPhones released in 2018. Kuo said the iPhone X’s 3D sensing capabilities are already at least one year ahead of Android smartphones, therefore he believes Apple’s focus with next year’s iPhone models will be ensuring an on-time launch with adequate supply.

Related Roundup: iPhone XTags: bloomberg.com, ARKit, TrueDepthBuyer’s Guide: iPhone X (Buy Now)
Discuss this article in our forums

MacRumors-All?d=6W8y8wAjSf4 MacRumors-All?d=qj6IDK7rITs

Advertisements
Read more from News

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Note: HTML is allowed. Your email address will never be published.

Subscribe to comments

%d bloggers like this: