Skip to content

November 15, 2017

What could make the iPhone’s dual lenses more awesome? Why, lasers of course

by John_A

Apple’s dual cameras brought features such as DSLR-like bokeh to smartphones, but the next camera tech the company is focusing on could make those features look old school in comparison. An anonymous source inside the company recently told Bloomberg that Apple is working on a 3D laser system that expands the camera’s capabilities — and increases the realism of augmented reality effects.

The tech works similarly to the iPhone X’s front-facing sensor that allows users to unlock the phone with their face and fake studio lighting on selfies, but brings a similar idea to the rear-facing camera. While similar in concept and the use of lasers, the rear-facing system would instead measure how long it takes each laser to bounce off a surface, in order to create a depth map of the scene. By timing how long it takes for the laser to travel to those surfaces, the iPhone could create a much more detailed depth map then what the two offset dual cameras can create.

A better depth map could in turn be a big boon for augmented reality. With more data about where everything is in the scene, the future iPhone could create a more realistic placement of AR objects. If the smartphone understands the scene in three dimensions, the placement of virtual objects, and even movement, could be adjusted to fit within that specific space.

The unnamed source suggested that the feature could be coming in 2019, but says that the feature may not make it into the final version if the 3D laser system doesn’t do well in testing. With Apple not discussing unreleased products, the feature can be added to a growing list of iPhone rumors. The company is rumored to be releasing a more budget-friendly iPhone alongside a larger, pricier model in 2018. While initial reports suggested the same TrueDepth tech that allows facial ID, analysts later said the company would be instead focusing on new models with that feature in the front-facing camera, a rumor that supports the claim the feature is coming in 2019.

Depth information is becoming increasing available in high-end smartphones, but the tech behind the feature varies. The iPhone Plus smartphones (including the 7 Plus and the X) compare the views from two different lenses to add depth effects. The Google Pixel 2 instead compares the view from opposite sides of a pixel, a variation that works with the dual pixel autofocus system.

Editors’ Recommendations

  • Ditch the studio lights — Apple’s new camera feature uses AI to light your face
  • Infani smart baby monitor gives parents peace of mind, a little extra shut-eye
  • iOS Portrait Mode not enough? Focos is a free app for customizing that blur
  • The Google Pixel 2 camera already earned the highest scores yet — here’s why
  • Anamorphic app review




Advertisements
Read more from News

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Note: HTML is allowed. Your email address will never be published.

Subscribe to comments

%d bloggers like this: