Apple's 2019 iPhone Will Be Able To Shoot Lasers With New 3D Sensor
The iPhone X hasn’t been the market for two weeks yet and there’s already rumors swirling for the 2019 iPhone. A new report from Bloomberg claims Apple is already developing a new “3D sensor system” for the 2019 iPhone to further improve its augmented reality capabilities.
This new 3D sensor system will apparently be placed on the back of the 2019 iPhone. The 3D sensor would work by firing lasers out of the device and measuring the time it takes for the reflection to get back. Using time-of-flight algorithms, this would allow the iPhone to measure depth in real life environments, people familiar with the matter told Bloomberg.
This would be an entirely new technology that’s completely different from the TrueDepth camera system that’s currently on the front notch of the iPhone X. The TrueDepth sensor system uses 30,000 infrared laser dots to map out a user’s face to accurately generate a 3D image. The TrueDepth system is an essential part of the iPhone X that allows Apple’s Face ID facial recognition technology to work.
Besides Face ID and the TrueDepth camera system, Apple already has experience in gathering depth information, as pointed out by The Verge. Last year’s iPhone 7 Plus came with a dual-camera setup that is capable of taking portrait photos. The dual cameras are able to detect the subject and blur the background. The same technology is still being used on the iPhone 8 Plus and the iPhone X. It’s very likely that the new 3D sensor system will work in tandem with a set of dual cameras.
Bloomberg’s report claims that Apple plans to keep the TrueDepth system, meaning future iPhones will have both front and rear-facing 3D sensing capabilities. Sources say that Apple has already begun looking for possible suppliers for the new 3D sensor, including Infineon Technologies AG, Sony Corp., STMicroelectronics NV and Panasonic Corp. Infineon stands out from the pack since it has already worked with Google for the AR-centric Project Tango back in 2014. Infineon’s depth sensing technologies are already on Lenovo’s Phab 2 Pro and the Asus ZenFone AR.
Apple’s plan with this new 3D sensor is to enable more advanced augmented reality apps for future iPhones. Apple has already made a huge bet on augmented reality by developing ARKit, the tech giant’s own augmented reality platform that’s natively part of iOS 11.
The introduction of ARKit allowed developers to create AR apps that can identify flat surfaces, but it still has trouble with walls, doors and windows. ARKit is also insufficient in accurately measuring depth. With this planned 3D sensor, Apple could solve those issues. The 3D sensor could help AR apps better understand real world environments thereby making digital images interact better with real objects.
Apple is said to be still in the early stages of developing the new 3D sensor system and it might not even be used for the final version of the 2019 iPhone, the sources said.
© Copyright IBTimes 2024. All rights reserved.