According to the latest report released by Bloomberg on Tuesday, Apple is developing 3D depth sensing technology for the rear-facing camera of its new iPhone launched in 2019. This new 3D sensor system will be completely different from the iPhone X's front camera, which is said to be the next major move to turn your smartphone into a leading augmented reality device.
Informed sources said that Apple is evaluating a different 3D scene composition technology, and they are currently in the iPhone X front-end TrueDepth depth sensor system used in the technology is completely different. Existing systems rely on a structured light technology that projects 30,000 laser spots onto the user's face and measures distortion to generate an accurate 3D image for authentication.
The planned rear sensor will use a new method called time-of-flight to calculate when the laser will be reflected from surrounding objects, creating a three-dimensional image of the surrounding environment.
According to informed sources, the existing TrueDepth depth camera will continue to be used for the future iphone front camera to support the face recognition of the various functions and applications, and the new system will bring more advanced rear-flight "flight Time "3D sensing capability. It is reported that Apple and the manufacturer's discussions have begun, including Infineon, Sony, STMicroelectronics and Panasonic and other companies. Relevant tests are also in the early stages of preparation, but in the end will not be in the mobile phone to use this technology remains to be seen.
With the release of iOS 11, Apple introduced the ARKit software framework that lets iPhone developers build an augmented reality experience in their applications. In theory, the addition of a rear-facing 3D sensor increases the ability of the virtual scene to interact with the real world, resulting in a more realistic immersive experience.
Apple was reportedly plagued with production problems when it came to making sensors for the iPhone X's front camera because the components used in the sensor array must be assembled with very high accuracy. According to Bloomberg, although the "time-of-flight" technology uses an image sensor that is more advanced than the iPhone X, it does not require that much precision in the assembly process. With this in mind, if Apple really wants to add a rear-mounted 3D sensor to the new iPhone, it will be much simpler to mass-assemble.
Late last month KGI Securities analyst Guo Mingchi claimed that Apple is unlikely to extend its front-facing 3D sensing system to the rear-facing camera module of the new iphone released in 2018. Guo Mingchi said iPhone X's 3D sensing technology has surpassed the Android smartphone for one year, so he believes Apple will have enough supply next year when it will focus on the iPhone.