Lu believes the triple-lens camera system will enable advanced 3D sensing via stereoscopic vision, with two of the sensors able to capture images of a single object from different angles. A triangulation method would then be used to obtain the distance between the iPhone and the object.
Like the TrueDepth system on the front of the iPhone X, the report suggests that the rear 3D sensing will be used for augmented reality purposes. The prediction lines up with a report from Bloomberg‘s Alex Webb, who last year said future iPhones will be equipped with rear-facing 3D sensing capabilities.
At the time, Webb reported that Apple was evaluating a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a 3D image of the environment, but Lu believes that Apple will instead use the triple-lens approach for the rear camera system.
Apple CEO Tim Cook has repeatedly expressed his profound interest in augmented reality. Last year, Apple released its ARKit platform, enabling developers to create augmented reality apps for iPhones and iPads on iOS 11.
Meanwhile, the third lens would likely have a longer focal length for enhanced zoom capabilities, according to Lu.
The long-focus lens would likely enable 3x optical zoom on an iPhone for the first time, enabling users to magnify the image in the viewfinder by up to three times without a blurry reduction in quality like digital zoom. iPhone 7 Plus, iPhone 8 Plus, and iPhone X models feature 2x optical zoom for comparison.
Huawei’s new P20 Pro smartphone became the first with a triple-lens rear camera system, including a 40-megapixel lens, a 20-megapixel monochrome lens, and an eight-megapixel telephoto lens with up to 3x optical zoom.
Yuanta Securities analyst Jeff Pu also expects Apple to release at least one iPhone with a triple-lens rear camera next year.