According to foreign media (fastcompany.com) reported that Apple acquired in 2013 a company called PrimeSense 3D sensing technology company, three years after Apple introduced a dual camera with the iPhone 7 Plus. Dual camera makes the phone has the ability to perceive the depth of space, and this is the necessary conditions for the realization of 3D information capture. Coupled with Apple CEO Cook more than once to face the media to express augmented reality (Augmented Reality, referred to as AR) love, all this can not help but reverie: iPhone 7 PLus is Apple's first step toward the AR it?
All this is not groundless. VR, AR noisy for a long time, many peers have been placed in this market position. Microsoft's Hololens and Lenovo's PHAB 2 Pro have been quite a look of AR features, only Apple surprisingly calm, has not been action. To say with a dual camera iPhone 7 Plus Jianzhi AR, the idea is in line with the public's psychological expectations.
IPhone 7 Plus is equipped with an equivalent focal length of 28mm lens, and an equivalent focal length of 56mm lens. Two lenses can be detected with the screen object and the distance between the background. However, these two seemingly powerful lens in the AR function is not as powerful as everyone imagines, regardless of the internal components or the distance between the two lenses are far from the AR specifications.
Dual camera limitations
According to foreign media, the camera module used on the iPhone 7 is almost certainly developed by LinX. Apple last year acquired the Israeli company. Prior to the acquisition, LinX claimed that the camera module it developed was comparable to that of digital SLR cameras, while the module itself only occupies a very small footprint. These words and this year's conference Apple used to boast iPhone 7 copy exactly the same.
The ability to record depth of field information has also been a dual-camera LinX headed technology. This technology in the iPhone 7 Plus dual-camera has been the best embodiment. In Apple's photo proofs, the foreground of the figure in clear outline, while the background showing a charming fuzzy effect. Apple will be called "portrait mode", but according to fruit powder to get started testing, the depth of field effects in shooting non-human objects as effective.
In order to create this depth of field effect, the two cameras were their intake of images were analyzed. When the system in the screen to complete the recognition, the two lenses were taken with different focal lengths of the image, and then by the final synthesis of the depth of field software photos.
However, this identification depends on a sufficient number of characteristic data can be completed. For example, if you take a cell phone against a smooth surface of the white wall, once there is not enough information to complete the identification of images, depth of field effect does not exist.
Which are less than AR required by the camera requirements. 3D graphics and scene reconstruction require high accuracy and high stability of the image acquisition. This requires the camera in addition to the plane coordinate detection capabilities, but also has the depth of information collection capabilities. At the same time even if the mobile phone body shaking, the camera should have the ability to remain static relative to the environment, which is anti-shake on the issue.
The lens is too close
IPhone 7 Plus on the two cameras too close, but also people suspected of its AR application capabilities. Experience tells us that the distance between the two lenses multiplied by ten times, the result is equal to the camera can capture the depth of field. IPhone 7 PLus on the distance between the two lenses is only about one centimeter, if the two cameras the same size, its depth of field can only capture only 10 cm. But the iPhone 7 Plus on the use of two cameras is not the same size, but a wide-angle (28mm) a portrait (56mm). An AR developer told reporters that through a number of software subsidies, iPhone 7 PLus can detect the depth may reach 50cm.
However, even so, 50cm for the creation of AR applications is still stretched. From this perspective, if Apple really want to enter the AR 7 with the iPhone, the layout between the two lenses will not be the case.
PrimeSense sensor is missing
AR devices (such as Microsoft's Hololens) will usually use active sensors and infrared means to obtain accurate object information. The device calculates the distance information by emitting infrared light and then measuring the time it takes for the light to return to the object.
Previously acquired by Apple's LinX also have such technology, LinX will be called "Structured Light". However, this technology is not in the iPhone 7 (Plus) appeared. The iPhone 7 PLus still uses the same passive sensor solution as previous generations. Passive sensors use ambient light to sense distance, and accuracy is less than ideal.
Cook has a mystery
Cook in the "Good Morning America" in a speech may reveal some mystery. "AR allows people to thousands of miles apart but it can be a knee-jerk. Can 'see for yourself' does not exist. Here AR may be presented by something we are talking about, it may not be present but is impertinent someone."
Cook's words no doubt show that Apple's pursuit of AR. Compared to Oculus that completely isolated with the reality of the VR, Apple AR more soft spot. The statement also implied that Apple may not want the AR or mobile phone to achieve flat-screen, and may be a head wearing equipment or glasses.
To sum up, Apple's passion for AR no doubt, but the iPhone 7 PLus is not its entry into the AR's mountains. Maybe now Apple's headquarters in a room has been polished out of several AR prototype? Perhaps the next Apple conference, we can look forward to see the Apple to bring AR new products.