Ed. Note: This article first appeared in Business Insider
There is a lot of pressure on Apple’s next iPhone to deliver, both in product excellence and sales.
Chatter from Apple’s factories in Asia point to a completely redesigned iPhone coming out this fall, perhaps with a better screen, longer battery life, and a $1,000 price tag.
But perhaps the most interesting feature currently rumored for the so-called iPhone 8 is a new 3D camera.
The 3D camera would be the first step towards a technology called augmented reality, which integrates the internet with the real world.
It is a technology Apple CEO Tim Cook is fond of talking about in public.
In fact, earlier this year, Cook said it could “be as big as the smartphone,” a technology that accounts for two-thirds of Apple’s sales.
Apple has bought several augmented reality startups, including FlyBy Media and Metaio, and has integrated some of them into a special projects camera group, BI reported.
The ultimate goal is for augmented reality to inject computer graphics and important information into everyday situations — imagine raising your phone on the street and seeing directions to your next appointment, or a Pokemon hiding behind a bush.
A 3D camera would allow the device to see how far away things are from it, which is critical for placing computer graphics in the real world in a way that they integrate seamlessly.
Eventually, a pair of smart glasses might replace your phone altogether to seamlessly integrate graphics and the real world.
Apple never publicly comments on future products. But over the past few weeks, several Wall Street analysts have shared what they think they know about Apple’s new “3D camera,” and what it could mean for Apple investors and future Apple products — especially the high-end iPhone expected to launch this fall and likely to include the rumored 3D camera.
Steven Milunovich, UBS
UBS’ Apple analyst wrote that “investors could be surprised at how AR could reinvigorate the iPhone/iPad and possibly result in new products” in a Feb. 28 note in which he writes that Apple has 1000 engineers working on “a project in Israel that could be related to AR.”
He writes that Apple has an advantage over companies like Magic Leap, Google, and Microsoft for AR for four reasons:
- “hardware expertise and superior hardware/software integration;
- consistent updated releases with most customers on the latest version of iOS compared with the fragmented Android OS base;
- an installed base of iPhone and iPad customers that can use AR rather than starting from scratch; and
- a cloud infrastructure that facilitates data gathering.”
Milunovich also writes that “Apple’s experience designing its own chips could present an opportunity.”
Here’s how he sees Apple’s potential AR timeline:
In the short term, Milunovich believes Apple may integrate facial recognition and early developer tools for “basic AR experiences” into the next version of the iPhone or iPad, and that the features could improve retention rates and upgrade rates.
Andrew Gardiner and team, Barclays
Gardiner’s research note, dated Feb. 22, focuses more on the specific hardware that Apple could introduce with upcoming versions of the iPhone or iPad, citing “research into the supply chain.”
“Comments from a number of optical and light sensing component companies over the last quarter or two have reignited discussion of the additional functionality to come in smartphones, in particular the so-called iPhone 8/X/Pro flagship device,” Gardiner wrote.
Gardiner spends a lot of time identifying which companies make the parts that go into the iPhone and which ones could stand to gain from a next-generation “3D camera.”
He believes chipmakers like AMS, STMicroelectronics, Lumentum, and II-VI may be big winners as Apple includes a 3D sensor in 10s of millions of iPhones. A source familiar with Lumentum’s business previously told Business Insider that the company would be supplying parts for the upcoming iPhone.
Ultimately, the big question for Gardiner is whether Apple will use a time-of-flight sensor or structured light sensor. He thinks Apple will use a combination of the two. Above is the chart he uses to explain the difference between the two.
Both approaches have advantages and disadvantages, Gardiner writes. Apple currently includes a “time-of-flight” sensor in current iPhones.
Morgan Stanley’s research looks more broadly at 3D sensing technology, and not necessarily Apple’s interest in it. They write that 3D sensing will initially be limited to high-end smartphones, of which Apple makes up 70%-75% of the market.
The report looks at how 3D sensing works and which parts are generally needed: a light source, controlling optics, an image sensor, and a firmware chip.
They include a diagram courtesy of Primesense, an Israeli company that Apple bought in 2013.
Morgan Stanley cites AMS, a current Apple supplier, for potential applications for this technology, including “augmented reality and virtual reality, gesture, face recognition biometrics, improved 2D and video images, body measurement as well as autonomous driving.”
But Morgan Stanley questions the consumer appeal of AR and the viability of the requisite 3D camera technology, including whether smartphones will have the necessary horsepower to do get the job done.
Many people in the AR industry have told me that the power consumption and heat generated by processors are critical roadblocks.
And it’s not a given that the initial games and other AR experiences will click with consumers.
Pokemon Go, an early AR app that had huge engagement, saw a sharp drop in usage after the initial novelty wore off.
Ming-Chi Kuo, KGI Securities
In two notes issued in February, Kuo, widely seen as the top expert on Apple’s supply chain, laid out everything he’s hearing about Apple’s 3D sensor, which he says is the result of “1-2 years spent on related-component R&D.”
He believes that the 3D camera on the upcoming phone will actually take the place of the front sensor, and will be composed of three modules.
The application for this 3D camera is a “innovative user experience.”
“3D modeling offers a wide array of potential applications, for example replacing the head of a character in a 3D game with a user’s selfie (which contains 3D data), or using a 3D selfie (with depth information, unlike an existing 2D image) for AR/ MR applications,” Kuo wrote.
In a previous note dated November 1, 2016, Kuo explained how he believes this fits in with Apple’s product plans (emphasis ours):
AR is related to all Apple’s current businesses; the key is that AR is an innovative human-machine interface that could be used in various devices & applications. All of Apple’s past successes were related to human-machine interfaces, such as mouse for Mac, click wheel for iPod, and multi-touch for iPhone and iPad. Assuming Apple successfully develops AR, we predict the firm will enjoy the following competitive advantages:
(1) redefining existing key products and leading competitors by three to five years. For instance, this could happen for iPhone, iPad and Mac
(2) eliminating obstacles of Apple Watch and Apple TV by offering an innovative user experience
(3) entering new business fields, such as autonomous driving system. We expect Apple to generate preliminary results for AR in the next 1-2 years at the earliest and working with iPhone may be the first step.
Above is a diagram from a Feb. 20 note explaining what sources he thinks will supply parts for the iPhone camera — largely lining up with the Barclays research.
Tim Cook, Apple CEO
But the biggest hints about Apple’s augmented reality ambitions may come from its CEO himself. Here are the takeaways from his comments:
- He thinks that it could be a huge market opportunity: “I regard it as a big idea like the smartphone. The smartphone is for everyone. We don’t have to think the iPhone is about a certain demographic, or country or vertical market; it’s for everyone.” Independent, Feb. 2017
- But it may take a while to come to fruition: “AR is going to take a while, because there are some really hard technology challenges there.” Public speech, Oct. 2016
- First, Apple will integrate it into its existing platforms: “It will be enabled in the operating systems first, because it’s a precursor for that to happen for there to be mass adoption of it.” Public speech, Oct. 2016
- Almost like a processor, not necessarily a standalone product: “It’s a core technology, not a product per se.”
- But when it does happen, he says it will be “profound”: “Augmented reality will take some time to get right, but I do think that it’s profound. We might … have a more productive conversation, if both of us have an AR experience standing here, right?” Buzzfeed, Oct. 2016
Posted by: The Trust Advisor