The iPhone 11 models Apple launched this week were more camera than phone.
Sure, they can still send texts, download apps, and make video calls, but Apple spent a lot of time and effort marketing its new phones as powerful photography machines.
It’s the latest sign that camera technology is where Apple believes it can make the biggest improvements to the iPhone and where it can distinguish itself from its rivals including Samsung and Huawei, whose cameras have been competitive or even better than the iPhone’s at some aspects, according to technology industry analysts.
Apple spent 13 minutes of its 100-minute press conference talking about the high-end iPhone 11 Pro camera — more time than it spent on launching Apple Arcade, Apple TV+, or the new iPad, and just slightly less time than the 16 minutes it spent talking about new Apple Watch models, according to an analysis from independent Apple analyst Neil Cybart. Apple also spent seven minutes discussing the lower-end iPhone 11 camera, according to a transcript of the event.
“Camera, Camera, Camera is the new iPhone 11,” Deutsche Bank analyst Jeriel Ong said in a note earlier this week reacting to the launch.
“Apple is embracing what we can refer to as the camera plateaus, positioning them as the iPhone’s distinguishing feature,” Cybart wrote in the Above Avalon newsletter on Thursday.
Cameras are “what consumers still care about most and where most expect innovation to happen,” said Gartner analyst Annette Zimmerman. “It is a bit of an easy benchmark for users, as it is easy to observe.”
Phil Schiller, Apple’s top marketing executive, specifically called out the camera as his favorite part of the iPhone on Tuesday. Apple CEO Tim Cook has said there are 800 people dedicated to improving the iPhone’s camera. Aside from new colors, the biggest physical difference in this year’s iPhone 11 models is they simply have more camera sensors and lenses.
‘A social signal’
The entry-level iPhone 11 comes with two cameras: One that takes an “ultrawide” photo that can fit more information into the photograph and a more conventional lens. The two “Pro” iPhones also have a zoom lens, that can take a photo that’s closer to the subject without physically moving closer.
One major reason that Apple is focusing on camera improvements this year is simple: Consumers care about camera quality and take a lot of photos with their phones. Apple’s iPhone is the most used camera on Flickr, according to an analysis the website does based on photographs uploaded to it.
“Consumers upgrade for better cameras – among other reasons,” Forrester analyst Julie Ask said, citing displays as another key feature that consumers look for when upgrading. (This year’s iPhones have the same screens as last year’s.)
Since the design of this year’s phones matches last year’s, highlighting the change in its physical camera design is one of the few ways to distinguish an iPhone 11 from an iPhone X on billboards and other advertisements.
“In a weird way, the number of cameras on the back of your iPhone will become a social signal,” Cybart predicted last week.
Apple also needs to keep up with rivals like Samsung or Huawei, which are also releasing phones with multiple cameras and using machine learning techniques to combine photos for better results, analysts said.
One example is night mode, a new feature that uses machine learning to brighten photographs taken in low light. A very similar feature called Night Sight was a key part of Google’s Pixel phones last year, and was featured in advertisements run in the United States.
Analysts also highlighted how Apple is using artificial intelligence and machine learning to create photographs that are actually multiple photographs combined to make a better shot than is possible with a single camera. Apple calls its feature “Deep Fusion,” and while it’s not shipping with the new iPhones, it will be added in a future software update.
“It shoots nine images. Before you press the shutter button, it’s already shot four short images, four secondary images,” Schiller said. “When you press the shutter button it takes one long exposure, and then in just one second the neural engine analyzes the fewest combination of long and short images, picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimize for detail and low noise.”
Apple’s “‘secret sauce’ is more in the software and how Apple is leveraging AI for photography,” Zimmerman said. “They presented the Deep Fusion feature that helps with the small details in a shot all driven by the AI capability of the A13 Bionic.”
“It takes a lot of resources and only a few can do this, so Apple is among the top 5 but were they the first to market with a triple-camera system leveraging AI and high-end hardware? No,” she continued, noting that both Samsung and Huawei currently have triple-camera systems on the market.
The rise of “pro” cameras on smartphones has led to a sharp drop in standalone camera sales, according to stats from the Camera and Imaging Products Association analyzed by venture capitalist Om Malik. There were 20 million digital cameras sold in 2018, down from 120 million in 2008. (Apple sold nearly 218 million iPhones in its fiscal 2018 and research firm IDC estimates there were 1.4 billion smartphones sold that year.)
“Because hundreds of millions of phones are sold every year, it is possible for companies like Apple, Google, Samsung and Huawei to pour billions into researching and improve their phone cameras, not to mention the software and algorithms.” Malik wrote. “Thanks to this cocktail of better chips, better processing, better sensor and ever-improving algorithms, the future belongs to computational photography.”
All Rights Reserved for Kif Leswing