Only yesterday Apple presented the winners of its competition “Shot on iPhone”. The ten pictures, taken with smartphones from the iPhone 7 to the iPhone XS Max, look so fantastic that you no longer want to use the term “mobile phone photos”. Brilliant colors, razor-sharp details and creative tricks with the background blur ensure great photographic art.
And other top-of-the-line smartphones, from Google Pixel 3 to the Huawei Mate 20 Pro, would undoubtedly produce similarly excellent pictures in a competition – even though they, like the iPhone, are severely handicapped by “real” hardware cameras.
No room for elaborate look
Sony’s current full-frame camera A7R III for 3,100 euros is 7.3 inches thick – which is almost exactly ten iPhone XS. And while the lens of Sony is not even included. Undoubtedly, the A7R III still shoots better pictures than a smartphone. She shoots faster, more detailed, in higher resolutions. And she gets along much better with poor lighting conditions.
But the lead melts – especially when smartphone cameras can play their strengths in good light. Then the quality of the photos is often difficult to distinguish. And that, although iPhone & Co. hardly offer space for elaborate optics and picture technology. By comparison, the A7R III’s photo sensor capturing the light is 862 square millimeters. The iPhone XS, whose sensor also comes from Sony, it is 40.6 square millimeters – not even five percent of the area.
And for the lenses is in the iPhone, as well as other smartphones, so little room that Apple with the dreaded camera sink on the back at least still have to cut out a further millimeter. Conceivable bad conditions for good pictures so. Artificial intelligence ensures that it still works.
The secret of smartphone cameras
Machine learning, which detects image content and places photos accordingly, has been around at Google since launching the Google Photos 2015 image service. The Verge explains how the algorithm learns, for example, how a panda’s black and white pattern differs from that of a Holsteiner cow ,
But the breakthrough was only the next step – namely to use this artificial intelligence in real time for smartphone cameras. Since that is possible, the cameras literally “spot” what they are photographing and optimize the images accordingly.
Also interesting: In the hunt for the magic moment
A milestone was Apple’s dual camera in the iPhone 7 Plus. For the first time in 2016, it made it possible to crop portraits and other objects against a blurred background – an attractive effect (“bokeh”) that was previously dominated by single-lens reflex cameras and other high-quality cameras. The software recognizes the subject in the foreground and blurs the background. With “real” cameras, the lens with its elaborate glass construction ensures this effect.
With smartphones, there is a pure software trick behind it and promise a fake “bokeh” – which can even be changed later on the current iPhones. An amusing new promotional clip from Apple in which a mother gets upset that her child is “booby-headed” shows how this “depth control” works.
Two and a half years, the iPhone 7 Plus is just old. And since then, the quality of smartphone images has continued to increase at breathtaking speed. Because lenses and photo sensors are increasingly reaching the limits of physics with the limited amount of space available in the cell phone, artificial intelligence is primarily responsible for this – and new hardware such as the “Neural Engine” in the current A12 Bionic iPhone chip the calculations for the artificial intelligence takes over.
Experts speak of the Neural Processing Unit (NPU) – one of the magic words for the photography of tomorrow, which is now traded under the slogan “Computational Photography”.
With the extremely strong processors, the NPU and the artificial intelligence of current top smartphones, classic cameras have long ceased to be with them – and could therefore fall behind in many areas of photography in the next few years.
Basically, even a top camera could be equipped with this complex hardware and software. But then it would hardly be affordable. And she would quickly fall behind technically in the extremely fast pace of the smartphone manufacturers.
The Google effect
Two lenses on the back of the camera, three or even five, like the new Nokia 9 PureView? The race is in full swing. Only Google has hitherto held back in terms of hardware, and continues with his current smartphone Pixel 3 undaunted on a single single camera. The fact that the Google phone in all tests is still very far ahead or even in first place, according to The Verge, “that it comes with smartphone cameras long more on the software than on the hardware.”
Even the cropping of objects, for which Apple still needs a dual camera, creates the Pixel 3 with a single lens. The trick: Google’s artificial intelligence divides the pixels of a photo almost in left and right halves. That’s enough to create a 3D view and thus a decent bokeh.
Google’s latest AI masterpiece for smartphone images is called “Night Sight” – a night vision mode that sees more than the human eye. Night Sight now works on all pixel smartphones, but especially on the Pixel 3 provides stunning results. It uses a series of exposures to create a photograph in which artificial intelligence optimizes colors and light to make it appear almost daylight.
However, that does not have much to do with the reality that the eye perceives. There is also a technical term for this: “Fauxtography”. The scientist Stephen D. Cooper speaks of images “that make a questionable or completely false impression of what they seemingly photograph.”
The future of smartphone photos
The smartphone as the world’s dominant camera is unstoppable. Already, according to current research almost 90 percent of all photos taken with cell phones, and the proportion continues to rise. “Real” cameras become a niche for lovers – like turntables and vinyl LPs today.
The next generation of smartphone cameras can already be seen at the MWC 2019 in Barcelona. With its five Zeiss lenses, the Nokia 9 PureView is the first to produce a bokeh of the quality of a mirror reflex. Oppo shows the first prototype of a lossless tenfold optical zoom for smartphones. As in a periscope, the light reaches the photo sensor via mirrors. This should allow a focal length of 16 to 160 millimeters.
Sony is now cooperating with the company Light, the manufacturer of the revolutionary camera L16, which should deliver true lens reflex lens with 16 lenses in smartphone size. Among other things, the L16 failed due to the miserable quality of its software, but the concept is still considered brilliant. Sony wants to implement it now for smartphones and create another quantum leap.
Apple is supposed to launch its first triple camera on the iPhone XI in the fall. And even Google should not get around at Pixel 4 to a system with multiple lenses. These hardware paired with Google’s artificial intelligence – classic camera makers like Canon or Nikon should dress warmly.
Also interesting: Analog camera with smart moves: The Polaroid OneStep + in the test
Newsletter & Messenger
Always up to date on all topics of digital life with the LEAD Newsletter and the LEAD Tech Newsletter. Whether professional or private. In your inbox or via messenger.
Subscribe to our newsletter now
Subscribe now via messenger