Apple finally sent out an invite for an event happening September 7, almost certainly to launch the iPhone 7. Though the invite is very sparse there’s a big clue as to what you might be able to expect from the new cameras.
And no, we’re not just jokingly overanalyzing it:
We already know the iPhone 7 Plus will probably sport two rear cameras, but it wasn’t totally clear how it would be using them – even though we know pretty much all the specs. It could have taken the LG approach of featuring two different focal lengths, for instance, but instead Apple seems to be working more along the lines of Huawei/Honor, using the second camera for artificial depth of field.
Notice all the colorful little circles? Photographers call that ‘bokeh,’ a Japanese word for the way a lens renders out of focus points of light and highlights.
It’s an aesthetic quality that emerges from shallow depth of field, and the laws of optics mean it’s typically virtually impossible to achieve a significant amount of it without the combination of a large sensor and bright lens, like those in DSLR’s and mirrorless cameras (aside from macros). Shallow depth of field is also an important tool for isolating your subject and directing the eyes, especially when you have a background.
But mainly, it just looks really pretty.
HTC was the first to try and replicate DSLR-like bokeh way back with One M8 in 2014, but it was really slow. Google and Samsung then tried to replicate the feature with a single camera using some focus or parallax tricks, without much better results.
But the Huawei P9 has the best implementation of it I’ve seen so far, working very quickly and allowing you to adjust focus after a shot’s been taken. Sometimes the results from the P9 look passably realistic at first glance (mediocre photos aside, but the effect is shown off well):
It’s one of my favorite features in a modern smartphone camera, and it makes me consider the P9 is the best smartphone for photography professionals and eager amateurs.
Unfortunately, it’s still far from perfect, often-leading to strange artifacts around complex edges, especially if your subject is moving. The effect is also simply too exaggerated by default, making it look obviously artificial if you’re a photography enthusiast of any sort. That’s partly because the bokeh is also a bit too perfect, with completely round, uniform circles.
But this is Apple we’re talking about; with some exceptions, it tends not to implement half-baked features. Moreover, it bought a company last year that specializes in depth-sensing cameras; you can bet that purchase has been put into use here.
The most important bit will be making shallow depth of field intuitive. The feature is an optional setting on the Huawei P9, and relies too much on the user having a good eye or knowledge of photography to keep bokeh from looking fake.
I don’t have high hopes I could replace my mirrorless camera with an iPhone, but knowing Apple, I’m at least expecting an easy-to-use, effective application. For instance, I wouldn’t be surprised if the iPhone 7 tried to automatically calculate a realistic amount of depth of field based on your subject and its distance. Its cameras have always been about simply pointing and shooting, after all, and no one will use it if the implementation is obtuse.
While there’s nothing to suggest Apple will be using a dedicated monochrome sensor like Huawei, I also imagine Apple will copy the company in using the second camera to increase image sharpness and noise reduction by applying some clever algorithms. Current spec leaks show the standard iPhone 7 – which only has one camera – has a larger sensor than the in 7 Plus, so Apple will likely compensate by combining resolution details from the bigger iPhone’s too sensors.
Of course, I could be totally wrong about all of this. Maybe I am over analyzing, out of hope that someone finally figures out shallow depth of field on smartphones.
But I doubt Apple’s designed chose the look they did on the invite for no reason. It won’t be long until we find out.