|
Shot on my iPhone XS using CameraPro in RAW, processed in Lightroom. |
Unless you have been living under a rock this last week you would know that Apple, at its annual event announced some new iPhones. They are iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max. The two higher end models 11 Pro and 11 Pro Max boast
new chips that
Wired claims will make significant differences to pro level photography. The changes in software and hardware make sense. However the software changes for me make little sense. All I need in a smartphone camera is the ability to capture in raw, and to make some exposure adjustments as needed in-situ. A live histogram helps too. Apple’s new phones make multiple exposures and use software and neural networks to composite a single image from up to 9 pictures. Here’s Wired description of the computer hardware changes.
Under its glass and metal exterior, each iPhone has a new A13 Bionic processor, which should offer a decent speed upgrade. Apple claimed that the new chip has the fastest-ever CPU and GPU in a smartphone, and wowed the crowd at Tuesday's event with a show of big numbers to back up the claim. Per Apple, the new chip is capable of 1 trillion operations per second, and holds 8.5 billion transistors.
What this means for photographers is the ability to use more computational photography. Useful in a tight situation, say a war zone or wedding where experimenting with 3rd party apps features or exposure settings, like the
ProCamera and
Halide are not feasible. Still for me I make most of my digital images by exposing to the right and correcting in post. So this superpower of computational photography is a bonus but not a feature that my images would live or die by.
Still computational photography has been a thing for some time now. It’s not going away any time soon so it will be interesting to see where it goes from here.
|
The above raw file in Lightroom, left before right after processing |
Website |
Tumblr |
Flickr |
Twitter
No comments:
Post a Comment