Introduction.
It's impressive how smartphone manufacturers have embraced photography and have innovated impressively to overcome the inherent disadvantages of small camera sensors to achieve image quality approaching that of dedicated cameras. Instead of attempting to increase sensor size or to add longer focal length lenses, they have instead focussed on achieving image flexibility through computational techniques which create approximations of effects that previously could only be obtained with larger format cameras and intechangeable lenses. In particular, the use of multiple cameras on smartphones has opened up a whole new set of algorithmic solutions which significantly encroach on the previously exclusive realm of dslr's. Examples inculde the iPhone "Portrait Mode" where background blur (Bokeh), lighting effects, etc. are all now possible. These effects are made possible by the creation of a depth map with every image. The depth map is an approximate estimate, by pixel, of the position of that pixel in z-space (depth). Depth maps are typically created by imaging a scene with two (or more) cameras, slightly displaced. The differences in the images give enough information to calculate depth and hence generate a depth map. Recently, depth maps have been used for other applications such as re-lighting of images. Computationally generated lights can be placed arbitrarily in the "scene" in 3D and used to direct lighting to particular areas of the scene. Examples of apps supporting this capability include Apollo, Focos and others.
The pace of innovation in this space has been accelerated, largely due to the competition between Apple and Google for market share and leadership in smartphones (Android versus iOS). Further, the rate of development of AI techniques, aimed at imaging, has also contributed significantly. Also, Companies such as Quallcom, who supply the image sensors and associated circuitry to the phone manufacturers are developing ever more advanced technologies at a rapid pace. With these advances, the smartphone, which is already the dominant camera platform for the masses, promises to encroach even further into the realm of the dslr thus relgating more traditional cameras to largely specialized roles. [get some data to support this claim].
Despite this rapid development of smartphones, areas which continue to be the exclusive (but not only) territory for high-end professional cameras is in portrait, fashion and marketing applications. In these markets, not only is image quality of highest importance but the ability of cameras to support complex lighting set-ups is essential for the photographer to create the "look" they desire and for the client to get a product that they have flexibility in working with. In these applications, in addition to the need for high quality imaging, integration with the lighting "eco-system" delivered by multiple manufacturers, is essential. The elements include strobe lighting, light modifiers, assorted props, backdrops and a variety of light stands and other parphinalia necessary to create the desired lighting effects. Not only do these set-ups cost a significant amount but they also require skilled staff to set them up and operate them on behalf of the photographer. In complex shoots, it's not unusual to have 40-60 staff operating in support of the lighting and staging of a photo shoot.[ how much does the media indutry spend on photoshoots and related activities (editing, retouching, etc., per year?)
Experiment #1. Align images from iPhone X and SONY A7 RIII.
In order to determine if images can be aligned from different cameras, I picked the Apple iPhoneX and the SONY A7R III. They have different sensors, resolution, lens characteristics, everything.
Here's the process:
A7R III:
Image quality to RAW
Lens focal length to 56mm
ISO 100
Aperture priority, record shutter speed
iPhoneX
Use the LR CC camera app set to dng
Set to 2X (56mm equivalent focal length)
Set aspect ratio to same as A7R III (3:2)
ISO auto or play with manual controls to get as close to the A7R III as possible.
Tripod mount.
iPhone X is mounted on the A7R III hot shoe. Cameras are aligned horizontally but offset vertically
Use remote release on iPhoneX (apple watch) and A7R III.
In Post:
Import iPhoneX dng's from LRCC to LR Classic CC
Apply lens correction. Adjust wb and exposure in LR to attempt to match "look" between both cameras
Open dng's in PS CC and adjust image size to match the SONY (7952 X 5304) make fine adjustments if necessary. Image size must be identical. Save back to LR as a TIFF
Open SONY image in PS and save back to LR as a TIFF
Open both TIFF's in PS as layers
Align layers and look at the result.