iPhone X Takes on M43

We all assume that a mirrorless or DSLR camera is going to pulverise an iPhone concerning image quality, but, there are degrees of pulverisation.  Questions my dear reader come into play, like…

Are we talking about shooting in good light or poor light?

Are we comparing a fixed focal length lens to the equivalent on the iPhone X?

How big are we going to print?

……and so on.  It’s not a straightforward comparison to make.

On the surface it seems ridiculous to even consider making this comparison, the outcome is evident before we even start….. or is it?

When I was testing the telephoto lens on the iPhone X and 8 models I came to a conclusion pretty quickly that these dinky little glass constructions were mighty fine from the optical perspective, so that raised my initial question.  How might the iPhone in telephoto mode compare to my M4/3 camera at the same focal length?

Which then led to another question, well then what about the standard wide-angle lens….

Which begat the idea of what if I shot a mild panorama and compared that to a wide angle on the M4/3…….

Which naturally led to, yeah, but how about cropping the telephoto shot and seeing how it compares to M4/3 with a 40mm lens…….

And on it goes, so many questions, so many possibilities.

So here we are with my, “let’s shoot the same test pics at my trusty test location on both devices and see what flows from that”.

Test image iPhone X telephoto, full frame, Goulburn Railway Station.
Goulburn Railway Station, my regular lens and camera test location. Problems show up very quickly due to all the straight lines, lots of fine details and the high contrast subject. This is a test frame taken with the iPhone X telephoto lens, for a previous article looks good, but let’s see how the iPhone X compares to M4/3.

Students and iPhones

I’ve had more than a few students in my classes claiming their iPhones produced photos as good as their DSLR and Mirrorless cameras, I don’t discount their claims either, but like you, I want to see some hard evidence?

To be fair we can’t make comparisons under very low light situations (though I have, but that’s another story), that would be kind of meaningless, we can take it to the bank that a small sensor camera like the iPhone will be severely disadvantaged and underwhelming in low light. But what about in reasonable to good light, you know, the sort of light that most folks have when they take their holiday snaps etc.  I never get students making grand claims about shooting starfields with their iPhones etc. All of these “iPhone boosters” are talking about regular daylight type stuff.

NOTE:  I have run a series of test on the low light options including stacking, HDR etc, If you want a sneak peek at one of the single frame capture comparisons have a look on my other blog at this article:    

https://braddlesphotoblurb.blogspot.com.au/2018/03/low-light-and-high-contrast-m43-and.html

Of course, my students are referring to JPEG output, I’m more interested in absolutes, so it’s DNG/RAW only here guys, but a tremendous RAW result should translate to a good JPEG result unless the camera maker is doing something completely crazy in the processing department. (We shouldn’t discount the power of all the computational imaging methods used for iPhone compressed files either, but that’s too much to deal with for one article)

Obviously, a comparison like this is not about depth of field rendering abilities, if you’re seeking shallow depth of field control, you wouldn’t be using a smartphone in the first place, so I’m not going to entertain any arguments/comments about DOF.

I chose to compare the iPhone X Tele to my Olympus EM5 MK 2, which is an M4/3 format device.  Lens choice on Mr Oly was the 12-40 f2.8 zoom, it’s a well-known player regarded as a stellar performer.  The Olympus was shot at the lowest 200 ISO (regular range) and the iPhone at 16/25 ISO. In other words, I tried to make it optimal for both devices.

All test have been carried out using RAW/DNG files, and the processing of those files was handled by Iridient Developer on my Mac 5K. Iridient was also used for images that needed to either upsized or downsized.  Generally, I think I’ve done a reasonable job of making the comparison fair.

Aperture-wise I chose to use f5 on the Oly to give roughly the same depth of field look regardless of the focal length for both devices, the shutter speeds on each camera generally ended up being pretty similar, within 1/2 stop typically, so it makes for a fair comparison from a utility perspective.

Both devices shoot natively in a “Three to Four” aspect ratio, and I chose to compare the results for 12mm, 14mm, 17mm, 25mm and 40mm on the Olympus, these correlate to 24/28/35/50/80mm on full frame cameras.  To make the comparisons some of the iPhone images have to be cropped and upsized or in the case of the 12mm equivalent a two frame panorama stitch made.  The chosen focal length range represents what most people would shoot with a kit lens on a DSLR or mirrorless camera. I must emphasise however that the Olympus 12 to 40mm is a much higher quality lens than any kit lens, so the results for that side of the equation represent a best-case scenario for M4/3

It’s hard to get colours and tonal renderings identical, but overall I was more interested in the lens performance and detail. Nonetheless, I did try to get a similar white balance and colour rendering.

There’s an additional complication, the Olympus is 16 megapixels and the iPhone 12 megapixels, so I chose to test it both ways.  First, upsize the iPhone files in the RAW converter to match the Olympus and second downsize the Olympus in the converter to match the iPhone.  Neither approach is perfect but what else can you do? In the end, I have presented the files here at the Olys 16Mp size, which gives the Oly a bit of built-in advantage I guess.

Finally, I had to think about my target rendering, I could have gone for the web, small print etc.  In the end, I felt that judging for a file that could produce an excellent 8 x 11-inch print would suit best as this would easily cover most bases for most folk.

Initial Comparison

Alright, the obvious question out of the way first!

Yes, of course, the iPhone shows a bit more noise.

Would it matter? Probably not because I normally couldn’t see the noise differences at a 50% view on my 5K Mac screen, you have to zoom into a 100-200% view to pick it easily. Even then the noise in the iPhone images was in no way objectional and purely of the luminance variety.

And here’s a fun fact for you, many Pro Printers and Editors actually add noise to the file to make the image print more organically, just the sort of noise the iPhone DNG files exhibit in fact.  Trust me, noise is not necessarily the enemy!

Surprises were in store, however, first up, looking at the uncropped standard and telephoto iPhone images on my iMac it’s easy to see the cross frame sharpness of the iPhone tele and standard lens are a little more even than the equivalents on Olympus 12-40 f2.8.  That’s a serious wrap, cause the near $900 Oz dollar Oly lens is very highly regarded for its consistency, especially in the mid-focal length range.

oly14mmweir

iphone4mmweir
So Which one is the iPhone and which is the Olympus 12-40mm? Ok, the lower one is the iPhone, at this size they look pretty close, there are tonal, and colour differences but the differences are not vast.

Even more impressive both iPhone lenses actually seem to natively resolve a little more fine detail.  I’m very confident that if you were sitting next to me looking at the image pairs on screen at 100% views, you’d pick the iPhone shots as slightly more detailed and sharper (though a little bit noisier).  I realise that sounds heretical, but I stand by those words 100%.

But here’s the clincher, the uncropped iPhone shots are often sharper when both up-rezzed to the Oly’s 16 mp or the Oly is down-rezzed to the iPhone’s 12 mp, I imagine there are quite a few folk who did not want to hear that, I was one of them.

Now I can hear a whole bunch of Sony, Nikon and Canon fans out there saying so what, APSC is bigger than M4/3 and it will eat your puny iPhone Tele and Oly for lunch and then take your afternoon tea as well.  Mmmm, I wouldn’t be too sure about that, I didn’t test that, but I have tested enough of all of those brands and their kit lenses to make a couple of comments that just might have some bearing on this comparison.

Yes the APSC models will have lower noise, but in the good light like this at low ISOs, the difference between APSC and the M4/3 sensor is small, small enough to be a non-issue.  More importantly, the lenses for APS-C are very rarely as good as the 12-40mm Oly is, if you were comparing to any Nikon, Sony, Canon kit lens then forget it, I’ve tested several, they’re not even close, you’d have to be talking about a Pro-Grade lens.

I’d suggest in the washup the central core of the APSC images will be comparable maybe a tad better, but the outer edges and corners will be worse than the iPhone X lenses.  But I’m happy to be proved wrong if someone wants to do some rigorous testing (using correctly processed raw files of course).

APSC cameras also have a different aspect ratio, so in an “apples to apples” comparison, you need an 18 mp APSC to directly compare to the 16mp M4/3 regarding matching the vertical dimension.  Frankly, there’s very little difference between the 20 and 18mp sensors in performance or resolution so I expect it would it only be with the 24mp sensors that an APS-C camera may gain some resolution ground and only if the lenses were really up to snuff.

So if the iPhone lenses are possibly sharper, what about the dynamic range, vignetting, chromatic aberration and all that other stuff everyone pixel peeps and gets excited about?

Oly14mmweiredge

Edge crops to show resolution difference between iPhone X and M4/3, daylight at Marsden Weir Goulburn.
100% edge crops. Again the iPhone is on the bottom, detail-wise there’s very little in it, the iPhone seems a little more resolved, but you trade that off for a bit of luminance noise, which would be invisible in print. Both original files have been outputted at the Olympus EMs 16 megapixels, which should disadvantage the iPhone.

Chromatic Aberration.

The Olympus 12-40mm f2.8 is an excellent performer for CA, it shows very little if any at most focal lengths and is easily corrected, in short, its vastly better than most kit lenses, many of which are dismal CA performers only made good by extensive software correction in the processing stage.

But, take a look at the samples frames made from the left side of the 14mm shot, these are uncorrected RAWs, if you compare the Oly and the iPhone 4mm X lens it’s clear who is eating whose lunch as far as CA goes.

The iPhone X lenses are as close to CA free in the native state as I’ve ever seen, you have to zoom in too 200% or so to see any CA at all, correcting it is not really required at all.

I know photographers gloss over CA performance, saying it’s easily fixed in software, but you’ll always get better results from a lens that requires no correction.  The CA fix softens the edge and corner detail a little, anyhow the iPhone X lenses look to be gems of modern lens design in this regard.

These two crops also show that if you factor the iPhones higher noise out of the equation, it displays slightly more detail on the edges of the frame than the Oly does at 14mm in the native RAW state.

rawcropoly14mm

Comparing native chromatic aberration of iPhone x 4mm with M4/3 Pro lens.
You’re looking at 100% views of uncorrected Raw images taken from the left side of the 14mm frames, yep, this is how your RAWs/DNGs look under the hood. The top one is the Olympus 12-40mm and the bottom the iPhone 4mm. Disturbing telltale CA is visible in the Olympus frame whereas the iPhone has no visible CA and needs no correction.

Vignetting/Distortion

To see the vignetting in pics from either device you have to turn off the Raw Convertors built-in profile, resulting in the somewhat rubbish looking images below which reveal the unvarnished truth.  For this test, the Oly is set at 14mm and the iPhone X on the 4mm standard lens.

Those two pics are straight extractions with everything zeroed out except for a little brightness/gamma boosting to make things a bit clearer for you all to see.

Neither of the devices shows any issues with vignetting at 25mm on the Oly or with the iPhones’ X 6 mm Tele, so I have not included samples here.

It’s an interesting exercise which reveals a couple of significant points. First, the Olympus lens is very even in the scheme of things, most kit lenses show vastly more vignetting than this in their native state.  On the other hand, the iPhone X 4 mm lens is not exactly a weak performer either, sure there’s vignetting, but it’s only in the sky areas that this is really obvious and regardless, it’s well within the easily fixable range.

Next, we can see that the Olympus lens has a much broader angle of view when the lens profile is disabled, this is normal by the way, it allows for software correction of distortion, and folks, we do have some reasonably noticeable barrel distortion here.

The iPhone X 4mm has virtually zero distortion, I tried flicking the profile on and off and couldn’t really see any difference, and I can’t see any noticeable curvature of straight lines in this sample.  That’s great news because it means the edges and corners of the image don’t get degraded by the correction process, all of which probably explains why I found the processed 14mm shots from the Oly were not quite as well resolved as the iPhone X 4mm along the edges and into the extreme corners.

Uncorrected RAW frame of Olympus 12-40mm f2.8 at 14mm, showing distortion and vignetting in native file.
The Olympus 12-40mm f2.8 at 14mm without any corrections/adjustments applied to the RAW file. Vignetting is quite low in comparison to most lenses. Barrel distortion is evident, and the angle of view is wider than the iPhone as the Olympus uses the extra angle of view to allow for the needed distortion corrections. Once corrected the coverage is the same as the iPhone X at 4mm.
The iPhone X 4mm shows more vignetting than the M4/3 Pro Lens but is devoid of distortion. The native file does look far more subdued than the M4/3 file, but this has little effect on the final edited colour reproduction.
The iPhone X 4mm shows more vignetting than the M4/3 Pro Lens but is devoid of distortion. It is possible the iPhone does some distortion correction to the file in writing the DNG file, but I have no documentation of this. The native file does look far more subdued than the M4/3 file, but this has little effect on the final edited colour reproduction.

Dynamic Range

Well, oddly there’s very little difference between them, at least for the shots I took on both cloudy and sunny days, (note: cloudy days are more demanding on the dynamic range than you might think when white cloudy skies are included in the frame). All of the files have similar levels of tonal recoverability, you just get a little more shadow noise with the iPhone shots.

The unprocessed images above show basically the same RBG numbers in both the highlights and deep shadows, though obviously the mid-tone tonality and saturation are far punchier on the native Oly files.

Importantly your editing choices will have an enormous effect on the results so your mileage may vary depending on your skills and software, but either device will render files that are about equally malleable.

I’ve no doubt any full frame camera would eat both alive and spit out the bones, but many APS-C cameras, especially older sensor Canons will be little if any better in this area.

Ultimately I think the limit for both of the X lenses is the noise on the sensor, there’s undoubtedly no optical issues to be concerned about, and the native dynamic range is really not too bad at all.

What about Crops?

17mm Equivalent.

This test surprised me, the iPhone X didn’t win, that would be crazy, but it was nothing like a walkover either.  Consider that to get the equivalent of the 17mm lens you have to crop the iPhone down to just 8 megapixels and then blow it up to equal the Olympus EM5 Mk2s 16 megapixels!

The path I took was slightly more circuitous, I cropped the file in the RAW state and then upscaled it 200% on export, tweaked it in Photoshop CC and then resized it to match the Olys 16Mp. But my friends, facts are facts, if the pixels and detail aren’t good to start with no process will make any difference, you’ll just end up with more mush than you started with.

At a 100% view, the Oly looks a little more resolved, hell it would be the digital equivalent of walking on water if it didn’t, but here’s the thing, at a 50% view or for an 8 x 11-inch print size the images look almost identical regarding detail. Differences?  The Oly shows less luminance noise in the blue sky and very dark greys, and the tonality on the iPhone shots looks more filmic and to my eyes nicer (Yes I could probably get a perfect match, but oddly the Oly files just don’t seem quite as flexible).

There would be another way to skin this digital cat, you could shoot a two frame stitch with the Telephoto lens on the iPhone, I’m pretty sure judging from extensive tests I have made with that lens, the iPhone would then wipe the smile right off Mr Olys face.

Honestly, I reckon most people could shoot the iPhone X wide angle in DNG and crop the image to 17mm equivalent (or 35mm for you FF shooters) and get a result that is absolutely fine for 95% of non-pro needs.

oly17mm

comparison of 17mm focal length with iPhone X cropped to match
At the top is the Olympus 12-40mm at 17mm and below is the iPhone 4mm cropped and upsized to match. This would equal 35mm for full-frame shooters. There’s little apparent difference in clarity and tonality, though there are some visible differences in colour reproduction which are easily tuned in editing if desired.

17mmOlycrop

Centre crops comparing 17mm equivalent output for iPhone X at 4mm and Olympus EM5 mk 2, 12-40mm f2.8, the iPhone X is much better than expected.
It’s only in these 100% centre crops that any real advantage for the M4/3 is visible, the larger sensor resolves a little more effortlessly, whether you could see this in print is open to debate. Considering the iPhone image is both cropped and upscaled from its native 12 megapixels to match the Oly’s 16 megapixels the result is mighty impressive.

80mm Equivalent.

Whoa there, now we are really stretching the bounds of credibility because to pull this trick off you are turning just 5.2 megapixels of iPhone X tele into 16 megapixels. Doesn’t sound like much of a contest, does it?

So you have two options here to get your 80mm full-frame equivalent, upsize the RAW file after cropping before export and make the most of a single frame or take a multi-frame capture and then stack and blend them in Photoshop.  I tried both.

Now neither of the above approaches gave a result that looked close to the Olys 40mm shot at a 100% view but I was surprised that a 50% view seemed about the same in detail and I kind of preferred the cropped non-stacked version of the X file.  This held true for the 8 x 11-inch test print crop as well.

The stacked version comes tantalisingly close for detail, but it just looks a little more forced and digital at a 100% view.

In the end, I’d have no qualms at all in cropping the iPhone X frames to the equivalent of 30-35mm, I feel the 40mm crop is excellent for web stuff and moderate size prints, but if I need to blow the image up, I’d much rather start with the Oly.

I imagine that if you saw any of these versions in isolation, you’d be perfectly fine with the print or screen image, so from a practical point of view, yes, you could shoot your iPhone X RAW in the tele mode and severely crop for that longer tele framing.

40mmOly

40mmiphonecrop

iPhone 6mm X lens DNG cropped to compare with m4/3 40mm lens.
Ok so now we are pushing the envelope, the top image is the Olympus at 40mm, the middle one the iPhone 6mm X Tele lens cropped severely to match the Olympus. This image is Ok but lacks the detail of the Olympus as expected, and the noise level is easily noticeable. The bottom frame is a 4 frame image stack which was blended in Photoshop, it is much less noisy and competes reasonably well with the Oly version. The stacked version has ghosting on the quickly moving foliage, it was a very windy day, but I actually quite like the effect.

40mmcropOly

iphone x crop to match 40mm M4/3 image using DNG files.
Top, is the 100% crop from the Olympus at 40mm and below we have the iPhone 6mm which has been cropped and upsized to match the Oly’s 16 megapixels. The differences are apparent, but the iPhone is better than I expected, the stacked version, not shown here is much closer to the Olympus.

Say what about 100mm?

Got to be kidding, right?

You’re surely not going to get a great result with a crop like this from a single frame, but I did try an 8 frame stack which involved cropping the raws, exporting at 16 megapixels then doing a blend of the 8 frames in Photoshop.  This is not as radical as it might seem, there are several iPhone Apps that do something very similar on the iPhone in either JPEG or TIFF.

So what did that taste like?

Well not perfect by any stretch but the result would be excellent for a 5 by 7-inch print or for web use, with the right stacking/blending method the result is likely better than you might expect.

100mm equivalent stacked frame image from iPhone x 6mm DNG, railway station test image.
Yes, this is a pretty serious crop. The image was created with an 8 frame stack using the 6mm iPhone X lens, the movement in the trees is due to the stacking though this is possible to avoid with a different stack method. This image would easily make a nice 5 by 7-inch print, maybe even larger, the option could get you out of trouble in a pinch. I find these stack images give a lovely long tonal range look.

And the Panorama Option?

I’ve got all sorts of lenses in several formats, but the one thing I don’t have is an ultra wide M4/3 jobbie, 12mm is as wide as I can go, which by the way is usually just fine.  Sometimes I want more, and I’ve become pretty adept at making holiday panoramas that cover the 8 to 12mm range with my M4/3 gear using stitch options.

Simulating the equivalent of 12mm with the iPhone is simple enough, just shoot two frames in the vertical orientation with about 20% overlap and stitch them in your favourite editing application.  I prefer Photoshop CC, but many other apps will also do a great job.

The more obvious alternative is to use the Panorama mode on your iPhone via the standard App, again this is super easy, but the results are nowhere near as good as those you can achieve via the DNG/RAW file image stitch pathway. The Attached pics make the differences easy to pick.

First, the iPhone Panorama mode produces nicely sharp results, surprisingly sharp in fact, but there are a few deficits compared to the two other options.  First, you get an image which doesn’t have straight lines, and while it’s possible to correct this in editing, it’s by no means easy to do well.

Second, even if you’re very careful and keep everything entirely on the level, you’ll still get small stitching errors which ruin the result, though you probably won’t see these in small prints and on-screen views.

The last issue with the panorama mode relates to the rendering of highlight tones and colours.  Highlights are often severely bleached/clipped and colours sometimes a bit over-cooked.  You can control the exposure, but it’s usually impossible to get an exposure level that gives an optimal result for the entire image, especially skies.

Ultimately you could use the Panorama mode as an alternative in many situations, provided you wanted a quick and dirty result, but post editing an in-iPhone panorama to get a top-drawer final result is likely far more work than the other two RAW options, and remember the panorama mode is a JPEG once exported.

The Olympus 12-40mm f2.8 is really stellar at the 12mm setting, you’d have to be super picky to find fault, and honestly, for an overall result it wins the contest, but it’s very close.  In the centre of the image, there’s nothing to pick between iPhone two-frame stitch and the Oly @ 12mm for clarity.  It’s only on the edges of the frame where the necessary panorama transformations and slightly higher iPhone noise level conspire to allow the Oly to win the contest, but again you’d need to be pixel peeping at a 100% view to see it.  I seriously doubt you’d notice any differences in clarity between them in an 11 by 8-inch print.

Tonally for this, I preferred the look I got from the iPhone DNGs, the highlights look nicer though the shadows are an even contest, meaning, the dynamic range seems to be about the same, heretical I know but I stand by that statement.

In the end, the only downsides in shooting the wide shots with the iPhone X are that it doesn’t suit moving objects, you lose the instant “in-camera” result, and there’s a little extra work in the editing phase.  On the other hand, you probably have the iPhone on you anyway, the results give next to nothing away in detail and actually the process is more flexible as you can easily go much broader if you want from the same two captured frames.  Personally, I wouldn’t hesitate to go down this path when I need something wider than my M4/3 gear offers or don’t have the Oly with me.

12mmOlyfull

iphonepan

comparison of 12mm wide angle image from M4/3 to iPhone X panorama and two frame stitch.
Three versions of 12mm, the top image is the Olympus 12-40mm, the middle one is the iPhone Panorama version and bottom, the iPhone two frame stitch. There will always be differences in the perspective, but generally, the two frame stitch is more flexible and gives nothing away in quality.

12mmOlycentre

Centre crops iPhone two frame stitch and Olympus 12-40mm at 12mm
Finally centre crops from the 12mm Oly and iPhone X two frame stitch. Oly at the top and iPhone bottom, there is really nothing in it, either is excellent and suitable for any practical purpose.

What to Choose and Why?

So there it is, both lenses on the iPhone X are terrific and offer excellent imaging potential for DNG shooters and the iPhone 8 Plus is very similar.  That the X competes favourably with my M4/3 gear in reasonable to good light is quite extraordinary.

My iPhone X won’t replace my M4/3 gear, I didn’t expect it would, to do that, low light performance would need to improve by another couple of steps. However, the potential shooting envelope is much wider than with previous iPhone models and the gap between the two formats in “reasonable to good” light is very much closer than I expected and probably closer than you expected also.

And let’s not forget, you’ve been looking at RAWs from both devices processed in a state of the art RAW converter known to extract more detail than most converters. JPEGs from either camera are far less well resolved than the samples shown here and most folks are just fine with the JPEGs, that said, the iPhone gains far more from the RAW/DNG option than the Olympus does.

It’s the addition of tele lens that makes all the difference for the iPhone, I couldn’t consider the single lens models as a complete potential “personal camera” replacement.  Hence I always needed to have my regular camera with me at all times while on Holidays and at family events, the tele lens bridges an enormous gap and the improved image quality of the new iPhone models in RAW/DNG strengthen that bridge.

The most significant impediment for serious shooters would be the lack of depth of field control with the iPhone rather than image quality deficiencies.  Truth be told the majority of photos most folk shoot are not dependent shallow depth of field to work and ultimately computational imaging methods may well render this a non-issue within just a few years, the portrait mode on the iPhone and other smartphones is indeed promising.

Ultimately with RAW/DNG files, it’s the potential of the file that matters, if the file is malleable, well resolved, contains all the data you need to push and prod the tones into shape, the results can be excellent.  A combination of excellent optics and remarkable sensor performance on the iPhone X and 8 provide a great starting point, if you don’t get good or even great results you’ve either fluffed the capture or haven’t yet nailed the processing end of things.

For now my M4/3 wins, sort of, but in another couple of years, the result could well be a tie or even a loss. In the interim, I’ve no worries about using my iPhone X for a wide array of Photographic tasks in full confidence that quality wise it will deliver.

 

 

 

 

 

 

 

 

 

iPhone X tele lens part 2 the Raw truth

Most people reading this will be familiar with the specs for the iPhone X tele lens, but just in case you missed them, here they are.

  • 6 Elements
  • 56mm equivalent
  • 12 megapixels
  • Fixed Aperture f2.4
  • Image stabilised
  • Autofocus via focus pixels
  • Body and face detection

Note 1:  It has not been possible to find a final spec on the sensor dimensions or pixel size. However, all information from Apple and my own tests indicate it is superior in all ways to the sensor in the iPhone 7 models.

Note 2: The sensor in the 8 and X models may be the same with only the lens and image stabilisation being different, though Apple has indeed hinted that the X version is superior. Hopefully, I can provide a definitive answer to this in the third part of the series when I compare the two with real-world images.

Note 3: No portraits in this article, yes I know lots of people will use this lens for portraits, but they tell you next to nothing about chromatic aberration, edge performance, cross-field resolution etc.  Check out the previous article if portraits are your game.  Anyhow lots of people will use the tele lens for all sorts of regular images, so this is for you folk.

A Request: If anyone really does know the exact sensor size I’d be keen to know also as it will help me to calculate exactly how the depth of field pans out, please drop me a line. I can make some estimations based on the focal length, but I don’t know for sure that the reported 6mm of the X tele lens is precise either.

This review of the iPhone X tele lens only uses RAW/DNG files, no jpegs, basically I want you to see what the lens/sensor module is really capable of.  JPEG shooters will not obtain the same results, but I assume you already knew that.

The files were captured via ProCamera, but the results should be pretty typical of what’s possible via any decent app capable of DNG capture. Test files were processed in Iridient Developer on my Mac or Lightroom CC on my iPhone X

Remember that RAW/DNG files can be adjusted to provide pretty much any colour rendering or white balance setting you desire, so stating that the RAW files produce a more mellow colour is meaningless, it all depends on how you choose to cook the files’ ingredients.

Ok, let’s dive into the good stuff.

Track repair machine at Coolamon railway station NSW, taken iPhone X using telephoto lens, wide tonal range with rendered highlights and shadows.
TLC image (see note at the end of document), Track Repair Machine in Coolamon NSW. This is an extreme contrast situation, yet the file holds information from deep shadows through to bright sun-bleached whites, it is also very highly resolved.

 

Resolution

Many photographers get hung up on resolution and judge a lens purely by how much detail it can record, I find that approach a little simplistic but for all you pixel peepers, the answer is yes.

Yes, what?

Yes, this iPhone telephoto lens resolves very nicely indeed.

To judge resolution, I take several photos of real-world 3D subjects; I certainly don’t go shooting flat charts, brick walls and printed artwork.  Testing lenses in this way doesn’t tell you much at all unless you’re working with macro lenses. It quite surprises me that people would think that close up flat field subject rendering is a credible way to test a lens at all, anyhow, we have lots of photos at varying distances of real-world stuff here.

Next, all the tests have to be done in DNG (RAW); otherwise, I’d be just testing the JPEG processing of the device rather than the optics and sensor.

I shoot at the lowest ISO to ensure that if the device is adding noise reduction to the file, it’s minimal and also so I can get the best result in the RAW converter.

I check the files out using Iridient Developer with all noise reduction turned off, I also post-edit the extracted files in Photoshop to determine their malleability.

I try to find the interpolation algorithm that works best for the file at hand, Iridient Developer offers several options, they give slightly different results and looks. In the case of the iPhone X files, V3 Ultimate and V3 Detail+ seem the give the best results.

Finally, I view the image at different percentages, 200, 100 and 50%

So moving on, what have I found?

 

Goulburn railways station, standard test shot by Brad Nichol, iPhone x tele lens, wide tonal range, shows high performance result.
UniWB captured shot at my standard lens test location. Even in this web-sized image the even edge to edge-definition, wide tonal range capability and solid resolution are obvious enough. This location really tests the mettle of a lens and camera with the combination of high scene contrast, fine details and straight lines. In short, the iPhone X Tele passes muster with flying colours with no excuses.

 

The central resolution is about as high as you could reasonably expect for a small 12-megapixel sensor, it has no problem resolving fine textural information or low contrast detail.

I reckon if you were to look at the resolved detail and compared it with any decent 12 or 16-megapixel camera/lens combo offering the same angle of view you’d be unlikely to find anything to mark the iPhone X down on. (more on that later)

Furthermore, there is sufficient resolution to uprez to larger than native print sizes and files offer a reasonable degree of crop-ability for most purposes.

 

Edge Definition

The results here are pleasantly surprising; there’s little drop off in detail as you move towards the edges and corners, I’m confident that no one is going to have an issue with soft corners for any application.

In particular, the very even cross frame resolution would make for high-quality image stitching results.

I’ve examined a wide array of images, even landscapes with very fine grass out in the corners show high resolution, the only trade-off I can see is a slight loss in corner contrast, but most of that would be due to vignetting and the resulting correction.

 

Green finish box on country race track, Coolamon NSW, iPhone X telephoto lens.
Another TLC frame of Coolamon Race Track in country NSW. The image was captured at 16 ISO and shows excellent detail rendering. Also evident is the high level of corner definition, there’s almost no fall-off in detail even at the extreme corners. There is some residual vignetting, but it’s not colour shifted as was the case with earlier iPhones.

 

Cropped image of test frame, "Coolamon race Track" showing fine detail and dynamic range under very bright sunlight, iPhone X telephoto lens module, TLC, edited image.
This crop represents a 100% of the above image (assuming you open it full size for viewing). A couple of small things can be gleaned from this, the lens and sensor has no problem resolving high levels of micro detail, the edges are pretty much as good as the centre of the frame, and you can get blooming/flare on very bright elements such as the white fence but it’s well controlled and easily kept in check by holding the exposure back just a little.

 

Contrast

In the dim past, many smartphone lenses showed rubbish native contrast, especially if you pointed them towards bright light sources.  This behaviour was due to a combination of poor lens coatings and the effects of the front cover glass.

Some early iPhone lenses scratched up badly after mild use. These days the front cover lens is very tough and highly scratch resistant. Additionally, the coatings seem to be vastly better.  Even when the iPhone is pointed towards light sources contrast remains commendably high, this is especially the case with the telephoto lenses on both of the new iPhone X and 8 models.

The advantage of the higher contrast is seen in the shadows, which respond nicely to editing, provided you have good exposure files can be coaxed into displaying excellent tonal gradation in the shadows.

Note, as always the lens needs to be scrupulously clean for best results, something often overlooked.  I’ve had many students who ended up with hand cream and fingerprints all over their smartphone, obliterating any semblance of resolution.  That said, the telephoto lens is a little more forgiving of dust and small levels of lens born pollution.

One nice side benefit of the newer iPhones is the better sealing of the phone and its internal workings. Older models sometimes built up dust under the front lens cover requiring a pull down to fix; I’ve not seen this on an iPhone 6 or later model, still, it’s something to look out for if you are buying a second-hand phone as it plays havoc with contrast rendering.

 

Goulburn Railway Station, section of test file, iPhone X telephoto les module 16 ISO UNiWb capture, fully processed.
Here is another 100% crop, this time taken from the Goulburn Railway Station short earlier in the article, there is some very fine filmic noise even at 16 ISO but it actually makes the prints and on-screen images look a little more organic, and you really need to look very hard to see it at all. Even very fine details are well resolved, and there is a world of difference between the RAWs and JPEGs in what can be extracted from the files. The light sources on the platform do not show significant flare, nor is there any substantial micro flare around white items in the scene.

Flare

Flare is closely related to contrast; any lens can be made to flare if you try hard enough, usually when shooting into the sun or towards some specular light source. Most older smartphones show horrible flare in particular when shooting against bright white cloudy skies.

It is easy to confuse flare with localised over-exposure, this often occurs with light sources.  Here the immediate area around the light source loses gradation and bleaches out, this is usually a result of the sensor design and technical limits, but it may be made worse by poor lens coatings and designs which cause internal reflections.

I really pushed the tele lens to see if I could induce flare, and frankly unless I did something idiotic the flare was a non-issue, basically specular highlights remained specular, no flare-fests were evident.

The good news is that you will get excellent flare-free results without having to shield the lens, which has not always been the case. Certainly, my previous iPhone 6S often needed a little hand shielding many situations.

 

Corner crop iPhone X telephoto lens, test image Goulburn railway Station
This frame is an extreme blow-up of the corner of the main Railway test image; it’s from the bottom right side. We can see the fine textural noise which really is of no consequence, but more importantly, the resolution is excellent. You’d be very picky to find any issue, and most kit lenses on regular cameras would be far less well resolved in the outer most corners.

Vignetting

It can be a little tricky to test the vignetting with iPhone Raw/DNG files, you see all of the converters will apply an auto-correction under the hood before you do any editing to the image.

Does this matter?  Yep, it does, vignetting causes flow-on effects further down the editing chain, less real vignetting makes for a more pushable file.

To see what the native vignetting is you have to remove the effect of the base profile from the image, I can do this in Iridient Developer.

I can tell you that the native vignetting level is minimal in smartphone terms and the fall off is quite even showing no sudden banding.  The extreme corners certainly show a bit extra darkening, but it’s easily corrected and unlikely to show up as an issue in real-world images.  In short, the native vignetting performance is the best I have seen from any smartphone lens I’ve tested so far.

 

Colour Shift

Most older Apple smartphones in RAW show a significant red colour shift as your move towards the outer edges of the frame and this is made worse with underexposure or raising the ISO setting.

It’s not so much a product of the lens but rather a sensor issue which is exacerbated by vignetting, which of course means the exposure in the corners is always less.

You will almost never see this effect in JPEGs or indeed most RAW conversions because it’s dialled out early in the processing chain as part of the profile, but it has a flow-on effect for colour reproduction, colour noise and to a smaller degree, sharpness of the edges and corners. It can also make high-quality panorama stitching very difficult.

Naturally then, less native red-shift would mean a better-edited result.  Again the news is excellent, the Tele lens module shows minimal colour shift, you can just see it in cyan blue skies or if you have neutral greys out towards the corners. Finally, I can say the redshift is a non-issue and a big hooray for that.

Goulburn Railway Toilet Block test frame, iPhone X telephoto lens module.
This pic of the Toilet Block on Goulburn Railway Station has not had the profile that fixes the corner red-shift applied, nor has the colour been fine-tuned. The pic looks a little pink, but I don’t think many people would be too perturbed.  One side aspect of note is that the fine mesh on the walkway above the railway rendered nice and clear, the highlights of the sky have not etched into the details, this which indicate good flare control.

 

Chromatic Aberration

Honestly, I couldn’t find any; I tried all sorts of photos and subjects, I looked at the image at magnifications up to 500% with the input profile disabled, nada, nothing.  What can I tell you, well only that this is an impressive performance?

You may see a tiny bit of bluish fringing on white highlights if the exposure is set high enough, but this isn’t CA, instead, it’s the effect of very localised blooming, and even then I had to enlarge the image to 300% to notice anything.

Chromatic Aberration test, Gouburn Rail Platform overpass bridge, iPhone X, Telephoto lens module.
This pic is a side crop from the above image; If you are going to see Chromatic Aberration, this is where it will show up. Basically, there isn’t any, the only thing you may notice that might confuse you is some moire‘ on the wire mesh, but even that is pretty minimal.

Distortion

This is a little tricky; the distortion level is minimal, low enough that I doubt anyone would ever notice it unless you were into the masonry arts and shooting brick walls to get your jollies.

On a couple of occasions I thought I could notice a tiny bit of pincushion distortion and then on another occasion I felt it was barrel distortion.  It may change a little as the focus distance changes, but really there’s no distortion worth thinking about.

The great news is that if you wanted to create stitched panoramas, the telephoto lens should make the job very easy. Oh and just in case you were wondering, yes you can use the 2X lens to take panoramas using the standard Apple camera app.

 

Coolamon Main Street NSW, iPhone X Telephoto lens test. TLC, film like colour and tone rendering.
I was running a quick and dirty test to see if I could find any distortion in regular pics using the iPhone X telephoto lens, none of the test shots showed up anything of bother. This test frame is a TLC image, one thing it does demonstrate is the extreme tonal range rendering of TLC files on the iPhone X. The shot is in direct and very intense afternoon summer sun with deep shadows in under the awnings and windows, other than a little bit of clipping on some near white painted areas the image holds detail throughout the entire tonal range. Usually, I’d need an HDR approach to get this result, which in this case is really quite like you’d expect from colour neg film.

Noise

To test the noise level, you need to be able to see how the RAW/DNG files look without any noise reduction applied.  Different applications handle the noise reduction in various ways and some like Iridient Developer, provide several options to deal with noise.

It might surprise you to learn that some noise reduction methods can actually make the iPhone images look noisier because they alias the noise, it also depends on whether you’re after a “clean down-scaled web image”, a screen image for the iPhone itself or something optimal for large or small prints.

Ultimately the only way to get a handle on things is to judge the inherent noise in the unprocessed raw file and then extrapolate how that might translate for different output needs.

In the case of the dual lens iPhones we need to do this for each lens because obviously, the sensors are an integral part and different in design (I suspect anyway).

Ideally, you’d want to use the iPhone at the lowest ISO setting to obtain the lowest noise levels, which is what I have done for this section, the next part deals with what happens when you ramp the ISO to higher levels.

 

Abandoned Fire Truck at Coolamon Track, iPhone x Telephoto lens test, TLC, texture.
Abandoned Fire Truck at Coolamon Track. I held back the exposure to ensure the highlights were fully recorded and used the TLC method for capture. The lens is excellent at capturing fine textural information which is just perfect for this type of image and makes it an ideal tool when you want to convert the images from colour to monochrome. Again there are no apparent issues for cross frame clarity, vignetting, CA or anything else. At first, I thought the picture showed some Barrel distortion, but I now feel it was an issue with the subject and the lining up of the shot.

 

Crop of Firetruck shot above. Great texture recording and clarity leaving plenty of room for contrast boosting and other editing tweaks. Basically, we have a nicely pushable file.
A section crop of Fire Truck above. The crop shows excellent texture recording and clarity leaving plenty of room for contrast boosting and other editing tweaks. Basically, we have a nicely pushable file.

For this test I use ProCamera, I also took shots using both UniWB and TLC methods for comparison and examined the images initially with all profile settings off, in other words with no sharpening or noise reduction applied, either early or later stage.

The lowest ISO option using ProCamera is just 16, though the app does not actually list this.  It’s not uncommon to be able to shoot within the range of ISO 16 through to 32 under bright sunlight.

TLC shots in Bright sunlight at ISO 16 will give a shutter speed in the range of 1/100 to 1/250 sec, which is hardly a challenge on the stabilised iPhone X.  To see any significant noise you’ll need to zoom into a 300% view or greater, most of the noise will be fine luminance noise, but the deepest shadows will show some chrominance noise that is slightly blue shifted.

Essentially the TLC files are noise free, and all noise reduction could be disabled, meaning you can then apply whatever sharpening you need to extract the detail level you want.

UniWb files typically end up with a better level of RAW exposure because the histogram is no longer acting like a filthy lying mongrel, you may get as much as two EV extra exposure this way.  The idea is to set the exposure so the brightest required details just about to clip.  Unlike the TLC versions, you’ll get some minor chrominance noise which will tend to cut in from the middle grey tones and increase as you go darker on the tonal scale.

My take is the optimally exposed UniWb files show very little noise to worry about at the lowest or moderate ISO settings; you could just dial in a minimal amount of noise reduction and then sharpen as needed, provided you don’t do something completely stupid the image will look great.

Just so you know, “completely stupid” might mean turning the saturation up to eleven, applying crazy amounts of medium radius sharpening, trying to recover deep shadows that we expect to be left well and truly in the dark.

Now, most people won’t shoot neither TLC or UniWB, they’ll just let the app do whatever it likes, so what exactly might they or you expect.

For regular exposure they will find 100 ISO is the point where noise becomes a bit annoying and needs cleaning up, 50/64 ISO will show fine filmic noise that for prints and downsized images could be left alone and might actually be helpful, anything over 400 ISO will undoubtedly be a bridge too far.

low contrast edited 16 ISO test frame iphone X telephoto module, garden scene, mainly blue and green.
Here is an edited version of the 16 ISO test frame, the DNG extraction, like most I do, was done with post editing in mind and therefore set low in contrast. It really demonstrates that the iPhone X Tele module can deliver the goods providing a pathway to full tonal rendering and a look that belies its smartphone origins.

 

Extreme crop 24 mp version iPhone X telephoto module/lens, garden scene, blue sky and tree with roof and bricks.
Now here is something extreme, this is a small section crop of the above image, but it is a 24-megapixel version of it. Yes, you get a little bit of luminance noise, but overall the resolution is pretty terrific when you consider that it is upsized to twice the native resolution!

Underexposure vrs ISO Gain

With many regular DSLR and Mirrorless cameras shooting in RAW, it’s possible to shoot using the lowest ISO and merely brighten the image in the RAW converter. You will often get a result little different than what you would have achieved by ramping the ISO in the camera.  The main benefit is that such images often retain a better level of highlight detail.

I thought it would be a neat experiment to try this with the Tele lens on the iPhone X and then see what could be drawn out using a desktop RAW file converter.

The fundamental noise levels were pretty much the same either way, but I did notice a couple of anomalies, in bright conditions the underexposed images seem to be sharper and more detailed than the higher ISO versions for the same shutter speeds.

I suspect the iPhone may be applying noise reduction to the DNG files as the ISO is ramped up, but I cannot prove this an Apple don’t of course document this anywhere.

The loss of clarity really turns nasty at around 320 ISO and beyond 400 looks horrible in almost every respect to my easily offended eyes.

On the other hand under low light conditions where slow shutter speeds are needed to grab those fleeting photons, the higher ISO DNG files seem to render better results than severely underexposed low ISO frames.  I took many images in the 1/30 sec and lower range and in most cases with custom processing, I could get slightly lower noise, more shadow detail and better clarity from the higher ISO captures when compared to the underexposed low ISO versions.  Just note, the shutter speeds for each of the test pairs were identical meaning the real exposure at the sensor level was the same in each case.

The extreme highlight detail was usually better with the low ISO frames once you applied the optimal processing settings, but not to the same degree it is in when taking shots in bright daylight conditions.

It became apparent in this process was that the optimal sharpening and noise control settings for each of the changes in ISO are considerable. It takes quite some effort to find the best settings….but in all cases, the final result was wildly better than equivalent JPEGs from the camera, which were all pretty mushy and “watercolour like”.

Under regular daylight (at higher shutter speeds) it seems the higher ISO ramps up the noise reduction enough to require considerably more sharpening in comparison to the lower ISO frames. While it’s entirely possible to obtain reasonable levels of resolution from the images shot at 200 ISO plus, the final results tend to look more forced and digital than the optimally processed but under-exposed low ISO frames.

For lower light levels I found I could get an excellent compromise by choosing an ISO of 160, in this case, highlight detail was still very recoverable, the shadows were fine and the noise looks comfortably filmic.

One final note on this, many folks would think the native colour noise (chrominance noise) from the higher ISO frames would be horrible. I found that not to be the case; sure there’s colour noise but its nowhere near as objectionable as it often looks in many of the processed DNGs and JPEGs I’ve seen on the web.  The real issue is the processing methods, many of which exaggerate the noise, turning the fine colour noise into ugly clumpy blobs of colour with underlying rainbow clouds.

16 ISO no noise reduction, unedited
16 ISO Garden Test image, no noise reduction, no post-RAW extraction edit. It shows a little more colour saturation natively than the higher ISO versions.

 

32 ISO iPhone X Telephoto Garden scen test
32 ISO, no noise reduction, in this case, the tonality is a little more natural out of the box, you can only see noise in the sky, but it’s fine unless you want to blow the image up.

 

50 ISO Garden Test shot, iPhone X, Telephoto lens.
50 ISO, with the noise reduction zeroed, we start to see a bit more noise in the sky, the file is still perfectly editable and shows plenty of fine textural detail.

 

100 ISO test frame Garden, iPhone X
At 100 ISO we start to trade off the fine detail if we want to turn the noise reduction on. Interestingly the shadows are more easily worked, but the highlights in the sky are tending towards clipping, though they are still OK and don’t show any colour distortion. Once you proceed beyond this point to 160 ISO, you need to compromise far more with shadow, and highlight rendering, small highlight colour shifts and you’ll need to add detail sapping noise reduction processes.

ISO Range performance

I find it hard to draw conclusions here as there are so many variables at play, the shutter speed used, the amount of light, whether you are targetting highlights or shadows and ultimately how you process the files.  In other words, it’s nowhere near as simple as with JPEGs where you can do a straight ISO ramp up and then point your finger at the results and give the definitive nod of approval.

But you want answers so I will make some generalisations that hopefully help you get optimal results.

  • Perfect focus matters more than with the standard lens, and under low light, the iPhone sometimes hunts a bit and settles on the wrong focus distance, if the focus is out a little you’ll probably try to ramp up the sharpening and that makes the noise look way worse.
  • The tele lens needs to be much steadier than the wide angle one; it’s far more likely you’ll get motion blur unless you pay a bit more attention.  I found that 1/30 sec was the point where problems start to crop up occasionally but everyone is different, your experience may be better or worse.
  • Anything over 400 ISO is going to look rubbish regardless of what you do in the processing.
  • 100 to 160 ISO is good for low light, even if you underexposure to keep the shutter speed practical.
  • Under bright light the lowest ISOs (16-32) can be radically under-exposed (like three stops or more) and still give terrific results, the main advantage is this will allow for extended highlight rendering without the hassles or deficits of HDR. A “2 stops underexposed” image shot at 16-32 ISO can do a very nice impression of colour neg film with sympathetic editing.
  • Under bright/moderate light there is a general drop in potential sharpness at ISO 125 and above, but it will be fine for many needs.

 

TLC Tractor
A tractor shot deep inside a shed at 100ISO using TLC at the Coolamon showground, minimal noise reduction was applied, this adds a bit of filmic grain but importantly kept the micro details perfectly recorded on the grille etc. Overall the pic looks a lot like it was shot on colour neg film and the file remains easily tweakable.

 

Postscript

Finally, I was fascinated by the excellent results delivered by the telephoto lens on the iPhone X and decided to do a little further exploration and see how it compared to a regular high-quality camera.

I chose to compare it to my Olympus EM5Mk 2 paired with the Olympus Pro Grade 12-40mm f2.8 lens.

In the end, the comparison grew more significant than I planned and I made test shots with the standard iPhone X lens and also tried cropped and panorama shots for the iPhone to compare with using a regular zoom on your Mirrorless camera.  All images on both devices were shot in RAW.

I guess, at this point, you are assuming the M4/3 reigned supreme, you might want to just hold off on that assumption, anyhow the article will be posted soon, then all will be revealed.

 

  • TLC files are captured using pre light filtering, this is entirely explained in my book “Ultimate iPhone DNG” available on the iBooks store.
  • UniWb capture increases the overall level of exposure to the maximum, again this is covered in “Ultimate iPhone DNG”

 

 

 

 

 

 

 

 

 

iPhone X Tele Lens, How Good? Part One

Most casual iPhone followers think the telephoto lens in the X and 8 series are the same, they’re not, the aperture is wider on the X 2.4 versus 2.8 and the lens/sensor module has image stabilisation on the X version.  Those differences alone are important and should make for better images under many situations, especially if the light level is low, note however that when shooting with the standard camera app the iPhone 8 and X will both revert to a cropped version of the image from the standard lens when the light level really drops off a cliff.

I had several questions I wanted to be answered regarding the Telephoto lenses and options on the iPhone 8 and X models.

Regarding the comparison between the two models I want to know:

Are the lenses optically identical?

Is there any difference in dynamic range between the two lens/sensor modules?

How does the native “depth of field” rendering compare?

Moving beyond those basics,  I also want to know:

How well resolved are the corners?

Does either lens show significant native vignetting?

Are there any issues with Chromatic Aberration?

How good or poor is the image noise?

Eventually, I will answer all of these questions in detail but for today lets look at the portrait mode.

In a more general sense, the Portrait mode is fascinating and I thought it would be good to initially look a little deeper into this option. I’ve seen a few comparisons online between the iPhone and Android versions of the concept and though no smartphone cameras have come close to perfectly simulating the shallow “depth of field” you get with your DSLR or Mirrorless camera, the results for most casual needs are more than passable.

I must add that at this point we are looking at first and second generation products, I’ve no doubt the next five years will see enormous gains in the final quality provided by portrait mode options.  Your DSLR is certainly not yet a dead end but it will increasingly become a threatened species.

Just so we are clear, some of my questions relate to the compressed HEIF or JPEG files produced by the regular camera apps, specifically the portrait mode option but I also wanted to uncover the unvarnished truth and determine “the true potential” of the iPhone X Telephoto lens when used optimally, for that we must look to the DNG/RAW files.

This is going to be an in-depth examination so I’ve decided to break the coverage into three parts.

Part 1 Using the portrait mode

Part 2 Shooting DNG Telephoto on the iPhone X

Part 3 Comparison of the iPhone 8 and X telephoto lenses

 

Let’s get started then…

 

Old peoples home, artwork by Sun Yuan captured with iPhone X in portrait mode, monochrome.
Hyper Real Exhibition, National Gallery Of Australia, “Old Peoples Home” 2007 by Sun Yuan, in this case, the portrait mode worked perfectly.

 

Is the Tele Lens any good?

Cutting to the chase first, for those who have not used an iPhone with a Tele lens option, the simple answer is yes, it’s a terrific option and I couldn’t imagine going back to an iPhone without one.  The lens quality is solid and produces sharp results while the focal length helps me at least, to create photos in situations where I would otherwise have chosen to use a regular camera.

Some testers have pointed out that you’re limited by the lens in low light situations because the aperture is smaller (f2.4) than the wide angle lens (f1.8), true, but it’s nothing like a deal breaker for most needs.

I far prefer the natural perspective the tele lens provides and in particular I value the ability for more selective composition by virtue of avoiding the background distractions which tend to more easily encroach when I shoot with the standard wide angle lens.

Technically the extra lens is not one you’d normally classify as a telephoto, coming in at around 56mm full-frame equivalent.  A 70mm equivalent lens would have suited me better but strict physical limits apply when your camera is only a few mm thick.  On the other hand, the 56mm equivalent focal length is long enough to suit a great many needs, there’s a whole bunch of reasons why the 50mm focal length became the most used choice for film photographers…it basically works better more often.

Just in case you are wondering if the Tele lens might give a better result that one of those add-on lenses you’ve seen on eBay or even one of the more expensive ones, trust me,  there’s no comparison, the iPhone Tele eats them alive, especially in the corners of the frame.  Not one of the add-on tele lenses I have tried offered anything like sharp corner resolution, maybe there is a perfect option floating around out there but I very much doubt it.  By the time you pay for a good quality one plus the mount/case needed to adapt it to your iPhone your getting much closer to the price of the iPhone 8 dual lens model, and let’s not even bother getting into the practical aspect of a chunky bit of glass added to your slip in the pocket device.

 

10 month old baby, iPhone portrait mode, close up head shot, taken in shade.
A lovely snap taken by my wife Wendy or our Grandson, Milton using portrait mode on the iPhone 8, a result that would no doubt please most nannas and poppas.

The Portrait Mode

My wife Wendy is a great fan of portrait mode, she feels it gives her noticeably nicer results when taking family pics and especially so when she takes photos of our Grandson, Milton.  I’ve included some of her pics in this article as examples.

Wendy is not what you would call a picky photographer, she just loves to take photos of the things in her life that matter or need to be recorded, just like most iPhone camera users.  I could, of course, point out to her the technical deficiencies of the mode and she’d just as likely say, “so what”, she’s just a happy iPhone camper.

Here’s the point, most consumers won’t care about the techy stuff that most hobbyists and serious photographers fret over, they just want nice looking portraits with a minimum of fuss, for them, portrait mode using the standard iPhone camera app delivers and for those of us who do sweat the small stuff, well the tech is only going to improve and we have other options if we want to use them.

On the other hand, I’m sure when the mode is further refined and the results improved most users would notice the benefits but in the interim, most owners have no deal-breaking issues with portrait mode.

Me? I find it OK, really I don’t expect perfection, I am just grateful that it works at all.

 

10 month old boy on play gym, smiling broadly, teeth and tongue showing, iphone 8 portrait mode.
My wife captured this pic of Milton whilst he was in action on a play gym, there’s a little bit of blur but the result is really lovely, again the portrait mode has worked very nicely.

But the articles on this site are for those who want to take things to a higher level so critique the portrait mode I must and along the way offer some advice to help you get more out of it.

How does it work?

First up, let’s differentiate between the two parts involved in the option. The basic function is to simulate the shallow depth of field effect you’d expect if you’d shot using a DSLR or Mirrorless camera with a telephoto lens and wide aperture.

The second part is to apply lighting effects to give the portrait a look that channels studio lighting etc.  Currently, there are 5 options, natural, studio, contour, stage and stage light mono, no doubt Apple or other third-party app developers will add to this set as time goes by.

At this point I’d say the mode is much better at the former aspect, the lighting effects are very patchy in practice, sometimes the result is great, but often it’s just awful and even casual users, my wife included, have noticed this.

There’s some confusion here because most people it seems have assumed that portrait mode refers to the combination of lighting and depth of field effects, not true they are separate, the depth of field effect can be turned off “in post” and the lighting effects still applied.  To do this you simply hit the edit option on the top of the screen in the photos app and then tap on the yellow “portrait” tab, the DOF effect is removed but the Lighting effect option is still active.

Remember that none of this is destructive, the original image is still intact, we are just applying an instruction set to the image so it looks like the image has been adjusted when we view it….in other words, no pixel is harmed in the process!  What is critical however is that the file is shot using the HEIF format because that’s the only way we can save the depth map needed to create the shallow DOF simulation.

You can read about the HEIF format here: 

https://www.cultofmac.com/487808/heif-vs-jpeg-image-files/

Ok, all good info but how does it work? There are two lenses on the iPhone 7/8+ and X models when you shoot in portrait mode both of these are used.  The telephoto is used for the image and a combination of the standard wide angle and tele lens is used to create a depth map of the image.  Say what! Depth Map!

Yes, that needs a little more explanation.

So we have two eyes, this allows us to gain a great perception of depth when viewing our world because each eye sees a slightly different view, which means we can sort of see around things.  The upshot of it all is that our very clever brain is able to take these two images and blend them together so we see in 3D.

Our sense of 3D is excellent for close up objects but as the distance increases the effect disappears, this is easy to, well, see.  Close one eye and look at a scene, straight away you have lost the sense of 3D, open them both and we have 3D, that’s obvious, isn’t it.

Now, look at a distant object and close one eye, it looks pretty much identical because both eyes see it almost from almost exactly the same viewpoint.  Now look at something about a metre away and do the same, the difference between what each eye sees is quite profound, it’s simple really, the closer the object the more pronounced the difference is between the view seen by each eye.

With me so far, good.  Now it should be obvious that the view seen by each eye will not line up if you layered them over the top of one another, closer objects would show more misalignment than mid-range objects and distant objects would show virtually no misalignment at all.

Now if we measured the misalignment differences we would be able to work out how far from the camera the scene objects are.  This is the principle behind rangefinder camera focus systems so there is nothing new under the sun here.

Here’s the cool bit, if we can work out the distance of those objects we could create a map of the scene which can be overlayed on the image and used to control or mask a blur filter.  In other words near objects don’t get blurred and more distant objects or scene elements do.

It all sounds simple enough but in practice it’s very complicated because a whole bunch of parameters have to be determined, like at what distances does the blur stop and start, how much do you blur the image, how can you blur without cutting into fine details like hair, leaves, edges of fabric etc, it’s enough to make a tech-heads cranium spin.

Creating the depth map is the simple part, the tricky part is the post-capture processing and I have no doubt Apple will significantly improve this given time.

 

Hyper Real, Mark Sijan, Embrace 2014, monochrome image of sculpture, taken Brad Nichol, iPhone X portrait mode.
Photo of an exhibit at National Gallery of Australia, Hyper Real exhibition, Mark Sijan, “Embrace” 2014. The Portrait mode was just perfect for this image, the monochrome edit was done in Snapseed over a coffee in the NGA cafe.

Currently, there is a fair degree of AI used in the processing of the images in Portrait mode and I expect that Apple will increase this over time, this will make a significant difference, the potential is well demonstrated by the Google Pixel 2XL. The Pixel has only has one lens and relies to a much greater degree on AI for processing, it produces great results, sometimes better in fact than the tele portrait option on the iPhone.

At the capture level, the iPhone is hamstrung somewhat by the distance between the two lenses, it’s not much more than a centimetre, which limits the accuracy of the depth map.  Basically, the further apart the two capture lenses the more accurately object distances can be measured, now you know why the wide gap between the two viewing windows on a Leica rangefinder is so important, it allows for very precise focus setting, which is especially important when using wide aperture lenses.

A second complication is that both lenses are not identical, so I presume the image from the wide angle has to be cropped internally to build the depth map but this would mean the system is working with a lower resolution image, but bear in mind Apple don’t exactly spell all this out, so who really knows?

If Apple were to increase the distance between the two lenses the accuracy of the depth map would be better and combined with smart processing would probably mean the system would be less prone to blurring fine details and other errors but I doubt that Apple will venture down this path in future, it would compromise the practicalities of the iPhone too much.

 

Hyper real exhibit, Mark Sijan, Cornered, 2011, portrait mode, iPhone X, Brad Nichol
Another NGA exhibit by Mark Sijan, “Cornered”. Portrait mode has little problem with simple subjects against plain backgrounds like this and produces a nice result, again this was edited in Snapseed with the vignette applied in post, which is a much better look than that obtained with the “Stage Light Mono” option.

There are other ways of building depth maps, take a lot of photos at different distances in very quick succession and then selectively blend the images together using depth information. Use infrared beams to map the scene and create a depth map from that information, which is exactly what the iPhone X does with the front-facing camera. Use artificial intelligence to analyse scene elements and make depth judgments based on comparing the relative size of those elements and machine learning. The latter option is the method used by the Google Pixel 2 XL which also uses information gained from the phase detection focus pixels.

Beyond that, we have sonar, which was used to determine focus with the old but lovely Polaroid SX70. It should also be possible to create a very accurate depth map by taking the actual photo then moving the camera left and right to capture additional frames which could be used for the map but not the photo, there have been examples of cameras doing a version of this too.

Basically, there are many ways to skin this digital cat but for smartphones, practicality is almost always an over-riding issue, in the end, it is likely to be a combination of methods that wins the day.

I suspect the next high-end iPhone will use the iPhone X front camera method for portrait mode on the rear camera to build a more accurate depth map, I’m sure Apple didn’t go to all that trouble on the front facing camera just so we could unlock our phone and take better selfies!  I also suspect that long-term the iPhones portrait mode will also use much more machine learning and probably more sophisticated multi-shot capture methods.  Remember, all of these options currently exist in some form or other, we’re not talking about inventing the new wheel but rather combining current technologies in more sophisticated ways.

 

Baby in bath with toys, looking up and smiling, sepia low saturation colour tint, taken iPhone X portrait mode.
Milton loves his bath and the portrait mode worked perfectly here. The trick is to keep the exposure “just right” so as not to burn off any of the highlights. The edits were in Snapseed, which included a vignette and sepia tint with low saturation. Honestly, this would look lovely as a large scale canvas print.

The main current limitation of the built-in portrait mode is that it only works in a range of around one to two and a half meters due to the small distance between the two lenses and you have no control over the blur effect. There are other third-party apps which now use the depth map to create blur effects so watch this space because I’ve no doubt that even with the current hardware the results could be improved by better processing algorithms and perhaps even capture options.

The new APIs (Application programming interfaces) that Apple opened up with iOS 11 has meant that developers are free to create new apps to work with the portrait mode and depth maps, If you want to try the better current alternatives, which by the way allow you to vary and control the DOF effect, check out these options:

Focos

Slor

Anamorphic

infltr

These apps really need their own articles, there’s some serious power on offer, but for today I’ll stick with the standard option as much as possible.

 

Is “Portrait Mode” just for portraits?

In short no, you can use the mode to capture objects other than heads and a lot of folks are doing just that, some even creating passable results for “product” photography needs.

The front facing camera can also be used to shoot objects but it seems more attuned to portraits and it’s very difficult to get it to work on static objects reliably.

The main limitation is the mode does not work if you go too close, which in the scheme of things is not very close at all, basically about a meter. I’ve found this close distance is quite variable, sometimes it will work a little closer up. You’ll know if it’s going to work before the shot cause it tells you right there on the screen “move further away”.  This limit has nothing to do with the minimum focus distance for the telephoto lens which is around 35cm.

 

Photo of wooden steering wheel on vintage car taken with iPhone X portrait mode.
Although portrait mode is not really intended for non-portrait images is does an excellent job on occasions, this is one of them, but the results can be very patchy. Fortunately, the effect can be turned off after the shot is taken.

In theory, the mode should be able to work even better at close range as the difference between the positions of the two lenses should be better suited to more close up shooting distances than for further away, so I’m not really sure why the close distance limit is so far out.

The good news is the effect is seen as a live view, you know up front how the result will look, the bad news is the effect is often too strong, blurring parts of the objects you want to be left alone and when editing in the standard camera app you can only turn the effect on or off, there is no way to paint on the mask (which you do not see at all) to moderate the effect. (Remember there are Third party apps that can solve this limitation).

I’ve seen some pretty good photos of flowers, cameras, hands etc taken in the portrait mode so it’s definitely worth trying but presently portrait mode will probably prove frustrating when you want consistency, and that’s my main gripe, much of this probably comes down to the limits of the AI being currently used.  I can accept that it might not be perfect, but I have not been able to work out exactly when it’s likely to succeed and fail, it’s just some damned random and the one thing I hate in photography is randomness of results, I like to know that if I do certain things I will get a predictable result.  With more usage, I may become more intuitive with the option but I’m not convinced that’s how it will pan out, I suspect portrait mode is still a bit, shall we say, quirky for things other than portraits, which is no surprise.

 

Using Halide on iPhone for Depth map, photo of car
I took a shot of my Lexus using Halide to see if I could get any depth effect at all, you certainly can’t with the standard Apple app. If you look closely, you can see that the background is indeed blurred a bit and the leaves on the tree closer to the camera are more resolved. Sure it’s not blurred in the way you would get with a DSLR, but the effect is better than nothing at all, so a handy trick to have up your iPhoneography sleeve.

So What are the main Issues?

Only works really well with faces.
Only works in the range of approximately 1 to 2.5 meters.
Blur is not adjustable. (with the Apple app)
There is a tendency to get blurred edges and fuzzed up fine details, especially hair.
Works best if the background is well behind the subject without any encroaching details from the sides.
The lighting effects are hit and miss and often far too strong.
Still not a replacement for your DSLR. (like that is a surprise)
You cannot zoom within the mode.
All of the above leads to the next question, is there anything we can do to make the results better and more predictable, or to put it another way, Brad, can you give me some killer tips.

I think I can, try these on for size.

Exposure!

Be very careful with the exposure, it is not a RAW file and doesn’t respond near as well to tonal adjustments “after the fact”.  This especially applies to the highlights which once lost tend are impossible to recover and often end up with weird colour shifts.

Portraits live and breath by their ability to properly render lighter skin tones and texture without clipping, bleached white patches on the skin just look wrong.  There are three things you need to attend to.

One, make sure you have the HDR option enabled, with IOS11 this is done within the “settings” menu, not within the app itself.  Just make sure you have the Auto HDR option enabled.

Two, once you have focused slide your finger up and down on the screen to adjust the overall exposure, ensure that the skin tones you want to be recorded with detail actually are before you press the shutter.  The newer iPhones are much better at controlling noise than previous models, a dark photo can be easily lightened in editing without turning into a noisy mess, and frankly, the pics often look more analogue when treated this way.

Last, if you are going to use the lighting effects, pay attention to how the effect is interacting with the exposure, it’s way too easy to overcook things at this point.  Note: You can turn the effect off after the shot is taken so that’s not the problem, the real issue is that it makes it much harder to accurately judge the optimal exposure level.

My advice is to shoot without any lighting effects and then add the effect in post if you want it.

 

indoor portrait iPhone X portrait mode, tungsten light, Korean restaurant.
My son grabbed this hot with my iPhone X, the colour is a little too intense being taken under tungsten light, but the effect of the portrait mode blur is quite lovely. Ideally, the exposure would need to be shorter to prevent the clipping of the highlights on Wendys’ shirt and my shirt collar. This shot would likely edit fine, the tip is to allow for the editing when shooting, sometimes a darker initial capture is better.

Colour?

Stupidly the iPhone still offers no way to control the white balance in the standard camera app, it’s not normally a drama as the iPhone is surprisingly good at setting an accurate auto WB most of the time, in fact far better than most DSLRs and Mirrorless cameras.  But it can still mess up under odd lighting conditions or if the subject is dominated by one colour range, especially scenes with lots of yellow or red and sometimes blue in them.

There’s no easy workaround because the focus/exposure/white balance are all set with the same tap on point, though the camera does examine the whole image for WB.  You could point it at a grey object then lock the AF/AE lock but that will also lock the focus so unless the subject and the grey object are at the same distance from the camera your subject may well end up not in focus.

Why has Apple failed to provide a white balance option, probably they just wanted to keep it simple but other brands of phone certainly don’t dumb things down this much for those with a desire for a bit of WB tweaking.

The answer would be to use another app that allows you to adjust the White Balance and still capture a depth map.  In this case, you’d capture the portrait shot using the 2X camera with the depth map option turned on, the file would then be saved to the “Photos” app and edited as normal, where you can apply the Depth of Field and Lighting effects.  The downside being you would not see the DOF effect when you take the photo, in that respect, it would be a bit like the Google Pixel 2 XL, where the image must be processed first.

Currently, there are few apps that can record a depth map, Halide is an example of one which can, but none I’ve tried allow you to adjust the WB when using the depth map option.  I hope Halide addresses this soon as it would be an easy option for more serious shooters, better yet Apple how about you just give us a WB option…please.

Ultimately unless your WB is way off kilter it can be adjusted in post editing, if it looks like the colour is going to be beyond help then consider shooting against a less intensely coloured background and keep a close eye on the highlight exposure level.

Vintage vehicle interior against bight background, iPhone X portrait mode, bright colours, some background clipping, good shallow depth effects.
In this case, the portrait mode worked well, despite the non-portrait subject. Generally, the mode performs poorly on regular subjects when you go closer than about 1.25m. One issue I’ve found is that often the colour is too ramped up by default under bright light, but this can be adjusted in editing so long as you are careful not to clip the colour data, in other words, err on the under-exposure side if in doubt. The cyan on the car in the background is quite over-cooked compared to real life. The main fault here is the mirror on the far side of the red car, it is far too blurred.

 

Extending the Focus Range

Like I said the standard range is around 1 to 2.5 meters, this cannot be changed as the iPhone just switches automatically to regular shooting mode if the subject distance does not comply with the design parameters.  However,  if you use “Halide” with Depth Map enabled it seems to be able to extend the close focus mode down to around 40cm or so, maybe even a bit closer.  If you go to close the editing in “Photos” seems to mess up the blurring and cause all manner of weird artefacts but it is useful for times when you want to shoot product photos.

Halide also seems to be able to get the depth option working at a greater distance than the standard camera app does, so it represents a nice addition to your iPhoneography toolset.  Note, you cannot blur foreground objects when you focus on something more than say 2.5 meters out and generally the blur in front of the object is less convincing than the background blur.

Unfortunately, you can only control the exposure compensation when in Depth mode in Halide but that’s better than nothing.

Another option for creating blur effects it to transfer the HEIF files into Photoshop and then use it too create the blur using the depth map, that would need a whole extra article so I won’t go any further here.

Just in case anyone is wondering, and I know you were, you cannot yet shoot RAW in any Portrait mode, in other words, no depth map option for portrait mode because the depth map can only be recorded using the HEIF format. An option could be to record both RAW (DNG) and HEIF together, so far I haven’t come across this option, it would require more complicated post-processing however to implement.

 

Radiator gauge on antique vehicle, red, iphone X portrait mode, good result, nice background blur, hupmobile.
Sometimes with close up non-portrait subjects the portrait mode nails it, bear in mind I took a few similar shots and only about half were acceptable. My frustration lies with the randomness of the results, I’m sure this will rapidly improve over the next couple of years.

Adjusting the Blur

Presently if you wish to adjust the degree of blur you will need to use another App such as the ones I mentioned earlier or Photoshop to process the image. You have options, but there is no way around the on/off choice if you stick with the regular app. Once you start using third-party apps you can apply any degree of blur you wish.

 

Blurry Edge Details

This one is tricky, as I explained it’s a combination of having a less than perfect depth map and the intelligence of the algorithms used to apply the blur effect.

One thing you can do is avoid shooting against very detailed or messy backgrounds, for example, Portraits taken against blue skies with a bit of cloud tend to work well as do portraits taken where there is a big distance between the foreground and background. It’s no accident that Apples’ sample shots tend to conform to this style.

The results also look better if there is a good level of subject/background contrast when the tones and colours in both are very similar the algorithms seem to get confused and randomly blur the wrong bits.

Another scene element type that causes grief is when you have foreground objects with cutouts in them which allow you to see through to the background elements, in this case often the areas inside the cutout are not blurred as you might expect, of course this is not an issues with human heads, heaven forbid we should have holes in our craniums but it does mess with product photography work.  Obviously, it’s hard to avoid the issue but being aware and picking it up before you press the shutter button will at least give you a chance to rejig the shot before you shoot.

 

Antique motorcycle hand oil pump, iPhone X, portrait mode.
Results with complicated close-up shots can be very random, often background areas that appear within voids within the subject fail to blur, in this case, that’s only an issue in a couple of the small regions. The actual blur (Bokeh Effect) is nice, though the fall-off into blur is quite abrupt, other apps are able to give much better control over this so falloff control is not beyond the capability of the technology.
Metal sculpture NGV, iPhone X portrait mode, failure to wok properly, blur in wrong spots.
Sometimes for no apparent reason, the portrait mode fails miserably, in this case, it blurred one side of the sculpture in the middle, a monochrome version without the effect is below. Ideally, I would like the background blur, but it would need to be done in post, I made several attempts to coax the portrait mode into working with this pic but nothing succeeded.

steelfail

 

Better Lighting Effects.

Basically, they are all a bit too much for my delicate and easily offended eyeballs, but I do have a workaround.  This might sound tricky but you can do it, so long as you have an editing App that can work with layers, I suggest Snapseed, which is both excellent and free.

Take the photo using the portrait mode without any lighting effects applied and then duplicate the image on the “photos” app.  Open the first image and apply the lighting effect you want and leave the second one alone.

Now open the first in Snapseed and then using the “Double Exposure” tool open the second, you can now blend the two together as you see fit.  You need to have a look at the tutorials for Snapseed to see how to do this, but rest assured the results can be pretty good.

Of course whilst you are in Snapseed you have a whole bunch of other terrific effects on offer to use, so double the joy for those with a sense of adventure. There are of course a wide array of apps that can work with layers, Snapseed just represents a nice easy option for those not deeply into editing.

 

Modify the Light

Want to really get great results for portraits, why not try using light modifiers just like we do with any other regular camera, reflectors, blockers, diffusers and artificial light sources all work with iPhones too you know.

Think about it, isn’t this what Apple is trying to synthesise with the Lighting FX anyway, well why not try the real thing!

Lady in library, Victoria state library, iPhone X portrait mode, blurred foreground to match background, monochrome.
I took this pic of Wendy at the Victoria State Library in Melbourne, it’s one of those times the portrait mode worked well producing a very nice background blur effect, I applied a little blur to the foreground in Snapseed to match. Therein lies a lesson, often you need to treat the image a starting point as most seem to benefit from a little careful tweaking, especially if you’re after a nice monochrome look.

 

10 month old boy, colour, iPhone X portrait mode, looking left of frame and up, grey shirt.
No, this isn’t one of the pre-potted lighting effects, it is a product of the lighting on the subject, the apple of my eye and my Grandson, “Milton”. The pic has been edited in Snapseed to add some vignetting and a filmic simulation. The key is not to over-exposure the original capture, or you will lose those delicate skin tones.

 

Define Your focus.

I’ve noticed that most times when people use the portrait mode they don’t actually set focus, they just leave it up to the iPhone to decide, which is often perfectly Ok.  But the blur effect can be significantly different depending on where exactly you set the focus and this can be seen on the screen when using the standard iPhone app.

Accurate focus setting is especially important for “product” style photography but also applies to portraits.

Try it, just move the focus point a bit and tap, you might be surprised to find the effect looks better, or perhaps worse.

Crop Away

No, you cannot use the zoom when in Portrait mode but you can certainly crop the image in post which can be helpful for product shots or when you want a more closely framed portrait. Chances are you will still end up with 6 to 8 megapixels in the cropped image which is more than enough for regular prints and way beyond what you need for most social media pics.

It’s not a DSLR

Well Doh Einstein, of course not.

No, you’re missing my point, we have all sorts of cameras and they are suited to some needs way better than others, I think many cameras get a bad rap when people try to use them for something they were not designed for in the first place.

See I get a little bit testy with camera reviewers saying things like iPhones are rubbish cameras because they can’t shoot wildlife, distant sports, studio portraits etc.  Well sure they won’t excel at that, but then your DSLR makes a pretty crap pocket camera too.

It all boils down to this, is it sufficient for the job you have in mind or is it the wrong tool for the job, only you can decide, all I’m saying is don’t expect your iPhone to be something it isn’t, a DSLR replacement, work within its design parameters and I’m sure you’ll find it fine for the task.

Concluding.

I’m sure that we’re going to see massive development in the area of “portrait” mode on smartphones over the next few years, really the technology is just in its infancy and probably 5 years short of full market maturity.

I can see that over the next year or so we will have many apps coming to market that leverage off the depth Map option and possibly do a much better job than the standard iPhone app does.

In the meantime, it remains a great feature that handily expands the shooting envelope for most smartphone users and the results while not perfect, are superior to what you get by using the standard wide angle lens for portraits and product shots.

Next Up I will look I will look at the DNG files taken with the Tele lens on the X, I think you will be pleasantly surprised.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gone to Ginza with an iPhone

Modern buildings in Ginza strip, showing glass, stainless steel, acrylics and concrete. Taken at night, traffic light and ginza sign.

When on holidays I take both my iPhone and my Olympus M4/3 camera kit, the Oly gets a workout on the more serious stuff, mainly when I need more reach or better low light capability, and the iPhone serves for most everything else.  I thought some of my readers might be interested in seeing a little photo story of the Ginza district in Tokyo captured on my iPhone while my wife and I wandered the shiny streets of Ginza.  I have another version of this article on my regular site with some extra pics taken on the Olympus EM5 MK 2, you can check that out here:

https://braddlesphotoblurb.blogspot.com.au/2017/11/ginza-district-of-Tokyo-photo-story.html

Ginza is 87 hectares of high end, over the top, consumerist worshipping retail nirvana for Japanese with money to burn and a need to proclaim their superior economic status. The greater Ginza area provides a fascinating insight into the culture of modern Japan and presents photographers with a veritable feast of options, both for the tummy and the lens.

My wife and I along with our Son Aaron and his partner Jain spent 6 days in Tokyo recently, staying in a hotel in Ginza.  https://www.gardenhotels.co.jp/eng/millennium-tokyo/ .  The lodgings were superb and ideally located for access to the Ginza district, subway system and great eateries where you can exercise your gastronomic muscles.

Like many high-end shopping precincts around the world, Ginza is dripping with the usual brands, except perhaps the presentation is little more excessive than usual. Considering that Ginza is home to some of the worlds most expensive retail real estate in “dollars per meter squared” terms, that excessiveness becomes all the more impressive, especially when you compare the retail space sizes to the residential spaces of Japanese units and homes.

Ah Ginza, it’s all “be-on-neon, sidewalk fashion parade and busy with a purpose”.  But, my friends, in case you think it would be like, say Times Square or some similar location in other parts of the western world be assured that Ginza has a flavour that is entirely different and in many ways uniquely Japanese, which is what makes it so fascinating.

 

 

Ginza strip at Dusk, city lights and neon
Ginza Strip at Dusk.

 

First the familiar, Ginza is devoted to the church of conspicuous consumption and the brands of choice are the same as almost everywhere else, Cartier, Hermes, Prada, Gucci and all the other usual suspects. Most of the shoppers are women, and indeed most of the stores are aimed at women, and of course, there are a lot of very nicely dressed people parading under the bright evening lights.

As always the store window displays are works of art but not dissimilar to the same store displays in other locations around the world, as you might expect in these days of corporate uniformity and branding.

 

Window displays of Ginza, Halloween period, aliens
Aliens on Harumi-Dori, part of a large contingent who have descended upon Toyo for Halloween.

 

The whole Ginza edifice is built on the concept of consumption rather than materialism, the joy is in the shopping, browsing, touching, and ultimately parading the high-end bags along the streets post-purchase.  Of course, most non-food purchases fall into the category of a “declaration of status” rather than fulfilling any real need for body covering, personal hygiene or life’s practical necessities.

I read an interesting article yesterday on the issue of consumerism and materialism, it’s well worth a look if you have the time and probably typifies the drive behind Ginza more than anywhere else in the world except perhaps Dubai.

https://www.theguardian.com/business/2017/oct/30/to-cure-affluenza-we-have-to-be-satisfied-with-the-stuff-we-already-own

But now for something completely different. Ginza is also home to some incredible Japanese department stores that sell brands and foods that are uniquely Japanese, examples being Mitsukoshi, Matsuya and Wako.  You may not wish to buy anything at all, but I promise a walk through the food halls alone will leave the average westerner agog at the quality and presentation of the foods and even more impressed at the range on offer.

 

Multi level high end fashion display in Ginza strip
Uni Qlo, Fashion store Ginza Style.

 

Beyond the department stores, you have speciality shops that are uniquely Japanese, such as the G.Itoya stationary store and Hakuhinkan toy store (or more accurately, emporium).

It is possible to explore Ginza at a subterranean level moving from shop to department store etc. via the subway paths, convenient in Typhoons and many folk choose this option to avoid traffic and crossings. Move out onto the streets, and you’ll notice several other aspects.  First, while there are a few high-end European cars, the vast majority of vehicles are taxis, and almost all of them are black old-school Toyota Crowns that seem ideally suited to their purpose and are immaculately clean.  In fact, all vehicles in Tokyo including commercial trucks seem to be fresh from the carwash, which may be a little thing perhaps but quite profound when compared to cars in most cities around the world.  The link below will give you some insight into the Tokyo taxis.

http://autoweek.com/article/car-life/unsung-taxi-heroes-tokyo-toyota-crown-sedan-and-crown-comfort.

 

Lexus 430 on Ginza, air bags, gold, parked, jacked up
Modified Lexus LS430 on the Ginza Strip, the LS series are very popular with Japanese Modders.

 

Nissan display for 2017 Tokyo Motor show, Nissan Corner Ginza
Nissan hyper sports concept car waiting to pounce from behind glass on Nissan Corner, Ginza.

 

Private passenger vehicles in Ginza tend to by high-end Toyotas and Lexus, there are few other brands on display, maybe the occasional high-end Nissan, and frankly, I think about half the worlds fleet of Lexus HL600s must reside in Ginza alone.

The most unique Japanese vehicle you’ll see in Ginza are the Toyota Century sedans, Japans most prestigious vehicle and almost always chauffeur driven.  The conservative but exquisitely built Century is the vehicle of choice for CEOs, Government Officials and the very wealthy, it’s the ultimate Japanese automotive statement.  Oddly a Century with the Chauffeur in situ seems able to be parked anywhere with complete immunity from harassment by Police or parking officers.  The Century looks bland in photos, but in reality, upon the Ginza pavement, a century seems imposing, regal and stylish in an old school way.

 

Toyota Century waiting on Ginza Strip, dark blue, series 2, V12
Deep blue Toyota Century in waiting on the Ginza Strip, the ultimate Japanese automotive status symbol, a luscious V12 provides the motivation in almost complete silence.

 

Your ears will notice, or should that be, not notice something else.  For such a busy place the traffic seems remarkably quiet, no loud exhausts and definitely no horns, in our entire time there I only recall hearing a car horn on a couple of occasions.  Generally, cars are driven in a calm, sedate and orderly fashion, the complete opposite to what you might experience in say, Rome.

The streets are a combination of broad avenues and narrow thoroughfares, and most are one way, but regardless of width, they are spotlessly clean, absent of buskers, beggars, pavement furniture, advertising boards and other physical impediments.

People move with purpose, but in an orderly fashion, there’s no pushing and shoving, talking loudly on phones, or aloud to one another, you could say those manners are pivotal to daily life in Ginza, but that’s true of Japan generally.

Regarding fashion, Ginza is conservative, the Japanese women do not flaunt sexuality but rather dress immaculately in beautiful materials, exquisitely cut and tastefully trimmed with discrete jewellery, refinement is a word that sums up the style.  Men tend towards the universal black suit, black shoes and white shirt, in other words, the typical business uniform one would expect to see in the financial districts of Manhatten.

Of course, Ginza is not all about shopping, there’s much eating to be done as well.  From the food halls in the basements of the department stores, through to the myriad of speciality restaurants, there’s an option for almost every palette, except perhaps for those looking for typical American style fast food.  KFC and McDonalds are present but much rarer than in other cities.

 

teashop in Mitsubishi department store, three staff, fancy packaging.
Typical of the displays you see in the food halls in the basement of most Japanese department stores, in this case, the tea counter. Japanese packaging is exquisite.

 

One constant however are coffee shops, there are Starbucks and equivalent style shops on every block, but I’d say for “Coffee Culture” loving Aussies like ourselves the coffee is generally a disappointment except for a few specialist coffee shops.

Ginza is close to many of the other Tokyo delights such as the Imperial Palace and Gardens, the Fish Market, Tokyo Tower and a wealth of other tourist delights.  The metro system is highly efficient and cheap, placing you within striking distance of almost anything you could wish to see within around 30 mins or maybe less.  For Aussies used to the vagaries of Sydney trains and buses, forget everything you have ever experienced, Tokyo despite its massive 24 million population just works, on time, every time!

 

Hibiyakoen, a park near Ginza, Tokyo, gardens, ponds, zen style.
Hibiya Park, a short walk from Ginza, sitting on the edge of the Government Agency District, almost universally Japanese parks are places of serenity and always spotlessly clean.

 

East Garden of the Imperial Palace, Tokyo, Japan, pond, autumn.
East Garden of the Imperial Palace, you need to walk a little further afield, but it’s well worth the time and effort.

 

Just to finish up on the technical side of things, the pics are mostly DNG captures, but there are some JPEGs shot on the standard app when it suited, and the multiple exposures were all JPEGs shot using Average Cam Pro.  As always the DNG files were extracted in Lightroom Mobile (now known as Lightroom CC), and I’ve done a little fine tuning on Snapseed.  The frames were created in Photoshop on the PC, and in some cases, a few small selective edits were made while there.

 

Composite image, Ginza strip, neon lights and cars, modern architecture.
Bright Lights Ginza Strip.

 

Multiple exposure, Tokyo plaza crossing, Tokyo Ginza district.
Crossing in front of Tokyu Plaza, Ginza, the intersection of Sotobori and Harumi-Dori.

 

Abstract art, window display, orange tones, Ginza Tokyo, Japan
Abstract art created from window display opposite Nissan Corner, Ginza.

 

Rain in Typhoon season, Ginza, Japan, from High rise.
Yes, it rains in Tokyo, our second Typhoon of our visit made for damp times, taken from level 9 of G.Itoya stationary store.

 

Lady standing in street under umbrella, Ginza, Japan, during Typhoon season, roads clear of cars.
No cars of many Ginza streets on a Sunday, Wendy, my wife takes shelter under the standards clear umbrella as the Typhoon rains soak the city.

 

Fish market near Ginza, Japan, seller, dried fish product.
The Fish Market area, a short walk from Ginza is home to a vast array of restaurants and specialist shops, wall to wall people on most days.

 

Modern buildings in Ginza strip, showing glass, stainless steel, acrylics and concrete. Taken at night, traffic light and ginza sign.
Stainless steel, glass, and acrylics are used to stunning effect throughout the Ginza area.

 

lady in rain in Ginza lane in drenching rain, running, neon door surrounds in background.
Drenching rain in a laneway, keeping dry in Typhoon season can be a challenge.

 

Tea on table in high end tea cafe, Ginza, includes timer, cup, pot and gold strainer.
It’s not often that you tea comes with a golden strainer and a timer, Ginza Miyuki-Kan Ginaza Gochome, one of the more specialized coffee and tea houses.

 

G.Itoya stationary store in Ginza, papers on display.
Japanese take paper very seriously, I have never seen such an array of paper types in my life, G.Itoya has a whole floor just devoted to papers, Photo papers also come in a bewildering variety of options not seen in most other parts of the world.

 

Don’t forget you can learn how to make your RAW, DNG iPhone files rock by purchasing my “Ultimate iPhone DNG” book from the iBooks store.  It’s the most comprehensive eBook around on the use of DNG on the iPhone and is the first in a series of 6 planned iPhone Photography publications from Zero One Imaging.

Buy it on the iBooks Store, click on this link:

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11

 

 

 

 

 

 

https://en.wikipedia.org/wiki/Ginza

How Good is DNG on iPhone 8 Plus

iPhone 8Plus DNG, test shots, Iridient Developer, High Contrast,

A few weeks back I ran some tests on the RAW files taken with the latest iPad Pro, you can read about that here.

https://iphoneraw.com/2017/07/18/dng-on-the-ipad-pro/

Frankly, I was pretty impressed, the quality was indeed considerably better than what’s possible on my “soon to be replaced iPhone 6S plus”.  Those test results got me quite excited because I fully expected to find the DNGs produced by the new iPhone 8S Plus would be a small step further step up the quality ladder.

As far as I can tell the modules on the iPad Pro and iPhone 8S plus are pretty similar, save for the lack of stabilisation on the iPad, but like all things Apple it can be quite difficult to get any definitive answers on what’s going on under the hood.  Anyhow, I’d have been happy if the iPhone 8 Plus DNG files were as good as the iPad Pro since it seems they’re actually a bit better I’m pretty chuffed. For comparison the shot below is one of the test images I took with the iPad Pro converted to monochrome, the overall quality is rather nice.

iPad-Pro-10.5in-DNG-Goulburn-Railway-Station-Bridge
Sometimes test shots work out nice in themselves and quite like this one, perhaps it is the layered effect. Anyway, it shows how the deep shadows (under the bridge) hold up pretty well. Nothing is clipped either.

And so here we are just a few days after the iPhone 8 release with a peek beneath the DNG hood.   Up front consider this as a preliminary test, it’s my wife’s’ iPhone and it only arrived Friday morning, so my time with it was a very limited, basically an hour or so on Sunday afternoon.  Frankly dragging any new Apple device from Wendy’s’ hot hands when she’s in the first blush of Apple love is harder than getting our Border Collie to give up a bone.  But Wendy is a lovely lady and terrific wife agreed to let me have a little free time with her new 8S Plus baby.

Note also, I only tested with the wide angle lens, not the telephoto, there’s no point comparing apples to oranges and then coming up with grapes, the 6S Plus has no telephoto lens option.

You still can’t shoot DNGs using the standard iPhone camera app, I imagine Apple decided the great majority of iPhone shooters will just want to deal with compressed finished JPEGs, except of course they’re not JPEGs anymore but rather the new HEIF and a big hooray for that. It’s certainly long past time when that clunky, chunky old JPEG format needed to be replaced with something much more modern.

If you want to know about the HEIF format here is a link for you to check out.

https://www.cultofmac.com/487808/heif-vs-jpeg-image-files/

This review is not about the fancy schmancy modes that the standard app offers, you’ll find plenty of info in other places if you want to know the ins and outs of the portrait mode or that cool photo lighting mode, suffice to say I reckon they are pretty cool.  Wendy gave those headline features a big workout over the weekend with our 8-month-old Grandson Milton and apart from having a lot of fun she found the results were actually pretty good most of the time.

This test is just about the potential of this DNG files but later I intend to explore the other options in depth, once I get my own iPhone X in a couple of months.

I actually think the iPhone 8 Plus DNG files have more relevance to the new iPhones that the previous versions because the general capabilities of the new cameras are much better all round. Now that might sound an odd thing to say since traditionally we shoot RAW/DNG to overcome the limitations of JPEG capture but bear with me.  I reckon a lot more people are going to choose the iPhone as their only camera, I can easily see DSLRs and Mirrorless cameras being left behind sulking in cupboards whilst owners pop off for two weeks of R and R.   That improved shallow depth of field effect will be enough to sway the choice for many casual and semi-serious users, most folk care little about how the result is achieved and just love the fact it can be done at all, much to the chagrin of many old-school shooters.

I can also imagine a lot of folk will still take their DSLRs on holidays and then faced with the choice of carting the gear around some foreign city for a day will decide…nah….leave it in the motel room, I’ll just slip the iPhone into my pocket.  Next holiday the DSLR won’t even make it to the front door!

Think about that for a moment, iPhone pics have been fine for many needs for years now but the new features and HEIF format raise the bar to a point where many more photographers will see the iPhone as “perfectly sufficient”.  What else out there combines lighting effects, great panorama modes, synthetic depth of field, slow-mo, great 4K video, time-lapse, perfect connectivity etc in the regular camera world…anybody…cmon…and of course you can shoot pretty good DNGs as well if you want an imaging edge.

At times more serious users will certainly want the wholesome goodness and flexibility that DNG capture offers, which brings us back to the question at hand, just how good or bad are the iPhone 8 Plus DNGs.

Whilst the following test pics are not fully comprehensive and nor are they great works of art, (but then what can you do when you only have the device for an hour or so), I reckon they give a pretty solid insight into the iPhone 8S Plus DNG option and its potential.

I always test with everything locked down with optimal exposure and focus control, I think when we test we should test “exactly” what we say we are testing, which means we need to eliminate the variables as much as possible.  You can be pretty confident these pics are a valid representation of what you can expect from the DNG files if you take care shooting and spend some time doing proper edits.

As for the shooting, I use and recommend two applications, Lightroom Mobile and ProCamera, (both of which are covered in detail in my iBook “Ultimate iPhone DNG”, available on the iBooks store) between these two apps you can do pretty much anything you’d reasonably expect to be able to do when shooting with DNG on your iPhone.

The exposures were optimal and some were captured using a UNiWb method on both the iPhone 6S Plus and 8S Plus, read the book if you want to know about that.

The editing?  I edited them in three applications, first Lightroom Mobile to get an insight into what’s possible by using only the Raw converter on the iPhone and then I carried out some post DNG tuning in the new version of Snapseed (which is very nice by the way).  On the Mac, I used Iridient Developer, followed by some Photoshop CC time to check for iPhone DNG edit-ability.

Just so you know, nothing extracts detail from files like Iridient does, it represents the ultimate and additionally, there are an absolute plethora of ways the files can be processed within the application, including alternative noise reduction and sharpening methods.  I came up with a few workflows for the files based on what I’ve done in the past with iPhone DNGs, these worked a treat but it’s worth adding that given some serious exploration time I can probably get more a little more out of the DNGs using more refined workflows.

My general principle with Iridient is to render out a result that can be fine-tuned in Photoshop.  Some folk might say my approach is not relevant to them, well maybe true, but they can always use Lightroom Mobile.  On the other hand if like me you really want to know what the iPhone 8S Plus DNG limits are then this is the way to do it.

You’ll find I  refer to the iPhone 6S Plus as a comparison,  I think that’s totally valid as most people buy their iPhones on two-year contracts or keep them for the two-year period, meaning  the most likely customers for the new 8 series iPhones will be the 6 Series updaters who’ve skipped the 7 series models.

Alrighty, let’s get down to it….

iPhone 8Plus DNG, test shots, Iridient Developer, High Contrast,
A very high contrast scene but the 8S Plus DNG format holds detail throughout the entire range. The lens shows no obvious distortion despite the relatively wide view.

Angle of View

The focal length of the lens on the 8S plus is slightly shorter than the 6S Plus, I assume the actual sensor dimensions are the same, (Maybe not, I haven’t been able to track down a definitive answer).  From the comparison pics, it looks like the 8S Plus has a slightly wider angle of view but I’d need to lock both down on a tripod and shoot them side by side to be sure.

The 6S lens was 29 mm equivalent and I’d say the 8S is 27.5 equivalent or so but I’ll confirm this with future tests.

 Depth of Field

Whilst the difference is not much the wider aperture on the 8S does seem to give slightly more separation when you shoot scenes that include near and distant objects, this is to be expected of course but it looks a little more pronounced than I had anticipated.

I assume that the higher level of overall lens/sensor performance in all measurable parameters is more important in changing that apparent depth of field rendering than the wider aperture.  Basically slightly out of focus areas always look more out of focus if the in-focus areas are rendered truly sharply in comparison, which they are with the 8S camera module.

Distortion

The distortion characteristics of the 8S Plus are benign, that is to say, I couldn’t see any change when I turned the lens correction on/off in Lightroom Mobile, even in Iridient Developer I couldn’t really see any distortion in the uncorrected files.

I’d need to run further tests on a tripod with fixed straight edged subjects to say with conviction that there’s no distortion but at this point that looks to be the case, which is quite impressive.

Compared to the 6S Plus

The 6 series modules have some pincushion distortion which in uncorrected files is just visible, so a win to the 8S plus, I’m just not sure by exactly how much.

iPhone 8S Plus colour rendition for DNG, neutral colour renderings.
The DNG has been edited in Iridient Developer for a slightly filmic look, the iPhone 8S Plus doesn’t seem to have strong colour biases, making it easy to liberate any rendering style you want.

Colour

With Raw files the colour rendering is mainly a product of the choices you make when extracting the files, the white balance, tint, saturation and vibrance are all adjustable but it’s also true that the sensor design and the processing chain will have an effect on how the final files respond.

Of all the criteria this is the hardest to qualify, I think Lightroom Mobile produces lovely colour with the 8S Plus but it’s pretty terrific on the 6S Plus files as well.

Colour can be fine-tuned in RAW converters or photo editors in post and the rendering of colours is not baked into DNG/RAW files in the way it is with compressed formats, at best I can make a couple of comments as to how the files look and responded when edited.

If anything the yellows are a little more dominant than ideal and blues are slightly cyan shifted, greens can end up a little yellow/green.  All colours seem to accept selective editing really well and fine-tuning white balance is very easy.  Really I’d need an opportunity to shot a wide array of shots including portraits and indoor lighting plus colour checker images to be able to make any meaningful judgement.

I did try a mixed lighting shot in my Daughters kitchen that had filtered window light and tungsten and overall the resulting image looked rather good, in other words, the tungsten lit elements were warm but controlled and the window lit areas not overly blue. Generally, the result was much better than what I saw with the 6S Plus.

Compared to the iPhone 6S

The 8S Plus seems to be a little less prone to accentuating certain colours, basically, it’s easy to get neutrals looking neutral and artificial light sources don’t seem to cause “runaway” colour tints.  I’d judge the 8S an improvement but I need to investigate further.

mixed light kitchen
I took this rough shot in my Daughters kitchen to see how the iPhone 8S Plus DNGs handles mixed colour temperature lighting. Very well, in fact, the tungsten is not overpowering and the filtered blue daylight through windows is not overly blue. It’s a win.

Noise

I expected the noise levels would be reduced compared to the iPhone 6S Plus as the 7 series modules produce RAW files that are definitely better in this regard.

So what did I find, no competition here at all?

For those shooting in the standard camera format, JPEG and now HEIF, noise is usually a non-issue as the iPhone processing pretty much blurs all the noise away along with the fine detail. On the other hand with DNGs we have total control and can play the trade-offs against one another, that alone could be reason enough to shoot DNG.

The 8S Plus DNG noise is much lower than the older modules and especially the 6 series, you can see it everywhere in the image, but it’s especially obvious in blue skies and shadows.  If the file is correctly exposed at the lowest ISO (as a RAW file, not as if it were a compressed processed file) you can turn off all noise reduction in Iridient, no qualms at all.

Initially when noise appears its low-level chrominance noise showing up in neutral toned areas, but I found it to be very acceptable at the low test ISOs and there’s almost zero luminance noise in smooth tones areas if the file is optimally exposed, i.e., at 20 ISO.

Compared to the iPhone 6S

No contest, the 8S Plus easily bests the 6S plus and importantly this means you can push the sharpening and micro tonal contrast adjustments more aggressively.

Derelict home taken with iPhone 8S Plus DNG, Test shot, edited Iridient Developer, tuned Photoshop CC
This home has seen better days, looking at this downsized image its obvious that the clarity across the entire frame is excellent, detail is held right out to the corners and the tonal range looks natural, it looks like it could have been taken with any high-quality camera. It should be noted too that the afternoon light was highly contrasty.
Crop from Verandah test image, iPhone 8Plus DNG,
This small 100% crop from the Verandah shot earlier in the article gives a good idea of the sort of detail the iPhone 8S Plus DNG files liberate. Textural information, in particular, is well expressed and should look nice in print.
100% view of iPhone DNG files, iPhone 8S Plus
Here we have a 100% crop (approx) of the home, the detail rendering is excellent with the DNG files and you can even see the twist in the barbed wire on the top of the fence, look even closer and you can see the nail holes in the timber on the side of the verandah. Detail and resolution are certainly not an issue.

Detail

The DNG files from the 6S plus are vastly better than the JPEGs, the JPEGs always show unpredictable mushiness, lack of very fine detail and sometimes look very watercolour like.  I expected the new HEIF format would be much less damaging to the files and thus the difference between DNG and compressed capture under normal shooting would not be as significant.  So how did that assumption pan out?

Well, although not covered in this test, I did look at the compressed standard files and there’s no doubt they hold much better fine detail than the old mushy JPEGs on the 6S Plus, there’s far less of that watercolour rubbish I detest.

Frankly, I was not expecting a big improvement in detail rendition with the DNG files on the 8S Plus, the 6S DNGs were already capable of very well resolved results providing the exposure was nailed correctly. DNGs converted in Iridient extract about as much detail as you could ever reasonably expect from any 12-megapixel image. So are the 8S Plus better? In the centre of the frame it’s a pretty close call, the native files showed little difference in detail but the win goes to the 8S Plus…just.

But, there is much more to it, the 8S Plus shows a higher level of clarity across the entire frame because the lens is simply better and more importantly the native noise level are much lower, meaning you can apply correspondingly higher levels of image sharpening without the noise becoming obvious and degrading the image.

The lower noise level pays off, particularly when applying very low radius sharpening to bring out textural details.  With earlier models, you really had to back off early as you’d get a combination of ugly colour flecking and rough grain.  The 8S Plus files beg to be texture sharpened and respond really well to it.

Compared to the iPhone 6S

Better, but not a chalk and cheese difference, in the end, you have more sharpening flexibility with the new camera, that will be a big bonus for those wanting to crop the frames or blow up to larger sizes, in particular, the improvements in the corners of the frame are obvious.

Dynamic Range

The Raw files on the iPhone 6S Plus have considerably better dynamic range compared with the JPEGs, especially if they are captured using optimal UniWB exposure, (read about that in “Ultimate iPhone DNG).  I’ve always felt iPhone DNGs did a much better job with the highlight end than the shadows, which despite all sensible efforts usually still ended up lacking good detail and tonality.  Ultimately highlights are far more important than shadows so it was a fair trade-off, but now I don’t have to trade anything…cool!

I really need to run some comprehensive tests on this but I’m confident the iPhone 8S Plus will provide details with better highlights and much-improved shadow detail under almost all conditions.  It boils down to this, even if the true dynamic range was no wider, (and I think it is) the shadows have far less noise and record more recoverable detail than the 6S Plus ever did meaning for DNG captures you can reduce the exposure to protect the highlights more, knowing you’ll be able to brighten the image in post without it turning it into an ugly noise-fest.

The 8S plus will still clip highlights, it is a small sensor after all, but I noticed that the highlight the recovery tools in Iridient worked a little better with the 8S files, tending to give a more neutral colour rendering and avoiding the harsh tonal breakup I’m  used to seeing with clipped 6S Plus files.

Compared to the iPhone 6S

The 8S Plus is better but probably mainly due to the lower shadow noise levels. Neither device is going to be as flexible as a regular DSLR or Mirrorless cameras but if you’ve only ever shot JPEGs on a smartphone you’ll be quite amazed at how good these DNG files can be.

Crop colours iPhone 8S plus DNG file, Detail and tonal rendering
This 100% crop from the Tea Towel shot above gives a pretty good idea of the degree of detail and micro tonality on offer with the DNG files. Really there is nothing to make you think that this is shot with a smartphone camera.
iPhone 8Plus DNG test image, late Daylight Condition, green field and blue sky with trees in distance.
Taken late in the afternoon just before sunset near Gundagai NSW. Even in this downsized version, you can see the DNG file holds a lot of fine detail in the grass. That ability makes photos look more 3 dimensional. Clarity in the close corners looks spot on too.

Lens Quality

I thought the lens quality of the 6S Plus was pretty good though mine at least would sometimes render corners randomly soft, it might be the top right in one shot, bottom left in another and so on.  I suspect this is due to weird interactions with the 6S Plus image stabilisation but I’ve never been able to conclusively prove that.  Generally, the 6S Plus edges and corners are noticeably less well resolving than the centre.

The iPad Pro lens with its 7 series camera module is much better performing than the 6S Plus, this might be due to less diffraction as a result of the wider aperture or it could just be a better design, regardless the lens on the iPad Pro eats the 6S Plus for dinner, resolving very well across the entire image and my iPad Pro doesn’t show any uneven corner softness at all.

It makes more sense, in this case, to compare the lens performance of the 8 series to the 7 series module as it’s a given the 8S Plus will easily best the 6S Plus version.

So the answer?  The DNG files look to have a little bit better edge definition on the iPhone  8S Plus when compared to the 7 series modules.  Like most lenses the corners aren’t exactly equal in resolution, the bottom left is the softest on the test sample, but honestly, it’s still very very good.  Let me put it this way, the cross frame performance is much better than any kit lenses I’ve ever tested on DSLRs or Mirrorless cameras when set at the wider end of the range, you certainly don’t look at the 8S Plus DNGs and think, “damn I wish that corner was sharper”.

Compared to the iPhone 6S

The improved corner definition compared to my 6S Plus is very obvious, especially when you look at the DNG files in their native state, no competition here, a knockout for the 8S plus and it looks a bit better than the 7 series modules as well, but this could be down to other processing chain factors rather than optics.

Image of bare tree displaying corner performance of iPhone 8S Plus lens and DNG files.
Taken from the extreme top right corner of the derelict home image a couple of points are obvious. First, there is no chromatic aberration and second very little purple fringing, bear in mind this is exactly the circumstance where you would expect see both. Additionally, the shadows hold tonality and with selective editing, more detail could be brought out. Also, note that there is no noise in the blue sky and this image has been processed with all noise reduction turned off! It’s really only the very outside corner area where clarity falls off a bit but honestly, this is quite excellent compared to pretty much any lens and who really pixel peeps the extreme corners anyway.

Chromatic Aberration

Just so we a clear, we’re talking about magenta/green and yellow/blue colour fringing here, not the purple fringing you can see around dark lines set against bright light sources, the later is not regular CA and has a different cause.

I’m very sensitive to CA, I find it visually disturbing and even little bit of CA gets me queasy.  CA messes with the colour as you move towards the edges and corners of the frame and also reduces peak sharpness.  Most photographers will argue, “yeah but it can be fixed in post”, that’s true but a CA fixed image will never be as sharp as one created with a lens that exhibited no CA at all. Give me optically corrected CA any day.

Now up front, I have to say those iPhone lenses since the 5S have been pretty good in this regard, each iteration seems to have reduced CA a bit and but frankly, it’s never the bothersome issue it is on most regular camera lenses (even expensive ones).

And now, I present with great fanfare…tadaa….the first lens I have ever tested where I could not actually find any Chromatic Aberration when zoomed into a 200% view on an uncorrected RAW/DNG file.   Just pause for a second and ponder that, I said none, nada, nothing.

Yes, you’ll get a little purple fringing if you push the exposure hard enough but that’s a horse of an entirely different colour, literally, regardless the purple fringing is really well controlled, basically a non-issue, all of which tells me the lens must have pretty high native contrast, excellent coatings and superb design.

Anyhow folks, you can forget about worrying about chromatic aberration and also be confident that any residual purple fringing when it shows up can easily be sorted in the RAW converter or Photoshop (or something similar) with a fringing correction tool. Lightroom on the desktop computer does a great job of sorting this for example.

Compared to the iPhone 6S Plus

The 6S Plus always performed well in this area, but zoomed in the 8S Plus is much better, in particular, the high contrast purple fringing is not as well controlled on the 6S Plus.

Just One Thought

Killing chromatic aberration with lens design is very difficult for a whole array of tech reasons, most kit lenses don’t even get close to sorting the CA within the lens itself, that’s done in software when making the JPEG or via a profile in the RAW converter.

I checked the DNG files without any corrections enabled and found zero CA, this raises a question I can’t answer. Have Apple found some way to perfectly correct the CA before the DNG file is written and bypassed profile corrections in the RAW converter later on or is it just the lens is incredibly well corrected for CA?  I don’t know but the results are great.

Colour Banding

Colour banding has been a real bugbear of mine with iPhone images since the first iPhone I owned, a 3GS.   I hate it banding, loathe it, detest it, I don’t like it and it makes me want to throw up, well not quite…. but you get the idea.  Banding is also devilishly hard to correct in post editing without causing other flow-on problems.

Banding or posterization particularly show up in blue skies and on bright skin tones, but it’s also common on yellow objects with many cameras including iPhones.  What complicates the matter is that some of the visual banding in the past was not due to issues with the files and inadequate bit depth but rather the display panel.  I often found apparently banded images were quite OK when extracted and viewed on my desktop 5K Mac.

The iPhone 8S Plus has a much better display, not as good as the X promises but still much better than the 6S Plus, in fact as soon as you look at the images on the iPhone 8S Plus it’s obvious the display is way better so I expect that that display induced banding will cease to be an annoyance.

It’s a bit early for me to pass a definitive judgement, I really need the chance to shoot a lot more photos with large areas of blue sky, yellow cars, portraits in bright light etc to be sure….but it certainly looks like the banding issues are significantly reduced or eliminated.  None of the quick test pics showed any tendency towards banding and breakup no matter how hard I looked or pushed them in editing!

Compared to the iPhone 6S Plus

Again its hard to be sure but the DNG performance looks to be much better, the real test will be when I can get the phone back off my wife later this week and torture test for banding using Lightroom Mobile HDR feature, I’m quietly confident that the “banding is on the run”, both for the files and the display!  BTW its pretty easy to get the “bands” when pushing 6S Plus files in editing.

Vignetting

All iPhone/iPad raw/dng files I’ve tested have shown red tinted vignetting in the native state.  JPEG shooters are likely unaware of this as the standard processing deals with it automatically, most Raw converters also deal with the worst of the issue via a built-in profile but sometimes you still see it in skies and smooth tones areas near the edges of the frame.

The red/vignetting shift is mainly caused by issues within the sensors design and the way it interacts with the lens, it’s diabolically tricky to eliminate the issue if present.  In the case of past iPhones, the red shift in the DNG files became much worse as the ISO was raised.

The truth of the vignetting matter is revealed by taking DNG files and viewing them with all profile adjustments turned off, you can’t do this on the iPhone nor is it possible with many desktop editing apps but it’s easily done in Iridient Developer.

Does it matter? Absolutely, that vignetting not only causes colour shifts in the corners but increases the noise levels, reduces corner shadow detail and limits your ability to get the best possible results from the files.  For example, you’d need to dial back the sharpening levels and increase the noise reduction if you don’t want messy corners and edges, it also means you need to perform advanced radial edits to get the most out from your DNGs. Red shifted vignetting might not be a big issue to many of you out there in interweb land but to me, it’s massive PIA.

So….the iPhone 8S Plus has much less native corner vignetting than the 6S plus models and a little less than the 7 series modules as well, additionally the vignetting is far more colour neutral, there’s a very slight colour shift but nothing like the horrible red shift on previous models and it’s only seen on plain blue skies if at all.  With the 6S module, you could see it on every uncorrected frame and it even ran well in towards the centre of the image if the ISO was raised just a bit.

Unprocessed iPhone 6S Plus DNG image, analysis of real RAW image quality.
This is what your iPhone 6S Plus DNG file looks like when you turn off all adjustments and profiles in the RAW converter, in this case, I used Iridient Developer on my Mac. A couple of points to note, the DNGs were shot with the exposures set right to the clip point using UNiWB in ProCamera, for the iPhone 6S Plus this renders a darker image as the sensor cannot accept as much light before clipping.  Next, have a look at the vignetting, it’s far greater than the following 8S Plus frame and also shows a significant red shift in the corners which gets much wrote as the ISO is raised.
iPhone 8Plus DNG test file, unprocessed image of derelict home, shows good vignette performance and clarity.
This is the unprocessed DNG from the iPhone 8S Plus, apart from being lighter the most obvious difference is the file has much lower vignetting and very little red corner shift, it also looks a bit better resolved.
Test frame iPhone 6S plus showing red/magenta vignetting shift in unprocessed files.
This is a basic extraction process of the iPhone 6S Plus DNG file using Iridient Developer, I’ve left the lens profile turned off so you can see just how much that red/magenta colour shift effects the corners of the frame. It really has a pronounced effect right in towards the central 30% of the frame. Even properly processed files (and that includes JPEGs) will often display this red shift problem, especially if the ISO is raised beyond about 100.

Response to Editing

This is where the rubber hits the road for DNG files, JPEGs are just so damned brittle, push the tones and colours or try to re-sharpen and all sorts of nasty things happen, I haven’t tested the HEIF files for edibility but the specs of the format tell me it should be pretty flexible.

The 8S Plus files edit very well in both Lightroom Mobile and on the computer in Iridient Developer. Shadows can be pushed, highlights recovered and selective edits applied without getting horrible tonal breakup.  The files can be sharpened easily and the noise being much lower means you have greatly improved options for shadow recovery.

As a little test, I shot an image along an old railway bridge in Gundagai NSW after sunset, it’s an extremely contrasty lighting situation and the phone wasn’t level either as I was shooting through a crooked wire fence.

Looking at the original DNG capture you could easily decide all is lost, it looks hopelessly dark and honestly if this were a DNG shot on the 6S Plus there’d be no hope, but take a look at the recovered edited and cropped image, it actually looks pretty reasonable.

The processing was done via a combo of Iridient Developer and Photoshop CC, yes it has some luminosity noise but truly it’s far better than I expected.

High contrast iPhone 8Plus DNG torture test. Dark bridge taken agains light from setting sun, very dark exposure.
The DNG Torture Test.  I shot this image straight into the light just after sunset and exposed to try and keep some colour and detail in the highlights. By the way, I did the same with the 6S Plus file but the image was beyond recoverable.  You’re looking at the unprocessed image, all I did was open it in Iridient Developer and then click export JPEG. Yep its pretty terrible.
Shadow recovery test using Iridient developer on grossly under-exposed iPhone 8Plus DNG file
Extracted File. This is the image that came out of Iridient Developer once I had tweaked and fiddled to get the shadows recovered, I left the noise reduction turned down low as I was interested in seeing just how terrible it could be. This version looks much better but not great and don’t you just love the crooked stance!
Fine tuned version of shadow recovery DNG test file taken with iPhone 8S Plus.
So after a spin through Photoshop CC and some selective edits we get this, oh and of course I straightened it a bit as well, though it could use more. Now, this is quite acceptable and certainly much better than I expected would be possible. This is a downsized pic but even the full-size version is nicely sharp and nowhere near as noisy as you might expect from such an extreme edit. The HDRs taken in Lightroom Mobile should work really well with the 8S Plus, this test also shows why the lighting modes on the 8S plus work as well as they do…basically the shadow recovery is much better and that makes for a more flexible post-capture approach.
Monochrome conversions from iPhone 8Plus DNG, vacant shop interior Coolamon NSW
Just to finish off on the edibility aspect, this monochrome image was extracted in Lightroom Mobile and then turned into mono in Snapseed. I added a film grain effect whilst in the app. Anyway, I found the files easily converted to monochrome and provided plenty of creative flexibility. That is not always the case with smartphone images. And just in case you are wondering, it’s a vacant shop in the main street of Coolamon NSW, Coolamon has lots of vacant shops.

Compared to the iPhone 6S Plus

Not even in the same ballpark. Net result then, the 8S Plus DNG files edit better period!

Where to From Here?

This is just the first in what will be a full battery of tests on the DNG, some of it will likely make its way into an update of my “Ultimate iPhone DNG” eBook.

So next I will explore the performance at the various ISO settings, try out the telephoto lens for DNG, run some comprehensive DNG colour tests, try different sharpening and noise reduction processes in Iridient, run some HDRs in Lightroom Mobile and probably a few other things as well.

Do come back again and if you really want to get the most out of your DNG captures on your iDevices why not pop on over to the iBooks store and buy a copy of my 400 page “Ultimate iPhone DNG”.

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11

Cheers

Brad

 

 

 

 

 

 

Do Image Stackers Really Work

Image stacking apps for creative iPhoneography and improved image quality.

The theory behind image stacking to reduce image noise and increase resolution is solid, we’ve been doing this for years in Photoshop. Can this be done automatically inside your iPhone?

These days there are several iPhone photography apps claiming they can produce much better photos by stacking several quickly captured frames on top of another, and then blending them down, even the iPhones standard camera app uses a version of this idea.

Do the image stacking option really deliver on the promise?

The answer is sometimes……but find out a little more about this by watching this short video.

 

Buy Ultimate iPhone DNG from the iBooks Store:

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11

Check out the video here:

Depth of Field and iPhone DNG

Yes, yes it’s true the iPhone doesn’t have the capacity for shallow depth of field rendering in the way your DSLR or Mirrorless camera does unless of course, you go very close to your subject.

However, there are some differences between what you can expect in terms of depth of field rendering from iPhone DNG and JPEG versions. Here’s a short video that discusses the differences, it may cause you to do a little re-evaluation of the accepted wisdom.

If you want to know more you can always buy my “Ultimate iPhone DNG” eBook on the iBooks store.

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11

Check out the video here: