iPhone X Tele Lens, How Good? Part One

Most casual iPhone followers think the telephoto lens in the X and 8 series are the same, they’re not, the aperture is wider on the X 2.4 versus 2.8 and the lens/sensor module has image stabilisation on the X version.  Those differences alone are important and should make for better images under many situations, especially if the light level is low, note however that when shooting with the standard camera app the iPhone 8 and X will both revert to a cropped version of the image from the standard lens when the light level really drops off a cliff.

I had several questions I wanted to be answered regarding the Telephoto lenses and options on the iPhone 8 and X models.

Regarding the comparison between the two models I want to know:

Are the lenses optically identical?

Is there any difference in dynamic range between the two lens/sensor modules?

How does the native “depth of field” rendering compare?

Moving beyond those basics,  I also want to know:

How well resolved are the corners?

Does either lens show significant native vignetting?

Are there any issues with Chromatic Aberration?

How good or poor is the image noise?

Eventually, I will answer all of these questions in detail but for today lets look at the portrait mode.

In a more general sense, the Portrait mode is fascinating and I thought it would be good to initially look a little deeper into this option. I’ve seen a few comparisons online between the iPhone and Android versions of the concept and though no smartphone cameras have come close to perfectly simulating the shallow “depth of field” you get with your DSLR or Mirrorless camera, the results for most casual needs are more than passable.

I must add that at this point we are looking at first and second generation products, I’ve no doubt the next five years will see enormous gains in the final quality provided by portrait mode options.  Your DSLR is certainly not yet a dead end but it will increasingly become a threatened species.

Just so we are clear, some of my questions relate to the compressed HEIF or JPEG files produced by the regular camera apps, specifically the portrait mode option but I also wanted to uncover the unvarnished truth and determine “the true potential” of the iPhone X Telephoto lens when used optimally, for that we must look to the DNG/RAW files.

This is going to be an in-depth examination so I’ve decided to break the coverage into three parts.

Part 1 Using the portrait mode

Part 2 Shooting DNG Telephoto on the iPhone X

Part 3 Comparison of the iPhone 8 and X telephoto lenses

 

Let’s get started then…

 

Old peoples home, artwork by Sun Yuan captured with iPhone X in portrait mode, monochrome.
Hyper Real Exhibition, National Gallery Of Australia, “Old Peoples Home” 2007 by Sun Yuan, in this case, the portrait mode worked perfectly.

 

Is the Tele Lens any good?

Cutting to the chase first, for those who have not used an iPhone with a Tele lens option, the simple answer is yes, it’s a terrific option and I couldn’t imagine going back to an iPhone without one.  The lens quality is solid and produces sharp results while the focal length helps me at least, to create photos in situations where I would otherwise have chosen to use a regular camera.

Some testers have pointed out that you’re limited by the lens in low light situations because the aperture is smaller (f2.4) than the wide angle lens (f1.8), true, but it’s nothing like a deal breaker for most needs.

I far prefer the natural perspective the tele lens provides and in particular I value the ability for more selective composition by virtue of avoiding the background distractions which tend to more easily encroach when I shoot with the standard wide angle lens.

Technically the extra lens is not one you’d normally classify as a telephoto, coming in at around 56mm full-frame equivalent.  A 70mm equivalent lens would have suited me better but strict physical limits apply when your camera is only a few mm thick.  On the other hand, the 56mm equivalent focal length is long enough to suit a great many needs, there’s a whole bunch of reasons why the 50mm focal length became the most used choice for film photographers…it basically works better more often.

Just in case you are wondering if the Tele lens might give a better result that one of those add-on lenses you’ve seen on eBay or even one of the more expensive ones, trust me,  there’s no comparison, the iPhone Tele eats them alive, especially in the corners of the frame.  Not one of the add-on tele lenses I have tried offered anything like sharp corner resolution, maybe there is a perfect option floating around out there but I very much doubt it.  By the time you pay for a good quality one plus the mount/case needed to adapt it to your iPhone your getting much closer to the price of the iPhone 8 dual lens model, and let’s not even bother getting into the practical aspect of a chunky bit of glass added to your slip in the pocket device.

 

10 month old baby, iPhone portrait mode, close up head shot, taken in shade.
A lovely snap taken by my wife Wendy or our Grandson, Milton using portrait mode on the iPhone 8, a result that would no doubt please most nannas and poppas.

The Portrait Mode

My wife Wendy is a great fan of portrait mode, she feels it gives her noticeably nicer results when taking family pics and especially so when she takes photos of our Grandson, Milton.  I’ve included some of her pics in this article as examples.

Wendy is not what you would call a picky photographer, she just loves to take photos of the things in her life that matter or need to be recorded, just like most iPhone camera users.  I could, of course, point out to her the technical deficiencies of the mode and she’d just as likely say, “so what”, she’s just a happy iPhone camper.

Here’s the point, most consumers won’t care about the techy stuff that most hobbyists and serious photographers fret over, they just want nice looking portraits with a minimum of fuss, for them, portrait mode using the standard iPhone camera app delivers and for those of us who do sweat the small stuff, well the tech is only going to improve and we have other options if we want to use them.

On the other hand, I’m sure when the mode is further refined and the results improved most users would notice the benefits but in the interim, most owners have no deal-breaking issues with portrait mode.

Me? I find it OK, really I don’t expect perfection, I am just grateful that it works at all.

 

10 month old boy on play gym, smiling broadly, teeth and tongue showing, iphone 8 portrait mode.
My wife captured this pic of Milton whilst he was in action on a play gym, there’s a little bit of blur but the result is really lovely, again the portrait mode has worked very nicely.

But the articles on this site are for those who want to take things to a higher level so critique the portrait mode I must and along the way offer some advice to help you get more out of it.

How does it work?

First up, let’s differentiate between the two parts involved in the option. The basic function is to simulate the shallow depth of field effect you’d expect if you’d shot using a DSLR or Mirrorless camera with a telephoto lens and wide aperture.

The second part is to apply lighting effects to give the portrait a look that channels studio lighting etc.  Currently, there are 5 options, natural, studio, contour, stage and stage light mono, no doubt Apple or other third-party app developers will add to this set as time goes by.

At this point I’d say the mode is much better at the former aspect, the lighting effects are very patchy in practice, sometimes the result is great, but often it’s just awful and even casual users, my wife included, have noticed this.

There’s some confusion here because most people it seems have assumed that portrait mode refers to the combination of lighting and depth of field effects, not true they are separate, the depth of field effect can be turned off “in post” and the lighting effects still applied.  To do this you simply hit the edit option on the top of the screen in the photos app and then tap on the yellow “portrait” tab, the DOF effect is removed but the Lighting effect option is still active.

Remember that none of this is destructive, the original image is still intact, we are just applying an instruction set to the image so it looks like the image has been adjusted when we view it….in other words, no pixel is harmed in the process!  What is critical however is that the file is shot using the HEIF format because that’s the only way we can save the depth map needed to create the shallow DOF simulation.

You can read about the HEIF format here: 

https://www.cultofmac.com/487808/heif-vs-jpeg-image-files/

Ok, all good info but how does it work? There are two lenses on the iPhone 7/8+ and X models when you shoot in portrait mode both of these are used.  The telephoto is used for the image and a combination of the standard wide angle and tele lens is used to create a depth map of the image.  Say what! Depth Map!

Yes, that needs a little more explanation.

So we have two eyes, this allows us to gain a great perception of depth when viewing our world because each eye sees a slightly different view, which means we can sort of see around things.  The upshot of it all is that our very clever brain is able to take these two images and blend them together so we see in 3D.

Our sense of 3D is excellent for close up objects but as the distance increases the effect disappears, this is easy to, well, see.  Close one eye and look at a scene, straight away you have lost the sense of 3D, open them both and we have 3D, that’s obvious, isn’t it.

Now, look at a distant object and close one eye, it looks pretty much identical because both eyes see it almost from almost exactly the same viewpoint.  Now look at something about a metre away and do the same, the difference between what each eye sees is quite profound, it’s simple really, the closer the object the more pronounced the difference is between the view seen by each eye.

With me so far, good.  Now it should be obvious that the view seen by each eye will not line up if you layered them over the top of one another, closer objects would show more misalignment than mid-range objects and distant objects would show virtually no misalignment at all.

Now if we measured the misalignment differences we would be able to work out how far from the camera the scene objects are.  This is the principle behind rangefinder camera focus systems so there is nothing new under the sun here.

Here’s the cool bit, if we can work out the distance of those objects we could create a map of the scene which can be overlayed on the image and used to control or mask a blur filter.  In other words near objects don’t get blurred and more distant objects or scene elements do.

It all sounds simple enough but in practice it’s very complicated because a whole bunch of parameters have to be determined, like at what distances does the blur stop and start, how much do you blur the image, how can you blur without cutting into fine details like hair, leaves, edges of fabric etc, it’s enough to make a tech-heads cranium spin.

Creating the depth map is the simple part, the tricky part is the post-capture processing and I have no doubt Apple will significantly improve this given time.

 

Hyper Real, Mark Sijan, Embrace 2014, monochrome image of sculpture, taken Brad Nichol, iPhone X portrait mode.
Photo of an exhibit at National Gallery of Australia, Hyper Real exhibition, Mark Sijan, “Embrace” 2014. The Portrait mode was just perfect for this image, the monochrome edit was done in Snapseed over a coffee in the NGA cafe.

Currently, there is a fair degree of AI used in the processing of the images in Portrait mode and I expect that Apple will increase this over time, this will make a significant difference, the potential is well demonstrated by the Google Pixel 2XL. The Pixel has only has one lens and relies to a much greater degree on AI for processing, it produces great results, sometimes better in fact than the tele portrait option on the iPhone.

At the capture level, the iPhone is hamstrung somewhat by the distance between the two lenses, it’s not much more than a centimetre, which limits the accuracy of the depth map.  Basically, the further apart the two capture lenses the more accurately object distances can be measured, now you know why the wide gap between the two viewing windows on a Leica rangefinder is so important, it allows for very precise focus setting, which is especially important when using wide aperture lenses.

A second complication is that both lenses are not identical, so I presume the image from the wide angle has to be cropped internally to build the depth map but this would mean the system is working with a lower resolution image, but bear in mind Apple don’t exactly spell all this out, so who really knows?

If Apple were to increase the distance between the two lenses the accuracy of the depth map would be better and combined with smart processing would probably mean the system would be less prone to blurring fine details and other errors but I doubt that Apple will venture down this path in future, it would compromise the practicalities of the iPhone too much.

 

Hyper real exhibit, Mark Sijan, Cornered, 2011, portrait mode, iPhone X, Brad Nichol
Another NGA exhibit by Mark Sijan, “Cornered”. Portrait mode has little problem with simple subjects against plain backgrounds like this and produces a nice result, again this was edited in Snapseed with the vignette applied in post, which is a much better look than that obtained with the “Stage Light Mono” option.

There are other ways of building depth maps, take a lot of photos at different distances in very quick succession and then selectively blend the images together using depth information. Use infrared beams to map the scene and create a depth map from that information, which is exactly what the iPhone X does with the front-facing camera. Use artificial intelligence to analyse scene elements and make depth judgments based on comparing the relative size of those elements and machine learning. The latter option is the method used by the Google Pixel 2 XL which also uses information gained from the phase detection focus pixels.

Beyond that, we have sonar, which was used to determine focus with the old but lovely Polaroid SX70. It should also be possible to create a very accurate depth map by taking the actual photo then moving the camera left and right to capture additional frames which could be used for the map but not the photo, there have been examples of cameras doing a version of this too.

Basically, there are many ways to skin this digital cat but for smartphones, practicality is almost always an over-riding issue, in the end, it is likely to be a combination of methods that wins the day.

I suspect the next high-end iPhone will use the iPhone X front camera method for portrait mode on the rear camera to build a more accurate depth map, I’m sure Apple didn’t go to all that trouble on the front facing camera just so we could unlock our phone and take better selfies!  I also suspect that long-term the iPhones portrait mode will also use much more machine learning and probably more sophisticated multi-shot capture methods.  Remember, all of these options currently exist in some form or other, we’re not talking about inventing the new wheel but rather combining current technologies in more sophisticated ways.

 

Baby in bath with toys, looking up and smiling, sepia low saturation colour tint, taken iPhone X portrait mode.
Milton loves his bath and the portrait mode worked perfectly here. The trick is to keep the exposure “just right” so as not to burn off any of the highlights. The edits were in Snapseed, which included a vignette and sepia tint with low saturation. Honestly, this would look lovely as a large scale canvas print.

The main current limitation of the built-in portrait mode is that it only works in a range of around one to two and a half meters due to the small distance between the two lenses and you have no control over the blur effect. There are other third-party apps which now use the depth map to create blur effects so watch this space because I’ve no doubt that even with the current hardware the results could be improved by better processing algorithms and perhaps even capture options.

The new APIs (Application programming interfaces) that Apple opened up with iOS 11 has meant that developers are free to create new apps to work with the portrait mode and depth maps, If you want to try the better current alternatives, which by the way allow you to vary and control the DOF effect, check out these options:

Focos

Slor

Anamorphic

infltr

These apps really need their own articles, there’s some serious power on offer, but for today I’ll stick with the standard option as much as possible.

 

Is “Portrait Mode” just for portraits?

In short no, you can use the mode to capture objects other than heads and a lot of folks are doing just that, some even creating passable results for “product” photography needs.

The front facing camera can also be used to shoot objects but it seems more attuned to portraits and it’s very difficult to get it to work on static objects reliably.

The main limitation is the mode does not work if you go too close, which in the scheme of things is not very close at all, basically about a meter. I’ve found this close distance is quite variable, sometimes it will work a little closer up. You’ll know if it’s going to work before the shot cause it tells you right there on the screen “move further away”.  This limit has nothing to do with the minimum focus distance for the telephoto lens which is around 35cm.

 

Photo of wooden steering wheel on vintage car taken with iPhone X portrait mode.
Although portrait mode is not really intended for non-portrait images is does an excellent job on occasions, this is one of them, but the results can be very patchy. Fortunately, the effect can be turned off after the shot is taken.

In theory, the mode should be able to work even better at close range as the difference between the positions of the two lenses should be better suited to more close up shooting distances than for further away, so I’m not really sure why the close distance limit is so far out.

The good news is the effect is seen as a live view, you know up front how the result will look, the bad news is the effect is often too strong, blurring parts of the objects you want to be left alone and when editing in the standard camera app you can only turn the effect on or off, there is no way to paint on the mask (which you do not see at all) to moderate the effect. (Remember there are Third party apps that can solve this limitation).

I’ve seen some pretty good photos of flowers, cameras, hands etc taken in the portrait mode so it’s definitely worth trying but presently portrait mode will probably prove frustrating when you want consistency, and that’s my main gripe, much of this probably comes down to the limits of the AI being currently used.  I can accept that it might not be perfect, but I have not been able to work out exactly when it’s likely to succeed and fail, it’s just some damned random and the one thing I hate in photography is randomness of results, I like to know that if I do certain things I will get a predictable result.  With more usage, I may become more intuitive with the option but I’m not convinced that’s how it will pan out, I suspect portrait mode is still a bit, shall we say, quirky for things other than portraits, which is no surprise.

 

Using Halide on iPhone for Depth map, photo of car
I took a shot of my Lexus using Halide to see if I could get any depth effect at all, you certainly can’t with the standard Apple app. If you look closely, you can see that the background is indeed blurred a bit and the leaves on the tree closer to the camera are more resolved. Sure it’s not blurred in the way you would get with a DSLR, but the effect is better than nothing at all, so a handy trick to have up your iPhoneography sleeve.

So What are the main Issues?

Only works really well with faces.
Only works in the range of approximately 1 to 2.5 meters.
Blur is not adjustable. (with the Apple app)
There is a tendency to get blurred edges and fuzzed up fine details, especially hair.
Works best if the background is well behind the subject without any encroaching details from the sides.
The lighting effects are hit and miss and often far too strong.
Still not a replacement for your DSLR. (like that is a surprise)
You cannot zoom within the mode.
All of the above leads to the next question, is there anything we can do to make the results better and more predictable, or to put it another way, Brad, can you give me some killer tips.

I think I can, try these on for size.

Exposure!

Be very careful with the exposure, it is not a RAW file and doesn’t respond near as well to tonal adjustments “after the fact”.  This especially applies to the highlights which once lost tend are impossible to recover and often end up with weird colour shifts.

Portraits live and breath by their ability to properly render lighter skin tones and texture without clipping, bleached white patches on the skin just look wrong.  There are three things you need to attend to.

One, make sure you have the HDR option enabled, with IOS11 this is done within the “settings” menu, not within the app itself.  Just make sure you have the Auto HDR option enabled.

Two, once you have focused slide your finger up and down on the screen to adjust the overall exposure, ensure that the skin tones you want to be recorded with detail actually are before you press the shutter.  The newer iPhones are much better at controlling noise than previous models, a dark photo can be easily lightened in editing without turning into a noisy mess, and frankly, the pics often look more analogue when treated this way.

Last, if you are going to use the lighting effects, pay attention to how the effect is interacting with the exposure, it’s way too easy to overcook things at this point.  Note: You can turn the effect off after the shot is taken so that’s not the problem, the real issue is that it makes it much harder to accurately judge the optimal exposure level.

My advice is to shoot without any lighting effects and then add the effect in post if you want it.

 

indoor portrait iPhone X portrait mode, tungsten light, Korean restaurant.
My son grabbed this hot with my iPhone X, the colour is a little too intense being taken under tungsten light, but the effect of the portrait mode blur is quite lovely. Ideally, the exposure would need to be shorter to prevent the clipping of the highlights on Wendys’ shirt and my shirt collar. This shot would likely edit fine, the tip is to allow for the editing when shooting, sometimes a darker initial capture is better.

Colour?

Stupidly the iPhone still offers no way to control the white balance in the standard camera app, it’s not normally a drama as the iPhone is surprisingly good at setting an accurate auto WB most of the time, in fact far better than most DSLRs and Mirrorless cameras.  But it can still mess up under odd lighting conditions or if the subject is dominated by one colour range, especially scenes with lots of yellow or red and sometimes blue in them.

There’s no easy workaround because the focus/exposure/white balance are all set with the same tap on point, though the camera does examine the whole image for WB.  You could point it at a grey object then lock the AF/AE lock but that will also lock the focus so unless the subject and the grey object are at the same distance from the camera your subject may well end up not in focus.

Why has Apple failed to provide a white balance option, probably they just wanted to keep it simple but other brands of phone certainly don’t dumb things down this much for those with a desire for a bit of WB tweaking.

The answer would be to use another app that allows you to adjust the White Balance and still capture a depth map.  In this case, you’d capture the portrait shot using the 2X camera with the depth map option turned on, the file would then be saved to the “Photos” app and edited as normal, where you can apply the Depth of Field and Lighting effects.  The downside being you would not see the DOF effect when you take the photo, in that respect, it would be a bit like the Google Pixel 2 XL, where the image must be processed first.

Currently, there are few apps that can record a depth map, Halide is an example of one which can, but none I’ve tried allow you to adjust the WB when using the depth map option.  I hope Halide addresses this soon as it would be an easy option for more serious shooters, better yet Apple how about you just give us a WB option…please.

Ultimately unless your WB is way off kilter it can be adjusted in post editing, if it looks like the colour is going to be beyond help then consider shooting against a less intensely coloured background and keep a close eye on the highlight exposure level.

Vintage vehicle interior against bight background, iPhone X portrait mode, bright colours, some background clipping, good shallow depth effects.
In this case, the portrait mode worked well, despite the non-portrait subject. Generally, the mode performs poorly on regular subjects when you go closer than about 1.25m. One issue I’ve found is that often the colour is too ramped up by default under bright light, but this can be adjusted in editing so long as you are careful not to clip the colour data, in other words, err on the under-exposure side if in doubt. The cyan on the car in the background is quite over-cooked compared to real life. The main fault here is the mirror on the far side of the red car, it is far too blurred.

 

Extending the Focus Range

Like I said the standard range is around 1 to 2.5 meters, this cannot be changed as the iPhone just switches automatically to regular shooting mode if the subject distance does not comply with the design parameters.  However,  if you use “Halide” with Depth Map enabled it seems to be able to extend the close focus mode down to around 40cm or so, maybe even a bit closer.  If you go to close the editing in “Photos” seems to mess up the blurring and cause all manner of weird artefacts but it is useful for times when you want to shoot product photos.

Halide also seems to be able to get the depth option working at a greater distance than the standard camera app does, so it represents a nice addition to your iPhoneography toolset.  Note, you cannot blur foreground objects when you focus on something more than say 2.5 meters out and generally the blur in front of the object is less convincing than the background blur.

Unfortunately, you can only control the exposure compensation when in Depth mode in Halide but that’s better than nothing.

Another option for creating blur effects it to transfer the HEIF files into Photoshop and then use it too create the blur using the depth map, that would need a whole extra article so I won’t go any further here.

Just in case anyone is wondering, and I know you were, you cannot yet shoot RAW in any Portrait mode, in other words, no depth map option for portrait mode because the depth map can only be recorded using the HEIF format. An option could be to record both RAW (DNG) and HEIF together, so far I haven’t come across this option, it would require more complicated post-processing however to implement.

 

Radiator gauge on antique vehicle, red, iphone X portrait mode, good result, nice background blur, hupmobile.
Sometimes with close up non-portrait subjects the portrait mode nails it, bear in mind I took a few similar shots and only about half were acceptable. My frustration lies with the randomness of the results, I’m sure this will rapidly improve over the next couple of years.

Adjusting the Blur

Presently if you wish to adjust the degree of blur you will need to use another App such as the ones I mentioned earlier or Photoshop to process the image. You have options, but there is no way around the on/off choice if you stick with the regular app. Once you start using third-party apps you can apply any degree of blur you wish.

 

Blurry Edge Details

This one is tricky, as I explained it’s a combination of having a less than perfect depth map and the intelligence of the algorithms used to apply the blur effect.

One thing you can do is avoid shooting against very detailed or messy backgrounds, for example, Portraits taken against blue skies with a bit of cloud tend to work well as do portraits taken where there is a big distance between the foreground and background. It’s no accident that Apples’ sample shots tend to conform to this style.

The results also look better if there is a good level of subject/background contrast when the tones and colours in both are very similar the algorithms seem to get confused and randomly blur the wrong bits.

Another scene element type that causes grief is when you have foreground objects with cutouts in them which allow you to see through to the background elements, in this case often the areas inside the cutout are not blurred as you might expect, of course this is not an issues with human heads, heaven forbid we should have holes in our craniums but it does mess with product photography work.  Obviously, it’s hard to avoid the issue but being aware and picking it up before you press the shutter button will at least give you a chance to rejig the shot before you shoot.

 

Antique motorcycle hand oil pump, iPhone X, portrait mode.
Results with complicated close-up shots can be very random, often background areas that appear within voids within the subject fail to blur, in this case, that’s only an issue in a couple of the small regions. The actual blur (Bokeh Effect) is nice, though the fall-off into blur is quite abrupt, other apps are able to give much better control over this so falloff control is not beyond the capability of the technology.
Metal sculpture NGV, iPhone X portrait mode, failure to wok properly, blur in wrong spots.
Sometimes for no apparent reason, the portrait mode fails miserably, in this case, it blurred one side of the sculpture in the middle, a monochrome version without the effect is below. Ideally, I would like the background blur, but it would need to be done in post, I made several attempts to coax the portrait mode into working with this pic but nothing succeeded.

steelfail

 

Better Lighting Effects.

Basically, they are all a bit too much for my delicate and easily offended eyeballs, but I do have a workaround.  This might sound tricky but you can do it, so long as you have an editing App that can work with layers, I suggest Snapseed, which is both excellent and free.

Take the photo using the portrait mode without any lighting effects applied and then duplicate the image on the “photos” app.  Open the first image and apply the lighting effect you want and leave the second one alone.

Now open the first in Snapseed and then using the “Double Exposure” tool open the second, you can now blend the two together as you see fit.  You need to have a look at the tutorials for Snapseed to see how to do this, but rest assured the results can be pretty good.

Of course whilst you are in Snapseed you have a whole bunch of other terrific effects on offer to use, so double the joy for those with a sense of adventure. There are of course a wide array of apps that can work with layers, Snapseed just represents a nice easy option for those not deeply into editing.

 

Modify the Light

Want to really get great results for portraits, why not try using light modifiers just like we do with any other regular camera, reflectors, blockers, diffusers and artificial light sources all work with iPhones too you know.

Think about it, isn’t this what Apple is trying to synthesise with the Lighting FX anyway, well why not try the real thing!

Lady in library, Victoria state library, iPhone X portrait mode, blurred foreground to match background, monochrome.
I took this pic of Wendy at the Victoria State Library in Melbourne, it’s one of those times the portrait mode worked well producing a very nice background blur effect, I applied a little blur to the foreground in Snapseed to match. Therein lies a lesson, often you need to treat the image a starting point as most seem to benefit from a little careful tweaking, especially if you’re after a nice monochrome look.

 

10 month old boy, colour, iPhone X portrait mode, looking left of frame and up, grey shirt.
No, this isn’t one of the pre-potted lighting effects, it is a product of the lighting on the subject, the apple of my eye and my Grandson, “Milton”. The pic has been edited in Snapseed to add some vignetting and a filmic simulation. The key is not to over-exposure the original capture, or you will lose those delicate skin tones.

 

Define Your focus.

I’ve noticed that most times when people use the portrait mode they don’t actually set focus, they just leave it up to the iPhone to decide, which is often perfectly Ok.  But the blur effect can be significantly different depending on where exactly you set the focus and this can be seen on the screen when using the standard iPhone app.

Accurate focus setting is especially important for “product” style photography but also applies to portraits.

Try it, just move the focus point a bit and tap, you might be surprised to find the effect looks better, or perhaps worse.

Crop Away

No, you cannot use the zoom when in Portrait mode but you can certainly crop the image in post which can be helpful for product shots or when you want a more closely framed portrait. Chances are you will still end up with 6 to 8 megapixels in the cropped image which is more than enough for regular prints and way beyond what you need for most social media pics.

It’s not a DSLR

Well Doh Einstein, of course not.

No, you’re missing my point, we have all sorts of cameras and they are suited to some needs way better than others, I think many cameras get a bad rap when people try to use them for something they were not designed for in the first place.

See I get a little bit testy with camera reviewers saying things like iPhones are rubbish cameras because they can’t shoot wildlife, distant sports, studio portraits etc.  Well sure they won’t excel at that, but then your DSLR makes a pretty crap pocket camera too.

It all boils down to this, is it sufficient for the job you have in mind or is it the wrong tool for the job, only you can decide, all I’m saying is don’t expect your iPhone to be something it isn’t, a DSLR replacement, work within its design parameters and I’m sure you’ll find it fine for the task.

Concluding.

I’m sure that we’re going to see massive development in the area of “portrait” mode on smartphones over the next few years, really the technology is just in its infancy and probably 5 years short of full market maturity.

I can see that over the next year or so we will have many apps coming to market that leverage off the depth Map option and possibly do a much better job than the standard iPhone app does.

In the meantime, it remains a great feature that handily expands the shooting envelope for most smartphone users and the results while not perfect, are superior to what you get by using the standard wide angle lens for portraits and product shots.

Next Up I will look I will look at the DNG files taken with the Tele lens on the X, I think you will be pleasantly surprised.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Depth of Field and iPhone DNG

Yes, yes it’s true the iPhone doesn’t have the capacity for shallow depth of field rendering in the way your DSLR or Mirrorless camera does unless of course, you go very close to your subject.

However, there are some differences between what you can expect in terms of depth of field rendering from iPhone DNG and JPEG versions. Here’s a short video that discusses the differences, it may cause you to do a little re-evaluation of the accepted wisdom.

If you want to know more you can always buy my “Ultimate iPhone DNG” eBook on the iBooks store.

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11

Check out the video here:

 

 

Shooting Seriously With the iPhone in DNG?

Custom motor cycle seat construction

 

Here’s a quick question for you. Considering we’re spoilt for choice when it comes to amazing cameras and uber lenses that are able to render the hair follicles and yet unborn zits on supermodels at 50 paces why would you even think about using a mobile phone for anything other than a casual snap of that coffee and cake you had for morning tea.

Serious photos with an iPhone, are you serious, that just sounds totally contradictory!

Well if you disregard the differences with Depth of Field and the iPhones’ wide angle lens perspective you soon realize that when shooting iPhone DNG the dynamic range, sharpness and color are not too bad at all for many needs.

And here’s the thing, not all serious photos are taken by serious people with serious gear. These days lots of people who are not photographers are given orders from on high…”go and get me some pics of that event, make it snappy, and when ya done get it posted to our Instagram account…….. oh, and I want it on Facebook before you leave the office tonight or you’re fired”

Fact is, and I know this will hurt the ears, feelings, and egos of many of the sensitive photographic souls reading this.  I get lots, and I mean “lots” of people who turn up in my workshops on Lightroom, Photoshop, Photo Composition, iPhoneography etc who’ve been made “Chief Executive of Multi-Media, Instagram and Facebook, Resident Communications Dynamo, iPhone Wielding Guru” for their business or organization.

 

Storage area for upholstery materials.
Whatever the material needed for your seat, Mick probably has a roll of it somewhere.

Often these folks don’t actually edit the pics, others further down or up the stream might do that, but sometimes they’re expected to also be the resident “Photoshop genius”, with all the impossible expectations that infers.

These people are not actually photographers, heck they never intended to be photographers but that’s what they’re now expected to do, and damn it, those pics better be good!

So here we are 2017, the iPhone has DNG and all the extra goodness that it infers upon one’s image options and we have an increasing number of non-photographers and indeed even some professional photographers who now use the tool for serious work.

When I was planning my new Ultimate iPhoneography series of eBooks it soon became obvious that one of those books should look at what photographers and non-photographers with serious needs could do with their iPhones.  There’s definitely a strong demand for some wholesome but easily digestible information on how you might actually get the job done and importantly how to avoid the myriad of potential pitfalls.

Well that particular book’s still a way off, I’ve six planned for the whole series, the first book “Ultimate iPhone DNG” is already up on the iBooks store and the others are all well into the production phase but I thought it could be fun to show one of the sets of pics I’ve created in the preparation phase for upcoming “Ultimate Professional iPhoneography” book.

Lets just come back to the question of “why shoot serious (work) stuff with the iPhone”. I reckon there are several solid reasons.

The iPhone may be the only camera you or your workplace owns, maybe you or the workplace have decided that using a DSLR is just too complex.  (I wouldn’t necessarily agree with that but I well understand the way many feel about this situation).

An obvious one is the need for rapid turnaround and the benefits of instant sharing and no doubt for a great many such uses the quality deficits are less relevant.  You can easily crop the images severely and still have enough pixels for social media needs and honestly, regardless of how much traditional photographers protest, the fact remains only a very small proportion of images shot for promotional purposes find their way into print at anything larger than say 5 by 7 inches. Now even allowing for reproduction at 300 PPI, that 5 by 7-inch print only equates to around 3 mega pixels.

But I think you can make a case for iPhone shooting that transcends the traditional convenience and resolution sufficiency arguments, a case where sometimes the iPhone might technically be a great choice. (ASSUMING we are shooting in DNG)

Yep I know, right about now there are virtual knives and spears being thrust forth into computer monitors in the hope of impaling me or at least banishing my presence for the outer reaches of the inter-web, but please humor me, I’m just a country lad from a place that no-one much knows about.

 

Industrial sewing machine sewing embossed leather.
The industrial sewing machine had no issues at all in sewing heavy materials.

So what might those technical benefits of the iPhone be?

Well, Depth of Field is enormous, it’s pretty easy to get everything in focus and sometimes that’s just what you need. This fact might seem a little surprising to many who have come to the photography table since the advent of digital but once upon a time getting deep depth of field was a challenge and something professional photographers went to all sorts of lengths and contortions to achieve.

Related to the depth of field rendering, the iPhone can easily get really close up photos nicely sharp and yet still have quite nice separation between the subject and background elements.

Going further the lens is wide-angle, but it’s actually sharp right out to the corners, which is not a given with many regular wide angle lenses on DSLRs and Mirrorless cams.

Better yet, the lens/sensor size combination enables you to have some interesting perspective renderings that are impossible with larger sensor sizes without image stacking.

Now yes the iPhone is potentially a noisy little blighter but actually, the luminance noise, when shot in the DNG format at slightly elevated ISOs, is rather filmic and has a certain artistic appeal that actually works nicely for some types of images and especially monochrome.

Another aspect that few people will have considered is that it’s relatively easy to get total deep focus rendering from very near to distant objects by using focus shift techniques with only 2 or 3 frames.

So that’s not a bad list and for the working photographer and a tool only needs to excel in one specific aspect to make it viable for some selected shooting needs, no-one’s claiming the iPhone is the perfect portrait device, the ideal copy camera, the most powerful landscape tool, the last word or even the first word in the world of sport photography shooting, but then it doesn’t need to be either.

On the other hand, let’s face it, most DSLRs and Mirrorless cameras are still pretty hopeless when it comes to social media duties and many are very difficult to use for close-up work unless you have lots of other macro bits to go with them.

So onto the pics, our subject is Mick McCarthy from MJM Vehicle Trimming in my hometown of Goulburn NSW.  Mick is well-known for making the best custom motorcycle seats around for people who really want to be able to plant their butts on something more comfortable than the average plastic board with inverted nails that seems to pass for a motorcycle seat these days.  He still does some regular motor trimming for folks with special cars but basically motorbike seats are his gig, he also happens to be my neighbor and a friend.

I’ve gotta say it amazes me that all of the bike manufactures can produce machines which are brilliant in every way, yet they still can’t make a seat that soothes the average butt for more than 20 kilometers. Maybe we’re not actually meant to ride modern bikes, maybe we’re supposed to take them to the coffee shop, park them and then proceed to talk all kinds of BS about how great my bike is compared to your bike.  Anyhow Mick does great seats, the one he did for me totally changed the way I felt about my bike, that’s for sure.

Mick is a bit of “bike n car nut” and has a nice little collection of his own, and he really is a terrific bloke who loves to shoot the breeze on all those mechanical things that we fellas get excited about.

I wanted to create a set of images that gave a good account of the man and his craft, something that his family might treasure, but also something that told the story of MJM.

 

sewing leather material on industrial sewing machine.
Mick hard at work sewing a motorbike seat.

I think that shooting DNG files on the iPhone worked treat. I was able to get a tight close-up shot, super deep depth of field renderings (which I then dialed back to taste), a lovely filmic look and with appropriate editing some terrific shallow DOF stuff all with more than passable quality for most regular usages.  And I have to say I love the fact you can get into tight confines with the iPhone, it really is easy peasy with the iPhone on a selfie stick.

Lots of people get really hung up on the issue of noise but the honestly the noise when you shoot in iPhone DNG isn’t too bad and importantly it can be fine-tuned and even used creatively.  The key is not to shoot at high ISOs and before someone starts furiously typing a “full and well-expressed rebuttal on the folly of only having a low ISO option” think about it for a moment or two.  The iPhone lens is fixed at f2.2 or f1.8 and you don’t need to stop down to say f5.6 or 8 to get some clarity or depth of field, meaning you don’t need high ISOs all that often, provided of course you have the camera properly supported.  Some of the shots I took in his shed were at 1/5 sec or so but it all worked out fine.

And if the light is MIA and you need to bring in some artificial light via LEDs or tungsten then you don’t need near as much of it to lift the brightness levels to something workable when using f2.2 at say 100 ISO.  If you needed to use f5.6 – 8 @100 ISO with a DSLR you’d need about 8 to 16 times as much lighting power,  meaning either more lights or more expensive lights or a non-continuous light source, in other words, flash.

Yes, of course you could shoot your DSLR at a longer shutter speed, but then you’d risk subject movement or you could raise the ISO to 800 or 1600 ISO but then the difference between the quality of the two devices (when shooting in iPhone DNG) would not be near as wide as you might expect.  In any case, this article is aimed at those who are shooting with the iPhone and really don’t want to use a DSLR.

And just so you know, yep of course these shots have been lit, but in keeping with the concept of making it practical for those who need to use the iPhone for work stuff I kept it simple, just a couple of LED work lights on a pair of cheap stands with a couple of bits of foam core board.  All up the lighting stands and other bits represent about a $200.00 investment, which most businesses would pay for out of petty change.

Of course, if you want to compare JPEG outputs then all bets are off, those iPhone JPEGs are variable at best and the attainable quality level is nothing like that offered by the DNGs, so please don’t send me any arguments based on the JPEGs, I’d just be nodding my head in full agreement.

The big advantage of the DNGs over JPEGs is the pushability of the files, you can dodge and burn, sharpen and blur, crop and blow up in ways that the brittle JPEGs never allow.

 

cutting vinyl with scissors for motor trimming needs
Freehand cutting vinyl for a motorbike seat.

I find the idea of shooting with the iPhone then working out the Depth of Field rendering in post quite appealing, I’d normally choose to use a different camera if I want the shallow DOF look, but the approach can work pretty well.

Sure it takes a bit of work but then with practice, you get pretty quick at it, more importantly, it allows me to create DOF renderings that would be difficult or impossible if shot with regular DSLRs or Mirrorless cameras. In some ways, and I know this will prove a challenging statement, but sorting the DOF out in post is close in terms of flexibility to what you could do by using a view camera with tilts, shifts, rise and fall, except without all the chemical and scanning stuff arounds.  Yes, Yes I know it will not be as detailed etc, but we are not producing full-page spreads and billboards, basically most stuff goes straight to the web these days and honestly this approach looks fine for social media stuff and I reckon looks quite a bit better than the effects you get using the iPhone 7S plus portrait mode.

Going further on the Depth of Field simulation option, you can create looks that would not be possible with regular lenses, it’s easy for example to simulate the look of lens with significant field curvature or tilted focal planes such as with a tilt/shift lens, bokeh can be whatever you want and importantly you can create sharpness fall off characteristics that would be impossible with almost any regular camera. Ultimately if you start with an image that has overall sharpness, in other words, deep depth of field, you can blur it to anything your heart desires (given enough time/skill), on the other hand you can’t start with a shallow DOF image and then find clarity that was not recorded in the first place.

Sure this is a different way of working and it won’t suit everyone but like most techniques in photography, it’s just another option that might suit some specific needs. I imagine that those photographers who are fixed on the idea of photography being “what comes out of the camera” and with a strong belief that “editing is the devils work” will choke on the DOF simulation concept, but…. there are a great many of us who just accept and embrace editing as an integral part of the whole process.

I chose to go with a sepia monochrome look for this shoot but the colour versions are fine despite the basic light sources used, I’ve also added a little noise to give a more filmic feel.

iPhone DNGs can give quite different looks depending on how they are extracted and in this case, I used Iridient Developer with the noise reduction turned off. As you might expect that makes the files a little noisier but means they also look more film-like and more importantly they work really nicely with DOF simulation processes in Photoshop when you’re including added noise in the blurring process.

You might think, well sure the pics look OK on the WEB but surely the prints would be poor. Not so, 11 by 14s prints look rather nice and long ago worked out that if you make a good 11 by 14 you can pretty much print a file any size you want when you take into account the increased viewing distance.

I’ve put together a nice layout for Mick that he can frame and put up on his wall and despite the 36 by 34-inch size, the resolution is absolutely perfect.

Anyhow thanks for reading and I hope it has provided a little inspiration, Oh and if you want to know how to really shoot and deal with those iPhone DNGs check out my book “Ultimate iPhone DNG” on the iBooks store and you can also have a look at some other pics on my dedicated iPhoneography instagram site.

Just look for..   zerooneimaging or iphoneraw01 on instagram

Buy Ultimate iPhone DNG from the iBooks Store:

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11

 

Custom motor cycle seat construction
Measuring the seat for a Moto Guzzi Magni

 

Matching automotive seat materials.
Checking the match for new seat covers for a very collectible Ford Falcon 351 GT.

 

Motor trimming fasteners
Mick has pretty much any trim fastener you could need in his drawers, it sometimes takes a little searching though.

 

Keeping the dust of a motorcycle.
Michael’s’ low K MV often needs a little dusting time.

 

MV Augusta in Motor Trimming Shop
MV takes pride of place in the customer service area.

 

Restoring E type Jag seats; close up of staple gun in use.
Working on restoring the seats for an E type Jag.

 

Boss in Front of his motor trimming shop
Mick, proud owner of MJM in front of his kingdom.

 

 

 

 

 

 

Depth of Field Simulation for iPhone Pics

simulated depth of field with iPhone photo on crane winding drum.

Why would you simulate depth of field on iPhone photos, after all, we all love a nice bit of fast glass, the bragging rights delivered by nice 85mm f1.4 are just brilliant when you have leg up on the brass and coldie in hand at the local watering hole. Of course, we all know that if we want that “dreamy creamy bokeh bonanza” fast glass is the way to go……or is it?

There are a few downsides to all that bokeh driven madness, let’s count them.

First, you actually need to have the camera and mentioned heavy bokehlicious lens with you at the time. Funnily enough, some of us are just plain slack and we baulk at the idea of carrying such a bulky rig with us everywhere we go.

You know how it is, juggling the dog lead, doggy treats and carrying uber DOF master rig all at the same time, whilst trying to stop your furry face licker from all manner of canine misadventures.

And then there’s the lack of camera parking room on your favourite coffee table at your favourite cafe, oh and not to mention your aching neck, shoulders and the bruising on your “one pack” from that DOF meister rig bouncing around like Bjork at a Rave as you clumsily shuffle around.

Yep, the best camera really is the one you have with you, which for me and a significant proportion of other shooters is more often than not, the iPhone.  It doesn’t mean I want to sacrifice all that Bokeh shallow DOF goodness on the altar of convenience though, hell no, I want it all.

But dear reader, and I really must say this in hushed tones, (just hold on a minute, whilst I put on my chain mail, fireproof suit and motorbike helmet, dum de dum, ah there you go, all done),…….. sometimes you can get a better result by actually doing the DOF sim shuffle.

Ouch, who threw that, I saw you!

Tawny Frogmouth in gumtree taken with iPhone and depth of field simulated in post
Simulated Depth of field on this close up iPhone shot of Tawny, our resident Frogmouth, he lives part time in our backyard, he’s very tame and quite happy to be shot close-up with the iPhone.  The DOFsimm’d look is nice and makes him stand out rather well, especially considering that Frogmouths are normally the masters of disguise.

See it’s like this, just maybe you actually don’t want the 4 eyelashes, 3 nose hairs and one bloated magenta zit on the right cheek look, like dude, maybe you want something a little bit more sophisticated such as, oh I don’t know, a whole face in focus and a gently diminishing background blur that’s just a tad softer on the corners and super dooper soft on the most distant objects. Yeah I know, I’m hard to get along with.

Maybe you actually want those “in focus bits” to be truly ruly sharp, not just sort of glowy sharp.

Could even be you want a DOF look that’s not actually technically possible using regular aperture adjustments on a regular camera.

And what about bokeh rendering….well what about it….well maybe you want something that your DOF monster 300mm f0.95 won’t actually deliver. (Sorry, I was getting a bit silly there, but you know what I mean)

So shoot me, (whoops, just ducked in time) but you know what, you can always start with a sharp image and get down and boogie, um I mean bokeh, but you cannot start out with a creamy dreamy bokeh bonanza and find details that went MIA at shooting time.

Now sure DOF simming’s not for everyone, some folk just want to press the shutter and go home to a nice warm hot cocoa and lie down with a good book, some folk think their camera is a machine gun and you need to expend 1000 rounds to get coverage for every possible shot, well DOF simming will never float these folks boats, I get that.

Now just so you know, yep I’ve also got full frame, half frame, quarter frame and bloody big film frame and more lenses “than my wife knows about”, so it’s not like I don’t have the so-called sensible DOF choices if I want to use them.

Shooting for me…well, I’m pretty selective when it comes to taking shots, I prefer to take a few selective shots and then nicely edit them to suit my tastes.  I long ago came to the conclusion more is often, well, just more and less is a lot less work. But when you do more with less well that’s bess….I mean best.

So putting aside the time to have a blurry old-time in Photoshop on a few pics is no hardship, mind you I doubt I’ll ever do it this way for a big commercial shoot…..well not unless someone really wants to pay me to do so, then all bets are off. Money talks you know!

ducati bevel drive single iPhone simulated depth of field
A rather lovely Ducati Bevel Drive motor, taken at the Ducati Museum in Bologna, Italy, the subtle depth of field effect works a charm and accentuated the simple beauty of the bevel housing.

This is not a “how to” article and one day when I get a rush of blood to the cranium and be tempted to make a little YouTube clip on my methods and furnish a few special secret sauce killer tips.  But…First I’d need to find some hot bikini-clad ladies (apparently compulsory in almost all “tube” photography lesson clips), or get some cats (also popular and near-compulsory), learn some banter from the youtube bros and drop a few pounds – but generally I can drop a few tips here that might help you “would be dofmeisters”.

(Note since I wrote this I have embraced the world of “Tube”, but without the Models and cats…just me)

I don’t get all carried away with masks, depth maps etc, I just use multiple layers blurred to different degrees and brush it all in freehand.  No sir there are none of your fancy schmancy pants pen tools selections and all that crafty caper. I’ve got reasonably handy with brush tools over the years and whilst I’m happy to spend quality time in Photoshop I also want to get the job done efficiently and hopefully reasonably quickly.

I also make use of several types of sharpening methods, high radius, low radius, ultra-low radius, blurb-blend sharpening, high-pass filter and add noise filters, we’re all good friends you know and we play nicely with one another.

simulated depth of field,model steam pump, wellington museum new zealand.
Subtle depth of field simulation applied to Mechanical Exhibit in Wellington Museum, taken with dim available light using Cortex Cam on the iPhone 6S Plus.

The real secret sauce is actually in the shooting, first, regardless of what I’m shooting I’m very precise with my techniques but importantly shooting in DNG is super important.

If I think the image is going to be DOF simm’d I try to shoot it so there’s at least some separation between the subject and the background and I especially try to keep the backgrounds unobtrusive and not too busy. Honestly the last bit can be hard to do and sometimes a busy background when DOF simm’d can have a charm all of its own.

I also look for the right light, in other words, light that has some direction but not too harsh. I’m not afraid to ask myself or the subject to move to get the right light, assuming, of course, the subject is human, canine or mobile in some way.  I wouldn’t bother asking cats to move of course, cause you know exactly what cats are like….which is probably why I haven’t made any YouTube clips yet.  (with cats in them)

This is a little hard to explain but trust me, depth of field rendering and apparent separation has a hell of lot more to do with getting maximum image sharpness on the planes that should be …well sharp than just adding big blur.  Blur will look a lot more blurry if the sharp bits are actually really sharp. DNGs da bomb because with the right methods the images are just sooooo much more detailed in the first place, that and the fact that I can precisely control the noise signature with DNGs.

Serious JPEG iPhone shooters will be very familiar with the terms….mushy, soft, plastic, watercolour like, flat, smudged, you get the idea. iPhone DNG is nothing like this!

Last and definitely not least I have some special capture methods that really make those DNGs sing, no clues I’m afraid but you can always buy my book if you want the inside running on that aspect.

Buy Ultimate iPhone DNG on the iBooks store: 

https://itunes.apple.com/us/book/ultimate-iphone-dng/id1274334884?ls=1&mt=11