Sunday, June 29, 2014

Responding to a friend about "unnatural" photographs.

Why does a picture look unnatural?


My friend Ron sent me an email with an excerpt about a photographer that had been asked how he got his pictures to look the way he did. What kind of filter did he use? Why don't the pictures look like the home pictures the questioner takes?

It got me thinking (Ron often does that to me, by the way). Here's my response. I thought I'd post it since I did put a little effort into explaining some of my thoughts regarding this subject. 

High Dynamic Range Processing


There's a technique called High Dynamic Range processing, or HDR for short, that can produce what look like exaggerated images. I didn't see the image being discussed in the snippet Ron sent, but HDR pics can look unreal. At least, they can certainly look different than a lot of "normal" pictures. It may be, though, that HDR pictures just take getting used to. Until then, we're used to seeing "normal" pictures from limited cameras.

The problem with most cameras and photography, including printed images, is that they can't produce the range of brightness or color levels that the human eye can see. The human eye can see perhaps up to 24 EV or F-Stops worth of dynamic range. Most DSLR's can only produce about 8-12 EV of dynamic range. In other words, the range from the darkest dark (black) to the lightest bright (white) is about 8-12 EV on most cameras. It's not that a camera is capturing the full range of what the eye can see, but only a portion of it, perhaps about half.

Using a Graduated Neutral Density Filter



Chirircahua Canyon
If you look at a scene like I saw in the Chiricahua's where it's dark in the canyon but the tops of the cliffs are brightly lit because the sun is hitting them, it may be a range of 16 EV.

Since my camera can only record about 12 EV, I can expose a picture for the canyon floor but if the cliff tops are in it they'll be "blown out" and will look full white in the picture. If I expose for the cliff tops, parts of the canyon floor that I can still see would be turned to black in the picture.

The picture to the right has a lot of processing to try to bring out details in the cliff tops but I think it shows that they've been processed. Note that this was shot without any graduated neutral density filter (this outing prompted me to pick up the Singh-Ray though).

The D610 has about 14 EV range. It's better but still can't always record what your eye sees.

To overcome this limitation you can use filters or HDR processing.

I got a filter specifically designed to handle this situation a few months ago, called a graduated neutral density filter. It has a rectangular shape with 1/2 filtered with neutral gray, the other half clear, and a dissolve between the two.

With this filter you can slide the gray portion up or down so that, in my example above, it provides a darkening effect for the cliff tops to bring them more within the range of the camera, allowing you to expose correctly for more of the scene.


Using HDR Software


To use HDR, you take a number of pictures exposed properly for different areas of the scene and then combine them using software. You could properly expose for the cliffs in one picture, properly expose for the canyon floor in another, and then combine the properly exposed portions of each picture in software to get more dynamic range in the final image than the camera is capable of.

I've done a few HDR shots but haven't had a lot of good luck with them like some people. I think a lot of it is because I got a "limited" version of the HDR software I use (I have Photomatix Essentials).

Here's what I think is a nice HDR shot: http://hdrsoftware.com/wp-content/uploads/2011/02/Florida-HDR.jpeg

Looking at that image, another thing to keep in mind is that the human eye doesn't really have things in focus in a scene like this from the foreground to the background. This shot has the foreground out of focus, which is kind of weird to look at. Our eyes would shift focus if we looked down at the rocks and they'd be in focus. We can't actually look at the parts of a scene in front of us where our eyes AREN'T focused to see that the area isn't in focus. The camera CAN be made to have more of a scene in focus than the eye would on it's own. This is good in that we can then look around the picture to see the different parts of it in focus, but it's bad in that it can then look unnatural.

As for color saturation, yes, that can easily be bumped up more than what the scene actually had for a normal eye. I tend to do that a lot myself. I just think they look better with the saturation bumped up.

All of these decisions end up being what can turn a picture into a personal vision, or art. It doesn't mean everyone will like the picture or consider it art. But an artistic photo can be different than a true to life representation, just as a painting can.

Even simple things like shooting pictures during the early morning can seem unreal to some people because they never go look at a place in the early morning during the "golden hour" or "blue hour". I like my pictures of Encanto Park for that reason - the sunrise and silhouettes are unusual but, other than maybe bumping up the color saturation, that's the way it looked.

No comments:

Post a Comment