Photograph colour casts and perception
I've covered this briefly before, but having finally managed to watch the Horizon show regarding colours and having to explain the concept to my godparents I think it's worth looking at again.
One of the main problems with digital photography is taking a shot, looking at it and then stating "But why is it so orange/blue/red when it clearly isn't like that when I look at the scene in front of me?" The short answer is - because that really is what the scene looks like and the camera is correctly reproducing it.
If a person was to walk into a room with overly blue lighting they would not be surprising that objects in that room possessed a blue hue; yet take the same person outside and take a photograph underneath the clear blue sky and they'd express surprise that things have an overall blue cast; because, after all, it doesn't look that way to them.
This is due the processing performed by our brains - in the case of the outside our brains know that the light is blue and adjusts our perceptions accordingly to re-balance all the other colours. This happens without our knowledge and so seamlessly we don't even notice.
So take a photo outside and the light is blue and everything acquires a blue cast; take a shot inside and everything may turn orange (or more yellow) because that's what colour light is being thrown out by the lamps. We don't notice it because we've filtered it, but the unprocessed camera just sees what it is there.
As this is a known 'problem' most cameras will try to adjust for lighting conditions; the camera will detect a prevalence of a colour and re-adjust the others to compensate. Sometimes it can do a good job; sometimes it doesn't. When it doesn't it's time to delve into the settings for something called "White Balance".
Most of the time this is set to Auto, but if that worked there would be no need to be looking at these settings. Depending on the camera or just how snooty it is there may be some presets such as Daylight, Tungsten, or Fluorescent or a temperature range using which the user is supposed to assess the temperature of the light (presumably using a temperature reader. Much easier is a custom setting - point the camera at something known as being white and tell the camera "This is white".
To get the best from this it's important to make sure the white object is in the same sort of lighting as the scene you want to take. So not directly under the light source; not in a shadow etc. Do this and the colours of the photo should match the colours as perceived by the filtering software correctly being worked in the human brain.
Now hopefully any reader will spot that the light changes depending on where the photo is being taken so using a white balance as set up inside under artificial lighting can produce quirky results when taking photos outside under natural lights. As such it's important (if a manual one is set) to set the white balance every time the lighting conditions change.
So no blaming the camera, blame your brain.
2 comments:
What I can't figure out is… if the human visual system filters the image of the world around us correctly, why doesn't it also filter photographs of that world?
It does. Look at the photo outside and look at it inside and the latter should look bluer because the brain is filtering out some of the blue when it's outside.
Think of the light paths. The blue light hits the white object and reflects blue light to our eyes which is filtered to white. In the photo the blue light hits the 'blue' object and reflects blue light to our eyes which is filtered to blue.
Post a Comment