So, you’ve seen the coverage covering the internet about a dress which is reputed to be white and gold, or blue and black. People have extremely strong opinions about it. A lot of ‘science’ has been talked about it, or, rather, journalists have approached some scientists and written down some of their remarks. The reality is a lot simpler, and is something that anyone who is serious about taking photographs has had to work with a long time ago.
Essentially, the human eye is a combination of a bio-optical lens and retina, and some highly developed brain functions which interpret it, which works very strongly with short and long-term memory to interpret images correctly. If you look at eye motion studies, you see the way the eye’s (very shallow) focus point darts around something, building up a picture in memory which becomes what you ‘see’.
A camera, by contrast, merely captures whatever falls onto the sensor, be it exposed film or RGB photo-diodes. If you have a digital SLR and you work with RAW files, then the RAW files contain the exact information captured by the sensor.
Even a RAW image isn’t quite as raw as you might imagine. Naturally, no computer image can be ‘seen’ by the naked eye without interpretation — it exists, after all, only as electronic impulses which would normally be interpreted into 1s and 0s. When you ‘see’ a JPEG file, or a TIFF, or a RAW, you are seeing the results of a very sophisticated process of interpretation.
A JPEG or a TIFF has the interpretation ‘baked-in’ — the Red-Green-Blue values translate straight to the screen, which means if the screens colour-space is different from the file’s colour space (more on that in a moment), you will see a colour shift. A RAW image, though, also contains information about white balance, which includes colour temperature and tint.
What is colour temperature?
If you take a black body and heat it to a particular value, you get a particular colour. These values are quite high. Ordinary daylight, the kind emulated by electronic flash, is 5600 K — a temperature you would never normally encounter on Earth. However, because it gives an objective definition, colour temperature (in K) is used to define how reddish or blueish the illuminating light is. In addition to colour temperature, you also get tint shifts — early fluorescent lights, for example, always turned everything green.
Using its automatic settings, your camera should correctly record the right white balance, which is then transferred with the RAW file, and baked into the JPEG. If you’re using a smartphone camera, you will never touch the RAW — the only file you will get is the JPEG.
It should, but, quite often, it doesn’t.
Your eye (bio-optical + brain + memory) is highly adept at adapting to the illuminating light and thus correctly seeing the ‘true’ colour of something. Even so, the golden light of the hour before sunset and after dawn is golden because your eye does not entirely adapt. This is partly because your eye is taking more than a thousand samples a minute as it roves around, building up not only a picture of what you are looking at, but also the other things around it.
The camera only gets one go, and it has to guess, using its memory, what parts are ‘white’ and what parts aren’t. A sophisticated dSLR will have a substantial amount of pre-programmed scenes to help it in this task. A smartphone, no matter how sophisticated, has fewer. Even the best dSLR — I’m thinking of a Nikon D800 — doesn’t get it right all the time, which is why many photographers prefer to shoot in RAW and fix the images in post-processing if the white balance isn’t right.
What kinds of things cause poor white balance?
Essentially, a combination of mixed lighting and difficult to identify surfaces. If you are in a room, and the light is on, and there is also sunlight streaming through the window, you won’t notice anything odd. On the other hand, if you are driving home and the windows are lit, you will see the cheery glow of yellow-orange light. In the room, your eye has correctly assessed all the different kinds of shadows, and used memory to identify the ‘true’ colours of things. Driving home, you see the much bluer than normal light coming from the sky, and the window in a cheerily contrasting yellow-orange, because the colour temperature of incandescent lights is much lower than evening light, and even the modern LEDs are balanced to replicate that.
If you take a picture in a room with mixed lighting, you will see that areas illuminated by sunlight will appear blueish, and areas illuminated by electric light appear yellowish. This is most obvious in the multi-coloured shadows that you get, which can be yellow, blue or even green, depending on what kinds of lights you have, and how strong they are.
The camera has to make a guess at what the ‘correct’ colour temperature is. What it can’t do is assess the entire situation and see the colours ‘correctly’ notwithstanding the predominant light falling on them. The eye gets that right, the camera doesn’t.
You can fix this, if you have the patience, in Lightroom (for shadows), Capture One, or, best of all DxO, which has a mixed lighting feature. You can even mask out layers in Photoshop.
The other problem the camera faces is that it has to guess what white is. Unlike your eye, which knows what colour the curtains are, the camera tries to make an assessment based on predominance. If the entire image appears to the camera to be yellowish, then it will lower its reference colour temperature, in the belief that the room is yellowish because of a yellowish light. In my living room, with its butter coloured wall paper, yellow Flemish sofa and gold Flemish curtains, it’s not likely to get it right. More sophisticated software can guess based on more information — skin tone and sky tone, for example, but it’s never perfect.
What about The Dress?
Some people see The Dress as gold and white, others as black and blue. What’s the truth? The truth is that you can’t possibly tell from that picture which it is. If you load it into Photoshop, you will see very clearly that the colours which appear ‘white’ to some people are blue, and the colours which appear ‘black’ to other people are brown, mud, or gold. The dress is blue and gold, at least, according to the picture.
But — the eye doesn’t like that. Your memory has many dresses stored within it, and also many objects seen in different lights. In direct sunlight, all shadows are blue, but the eye corrects them to the right colour. Why? Because sunlight has a much lower colour temperature than blue sky, but is much stronger. Things appear to be in shadow when the sun’s light does not fall on them directly, but only as reflected through the sky. Your eye is well-used to correcting this, which is why things don’t appear to change colour as they move in and out of shadow, only luminosity.
Lacking reference cues, some people’s eyes light on the ‘white’ of the dress and interpret this in the same way as white in shadow. This causes the eye to interpret the brown part as a rich gold. From that perspective, the dress is then white and gold.
However, the eye is not satisfied with that. It continues to look at the picture, and notes that the colour cues of the other lights in the picture don’t suggest that the dress is in shadow out of doors, but rather is indoors, in low-colour temperature lighting. This means (to your eye) that the ‘white’ area is not white at all, but a much stronger blue than it would appear. In that case, the ‘brown’ must be much bluer as well, which moves it into black.
If you stare at it long enough, you may well see the dress shift from gold and white to black and blue, and back again, as the eye struggles to get the information it wants from the rest of the image.
This is made worse by the fact that the colour in the image is fairly obviously degraded. Fifty years ago your eye wouldn’t have known what to do with that, but in the internet-age we’ve seen enough bad JPEGs for our visual memory to be busy making sense of them.
Solving it with colour workflow
Assuming you actually wanted to represent the colours correctly, for example because you were doing a fashion advertisement, how would you go about this?
With some difficulty. Every step of the way is fraught with colour danger.
However, this is how it’s done.
First, before you shoot the picture, you shoot a Grey Card — not just any grey card, but a specially printed and frequently replaced card which is a known value of grey. If you have the time, you actually then set the camera’s white balance to that. If the lighting changes, you shoot the grey card again.
Actually, there’s a step before that, which is to get rid of all mixed lighting, or, if that’s not possible, to put gels on your lights so that all the lights balance. On a photo-shoot, you’ll quite often see blue gels over lights to match blue sky, or blue gels over incandescent lamps to match the colour of the studio lights. Additionally, studio lights are usually sufficiently powerful to overpower indoor lighting, and are colour balanced to match outdoor light.
Second, you shoot in RAW, and when you come to process the images, you first set the white balance using your image of the grey card, and then apply that same balance to all images, irrespective of what the camera recorded.
You are still not out of the woods. When you come to view the images, you need to view them on a colour calibrated monitor capable of displaying the colour space they are saved in. Colour space? That’s an electronic description of what the screen and other devices are to interpret the values, because each different kind of device has different kinds of colour its good at displaying. Photographers might prefer to shoot in ProPhoto, but the screen likes to show things in sRGB. You just have to bear this in mind, and have the profiles set up correctly. Calibrating the monitor, though, is something you need a measuring device for. X-Rite makes the one I use, and I calibrate my monitor and my printer fairly regularly. Not only do you have to calibrate the monitor, but you have to calibrate it for the light in the room you are in. Again, your measuring device will do this if you specify it, but, if you turn the light on (unless it’s daylight balanced) you’ll need to recalibrate, or use a different, saved, calibration.
You are still not out of the woods. If you want to send the image to a friend, just give up on the idea of colour fidelity. Unless their monitor is also colour calibrated, they will see whatever their combination of monitor and viewing conditions think they should see, which definitely won’t be what you’re seeing. Worse, if they view it on a mobile phone, the colour gamut which the phone can display may be very small, or the colours over saturated, or too bright, or too dark.
If you want to print it to your laser or inkjet printer, you need to calibrate them first. The same device should do this. I you’re sending out to be printed for a leaflet or magazine, or (even worse) a billboard, then you are unlikely to get an opportunity to do a proper calibration of their output (though, if you’re clever, you might calibrate from a previous document you did with them). If you also want to use your photo on TV, then that’s a different calibration again.
Print is particularly difficult because it’s done in Cyan, Magenta, Yellow and Black dots, rather than in Red, Green, Blue luminous pixels. The computer will handle the conversion, but it’s never exact, and CMYK print will always look duller than on your screen. Acclimatising yourself by looking at a previous piece of print and the photos used in it on screen is still the best way to get over this — let your eye, with its vastly superior processing power, do the work.
Even then, the viewing conditions may still skew the colour for you.
All this is what is known as Colour Workflow, and it’s bread and butter for anyone involved in photography, graphic design or print.
Back to the The Dress
Needless to say, absolutely none of that work was done for The Dress. For a start, it’s mixed lighting. Then, the exposure is fairly well off, but you can’t tell if the image is over or under-exposed. The camera’s software has worked hard to interpret the RAW RGB and put it into a JPEG, but this just makes things worse, as a JPEG file compresses the image using its own guesses about the image. Finally, the image has been viewed on millions of smartphones and other devices, of which only a handful will be calibrated. Turn the brightness up, it’s going to look more like gold and white, turn the brightness down, it’s more blue and black. If you’ve spent a lot of time looking at images, you’ll see two things: first, the real colour of the image is blue and brown, and, second, the real colour of the dress could be pretty much anything, since the possibility of a tint shift means it could actually be yellow and orange, or green and mauve.
For reasons which I generally only understand for the five minutes after I’ve just read the technical articles explaining it, digital cameras just don’t capture mauve or purple at all well, usually shifting them across to blue.
One thing only is certain: the makers of that dress are going to be getting a lot more sales.