Gamma correction is an integral part of all the digital imaging systems, but a lot of people don’t know about it! It is an essential part of all the imaging devices like cameras, camcorders, monitors, video players, etc. It basically defines the relationship between a pixel’s numerical value and its actual luminance. Now wait a minute, why would they be different? Isn’t a pixel’s numerical value supposed to be exactly the same as its luminance? Well, not really! Without gamma, shades captured by digital cameras wouldn’t appear as they did to our eyes. If we fully understand how gamma works, we can improve our exposure technique, along with making the most of image editing. So what is it all about? Why do we need gamma correction at all?
Why do we care about it?
The main thing to understand here is that our eyes do not perceive light the way cameras do. Human visual system is an extremely refined system. It does a lot of work in the background and makes stuff look prettier. Let’s consider a digital camera for a moment here. The way it works is that the photons hit the sensor, the camera then receives a signal, and an image is formed. So when thrice the number of photons hit the sensor, it receives three times the signal. Pretty logical, right? It’s a simple linear relationship! But that’s not how our eyes work. When thrice the number of photons hit our eyes, we perceive thrice the light as being only a fraction brighter. This effect becomes increasingly more prominent for higher light intensities. This basically indicates a nonlinear relationship.
Have you ever noticed how we tend to miss out on smaller details if the image is really bright? The reason for this is that we are much more sensitive to changes in dark tones than we are to similar changes in bright tones. There’s actually a biological reason for this peculiarity. It happens because it enables our vision to operate over a broader range of luminance. Otherwise, the typical range in brightness we encounter outdoors would be too overwhelming. Our visual system is really smart that way!
But how does all of this relate to this “gamma” thing?
We are getting there! The reason we are discussing all this is because we need to understand how we perceive luminance in order to understand why we need gamma correction. Gamma basically establishes a relationship between our eye’s light sensitivity and that of the camera. When a digital image is saved, it’s therefore “gamma encoded”. This way, twice the value in a file more closely corresponds to what we would perceive as being twice as bright.
The reason we do gamma encoding is because gamma encoded images store tones more efficiently. Since gamma encoding redistributes tonal levels closer to how our eyes perceive them, fewer bits are needed to describe a given tonal range. The number of bits you allocate to a particular thing dictates the level of detail you can store about it. Since our visual system doesn’t really care about the details in the brighter regions, we don’t have to waste more number bits to store information about that region. The extra bits that are saved can instead be devoted to describe the darker tones, where the camera is relatively less sensitive.
In the picture here, you can see how the linear encoding uses insufficient levels to describe the dark tones. As in, since human eye is sensitive towards the dark tones, we need sufficient level of detail to represent those regions. If we assign uniform number of bits to all the regions, we end up with an excess of levels to describe the bright tones. This is not necessary at all! On the other hand, the gamma encoded gradient distributes the tones roughly evenly across the entire range. This is perceptually more uniform.
How do we do gamma correction?
Up until now, we saw why we need gamma correction. But despite all of these advantages, we cannot avoid the fact that gamma encoding adds a layer of complexity to the whole process of recording and displaying images. A gamma encoded image has to have “gamma correction” applied when it is viewed, which effectively converts it back to the original scene. This means that the purpose of gamma encoding is for recording the image, not for displaying the image. Fortunately, this second step is automatically performed by your monitor and video card. This gamma actually appears at different steps in the pipeline:
- Image Gamma: This is applied either by your camera or RAW development software. Whenever an image is captured, it is converted into a standard JPEG or TIFF file. So when the camera software does that, it redistributes native camera tonal levels into ones which are more perceptually uniform. This way, we make the most efficient use of a given bit depth. As in, we only have a certain number of bits to represent an image and we need to make the best of it. In order to use it efficiently, our camera software assigns more bits to darker tones and lesser number of bits to the brighter tones.
- Display Gamma: This refers to the net influence of your video card and display device. The main purpose of the display gamma is to compensate for a file’s gamma. We need this step because we need to ensure that the image isn’t unrealistically brightened when displayed on your screen. A higher display gamma results in a darker image with greater contrast. You can see this in the image above. The image on the right has a higher gamma, and is therefore brighter.
- System Gamma: This represents the net effect of all gamma values that have been applied to an image. The net effect basically refers to the combination of image gamma and display gamma. For faithful reproduction of a scene, this should ideally be close to a straight line (gamma = 1.0). A straight line ensures that the input is same as the output, i.e. the original scene is the same as what’s being displayed on your screen. However, the system gamma is sometimes set slightly greater than 1.0 in order to improve contrast. As we all know, human eye loves contrast!
Does it affect the monitors?
All the images and videos we view today are viewed on some kind of monitor. So this whole gamma correction thing is an integral part of those monitors.
- CRT Monitors: The interesting thing to note is that the native gamma of a CRT is 2.5. This is almost the inverse of our eyes! So technically, values from a gamma-encoded image file could be sent straight to the screen and they would automatically be corrected. The CRT monitors will almost nullify them by default. However, a small gamma correction of 1.1 needs to be applied to achieve an overall display gamma of 2.2. This is usually already set by the manufacturer’s default settings, but can also be set during monitor calibration.
- LCD Monitors: LCD monitors have a different gamma value. To achieve an overall display gamma of 2.2, we often need substantial corrections. LCDs therefore require something called a look-up table (LUT) in order to ensure that input values are depicted using the intended display gamma. This is a whole new pandora’s box we cannot afford to open in this blog post!
Basically, gamma correction is yet another piece of puzzle in the world of digital imaging that exists everywhere. Human visual system is a spectacular piece of engineering marvel and every imaging device aspires to replicate it. That’s the reason we dance according to its tune to capture all the subtle nuances and imbibe them in our digital imaging systems!