Gamma

Definition

Gamma is a mathematical function that describes the relationship between the numerical value of a pixel in a digital image and the actual brightness of that pixel when displayed. Because human perception of brightness is not linear (we are more sensitive to differences in dark tones than in bright tones), digital imaging systems use gamma encoding to allocate more data to the tonal ranges where the human eye is most sensitive. Different gamma standards are used for different applications: the standard gamma for HD television is 2.2 (Rec. 709), while cinema displays typically use a gamma of 2.6. Log gamma curves (such as ARRI LogC or Sony S-Log) are used in professional cameras to capture the widest possible dynamic range.

Contextual Usage

The colorist explains a technical issue to the director: "The footage looks very dark and flat on your laptop because your laptop is displaying it with the wrong gamma. The footage was shot in ARRI LogC, which is a flat, log gamma designed for maximum dynamic range. It needs to be converted to Rec. 709 gamma before it looks correct on a standard display. I'll send you a properly converted review file."