Color depth. or bit depth, is a computer graphics term describing the number of bits used to represent the color of a single pixel in a bitmapped image or video frame buffer. This concept is also known as bits per pixel (bpp), particularly when specified along with the number of bits used. Higher color depth gives a broader range of distinct colors.
With relatively low color depth, the stored value is typically a number representing the index into a color map or palette
. The colors available in the palette itself may be fixed by the hardware or modifiable within the limits of the hardware (for instance, both color Macintosh
systems and VGA
typically ran at 8-bit due to limited VRAM
, but while the best VGA systems only offered an 18-bit (262,144 color) palette from which colors could be chosen, all color Macintosh video hardware offered a 24-bit (16 million color) palette). Modifiable palettes are sometimes referred to as pseudocolor
- 1-bit color (21 = 2 colors) monochrome, often black and white.
- 2-bit color (2² = 4 colors) CGA, gray-scale early NeXTstation, color Macintoshes.
- 3-bit color (2³ = 8 colors) many early home computers with TV out displays
- 4-bit color (24 = 16 colors) as used by EGA and by the least common denominator VGA standard at higher resolution, color Macintoshes.
- 5-bit color (25 = 32 colors) Original Amiga chipset
- 6-bit color (26 = 64 colors) Original Amiga chipset
- 8-bit color (28 = 256 colors) most early color Unix workstations, VGA at low resolution, Super VGA, AGA, color Macintoshes.
- 12-bit color (212 = 4096 colors) some Silicon Graphics systems, Color NeXTstation systems, and Amiga systems in HAM mode.
- 16-bit color (216 = 65,536 colors) later color Macintoshes, called “thousands of colors.”
Old graphics chips, particularly those used in home computers and video game consoles, often feature an additional level of palette mapping in order to increase the maximum number of simultaneously displayed colors. For example, in the ZX Spectrum, the picture is stored in a two-color format, but these two colors can be separately defined for each rectangular block of 8x8 pixels.
As the number of bits increases, the number of possible colors becomes impractically large for a color map. So in higher color depths, the color value typically directly encodes relative brightnesses of red, green, and blue to specify a color in the RGB color model
8-bit direct color
A very limited but true direct color system, there are 3 bits (8 possible levels) for both the R and G components, and the two remaining bits in the byte pixel to the B component (four levels), enabling 256 (8 × 8 × 4) different colors. The normal human eye is less sensitive to the blue component than to the red or green, so it is assigned one bit less than the others. Used, amongst others, in the MSX2
system series of computers in the early to mid 1990s.
Do not confuse with an indexed color depth of 8bpp (although it can be simulated in such systems by selecting the adequate table).
12-bit direct color
In 12-bit direct color, there are 4 bits (16 possible levels) for each of the R, G, and B components, enabling 4,096 (16 × 16 × 16) different colors. This color depth is sometimes used in mobile devices with a color display, such as mobile telephones and other equipment.
or HiColor is considered sufficient to provide life-like colors, and is encoded using either 15 or 16 bits:
- 15-bit uses 5 bits to represent red, 5 for green, and 5 for blue. Since 25 is 32 there are 32 levels of each color which can therefore be combined to give a total of 32,768 (32 × 32 × 32) mixed colors.
- Much 16-bit color uses 5 bits to represent red, 5 bits to represent blue, but (since the human eye is more sensitive to the color green) uses 6 bits to represent 64 levels of green, sometimes known as 5650 format. These can therefore be combined to give 65,536 (32 × 64 × 32) mixed colors. Sixteen-bit color is referred to as "thousands of colors" on Macintosh systems. Some formats use 5 bits for each of the colors, and then the last bit for a 1-bit alpha value. There is another format that uses 4 bits for all colors and alpha, known as 4444 format.
- Some cheaper LCD displays use dithered 18-bit color (64 × 64 × 64 = 262,144 combinations) to achieve faster transition times, without sacrificing truecolor display levels entirely.
Truecolor can mimic far more of the colors found in the real world, producing over 16.7 million distinct colors. This approaches the level at which megapixel monitors can display distinct colors for most photographic images, though image manipulation, monochromatic images (which are restricted to 256 levels, owing to their single channel), large images or “pure” generated images reveal banding and dithering artifacts.
- 24-bit truecolor uses 8 bits to represent red, 8 bits to represent blue and 8 bits to represent green. 28 = 256 levels of each of these three colors can therefore be combined to give a total of 16,777,216 mixed colors (256 × 256 × 256). Twenty-four-bit color is referred to as "millions of colors" on Macintosh systems.
Video cards with 10 bits per color, or 30-bit color, started coming into the market in the late 1990s. An early example was the Radius
ThunderPower card for the Macintosh, which included extensions for Quickdraw
plugins to support editing 30-bit images.
"32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha
data), or sometimes even to plain 24-bit data.
Systems using more than 24 bits in a 32-bit pixel for actual color data exist, but most of them opt for a 30-bit implementation with two bits of padding so that they can have an even 10 bits of color for each channel, similar to many HiColor systems.
While some high-end graphics workstation systems and the accessories marketed toward use with such systems, as from SGI
, have always used more than 8 bits per channel, such as 12 or 16 (36-bit or 48-bit color), such color depths have only worked their way into the general market more recently.
While practically every consumer brand of printer, scanner and digital camera on the market since the late 1990s offers 10, 12 or 16-bit DACs/ADCs, the ability to edit or see these colors onscreen has not seeped into modern PC systems. On the software front, working with high color depths in native RAW files is very difficult under Mac OS X, Microsoft Windows and most PC-based UNIX-likes, and many programs such as Photoshop or Final Cut are generally incapable of performing all operations “clean” for higher bit depths. On the hardware front, no consumer video adapter chipset has DACs better than 8-bit, and the integrated DACs in newer digital (LCD, PDP, etc…) monitors are often only 6-bit or worse.
As bit depths climb above twenty four, some systems are using the extra room to store data nonlinearly, with the most common form being the storage of more data than can be displayed all at once, as in extended dynamic range imaging, including high dynamic range imaging (HDRI). Floating point numbers are used to describe numbers in excess of 'full' white and black. This allows an image to describe accurately the intensity of the sun and deep shadows in the same color space for less distortion after intensive editing. Various models describe these ranges, many employing 32 bit accuracy per channel. A new format is the ILM "half" using 16-bit floating point numbers, it appears this is a much better use of 16 bits than using 16-bit integers and is likely to replace it entirely as hardware becomes fast enough to support it.
The ATI FireGL V7350 graphics card supports 40-bit and 64-bit color.
Most of today's TVs and computer screens form images by varying the intensity of just three primary colors: red, green, and blue. Bright yellow, for example, is composed of equal parts red and green, with no blue component. However, this is only an approximation, and is not as saturated as actual yellow light. For this reason, recent technologies such as Texas Instruments
augment the typical red, green, and blue channels with up to three others: cyan, magenta and yellow. Mitsubishi
, among others, use this technology in some TV sets. Assuming that 8 bits are used per color, such six-color images would have a color depth of 48 bits.