Learn about the technologies behind the Internet with The TCP/IP Guide!|
NOTE: Using robot software to mass-download the site degrades the server and is prohibited. See here for more.
Find The PC Guide helpful? Please consider a donation to The PC Guide Tip Jar. Visa/MC/Paypal accepted.
|View over 750 of my fine art photos any time for free at DesktopScenes.com!|
Pixel Color and Intensity, Color Depth and the Color Palette
Each pixel of the screen image is displayed on a monitor using a combination of three different color signals: red, green and blue. This is similar (but by no means identical) to how images are displayed on a television set. Each pixel's appearance is controlled by the intensity of these three beams of light. When all are set to the highest level the result is white; when all are set to zero the pixel is black, etc.
The amount of information that is stored about a pixel determines its color depth, which controls how precisely the pixel's color can be specified. This is also sometimes called the bit depth, because the precision of color depth is specified in bits. The more bits that are used per pixel, the finer the color detail of the image. However, increased color depths also require significantly more memory for storage of the image, and also more data for the video card to process, which reduces the possible maximum refresh rate.
This table shows the color depths used in PCs today:
True color is given that name because three bytes of information are used, one for each of the red, blue and green signals that make up each pixel. Since a byte has 256 different values this means that each color can have 256 different intensities, allowing over 16 million different color possibilities. This allows for a very realistic representation of the color of images, with no compromises necessary and no restrictions on the number of colors an image can contain. In fact, 16 million colors is more than the human eye can discern. True color is a necessity for those doing high-quality photo editing, graphical design, etc.
Note: Some video cards actually
have to use 32 bits of memory for each pixel when operating in true color, due to how they
use the video memory. See here for more details on this.
High color uses two bytes of information to store the intensity values for the three colors. This is done by breaking the 16 bits into 5 bits for blue, 5 bits for red and 6 bits for green. This means 32 different intensities for blue, 32 for red, and 64 for green. This reduced color precision results in a slight loss of visible image quality, but it is actually very slight--many people cannot see the differences between true color and high color images unless they are looking for them. For this reason high color is often used instead of true color--it requires 33% (or 50% in some cases) less video memory, and it is also faster for the same reason.
Note: Some video modes use a
slight variation on high color, where only 15 bits are used. This means 5 bits for each
color. The difference is not noticeable at all.
In 256-color mode the PC has only 8 bits to use; this would mean something like 2 bits for blue and 3 for each of green and red. Choosing between only 4 or 8 different values for each color would result in rather hideously blocky color, so a different approach is taken instead: the use of a palette. A palette is created containing 256 different colors. Each one is defined using the standard 3-byte color definition that is used in true color: 256 possible intensities for each of red, blue and green. Then, each pixel is allowed to choose one of the 256 colors in the palette, which can be considered a "color number" of sorts. So the full range of color can be used in each image, but each image can only use 256 of the available 16 million different colors. When each pixel is displayed, the video card looks up the real red, green and blue values in the palette based on the "color number" the pixel is assigned.
The palette is an excellent compromise: it allows only 8 bits to be used to specify each color in an image, but allows the creator of the image to decide what the 256 colors in the image should be. Since virtually no images contain an even distribution of colors, this allows for more precision in an image by using more colors than would be possible by assigning each pixel a 2-bit value for blue and 3-bit values each for green and red. For example, an image of the sky with clouds (like the Windows 95 standard background) would have many different shades of blue, white and gray, and virtually no reds, greens, yellows and the like.
256-color is the standard for much of computing, mainly because the higher-precision color modes require more resources (especially video memory) and aren't supported by many PCs. Despite the ability to "hand pick" the 256 colors, this mode produces noticeably worse image quality than high color; most people can tell the difference between high color and 256-color mode.