Sunday, January 01, 2006
Color Depth
If you're running Windows, you probably run your desktop with a color depth of 32 bits. This 32 bit designation is a bit misleading. Only 24 bits are actually used for color, though for one reason or other, the OS may allocate 32 bits per pixel. If so, 8 of these bits are not used, but allocating a full 4 bytes per pixel may provide some sort of advantage regarding memory performance (it's about alignment or something), though at my resolution of 1280x800, that's nearly a full megabyte of wasted memory.
In 24bit and the so called 32bit modes, one byte is designated for each of the three colors - red, green, and blue. This gives 256 possible values for each color (0-255) and all of the colors that your computer displays are some combination of these 3 colors. This is simple stuff here, it's what I learned in elementary school art class about primary colors along with a little elementary math - only the actual colors are the inverse of what we learned about in art class and the reason for this is the topic for another post that I'm probably not going to write - do a google search if you really care to know.
The point is that there is effectively no difference between 24bit and 32bit color modes.
But my (partially) automatically generated xorg.conf file had an entry for 32 bit color mode in the server section - this is rather uncommon as 24bit mode is usually the max. I know that there is no difference, but I changed the config to default to 32 bit mode instead of 24. And this change caused an unexpected problem which you can read about in my Gentoo Tweaking post (which I wrote before this one, but you'll probably be reading this one first). So I switched the config back to 24 bit mode and everything works just fine.
You can stop reading now unless you're not sure that there is no difference between 32 and 24 bit color modes.
There really is no difference, they each designate 1 byte (8 bits) to each of the primary colors (8x3=24). I think that Windows just calls them both 32bit mode and makes no distinction - or just wastes the 1 Mb or so in unused bytes.
Now, 16 bit mode is a bit interesting. You still have 3 primary colors, no simple way around that, but 3 does not divide evenly into 16. Some cards/drivers handle this by using 5 bits for red, 5 bits for blue, and 6 bits for green (huh? - yup). Some others handle it by using 5 bits for each with 1 bit for padding.
So, I just use 24 bit mode - it works great.
In 24bit and the so called 32bit modes, one byte is designated for each of the three colors - red, green, and blue. This gives 256 possible values for each color (0-255) and all of the colors that your computer displays are some combination of these 3 colors. This is simple stuff here, it's what I learned in elementary school art class about primary colors along with a little elementary math - only the actual colors are the inverse of what we learned about in art class and the reason for this is the topic for another post that I'm probably not going to write - do a google search if you really care to know.
The point is that there is effectively no difference between 24bit and 32bit color modes.
But my (partially) automatically generated xorg.conf file had an entry for 32 bit color mode in the server section - this is rather uncommon as 24bit mode is usually the max. I know that there is no difference, but I changed the config to default to 32 bit mode instead of 24. And this change caused an unexpected problem which you can read about in my Gentoo Tweaking post (which I wrote before this one, but you'll probably be reading this one first). So I switched the config back to 24 bit mode and everything works just fine.
You can stop reading now unless you're not sure that there is no difference between 32 and 24 bit color modes.
There really is no difference, they each designate 1 byte (8 bits) to each of the primary colors (8x3=24). I think that Windows just calls them both 32bit mode and makes no distinction - or just wastes the 1 Mb or so in unused bytes.
Now, 16 bit mode is a bit interesting. You still have 3 primary colors, no simple way around that, but 3 does not divide evenly into 16. Some cards/drivers handle this by using 5 bits for red, 5 bits for blue, and 6 bits for green (huh? - yup). Some others handle it by using 5 bits for each with 1 bit for padding.
So, I just use 24 bit mode - it works great.