Tech Talk: 8-bit vs. 10-bit Color – Do we need a Billion colors?

4K-Malas-image-2

Most people that don’t keep up with display technology have never really heard of or thought about different levels of color depth. However, you may have noticed that some TV and monitor display manufacturers have been throwing around the term “HDR” or “High Dynamic Range” and touting it as producing better, more accurate color presentations. So, what does this really mean, and is this something that you should give thought to when considering purchasing a new TV or display?

In the past, if you noticed this at all, you may have seen color be presented in Windows as something like “High Color” or “True Color.” True Color may confusingly have been referred to as 32-bit, which is three 8-bit values for RGB and another for greyscale, essentially. By this naming system, we now have something known as “Deep Color” that has begun appearing on newer displays, most likely touting itself as 36-bit. But, this gets confusing when you see how these values are referred to in most technical specs these days.

Current color values for the vast majority of TVs and monitors out there would be reported as 8-bit, which essentially means that Red, Green, and Blue sub-pixels can produce 256 separate shades (2 to the 8th power, basically). This equates to 16.77 million distinct colors that these screens can display, which really does sound like a very high amount of separate colors, doesn’t it? Yet, with the emergence of 4K displays and newer-generation 4K displays, manufacturers have begun taking color presentation to the next level, referred to as 10-bit color. Do those two extra bits matter? Well, if you think about it in terms of 2 to the 10th power, you’ll quickly see that now each R, G, and B sub-pixel can generate 1,024 separate color shades, resulting in a total of 1.07 billion possible distinct colors!

Now, a lot of people would likely say that they are quite satisfied with how their computer screen or TV looks and they think that having nearly 17 million colors to display is very good, so what is the value of having all these extra color shades and why might you want to consider a screen that supports 10-bit (or higher, later on) color when it comes time to upgrade?

Dice_8bit

(8-bit image)

Dice_10bit

(10-bit image)

Since this may still seem a bit obtuse to some, I think a graphical representation is a good way to illustrate this. Look at the two images above: the first is with an 8-bit color spectrum while the second is 10-bit. While they both look vibrant, the 10-bit one has a lot less color “dithering” present, which is very noticeable in the shading on the white table. In the 8-bit picture, you can see lines where the colors swap shades, while in the 10-bit one it’s a lot smoother because there is more color data to work with. In the picture at the very beginning of this article, too, you can see the same effect with how the sunlight is represented in the image.

One thing to consider, though, is that not everyone will “need” to have this much color data in a display. If the source content is in standard 8-bit color, then there’s only so much that a display can do in terms of color scaling. Some displays might attempt to fill in the blanks better where colors are “crushed” or less vibrant, while others may just show the image as-is. Most film is shot in 8-bit color, of course, and certainly TV material is as well. However, this is something that is in the process of changing, and you’ll likely start to see more things (especially film, but also YouTube videos, etc.) that support HDR color fields and will look very vibrant, crisp, and realistic. In the meantime, though, 10-bit color does certainly have a use in computer displays. Photographs, for example, can have very deep color depths (12 or even 16-bit) depending on the type of camera sensor used, and so having a screen that can more accurately display what the sensor saw without having to approximate anything will make things much more easier for professionals who need to edit those pictures. The same will likely be true when it comes to editing film shot with an HDR sensor. For computer games, you’ll be able to render things with a bit more vibrancy and accuracy, and as game assets continue to develop, more and more textures will make even better use of your ability to have more colors on the screen.

Right now, you’ll mainly see 10-bit color in the TV market when looking at 4K UHD TVs. However, there are a couple exceptions to this, such as a couple reference panels from TCL (who makes Roku TVs) as well as Sceptre, but most mainstream 1080p TVs have standard 8-bit color. In fact, early-generation 4K TVs also mainly have 8-bit color, so if you want 10-bit color in your 4K TV, you’ll want to look at late-model televisions.

In the computer display market, though, there are a few 1080p displays that support this higher color depth, but again, mainly you start to see it become used in 2560×1440 (QHD) monitors and above, especially those in the past couple of years. Naturally, some “budget” 1440p or 4K monitors may simply support the standard 16.77 million colors, so if you’re on a budget you may have to decide whether resolution or color depth is more important to you.

This is really only a very basic introduction to this topic, but should be enough to get you pointed in the right direction. In upcoming Tech Talk pieces, I plan to talk about high refresh rates (such as 120 Hz) and 4K, and break down whether these things are also things you need to look out for when making your next display upgrade!

Jessica Brown

Retro Games and Technology Editor. She'll beat pretty much every Mega Man game without breaking a sweat.