It’s just the sum. Monitors have 8bit per color, making for 24bit per pixel, giving the millions mentioned. 16bit is actually 4bit per color and then another 4 for a single of those colors. But this has downsides as explained in the article when going form higher bit depth to lower.
HDR is 10bit per color, and upwards for extreme uses. So it’s sorta true they are 24 or 30 bit, but usually this isn’t how they are described. They normally talk about the bit depth of the individual color.
16-bit colour gives us around 65000 colours, 24-bit colour gives us the millions mentioned above.
also 10 bit raw footage, not 30 bit
so is it called 24 bit or 8 bit? I feel like most monitors have 8 bit color and the fancy ones have 10, not 24 and 30
It’s just the sum. Monitors have 8bit per color, making for 24bit per pixel, giving the millions mentioned. 16bit is actually 4bit per color and then another 4 for a single of those colors. But this has downsides as explained in the article when going form higher bit depth to lower.
HDR is 10bit per color, and upwards for extreme uses. So it’s sorta true they are 24 or 30 bit, but usually this isn’t how they are described. They normally talk about the bit depth of the individual color.