Nik Collection 2018 by DxO - FREE with selected X-Rite products!

10-bit or 8-bit?!


col.campbell 15 1.3k 4 United Kingdom
3 Feb 2017 4:45PM
Hello again everyone.

I was canvassing for opinion a short while ago on computer specs; I haven't bought yet but I'm homing in on a final spec which I'll put together myself (well, with some help, actually). Looking forward to getting in amongst that.

In the meantime I'm braising my brain on the question of colour depth. I bought a monitor believing it to be 10-bit capable. It's the Benq GW2765HT - thanks for the recommendation Chris_L. But as far as I can gather it's actually an 8-bit panel with dithering/ frame rate control to 'interpolate' colours (for want of a better expression).

I'm currently trying to choose a graphics card. Would this type of monitor benefit from a 10-bit signal? I might go for a fairly low Quadro model. Or would it only read an 8-bit signal and carry on dithering, in which case I'd perhaps be better off looking at a general purpose 8-bit card.

I've tried contacting Benq but so far, no joy. Anyone clued up on these matters?

Join ePHOTOzine for free and remove these adverts.

Fma7 4 914 United Kingdom
3 Feb 2017 5:25PM
I suppose with an 8 bit signal it would not have to dither whereas with 10 bit you would get the dithered colours
col.campbell 15 1.3k 4 United Kingdom
3 Feb 2017 5:36PM
Sorry if I haven't explained it well. 8-bit with dithering is what I'm hoping to improve upon with 10-bit. If it's a noticeable improvement.
ttiger8 3 159 United Kingdom
4 Feb 2017 3:18PM
Personally I would go for the 10 bit card. If you later decide to get a monitor that can natively cope with 10 bits the you won't have to upgrade the card.
banehawi Plus
14 2.0k 4001 Canada
4 Feb 2017 3:26PM
That 10 bit would be so you could see something slightly better (if at all) on your screen. Everything you post on the web will be seen in 8 bit. A waste as far as Im concerned if you mainly post on the web
Chris_L 5 5.0k United Kingdom
5 Feb 2017 7:40AM
It really is a non-issue unless you were having to match exactly a particular shade - for example a corporate colour for a brochure in a printing job. Even then the vast majority of Mac and PC plus monitor combos used in the industry don't bother with it. Software support is scant too.

This thread should put your mind at rest.

By the way it's worth pointing out that to get the best out of that monitor you may be better off using a DisplayPort cable or DVI rather than an HDMI unless it's a decent HDMI cable that can easily go higher than Full HD 1080p. With some of my cheap old cables from my box of cables I couldn't get it to go to full resolution - 2560 x 1440
col.campbell 15 1.3k 4 United Kingdom
5 Feb 2017 11:43AM
Yep I had clocked that. On paper at least, the HDMI should offer full resolution (with the motherboard I was looking at), but at a much lower frame rate.

My thinking now is that the display is 8-bit with built-in software giving pseudo 10-bit colour. No matter how nice the displayed image is, it's not derived from a 10-bit signal. In that respect the Quadro card would be redundant.

I have no real aspiration to change monitor any time soon, so I don't think there's much value in choosing the Quadro for 'later'.

So, I think both would serve Photoshop perfectly adequately, the choice boils down to which would be more versatile. I'm not a big 'gamer' (don't even like the word!) although I'd maybe dabble. I'll probably get more amusement from 3D rendering, which I suspect would be better served by the Quadro.
col.campbell 15 1.3k 4 United Kingdom
5 Feb 2017 4:32PM
Hah. Until I discover this . I realise the website is probably hugely biased towards game-playing, but there isn't a category, including rendering, in which the 960 doesn't romp home. Am I missing something and misinterpreting the results of this test, or (10-bit colour aside) is the GTX960 far superior?
col.campbell 15 1.3k 4 United Kingdom
5 Feb 2017 5:38PM
(And then I saw that the 1060 offers a considerable increase in performance for a modest price increase.. if the GTX's are a viable option)
Chris_L 5 5.0k United Kingdom
5 Feb 2017 10:51PM
I've had Quadros in the past and find now that regular GeForces with plenty of CUDA cores serve my needs for After Effects, Cinema 4D, Ps and Premiere perfectly.
col.campbell 15 1.3k 4 United Kingdom
6 Feb 2017 8:16AM
Thanks Chris, I think that's the decision made.. finally!

Sign In

You must be a member to leave a comment.

ePHOTOzine, the web's friendliest photography community.

Join For Free

Upload photos, chat with photographers, win prizes and much more.