GeForce GTX 980/970 are great graphics cards and one of the cool features is the support of HDMI 2.0. Why HDMI 2.0 is cool? Because it allows to have an insane 3840×2160 resolution at 60Hz on a 4K TV. That’s a crazy experience to work on a 49-inch monitor with a 4k resolution at 60Hz!
To display a 4k resolution at 60Hz, the video interface (HDMI, DisplayPort) needs to transfert 12Gbits/second of data:
– bit_per_channel = 8
– num_channels = 3 (RGB)
– resolution = 3840 * 2160
– 60Hz = 60 times per second
– bandwidth = 3840 * 2160 * 8 * 3 * 60
– bandwidth 4k@60Hz = 11.94 Gbits/sec
– bandwidth 4k@30Hz = 5.97 Gbits/sec
Here is a quick recap of HDMI and DisplayPort (DP) bandwidths:
- HDMI 1.2: up to 4.95 Gbits/sec
- HDMI 1.4: up to 10.2 Gbits/sec
- HDMI 2.0: up to 18.0 Gbits/sec
- DisplayPort 1.2: 17.28 Gbit/sec
- DisplayPort 1.3: 32.4 Gbit/sec
As you can see, 4k @ 60Hz requires 12Gbits/sec so only HDMI 2.0 or DisplayPort 1.2 (and higher) can offer such a bandwidth. In 2014 and 2015, all new graphics cards should support HDMI 2.0 and DP 1.2 and all new TV screens should have HDM 2.0 inputs. Otherwise it’s a… rip-off!
I recently received this GTX 970 from MSI and one of the first test I instantly did (after installing latest R344.75) is to connect it to the LG 49UB850V. It worked like a charm:
The NVIDIA control panel has two choices: 4k @ 30Hz (Ultra HD section) and 4k @ 60Hz (PC section):
LG 49UB850V HDMI inputs:
A thing remains a bit mysterious for me: it’s not clear if it is 4k @ 60Hz with 4:4:4 chroma subsampling ratio or it is rather 4:2:0 chroma? According to some sources (like this one) over the Net, LG UB850 supports 4k @ 60Hz 8-bit 4:4:4:
When it comes to native 4K content, the UB8200 supports 4K @ 60Hz, albeit only 8-bit 4:2:0, on all HDMI inputs. The UB8500, on the other hand, supports 4K @ 60Hz feeds with higher color depth (10 and 12-bit) and chroma subsampling ratio (4:2:2), as well as 8-bit 4:4:4, albeit only on the HDMI 3 input (the remaining HDMI inputs support 4K @ 60Hz, 8-bit 4:2:0).
I didn’t find this information in the TV settings. I tested the GTX 970 on all HDMI ports and I didn’t notice difference in image quality. Actually there is a big improvement in image quality when I set 4k @ 30Hz in NVIDIA control panel. That’s why I have some doubts about what chroma subsampling ratio is used in 4k @ 60Hz because the image is very slightly degraded compared to 30Hz…
10 thoughts on “(Tested) GeForce GTX 970 and 4k @ 60Hz on HDMI 2.0”
would be very happy if you can check on other OSes if it’s working (I mean 4k@60hz 4:4:4).
I say this because seems people were having troubles getting working on Linux and Mac (hackintosh)
Linux: with launch drivers for Linux(343.22).. perhaps with 346.16 drivers it’s working..
see threads like these:
“Internally tracking this issue under Bug 200043559 : linux : HDMI 2.0:2nd generation maxwell:GTX 970/980: getting no-signal or blank-screen when attempted 4k resolution @60hz ”
also interesting would be to know if you can hackintosh working also:
I posted here the question and a guy answered no working:
I’ll try the GTX 970 + hackintosh asap.
You should activate “HDMI UHD Deep Colour” setting on LG 49UB850V (or 49UB8500) menu, to get 4:4:4 color (with 4K 3840×2160 60 Hz).
Ciao a tutti, il mio lg 55ub950v collegato alla porta 3 quella dichiarata hdmi 2.0 collegato a una gtx 770 mi rileva la risoluzione 4k, da pannello di controllo nvidia nella sezione ultra hd tv mi da solo 30 hz! E la qualità dell’immagine e a dir poco fantastica, passando nella risoluzione pc la porta fino a 60p con un degrado dell’immagine, anche abilitato la funzione deep color della lg modifica la risoluzione nativa dello schermo portandola dal full hd a quad hd m a senza cambiare altro e dopo poco il tv mi dice che il dispositivo non supporta tale tecnologia! Altro problema che riscontro e la visualizzazione di video 4 k che non sono fluidi e si bloccano! Chiedo aiuto a qualcuno più esperto!
ciao andrea sembra que le video non sono fluidi perché il computer non e abstanza veloce
l’altro problemo sembra normale :
gtx 770 spec …. no 60Hz schermo perche si deve utilisare il display port per poter superare le 30hz :
3840×2160 at 30Hz or 4096×2160 at 24Hz supported over HDMI. 4096×2160 (including 3840×2160) at 60Hz supported over Displayport. Support for 4k tiled MST displays requires 326.19 driver or later.
I’m totally confused because I’ve contacted NVidia and they assure me that GeForce cards do NOT support 8-bits. So pleased I’ve found your post!
Can you tell me, is 8-bits only possible because you have it through a TV and not monitor? I’m looking for a monitor suitable for editing and find most are 8-bit or even 10-bit. Would 8-bits also be possible on the 960?
Lets be straight.You will need 4k projector or UHD TV with 2.0 hdmi input , AV receiver with hdmi 2.0 imputs-outputs,and bluray player with hdmi 2.0 output.AND if you want to download UHD torrents and watch them on 4k projector -guess-you will need a computer with hdmi 2.0. (When the time comes-that is).In the mean time will be of course allso HDMI 2.1 ,2.2 ect….Discussion on curent HDMI cables-are they compatibile or not -is pointless.The current chips in todays consumer electronic are -1.4 HDMI- are obsolete.For 2.0 HDMI is all new world.New processors that can handle 4K 60p.Software of todays 2.0 hdmi is youst scratching surface and its too soon to buy something that is in early stages of construction- Unless if you want to run games on 4k 30 Hz.Who does?
P.S There is another thing to consider. WILL all this money spend on new
equipment with hdmi 2.0(receiver,4k projector or lcd,oled tv,bluray
player,computer,cameras,ps or xbox…. ect…) outhweigh all production avaliable on future
market? NOT at ALL. Tv shows ar bad or worse(excluded some
techshows,geographic, documentary…) movies are tenager oriented-the
easiest way to plunder money from-.and the games are you know -repeating -endllesly-remake,
all by itself. Did anyone mentioned UHD 6K or 8K ? In 8 to 10 years
will be all over again. New codec,new HDMI standard – brand new better processors…ect… AND last but not least-for big coperations to make lots and lots of money,and for
consumers-to make them again-need to spend zombies.
I have a quick question:
Why can’t is use my lg ub8200 with my gtx 970 in
2k – 60hz ? 4k – 60hz is working but I’m not sure my graphics card can handle ne games in 4k …. Thats why I would like to use 2k – 60hz but I just can’t select it in the settings.
Any ideas why ?
Simply because 4K = 2160p (2k) they’re no 2k cause the real 2k IS 1080p
Why 1980 * 1080p = FULL HD but they’ve decided to change that and use vertical resolution instead of number of “line”
So 2K = 1980*1080p
and 4K = 3840*2160p
This is why you can’t play in 2k cause we already are in 2k 😛
Has anyone verified 10-bit output from GTX 9xx series cards over HDMI 2.0 ?
Thanks for your help.
Comments are closed.