8K resolution - ready for prime time ? worth it ? YES!

so after a lot of research, planning and finally fighting bugs in Nvidia drivers and Sony firmware the system is running at glorious 8K resolution and it was 100% worth it.

so i have two systems now - the new system is 8K 75" screen in the living room and the previous system is 4k 55" screen in the office / study room. switching from the big system to the small one is like going from desktop to laptop. it’s a BIG difference no matter if you’re watching youtube videos, browsing twitter or playing Fortnite - you feel the difference pretty much in any scenario.

although the new system is upgraded over the old one in every way you only really FEEL the difference in two things:

1 - the screen
2 - the graphics card

if i simply used a new screen and graphics card on the old system with the old processor, ram etc i would probably get almost exactly same experience while saving thousands on not having to build a new system.

the reason i built a new system is because my parents have a dying laptop so i’m giving them mine and thus i needed an extra computer. plus i like to have spare computers in case one of them dies.

otherwise you can just use your existing PC and get a new RTX GPU and 8K screen and get pretty much the same benefit. you simply won’t have a spare computer in case shit hits the fan.

technical issues i ran into when designing my 8K system.

1 - Quadro graphics don’t have HDMI output and displayport 1.4 to Hdmi adapter only supports 8K at 30 hz, not 60 hz …

2 - Sony 8K HDMI mode is not compatible with Geforce RTX drivers …

the first problem more or less ruled out getting a Dell workstation for me because they aren’t offered with GeForce, which i need for the HDMI …

Dell Threadripper workstation does have a “no graphics” option where you get your own card but in practice only a blower style card will fit and Nvidia banned blower style GeForce so you can only get a used one like 3070 or 3080 from Ebay to use in that Dell Workstation …

at that point you now have a $3,000 workstation that is limited to an old graphics card from Ebay which is stupid …

so i had to go custom build route …

but once i have set it up to my horror i realized there is a known bug that prevents Sony TVs from properly doing HDMI handshake with GeForce RTX cards …

after 2 weeks of despair and almost sending my RTX back to replace with AMD Radeon i finally found a solution on AVS forum …

here is the solution by “nussbi35” that worked for me:

I own the Z9J and like everyone else, only VVR mode works. The Enhanced mode is also broken. In addition, I have black screen dropouts in VRR mode. This behavior is driving me crazy. Especially the black screen dropouts. My PS5 and Xbox Series X works perfectly in 4K@120 with any issues in both TV modes (Enhanced and VRR). The bandwidth at 4K@120 for the PS5 is limited to 32gbit 4:2:0, for the XBox to 40gbps 4:4:4. It could be that the 2021 and 2022 models are not really capable of handling 48gbps. This is just a guess. If you read the EDID information with CRU you can see that the TV reports a bandwidth of 48gbps. The problems exist since the ALLM and VRR update. Long Story short. I tried to change the EDID information.

My configuration
-Win11 latest version
-RTX 4090
-Nvidia Driver 527.5
-Ultra Speed Certified HDMI Cable

The following instructions solved all my problems:

  1. download cru-1.5.2
  2. run "reset-all
  3. run "restart64
  4. reboot PC (important otherwise the changes will not be applied)
  5. after reboot run CRU
  6. select SONY TV in CRU
  7. open CTA-861 information under Extension Blocks
  8. under HDMI 2.1 Support set the value Maximum FLR Rate to “40 Gbps” and disable “Auto low-latency” mode
  9. close all with OK
  10. execute "restart64
  11. reboot PC (important otherwise the changes will not be applied)

After these changes the Enhanced Mode works for me without problems. I can switch between the TV modes without any problems. Also the resolution change works without any problems. Maybe this tutorial will help someone else.

once i got it to work using the solution above all the trouble and expense was worth it.

even though i was already getting a big improvement from 4K by running a 4K signal to the TV and using internal AI upscale of the TV there are limits to upscaling …

for example upscaling will clean up ( make smoother and sharpen ) the edges of text but it won’t make the text thinner …

with real 8K you don’t just get cleaner text but the font is thinner … more stylish

with 4K upscaled to 8K the font is sharp and smooth enough but it looks like an inflated balloon - not like something carved with a sharp knife …

and on YouTube it seems that YouTube’s own upscale looks better than Sony’s so watching high quality 4K video on YouTube using 8K link from PC to TV … it looks REALLY good … better than using 4K link from PC to TV and having the TV do the upscale …

bottom line - DO IT

i mean don’t throw your old TV out but if you’re going to build a new system anyway 100% go with 8K don’t even think about it

a lot of idiot pundits are saying you don’t need 8K because you can’t see it from your couch watching a movie from 12 feet away plus no movies are shot in resolution over 8K anyway …

yes that is true …

but if you’re using it as a computer screen and you’re 4 feet away from the screen it makes ALL the difference in the world … and even makes a difference while watching 4K video …

let alone while watching real 8K content like READING TEXT or playing video games assuming you have a beefy graphics cards …

i’m sitting 4 feet away from 75" screen at 8K and my scaling in windows 11 Pro is set to 400%

everything looks GREAT

i have a separate thread on how i would have built my system differently if i was doing it again here:

8K is nice, but there really isn’t content in 8K and probably won’t be for some time. All you can do it game in 8K. The problem is that games are designed for 4K and don’t really look any better in 8K because of texture resolution and what not. As a matter of fact, games barely look any better on a $5000 gaming PC than they do on a $500 PS5. Developers don’t care about the tiny percentage of the market with expensive gaming PC’s.

Movies are shot in 4K or edited in 4K if they aren’t. CGI is rendered in 4K. This means that most movies can’t get 8K rereleases even if the demand were there.

HDMI cables are also increasingly unreliable at that bit rate. Even cables that are supposed to work fail quite frequently.

8K is going to be harder to get off the ground than 4K for sure.

what you probably failed to consider is pixel structure.

here is an actual picture of subpixel domains in operation:

image

there is a variety of reasons why this is done from shadow detail to viewing angles but the net result is that pixels look nothing like squares of varying brightness and color

if they did then 4K upscaled to 8K would not look any better than regular 4K

but because of how uneven real pixels are 8K even when it is simply an upscale from 4K looks NOTICEABLY better …

a real world pixel has its effective center shift left and right depending on color displayed and shift up and down depending on brightness displayed. upscaling fixes this even without AI, but most new TVs use AI upscale which can even further sharpen the picture from base 4K image.

i am not talking theoretically here but from using both back to back every day

you can see the difference in almost all content, including 4K video.

all you need to see the difference is to be close enough to the screen. you won’t see it from your couch, but you will definitely see it from your computer chair for a screen of about 65 inches or larger. and there aren’t any smaller 8K TVs anyway.

here is a simple experiment you can do yourself.

watch a 1080p YouTube video on 1080p screen.

and watch the same 1080p YouTube video on a 4K screen.

it will look noticeably cleaner on the 4K screen.

the same exact benefit applies when you watch 4K video on 8K screen.

of course you don’t buy a 8K screen to watch 4K videos. you buy it for all the OTHER type of stuff you can display on it that is natively 8K like web pages.

and as for games - they adjust detail level based on what GPU you have and what settings you use. the game itself can easily take advantage of 8K screen if the GPU allows it.

MKBHD once did a video demonstrating a racing game in 8K where inside every drop of water on the car’s windshield you could see a reflection of the entire world … he was running RTX 3090 at the time, and my 4080 has roughly the same power …

by the way that video itself is a 8K youtube video which i can play in 8K on my computer …

in fact if it wasn’t for this video i wouldn’t have built my system because most people are skeptics like you who think this shit isn’t going to work or isn’t worth it etc. but it does work and it is worth it.

8K definitely had some teething pains but we’re mostly out in the clear now. i would say the time to switch to 8K was probably around 2 years ago when that MKBHD video was made, when i watched it, and when i started planning this current build that i finished a few weeks ago.

if you’re going to game in 8K though you probably want the RTX 4090 … i got the 4080 because i didn’t realize i would actually be gaming, but then i got hooked …

so i been struggling with ugly fonts in 8K and couldn’t figure out what was causing it …

just fixed it today. it was the “game mode” on Sony TV … changed it to “graphics mode” and text was instantly beautiful …

except now i can actually see reflections between different layers of the screen. the text itself is displayed sharp and clean in terms of pixels but then it reflects through various wide angle viewing diffusers and such and in the end i end up seeing like a ghost version offset by may be about 1/4 millimeter depending on what angle i’m looking at.

if i’m looking dead on at 90 degrees the offset disappears and text is razor sharp. if i’m looking at an angle the text is doubled.

of course when you sit close to a large screen you can only see maybe 1/4 of the screen head-on, while most of the screen will actually be at an angle.

so it’s an issue you will probably want to check for when getting a 8K TV …

overall there are many such weird issues that pop up with 8K, the other is dither applied for viewing angles for example and so on …

basically manufacturers assume, just like @kimkardashian does, that nobody actually sees 8K so they start to do all sorts of fuckery on a pixel level either to improve viewing angles or to decrease processing latency and so on …

but you can actually see some of that shit when you sit 4 feet way from the screen like me …

once you find the right settings though, even though it still may not be perfect, it’s a noticeable upgrade from 4K …

still nowhere near as sharp as your phone’s screen though … we’re talking about 100 pixels per inch on a 75" 8K screen versus 400 pixels per inch on a phone or about 250 pixels per inch on a tablet

i can’t see pixels on a phone from any distance. i can barely see something not quite perfect on a tablet. on a 75" 8K screen when it is working properly i can see artifacts up to about 3 feet from it. when it is not set up right or when there are some unintended reflections between films etc. those can be seen for maybe up to about 6 feet away.

of course i actually sit 4 feet away so the only artifact i’m seeing right now is those reflections between internal films. after switching to “graphics” mode i can’t see the pixel level weirdness anymore from my viewing distance.

i am still looking forward to 16K screens. with 16K you would be able to sit 3 feet away from a 100 inch screen and have it perfectly sharp. can’t do that with 8K.

for 16K to make practical sense though it would probably need to curve around you, and for that to make sense the aspect ratio would probably need to be wider.

in other words realistically i only want maybe 70% more lines or resolution vertically and horizontally if everything stays the same except size. but with with a wide, curved display i could make use of maybe as much as 250% more lines of resolution horizontally, though i would have to turn my head to see the edges of the screen, but that is fine.

i would also want to have 600 hz refresh rate instead of 60 hz.

i give it another 20 years and we will get there.

The 8K panels being produced have a BGR subpixel structure, which makes text harder to read. Monitors normally have an RGB subpixel structure. Maybe that is what you are seeing.

BGR vs RGB is something Windows ClearType is supposed to fix

my problem was GAME mode on the TV. switching to GRAPHICS mode fixed it 100%.

i just keep forgetting how much processing these 8K TVs are doing - being in the right mode can make a big difference as the difference between modes is the processing applied.

these modes aren’t just saved presets for your brightness, contrast etc. they actually work differently.

i am an idiot for not trying to switch modes sooner. i guess i have a fatalistic mindset where i just assume i’m doomed from the get go and that causes me not to try potential solutions.

I have to put my Samsung into game mode to get it to display 8K 60Hz natively without any weird processing. The downside is that it reduces the number of dimming zones and the blooming is significantly worse than in the other picture modes.

I think that is the major limitation, or at least it was when I bought mine. The TV can natively display an 8K 60Hz signal, but it doesn’t the processing power to use all the dimming zones. Backlight processing is very demanding.

The obvious solution is to wait for 8K OLED’s. I would never use one myself as a monitor, but you could avoid burn in long enough to get several years out of it.

it isn’t so much about power but rather …

picture must be delayed to avoid backlight flickering and seeming to lag the movement on the screen …

this is the same as how automatic transmission hesitates and lags - it needs to wait to determine the right action - it can’t shift every time your foot twitches …

in game mode you can’t delay picture to wait and see what appropriate zones should be turned on - instead you just have to pretty much turn them all on and leave them on ( more or less ) …

luckily games tend to be relatively bright compared to night scenes in movies so it’s not a huge issue IMO …

Sony and Samsung both apply a sort of dither at 8K, in order to improve shadow detail and viewing angles but they don’t use the same processing.

it looks like a checkerboard pattern when you look very close in certain dark areas … i have seen it on both Sony and Samsung …

notice in image above (Samsung QN700B 8K) in the darker area alternating pixels are skipped in checkerboard pattern …

as a result, at least in the past, Sony was unable to resolve some 8K patterns in certain modes. it varies by mode but Samsung was in the past able to resolve 8K patterns in more modes. in other words the way Samsung applied the checkerboard pattern ( which both Sony and Samsung use ) was more intelligent, or at least better optimized for displaying fine detail.

basically the TV has to detect what is being displayed and apply this grid only where it won’t negatively affect image quality. it should only be applied in some dark areas that lack detail - not everywhere. Samsung used to handle this better.

LG has IPS panel and doesn’t use this type of dither at all. It’s always pure 8K. But it blooms like a motherfucker.

i would also add that Sony historically is not great with high resolution signals. even the 4K sonys had issues at 4K already. Sony just simply never prioritized resolution or number of dimming zones - they were focusing on brightness, accurate color and so on.

Sony is more geared towards cinema whereas LG OLED towards gaming and Samsung QLED is somewhere in the middle. or at least that’s the impression i got at the time.

i think high end mini LED displays are superior to OLED but only the 8K screens get proper mini LED backlights for the most part. it’s how the industry decided to do it.

the 8K screens are the no-compromise screens. the OLED screens are great if you only need 4K and don’t mind replacing them every 5 years.

and regular 4K LED is just meant to be cheap.

they COULD put a 10,000 dimming zone backlight into a 4K LED screen - but why ? it would then just compete with the OLED, which is dumb.

Samsung had better 4K LED screens than others ( more zones ) because Samsung didn’t have OLED at the time, so it wasn’t competing with itself by offering 4K LED with good blacks.

anyway i think mini LED tech has a lot left in the tank. Sony is deliberately holding it back so that they can show incremental improvement every year. it’s better for them to sell you cheap trash today and then again sell you improved cheap trash tomorrow making money twice … rather than go broke making perfect TV today so that you don’t need to upgrade tomorrow and they can go broke even more …

both Sony and Samsung are cynical in their own ways. Sony because it deliberately holds their TVs back by using lower dimming zone count ( to force you to step up to higher end models and / or upgrade later ) and Samsung is cynical because they focus on slim cabinets over build quality making TVs that literally bend while being installed.

in the end only cynical companies can survive in the long term. some brilliant innovation may give a company a start but you can’t count on being brilliant and innovative forever. both people and companies get old. once you’re old you have to rely on cynicism, that is, treating other people ( customers ) as the idiots they are.