The biggest point I have to make against 8k is that it is entirely a marketing gimmick as almost no 8k content exists to the casual market. And we can not visually benefit from that resolution unless we are so close the screen basically eats us.
What people should be putting their money towards are 4k screens with either hdmi 2.2 support for gaming or high fidelity color accuracy and reproduction. Bad color makes movies look lower res as it turns out
When the Nvidia 900-series came out, the 980ti could almost manage playable 4k, the Titan Xm (= Maxwell) could.
1080 could barely manage 4k, 1080ti (~ $800) could barely manage 4k60.
Now we have the 2080ti and 4k at 80FPS, for the same price of $1200
I say, unless game devs put as much effort into their games as they did in 2014/15, 4k gaming stays a meme.
You’d be surprised. 4K is harder to prioritize while the enthusiast market is pulling in different directions and demanding support.
I’m not sure how Steam gathers data for its hardware survey but they’ve got ultrawide resolutions at 0.74 and 1.11% (multi-monitor is a separate category), that’s within margin of error of 4k at 1.97%. Steam has a gargantuan install base so they’d have to be pretty awful at surveys for that to be very unrepresentative.
All of the above are still “out there” really, but then Steam only showed 1080p as the most popular resolution a few years ago, it was 1360 x 768 for a very long time (probably changed by the release of the Chinese version of Steam).
Lol, my friend bought a 2080ti and he says he never plays in 4K. Framerate just tanks to 10-to-20fps or less. 1080p all the time and sometimes 2K, otherwise it’s not worth it.
Also, considering you basically need a graphics card to run the Facebook website without lagging today, my guess is, that’s not the only place they don’t give a crap about slow computers, and coders need to learn to do their job properly again.
I don’t believe in windows gaming anymore. Some trolls sometimes say consoles are “slowing down” the industry, but my guess is, it’s the opposite. They’re just jealous because they bought ram and a new graphics card for 2000$, when consoles actually force people to code properly and release games of similar quality on a 500$ or less box, instead of them saying “oh, for this second DLC you need a second 2080ti in SLI” or whatever.
I’m curious…can PC’s even be set to output @ 8K now? a BIG 8K display or projector would make for a nifty monitor for gaming if you’ve got some 2080 Ti Super’s in SLi!
I worked in games for 25 years, and I’ll tell you performance on PC’s vs consoles has little to do with “learning to code properly”, and a lot more to do with knowing exactly what you are working with on consoles and having to support min spec machines and random driver issues in the PC space.
There is only so much you can do to support a guy with 64GB’s of memory, a 2080 and a 16 core CPU, when what you produce also has to run on a 10 year old processor with 4GB’s of RAM and a 10 year old graphics card, or worse something integrated because that’s what the marketing/sales department tells you, is what most of your users have.
Cool thread successfully necroed. I feel that 8k will probably be the last step up in resolution that ever needs to happen due to the pixel density. In like 10 or 15 years when 8k gaming is possible and 8k content is more available it will be a reasonable thing to get. For now 4k is more than enough.