You’d be surprised. 4K is harder to prioritize while the enthusiast market is pulling in different directions and demanding support.
I’m not sure how Steam gathers data for its hardware survey but they’ve got ultrawide resolutions at 0.74 and 1.11% (multi-monitor is a separate category), that’s within margin of error of 4k at 1.97%. Steam has a gargantuan install base so they’d have to be pretty awful at surveys for that to be very unrepresentative.
All of the above are still “out there” really, but then Steam only showed 1080p as the most popular resolution a few years ago, it was 1360 x 768 for a very long time (probably changed by the release of the Chinese version of Steam).
Lol, my friend bought a 2080ti and he says he never plays in 4K. Framerate just tanks to 10-to-20fps or less. 1080p all the time and sometimes 2K, otherwise it’s not worth it.
Apparently the next gen stuff is being designed to handle 4k120 but really at this point were just catering to esport gamers
Also, considering you basically need a graphics card to run the Facebook website without lagging today, my guess is, that’s not the only place they don’t give a crap about slow computers, and coders need to learn to do their job properly again.
I don’t believe in windows gaming anymore. Some trolls sometimes say consoles are “slowing down” the industry, but my guess is, it’s the opposite. They’re just jealous because they bought ram and a new graphics card for 2000$, when consoles actually force people to code properly and release games of similar quality on a 500$ or less box, instead of them saying “oh, for this second DLC you need a second 2080ti in SLI” or whatever.
I’m curious…can PC’s even be set to output @ 8K now? a BIG 8K display or projector would make for a nifty monitor for gaming if you’ve got some 2080 Ti Super’s in SLi!
I worked in games for 25 years, and I’ll tell you performance on PC’s vs consoles has little to do with “learning to code properly”, and a lot more to do with knowing exactly what you are working with on consoles and having to support min spec machines and random driver issues in the PC space.
There is only so much you can do to support a guy with 64GB’s of memory, a 2080 and a 16 core CPU, when what you produce also has to run on a 10 year old processor with 4GB’s of RAM and a 10 year old graphics card, or worse something integrated because that’s what the marketing/sales department tells you, is what most of your users have.
Who all play at 720p to get 200FPS…
Yes, resolution is just a number. Covering a skyscraper in an advertisement board easily gets you to 8k. The FPS though…
it’s cool that FHD projectors are getting low latency tech included at low prices…makes for a giant gaming monitor that’s not all pixelated, LoL!
aye…but such a high and wholy uncommon resolution, I wasn’t sure if PC hardware supported it.
Double hearts not allowed I’m afraid.
Yeah ultra high resolutions have been programmed into windows as far back as the CRT monitor days because of the way CRT monitor resolution scaled
Cool thread successfully necroed. I feel that 8k will probably be the last step up in resolution that ever needs to happen due to the pixel density. In like 10 or 15 years when 8k gaming is possible and 8k content is more available it will be a reasonable thing to get. For now 4k is more than enough.
Yeah, well, some act like everyone has unlimited graphics, bandwidth and storage space still. I prefer consoles because with this, there’s no excuses.
Oh no! I inadvertently necroed the console vs pc flame war!
ABORT MISSION! I REPEAT, ABORT MISSION!!!
Not really no.
Interesting thread, thanks people!
I’m buying an LG CX so I’m reading up beforehand