and then people will complain about using a metric that actually is useless. running games at 1080p does have use. it shows how good the CPU is at playing games.
damned if you do…damned if you don’t.
and then people will complain about using a metric that actually is useless. running games at 1080p does have use. it shows how good the CPU is at playing games.
damned if you do…damned if you don’t.
I’m with martward on this completely know why we do it and is really one of the few ways you can demonstrate IPC gains in regards to gaming. It’s just an annoying metric to look at and It feels misleading to those who are newer to PC building/ less observant enthusiasts.
My point was it feels useless to those who actually wanna buy this cpu for the most part and the small niche that want super high refresh rate monitors @1080p are the only ones I feel this really benefits. And feels like such a small niche. But I mean I’m no authority in PC gaming. Just a confused enthusiast wondering why a metric I don’t really care about exists
Perhaps I can word it another way. It’s not a matter of performance but bottlenecks. At 1080p the GPU is barely doing anything so the the results you get are purely measuring the difference in CPU. At 1440p or more the GPU starts to become a bottleneck, meaning the CPU has to start waiting around, invalidating the results.
So it’s not just marketing, high resolutions would make the benchmarks worthless.
exaclty! running benchmarks at 1080p is the only way to see the difference between CPU’s for gaming performance.
A metric exists to give an indication of real world performance. If the benchmark does not do that it’s worthless IMO .
Let’s just all agree to disagree.
Yes I 100% understand all that. And is not at all the point I’m trying to make. I get that 1080p is the only way to show off cpu performance and any popular higher resolution makes them pretty much invalid due to how much gpu’s take over. I made my original post with that all in mind lol. I understood all that before perhaps I did not really make my initial question clear enough. When I asked.
I brought up the fact that these guys who will use these CPUs on gaming PC’s will likely have higher end monitors where yes the cpu will not really need to perform all that much cause the GPU will be what does all the work for the most park I said all this with all that knowledge in my head. I 100% understand why Intel is doing right from the start. I was just stating how not really useful this chart is for most people . I’m not asking why these benchmarks aren’t done at higher resolutions at all. I’m more asking why people would even really care about this chart and why it’s up there as I personally think if all you need is a gaming cpu at 1080p the 11900k is definitely not what you need rather get something like a 11500k at that point. A chart comparing say the 11500k vs 5600x would be a more useful chart to look at.
But I mean it’s Intel’s best gaming cpu vs AMD’s best gaming cpu here but I mean if you are running these chips before you have a higher resolution monitor and a a card that can run it for just gaming I think you have your priorities our of whack
Oh My Word! How is the benchmark worthless when it’s the ONLY benchmark that shows the CPU performance for gaming?
@RiceGuru we’re not saying you’re going to play games at 1080p. but by running games at 1080p, you will see which CPU has the better overall gaming performance. the Intel benchmark showing how their 11th gen CPU’s have higher performance than AMD…which would not be as obvious at the higher resolutions they CPU is likely to be used at, because at higher resolutions the GPU is doing the work, not the CPU.
it’s kinda like testing a car on a dyno. the details it provides have no real world value as you’re never gong to be running your car at that level of performance. but it’s still nice to know the HP your engine can crank out.
@marzipan I was not disagreeing with this point at all. I was just saying my original statement was meant as more of a sarcastic quip than an actual legitimate question. I completely understand the relationship petween CPU’s GPU’s and resolutions and Ikeep trying to explain my understading around them but it seems it’s not coming through even though I said.
but it seems me explainging my understanding around that releationship isn’t coming though
A metric can exist to give an indication of real world performance. That’s not their sole function. If real world performance was the only reason test metrics exited, then car reviews would never conduct most of the tests they routinely perform.
That’s a fair point, and I don’t mean to say that people who do look at such metrics are stupid or anything. I, personally, am of the opinion that, for me, a metric that does not indicate real world performance is worth next to nothing.
Like having a car that goes 250 km/h while the maximum speed anywhere is 130 km/h, I don’t care since I will never be able to take advantage of it therefore, to me, it holds no value whatsoever.
Or take headphone; lovely to know a pair of (theoretical) headphones won’t distort even when running at 200 decibels, however my eardums will have popped before that so if every headphone won’t distort until 190 decibels this is a useless metric to me and, IMO, nobody should care about that 10 extra decibel.
At the high end it actually becomes necessary to balance the performance of your gear. An issue with the RTX 3080/90 affected reviews when they found that the GPU was actually causing a bottleneck in the CPU by chewing through frames too quickly. Older CPUs (Zen+/Kaby Lake) weren’t able to keep up and were kneecapping the GPU. So having a low resolution benchmark of the CPUs allows you to see which one has high throughput and can keep feeding the GPU.
i understand this but par t of my point was these people who do have a 3080/3090 will most likely have a 1440 p or 4k monitor to run on which makes this metric less useful for the high end market. especially in the days where 1400p 144hs with adaptive sync is really easy to come by at an affordable price. I get why intel would do it though. I just kinda find the graph itself funny knowing the reality of things
Just to chime in I run a 1080p display on a 3090 and 5800x. People like myself want pure framerate for competitive edge.
there is a point of diminishing returns though. in computing everyone is obsessed with numbers and the bigger the number, the better. but a game playing at 200fps at 1440p vs 300fps at 1080p, there is no realistic advantage but waving around your e-peen and proving you’re ‘special’.
I’d much rather have the higher resolution, detail and field of view of 1440p @ 200fps than 1080p @ 300fps.
the above is theoretical scenario as I have seen mfr boasting about their async 300hz monitors.
Man you are a contrarian. There is a difference. If you don’t agree with it that is fine. I physically experience the difference. Its not an E-peen thing but rather a response time thing.
Like telling people they cant hear the difference between audio equipment at a certain point. Think about that…
I think Eizo used to make a 240Hz gaming monitor that was widely popular with Starcraft players.
I brought that up as well but I feel your in a very specific nichthat this metric applies to. and as a person who has experienced super high refresh rate myself I am jelous of your ability to afford that. Ill be in the average enthusiast grade @ 144 hz 1440p for now. but I admittedly have my personal point of dimishing returns at 144hz but can perform a bit better at 240hz I dont notice as much of a difference of cource by nature of going from high to high er refresh rate but the lesser latenncy of the panel really helps click heads when I flick
I’ve always performed better with snap shots then tracking by nature so I find it very beneficial. Tbh I also found myself to get less eye strain based headaches since switching to a 144hz or faster display. Anywho everyone is different and ymmv.