1080p 280hz vs 1440p 165hz

I need to decide on a monitor to upgrade. Currently using a very old BenQ 2455HM or whatever it’s called. 1080p 60hz, overclocked to 76hz (farthest it’ll go). Done a TON of research on dif types of panels and everything. Does anyone have any suggestions?

Thinking of getting the vg259qm (1080p 280hz IPS)

or: vg27aq (1440p 165hz IPS)

or: 27gl850 (1440p 165hz Nano-IPS)

I’m thinking about getting into Fortnite and overwatch competitively.
Thanks for reading.

1 Like

Do you have a video card that can easily handle 1440p at 165Hz?

I’m gonna get a 2070 super anyways. i got a 960 rn

I had a friend that was hyped about the bottom option but he returned it soon after. He complained about bleeding.

I would go 1440 p 165 ips for gaming nano ips for productivity due to the improved color gammut but. If the gtg is the same I would go nano ips for gaming as well. Higher refresh rate past 120 hz is really hard to discern a difference. Especially 165 vs 280. Higher refresh rate 1080p monitors are slowly getting cheaper anyway .

1 Like

Reaction times have been demonstrated to improve with higher refresh rates (just because you can’t see it that easily doesn’t mean it isn’t helping your brain), but 165Hz is already pretty up there. I would go with the 1440p as well (I did), it’s hard to go back to 1080p afterwards.

Haven’t tried the GL850, but I own the GL83A-B and I think it’s pretty awesome. I guess it depends, if all you care for are Fortnite and Overwatch competitively I guess a higher refresh rate is preferable, and you probably won’t set the settings on max since you’ll want all the frames you can get anyway. However if you care anything about other games or working on the PC, I’d never advise to get a 1080p high refresh monitor since (IMO) it looks a bit like ass and is just a bad deal considering the high price.

I own a 240hz 1080p monitor for gaming, and an 1440p 75hz monitor I intended to keep using for productivity and content. Intended to, but in the end I do everything on the 1080p monitor and the 1440p one is relegated to secondary monitor duties. And I’m using a TN panel. Basically panel technology has come such a long way that the colors and viewing angles on high refresh monitors are not actually a problem unless you’re doing work where color accuracy is critical. And the difference between 1440p and 1080p AT THE SAME SIZE is not that big. Individual pixels are noticeable on the 1080p panel - if you’re specifically looking for them. Personally I’m looking to switch out the 1440p panel for a 4k one around 40 inches - now that I feel would be awesome for productivity.

Now for competitive gaming, you’re not comparing just 240/280hz and 165hz - the resolution also matters, because it will affect your in game FPS. It will be easier to reach higher and more consistent FPS with a 1080p resolution. And it’s not true that you just need to match your monitor refresh rate - even if your monitor only displays 144 or 165 hz having high frame rate in high 200s will absolutely feel smoother.

That said, those are somewhat in the realm of diminishing returns - so in the end it comes down to what is your priority and how obsessed do you want to be about squeezing out the last 10% of potential advantage. It’s not that different from high end audio in that regard.

1 Like

Much aggreed it has been proven time and time agaim that the higher the refresh rate the more the game improves but past a certain point noticable 120 or 144 hz the returns seem to diminish significantly.

Depends. If you have more time to track a target then yeah, refresh rates over 144Hz are within margin of error. The difference comes from reacting to objects that are only visible for a short time like enemies passing a window. Reaction times, accuracy and consistency are significantly improved over 200Hz.

Would highly recommend MSI MAG271CQR. 2k curved monitor @ 144hz. I use mine for everything from competitive fps to watching movies and I have had zero issues in 6 months. I would try it out in a shop first though. They are known to have backlight bleed issues in really dark rooms. My unit doesn’t have that issue though. I drive it with an MSI RX 5700XT.

Definetly go for 1440P. The win in sharpness outweights the 280hz thing. 144/165 is smooth like liquid. I have switchted from an Benq 144hz Full HD (TN) to an Acer Predator X34P wich is UWQHD (IPS) and i dont know how i could handly the crappy 1080p image that long.

Did not know you could OC a monitor, just bumped my Samsung S32D850T from 60 to 72hz, interesting and noticeable little bump in fluidity in the couple games I play.

Tested it out with this online tool, works dandy. https://www.testufo.com/frameskipping

1 Like

The true question is, how serious are you thinking, and what are your reaction times now. dont tell us, but dont lie to yourself. past 144hz mainly matters to like 10% or less of the population that has the reflexes to use it.

Freesync and G-sync monitors will actually do this automatically when the modes are enabled.

freesyncealso works as gsync now. i use a 144hz freesync and was waaaaaaay cheaper then anything gysnc. works flawlessly.

I would rather get a wide screen (34’’ or 38’’) than move away from 144Hz

you get to notice the difference between 60Hz and 100Hz, if you have good eye, you can even notice the difference between 100Hz and 144Hz, but 165/175 or 2xx HZ, you really stop to see the difference and it’s all just more for “I have a screen with 2xxHz!!” than you can actually take pleasurer of the image, plus… a huge machine to run games in Ultra at that rate!

Wide screen VA or IPS (please read a lot about them, their are IPS panels that are terrible, as well VA panels that look amazing) and you get much more evolved into a game, than a 25’’ at 2xx Hz…

1 Like

100% agree. I picked up a 34" wide screen 1440p 144hz when all the work from home started (and for gaming, I’ll be honest :P) and now that I’m back in the office with 3x 1080p 60hz, I can’t stand the lower resolution.

Kind of. Freesync is actually proprietary, so Nvidia had to figure out how to work around AMDs work to take advantage of the adaptive sync features that are actually the open source standard. Unfortunately that means only some monitors will work properly. One major issue is backlight strobing that can cause the screen to flicker in high contrast scenes, and blank frames. That’s why they have their G-sync compatible list.

Even with those monitors, they’re still limited to the range and consistency the manufacturers were able to get within Freesync’s standard and most are limited to DisplayPort 1.2 (depending on whether it’s OG Freesync or Freesync 2).

You got that backwards, buddy. G-Sync is Nvidias proprietary adaptive refresh tech, FreeSync is AMD’s royalty-free answer (because AMD does not have the market share to force anything).

Nvidia did a brillant move by rebranding FreeSync as “G-Sync compatible”, which is their biggest dick move of the last 5 years.