Let's talk about computers

But HBM uses less power (= less heat) than GDDR type memory.

Problem with GDDR6X is the signal integrity, so having to route some traces arround holes would likely kill memory clock making the expense of GDDR6X pointless.

5700xt, I marked out the memory connections (which should also be on the other 3 layers on the top half of the PCB):

As opposed to the Vega 7 which just has its powerplane visible (can clearly tell the lack of traces under the solder resist):

Per bit. The dies are more dense so the total power consumption ends up being higher, though obviously that’s going to depend on the clock they end up going for.

Decisions decisions, am I going for Team Green or Team Red? I think I am leaning more towards the 3070 right now since they might be a good budget card to those who are looking for that high end entry to those who are not able to get the 2080ti back then. Also I am curious on how AMD would perform with their 6000 series cards which I hope it can compete with Nvidia.

That is not how that works.
10 Lightbulbs use the same power, no matter if they hang in a hangar or your living room.

To expand on this further, HBM1 was less dense than GDDR5 (up to 1GB per die /4GB-stack vs 8GB per module GDDR5).
HBM2 can be stacked higher (12Hi, according to JEDEC) and provides more GB per die (up to 8GB per die).


Lets say GDDR uses 10W per Gigabyte stored, then HBM would hold arround 20 to 30GB for the same 10W.
At the same time the memory controller on HBM provides way more bus width (4096bit vs 512bit on the extreme with GDDR) and uses about half the power compared to a GDDR controller.

Sources:

SK-Hynix is a memory producer, AFAIK the only one who together with AMD developed HBM.

SK-Hynix slide excerpts:

https://cdn.wccftech.com/wp-content/uploads/2014/05/AMD-Volcanic-Islands-2.0-HBM-Memory.jpeg

AMD-Slides:
http://cdn.wccftech.com/wp-content/uploads/2015/07/AMD-Radeon-R9-Fury_Fiji-Pro_HBM.jpg

1 Like

interesting…weren’t these embargo’d an extra two days and were to come out tomorrow, on the 17th?

yes, from the 14th to the 16th because some review samples got delayed due to HumanMalware

1 Like

Yeah. Apparently there was difficulty shipping it out to Australian Tech Youtubers

Interesting Linus had entirely different results running on a 3950x and given the changes this generation the fps differences shouldn’t be as drastic as it is between the 2 reviews I wonder if it’s early drivers or human error.

1 Like

That is a CPU bottleneck right there.

Looking through LTTs numbers now.


Edit:
Yup, CPU bottleneck (FlightSim2020 hits one core pretty hard):

Yeah I thought so too. I just didn’t personally think the bottlenecks would be so severe at 1440p

Damn out of all the games I thought flight sim would use multi core effectively

I have given up the hope for good software a while ago. Most games still rely on 4 cores (which often results in 2 cores/4threads running the show).

well, 4K gaming is now a real possibility now; one where you can enjoy the game play.

Do consoles count :stuck_out_tongue:

Also saw no optical port :frowning: but damn was a pretty damn good reveal

1 Like

I wonder if XBox will release MS Flight to their X or S systems. that would be a scandal!

lol would it even run

1 Like

yes…just not as well as on a 3070 or better.

https://www.amazon.ca/Proster-Extractor-Converter-Optical-Splitter/dp/B073TT8JGN you would probably wanna use one of these then

1 Like

doesnt support HDMI 2.1 tho for that 4k120hz, so either I hope my new tv when I get it has optical 96khz or I hope it has usb audio out. I mean 48khz be fine but meh more hz better hahah.