• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX 7900 XT Flagship RDNA 3 Graphics Card Could Reach Almost 100 TFLOPs, Navi 31 GPU Frequency Hitting Over 3 GHz

Mister Wolf

Member
I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.

38-18-screenshot.png
 
Last edited:

FireFly

Member
So ray tracing performance is bad on AMD therefore it will always be bad. AMD couldn't equal Nvidia for power efficiency for over 6 years so what chance did they have of doing this in one generation with RDNA 2? Why should we expect AMD could overtake Intel in gaming when the 1800x was comprehensively beaten by an i5 processor?
 

Bo_Hazem

Banned
These numbers sound insane! I'll halt my upgrade to the next Mac or next PC depending on what's happening now.
 

Ezquimacore

Banned
I always get excited for AMD but then I buy their gpus just to remember their trash updates, stuttering and opengl support for emulation and I take that shit back to the store the next day just to buy something from Nvidia that is $200 more expensive for the same performance but smoother experience.
 
Last edited:
100Tflops....
Jesus H Christ.
While I'm not a PC gamer, I love to see the GPU makers pushing performance as much as they can.
But nothing will take advantage of it.
 

Buggy Loop

Member
Because intel is greedy and raja is not trustworthy.

This is why we need 🤡emoji.

Fucking AMD fans, delusional. AMD’s not greedy? If they had the chance? Just look what happened soon as they got their nose barely above water after 3 Ryzen generations. Let’s not even mention the fake 6XXX series MSRP where not a single AIB was even close. They’re not even trying to mitigate AIB prices.

Intel in GPU space is damn needed and it will be healthier for consumers. Just like nobody wants AMD gone.

They’re coming straight out the gate with an AI DLSS competitor for one.
 
Last edited:

Buggy Loop

Member
I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.

Right.. Nvidia is stuck on a node at their foundry and sleeping on tech for over a decade, nor bringing anything new, just like Intel did..

This stupid comparison keeps poping up without any foundation to a valid comparison. Intel is already closing the gap with a few investments, it’s not looking bright for Ryzen in coming years , they poked a dragon. And this is coming from a gen 1 Ryzen buyer and now an 5600X owner.
 
Last edited:

Silver Wattle

Gold Member
I thought I wasn't supposed to
Care about tflops anymore.
You compare against similar architecture, so comparing AMD vs NVIDIA is pointless, but comparing rDNA3 to rdna2 has merit, though until we get more information regarding rDNA3 the raw numbers should be used as a vague guide and not a direct percentage increase comparison.
 
  • Like
Reactions: wOs

Dirk Benedict

Gold Member
The big question is if RDNA3 and Ada Lovelace will reanimate the mining on GPU market.
If it does, the remaining consumers are f*****

I hope not, but I'm being selfish and thinking for myself, and possibly anyone else that wants to enjoy a GPU to play... Games.
 
Last edited:

Sho_Gunn

Banned
Right.. Nvidia is stuck on a node at their foundry and sleeping on tech for over a decade, nor bringing anything new, just like Intel did..

This stupid comparison keeps poping up without any foundation to a valid comparison. Intel is already closing the gap with a few investments, it’s not looking bright for Ryzen in coming years , they poked a dragon. And this is coming from a gen 1 Ryzen buyer and now an 5600X owner.
What are you talking about? The 5800x3d is possibly the best gaming cpu and it literally came out last week
 

YCoCg

Gold Member
I'd happily take a 40 TFLOP GPU for under 200W, hell I'd be interested in seeing what can be squeezed under 150W.
 

SlimySnake

Flashless at the Golden Globes
All this to still get wiped by Nvidia.

Never buy AMD GPUs.

No idea why this is still a thing after RDNA 2.0. AMD GPUs are on par if not slightly ahead of Nvidia GPUs in all games except for games with ray tracing. Even then, you are looking at Metro Exodus and Cyberpunk with the biggest differences. UE5 has ray tracing and seems to favor the AMD cards. See the Matrix benchmarks below. Only by a little but considering how virtually everyone and their mothers are moving to UE5 next gen, it's fair to say we wont be seeing Metro and Cyberpunk like differences going forward even if AMD fails to improve their RT performance which is highly unlikely.

mDPPmNQ.jpg



Here is a comparison of a 3080 12 GB and 6900xt. Both $999 cards at MSRP. The only difference is that the 3080 12 GB consumers 100-140 watts more than the 6900xt for 1% better average performance even when including the Metro results.

 

Mister Wolf

Member
No idea why this is still a thing after RDNA 2.0. AMD GPUs are on par if not slightly ahead of Nvidia GPUs in all games except for games with ray tracing. Even then, you are looking at Metro Exodus and Cyberpunk with the biggest differences. UE5 has ray tracing and seems to favor the AMD cards. See the Matrix benchmarks below. Only by a little but considering how virtually everyone and their mothers are moving to UE5 next gen, it's fair to say we wont be seeing Metro and Cyberpunk like differences going forward even if AMD fails to improve their RT performance which is highly unlikely.

mDPPmNQ.jpg



Here is a comparison of a 3080 12 GB and 6900xt. Both $999 cards at MSRP. The only difference is that the 3080 12 GB consumers 100-140 watts more than the 6900xt for 1% better average performance even when including the Metro results.



See the pic i posted in the thread.
 

Rickyiez

Member
Always wondered if 4k is worth it on PC right now or with RTX 4xxx. Seems that 1440p 144hz would be better? I don't know if that's much difference between 1440p and 4k on 27" monitor..
The adoption of big screen gaming is getting more and more , especially with the 42 or 48 OLED TV that people use on their desk like me. So a 4k 60FPS capable GPU is the absolute minimum.
 

Dream-Knife

Banned
The adoption of big screen gaming is getting more and more , especially with the 42 or 48 OLED TV that people use on their desk like me. So a 4k 60FPS capable GPU is the absolute minimum.
How far do you sit from that?

I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.
As someone who owned an RDNA2 card, I'm never doing that again. Probably not buying another AMD CPU either after the USB issues with Zen 3.
 
Last edited:

hlm666

Member
i dont know what game that is, but id take the average of 50 games over 1 or 2 games with really bad results.

Again, UE5 is going to be the standard next gen and the RDNA 2.0 cards seem to be on par with the 30 series cards.
If UE5 cpu usage doesn't get a big improvement these new flagship cards are gonna be lucky to hit 50% load because of being cpu bound anyway. The 6900xt already drops to under 90% load regularly in the matrix demo (so does the 3080ti this is not a shot at AMD).
 

Haint

Member
Because intel is greedy and raja is not trustworthy.

Intel have been ruthless towards competing businesses, but in terms of consumer pricing they've been far better than both AMD and Nvidia. Intel held an untouchable uncontested CPU performance advantage for nearly 15 years, but always sold successive parts at effectively the same price of the prior gen. They could have easily doubled their prices when they held like 50% performance leads over AMD's bulldozer dumbster fires, but they didn't. Very first advantage AMD gained (Zen 3) they jacked their mainstream performance tier up by $150 (3700X > 5800X, sitting on the 5700X for 2 years as a big fat fuck you, pay me money) and everything else $50. Nvidia have similarly jacked their mainstream performance tier from a $329 970 to a now "$500" 3070 (actually ~$600 as 99% of the inventory goes to AIB's). Meanwhile Intel is still selling their mainstream performance i7 12700KF ($365) for nearly the same price they sold their decade old i7 2700K at ($330).
 
Last edited:

SlimySnake

Flashless at the Golden Globes
If UE5 cpu usage doesn't get a big improvement these new flagship cards are gonna be lucky to hit 50% load because of being cpu bound anyway. The 6900xt already drops to under 90% load regularly in the matrix demo (so does the 3080ti this is not a shot at AMD).
Nah I’ve seen several rdna benchmarks that go over 97% and stay there. Its only when they are flying around do they drop to 70%. My 3080 remains at 98% almost all the time unless i start flying around.
 

V1LÆM

Gold Member
I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.
yeah sure because AMD hadnt released a good cpu in a while but lots of people were eagerly awaiting the Zen chips. AMD was really competitive up to about 2012

amd gpus arent bad but they definitely arent as well made and supported as nvidia. ive had amd gpus and they were fine but there various issues that pushed me to Nvidia. it would take a lot for me to go back. im not a fanboy. i want amd to keep making cpus and gpus. it keeps intel and nvidia on their toes. you can bet your ass we’d still all be using 4 core cpus if it wasnt for Zen. competition is good for everyone.
 
Last edited by a moderator:

FireFly

Member
Right.. Nvidia is stuck on a node at their foundry and sleeping on tech for over a decade, nor bringing anything new, just like Intel did..

This stupid comparison keeps poping up without any foundation to a valid comparison. Intel is already closing the gap with a few investments, it’s not looking bright for Ryzen in coming years , they poked a dragon. And this is coming from a gen 1 Ryzen buyer and now an 5600X owner.
What happened with Intel is exactly my point. Based on their near perfect execution up until Skylake, and AMD's horrendous execution and likely path towards bankruptcy, it would have been crazy to suggest the roles would be reversed over the coming generations. Even with Intel struggling, AMD still needed an 80% increase in IPC to draw even in productivity applications and more in gaming, after Bulldozer went backwards.

Comparatively I think that's a much worse position than their GPUs are in currently. Right now they are at rough parity in rasterizaton and just need improved ray tracing performance and for FSR 2.0 to be "good enough" and widely adopted. I don't know if RDNA 3 will deliver what they need, but stating it is somehow impossible to close the gap makes no sense to me.

As far as AMD CPUs go, based on the rumours/leaks, Zen 4 looks to be extremely competitive with Raptor Lake, with Zen 5 delivering a crazy increase in multicore performance thanks to Zen 4 "efficiency cores" stacked on top of Zen 5 performance cores.
yeah sure because AMD hadnt released a good cpu in a while but lots of people were eagerly awaiting the Zen chips. AMD was really competitive up to about 2012.
At the low end, sure, but Phenom couldn't stand up to Nehalem/Sandy Bridge.
 
Last edited:

SmokSmog

Member
AD102 144SM RTX 4090Ti
2,5ghz =92TF
3ghz=110,5TF

AD103 84SM RTX 4080
2,5ghz=53,7TF
3ghz=64,5TF

AD104 60SM RTX 4070
2,5ghz=38,4TF
3ghz=46TF

AD106 36SM RTX 4060
2,5ghz=23TF
3ghz=27,6TF

AD107 24SM RTX 4050Ti
2,5GHZ=15,4TF
3ghz=18,4TF




If Nvidia fixed FP32 utilisation in Ada GPUs then FP32 will be fake no more so AMD will be done.
Ampere has FP32+FP32/INT32 ( Half of FP32 is shared with INT32, this is where those fake teraflops are coming)
If Ada Lovelace is FP32+FP32+INT32 then AMD is cooked.
 
Last edited:

SeraphJan

Member
RDNA3 vs Lovelace is as close as it could get in terms of performance, I can't remember when was the last time Radeon is on par with Nvidia? 15 years ago? Just hope 7800 XT and RTX 4080 is affordable, doesn't need the top SKU
 
Last edited:

SeraphJan

Member
AD102 144SM RTX 4090Ti
2,5ghz =92TF
3ghz=110,5TF

AD103 84SM RTX 4080
2,5ghz=53,7TF
3ghz=64,5TF

AD104 60SM RTX 4070
2,5ghz=38,4TF
3ghz=46TF

AD106 36SM RTX 4060
2,5ghz=23TF
3ghz=27,6TF

AD107 24SM RTX 4050Ti
2,5GHZ=15,4TF
3ghz=18,4TF




If Nvidia fixed FP32 utilisation in Ada GPUs then FP32 will fake no more so AMD will be done.
Ampere has FP32+FP32/INT32 ( Half of FP32 is shared with INT32, this is where those fake teraflops are coming)
If Ada Lovelace is FP32+FP32+INT32 then AMD is cooked.

Do we have information on how much power RTX 4080 consume, if 4070 is at 300w can't we expect 4080 not breaking 400w?
 
Last edited:

Sanepar

Member
AD102 144SM RTX 4090Ti
2,5ghz =92TF
3ghz=110,5TF

AD103 84SM RTX 4080
2,5ghz=53,7TF
3ghz=64,5TF

AD104 60SM RTX 4070
2,5ghz=38,4TF
3ghz=46TF

AD106 36SM RTX 4060
2,5ghz=23TF
3ghz=27,6TF

AD107 24SM RTX 4050Ti
2,5GHZ=15,4TF
3ghz=18,4TF




If Nvidia fixed FP32 utilisation in Ada GPUs then FP32 will be fake no more so AMD will be done.
Ampere has FP32+FP32/INT32 ( Half of FP32 is shared with INT32, this is where those fake teraflops are coming)
If Ada Lovelace is FP32+FP32+INT32 then AMD is cooked.

Nah man 4070 with 300w will be louder as fuck. If amd can cook 7800 with 250w i will go amd.
 

Celcius

°Temp. member
Glad to see that Nvidia seems to be going with more vram this time around. It’s about time.
 
Last edited:
No idea why this is still a thing after RDNA 2.0. AMD GPUs are on par if not slightly ahead of Nvidia GPUs in all games except for games with ray tracing. Even then, you are looking at Metro Exodus and Cyberpunk with the biggest differences. UE5 has ray tracing and seems to favor the AMD cards. See the Matrix benchmarks below. Only by a little but considering how virtually everyone and their mothers are moving to UE5 next gen, it's fair to say we wont be seeing Metro and Cyberpunk like differences going forward even if AMD fails to improve their RT performance which is highly unlikely.

mDPPmNQ.jpg



Here is a comparison of a 3080 12 GB and 6900xt. Both $999 cards at MSRP. The only difference is that the 3080 12 GB consumers 100-140 watts more than the 6900xt for 1% better average performance even when including the Metro results.


I think AMD is doing pretty well already, RTX is over emphasized in comparisons just like any tech NVidia is pushing (it has happened countless times at this point). Nvidia cards are better at RT but there's barely any good implementation of RT out there and in pretty much all of them an argument can be made that the person would be better choosing more performance over RT. As soon as AMD catches up, be it with RT or FSR we'll move to the next thing and people will still insist that AMD has horrible drivers, etc.

It's a lot like what happens against Intel, AMD can only be seen as doing good when they have an unquestionably better product that crushes the competition, if they are just close or slightly ahead people will find plenty of excuses of why you should not buy their products.

The adoption of big screen gaming is getting more and more , especially with the 42 or 48 OLED TV that people use on their desk like me. So a 4k 60FPS capable GPU is the absolute minimum.
Laughable notion that 4K and 60fps are the absolute minimum.
 
Last edited:

Buggy Loop

Member


So basically, just like last run of GPU rumours before unveiling, a lot of guess work and stabs in the air when we don’t even understand the architecture.
 

Bo_Hazem

Banned
Mac?! WTF bro?

#NoFucksGivenToPCgaming

I only use my PC for other general uses, but most importantly for video editing. Mac Studio is compact as fuck, cool as fuck, powerful as fuck. I already have a Radeon VI PC since 2019 and never played on it, it cost me around $3400 to build it.
 
Last edited:

Hezekiah

Banned
#NoFucksGivenToPCgaming

I only use my PC for other general uses, but most importantly for video editing. Mac Studio is compact as fuck, cool as fuck, powerful as fuck. I already have a Radeon VI PC since 2019 and never played on it, it cost me around $3400 to build it.
OK fair enough, I thought there might be some MS games you have a sight interest in.

When it comes to power though, you can always get more power on PC compared to Mac at the same price point. Are video editing programs that much better on the Mac side of things? I always hear Macs are great for that type of work, but I've always wondered what the difference is.

I just feel like with every Apple product they are grossly overpriced!
 
Last edited:
Top Bottom