• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

9800X3D review thread

winjer

Member
Not worth creating a new thread just for this, so I'll leave it here.


The desktop version of AMD's next-generation "Zen 6" microarchitecture will retain Socket AM5, Kepler_L2, a reliable source with hardware leaks, revealed. What's more interesting is the rumor that the current "Zen 5" will remain AMD's mainstay for the entirety of 2025, and possibly even most of 2026, at least for the desktop platform. AMD will be banking heavily on the recently announced Ryzen 7 9800X3D, and its high core-count siblings, the Ryzen 9 9950X3D and possible 9900X3D, to see the company through for 2025 against Intel. The 9800X3D posted significantly higher gaming performance than Intel, and the 9950X3D is expected to be at least faster than the 7950X3D at gaming, which means its gaming performance, coupled with multithreaded application performance from its 16-core/32-thread count should be the face of AMD's desktop processor lineup for at least the next year.

It wouldn't be off-character for AMD to launch "Zen 6" on AM5, and not refresh the platform. The company had launched three microarchitectures (Zen thru Zen 3) on Socket AM4. With "Zen 6," AMD has the opportunity to not just increase IPC, but also core-counts per CCD, cache sizes, a new foundry node such as 3 nm, and probably even introduce features such as hybrid architecture and an NPU to the desktop platform, which means it could at least update the current 6 nm client I/O die (cIOD) while retaining AM5. A new cIOD could give AMD the much-needed opportunity to update the DDR5 memory controllers to support higher memory frequencies. The Kepler_L2 leak predicts a "late-2026 or early-2027" launch for desktop "Zen 6" processors. In the meantime, Intel is expected to ramp "Arrow Lake-S" on Socket LGA1851, and debut the "Panther Lake" microarchitecture on LGA1851 in 2025-26.
 

Von Hugh

Member
Not worth creating a new thread just for this, so I'll leave it here.


AMD (and AM5) keeps on giving.
 

SolidQ

Member
Handheld Strix Point - https://wccftech.com/worlds-first-amd-strix-point-handheld-onexfly-f1-pro-starts-1099-us-up-to-1899/

4.8 vs 4.8 ghz
Zen4 vs Zen5
GcGGFT9bYAAHRrB
 
Last edited:
I ordered a 5700x3d to place my x5800 since I couldn't find any 58003d at reasonable prices. Although now I'm thinking I should have just waited for black Friday
 

marquimvfs

Member

Pretty good content by them. I hope that it puts to bed the "I wanna see 4k results" in the future. I mean, most of us already knew that testing CPUs in a GPU bound scenario isn't a good representative of a given CPU performance, but some people still had their doubts.
We need to remember that a benchmark isn't about showing to you exactly what you will see in home, is a metric of how a component does behave in comparison with another, but in a controlled environment. Unless your setup is exactly like the testbed, you won't see the same results, it's just useful to see what's the absolute faster, and observe the margins.
It's so good that it deserves it's own thread, if you're willing to make it, winjer winjer
 

MikeM

Member
Pretty good content by them. I hope that it puts to bed the "I wanna see 4k results" in the future. I mean, most of us already knew that testing CPUs in a GPU bound scenario isn't a good representative of a given CPU performance, but some people still had their doubts.
We need to remember that a benchmark isn't about showing to you exactly what you will see in home, is a metric of how a component does behave in comparison with another, but in a controlled environment. Unless your setup is exactly like the testbed, you won't see the same results, it's just useful to see what's the absolute faster, and observe the margins.
It's so good that it deserves it's own thread, if you're willing to make it, winjer winjer
Its why I “only” use a 7600x for my 4k 120fps TV PC build. I’m 99% of the time GPU bound. Anymore CPU wouldn’t be worth the cost.
 

marquimvfs

Member
Its why I “only” use a 7600x for my 4k 120fps TV PC build. I’m 99% of the time GPU bound. Anymore CPU wouldn’t be worth the cost.
Yeah, if you're not playing competitive, its more than fine. Spend more on the GPU side and, upgrade in the future, if needed.
 

winjer

Member
Pretty good content by them. I hope that it puts to bed the "I wanna see 4k results" in the future. I mean, most of us already knew that testing CPUs in a GPU bound scenario isn't a good representative of a given CPU performance, but some people still had their doubts.
We need to remember that a benchmark isn't about showing to you exactly what you will see in home, is a metric of how a component does behave in comparison with another, but in a controlled environment. Unless your setup is exactly like the testbed, you won't see the same results, it's just useful to see what's the absolute faster, and observe the margins.
It's so good that it deserves it's own thread, if you're willing to make it, winjer winjer

There is one thing to consider. Most people don't run games at native resolution.
Currently, most modern games have DLSS, FSR, XeSS, TSR and PSSR.
So when we run a game at 4K with DLSS quality, the real resolution is 1440p. And it's 1080p for the Performance present.
 

marquimvfs

Member
There is one thing to consider. Most people don't run games at native resolution.
Currently, most modern games have DLSS, FSR, XeSS, TSR and PSSR.
So when we run a game at 4K with DLSS quality, the real resolution is 1440p. And it's 1080p for the Performance present.
Yeah, I thing it's pretty much given than almost everyone gaming with a 4k monitor or tv haves a strong GPU and are aware of that. The test also touches that subject.

What's most interesting is the test of "ancient" games and CPUs, completely shows that testing in 4k (even with dlss) completely obliterate the purpose of the test when showing what those "ancient" CPUs can do with today's games and GPUs
 

GHG

Member
Its why I “only” use a 7600x for my 4k 120fps TV PC build. I’m 99% of the time GPU bound. Anymore CPU wouldn’t be worth the cost.

Yep, same logic as to why I went for the 9700X for my living room build. Especially since it's a philips 4k OLED that I leave in 60fps mode because that's the max framerate the surround sound amp will accept at that resolution.

It's efficient, cool, runs quiet and also has incredible emulation performace. I doubt I'd see any noticeable uptick in performace on that system if I got a 9800X3D considering it's a 4080 gpu and I'm capped at 60fps.

Really nice to see someone test what is a more typical real world scenario for a lot of us.
 
Last edited:

hinch7

Member
There is one thing to consider. Most people don't run games at native resolution.
Currently, most modern games have DLSS, FSR, XeSS, TSR and PSSR.
So when we run a game at 4K with DLSS quality, the real resolution is 1440p. And it's 1080p for the Performance present.
My only qualm with his video is that a 4090 users aren't going to use balanced DLSS when they just spend $1600-2000 on a GPU. When going from Balanced (1253P) to Quality shows a big jump in image quality.

Or in the not-so distant future if people have enough power to drive their mid range GPU's to 4K Quality in RT demanding where most systems are likely hit GPU limits. Thats until where more powerful GPU's like when the 5090 releases, where this video would make more sense.
 
Last edited:

winjer

Member
My only qualm with his video is that a 4090 users aren't going to use balanced DLSS when they just spend $1600-2000 on a GPU. When going from Balanced (1253P) to Quality shows a big jump in image quality.

Or in the future if people have enough power to drive their GPU's to 4K Quality where most systems are likely hit GPU limits. Thats until where more powerful GPU's like when the 5090 releases, where this video would make more sense.

Usually, people change GPU sooner than they change CPU. Mostly, because the second, usually involves changing motherboard and memory.
So your argument that a next gen GPU will provide more room for the CPU, it's an accurate take.
 

marquimvfs

Member
My only qualm with his video is that a 4090 users aren't going to use balanced DLSS when they just spend $1600-2000 on a GPU. When going from Balanced (1253P) to Quality shows a big jump in image quality.
Yes, but what about the players without a 4090? Will those settings impact them? Again, the 4090 is there only to help prevent a GPU bottleneck, not to show 4090 users how their systems will behave, that's reserved to the 4090's tests.
 
Last edited:

ap_puff

Member
Yep, same logic as to why I went for the 9700X for my living room build. Especially since it's a philips 4k OLED that I leave in 60fps mode because that's the max framerate the surround sound amp will accept at that resolution.

It's efficient, cool, runs quiet and also has incredible emulation performace. I doubt I'd see any noticeable uptick in performace on that system if I got a 9800X3D considering it's a 4080 gpu and I'm capped at 60fps.

Really nice to see someone test what is a more typical real world scenario for a lot of us.
Why didn't you just get a 7700 non X and run in eco mode? iirc those are mad cheap
 

hinch7

Member
Yes, but what about the players without a 4090? Will those settings impact them? Again, the 4090 is there only to help prevent a GPU bottleneck, not to show 4090 users how their systems will behave, that's reserved to the 4090's tests.
Results will vary either way with different settings, set ups and what the user is going for. But sure a more powerful CPU would most likely benefit a persons set ups when not GPU limited and game allows it. Though with how heavy game engines are these days most games will run into GPU limits way before CPU.

But at the same time its disingenious to show 'real world' tests at 4K - where you're likely going to be GPU bound on higher settings. Only to artificially cap res+settings just to prove a point. We can clearly already see it with the myrid of 1080P tests and reviews with a 4090 and those data points.
 
Last edited:

LiquidMetal14

hide your water-based mammals
This is especially good because they are using DLSS balanced at 4k (so internal res of just over 1080p if I'm not mistaken).

It goes to show that even though the internal resolution before any AI upscaling is that low, you are still very much GPU bound in most games when outputting to that resolution.
Exactly why people like me see the benefit of upgrading because we're going to upgrade from the 4090s we have to a 5090. That's when we will actually see the more pronounced difference in 4k.
 

marquimvfs

Member
But at the same time its disingenious to show 'real world' tests at 4K - where you're likely going to be GPU bound on higher settings. Only to artificially cap res+settings just to prove a point. We can clearly already see it with the myrid of 1080P tests and reviews with a 4090 and those data points.
Now I got you better, but i think that they thought the other way arround. Instead of testing even further regarding quality, they did a step by step benchmarks and the first step up in quality was already enough to prove his point. I mean, if they continued to increasing settings, the results graphic was going to be even flatter and there would be somebody pointing that those are the real results. Just showing that the GPU bottleneck is real is more than enough to prove his point, there was no need to continue.

I still think it's stupid to increase the quality just to make the 4090 work harder just like "real world", or because "4090 owners won't play tha way", or even because "whoever buys this for sure don't play just 1080p".

I'm all in favour to test everything in 720p whenever is possible, with the fastest (not necessarily the better) GPU available.
 

scraz

Member
If you are on the hunt for sign up for alerts here


Just got mine off B&H Fuck scalpers.
 

GHG

Member
Why didn't you just get a 7700 non X and run in eco mode? iirc those are mad cheap

This is how the two components in question are priced where I live (don't ask me why, but that's how it is, it's just a $20 difference):

a7e47t4.jpeg
QLobboH.jpeg

Plus the emulation performance:

emulation-ps3.png


Running the 9700x stock results similar power consumption and lower temps than the 7700 in stock form, all while netting better performance.

Like you said, if the 7700 was significantly cheaper then it would have made sense at 4k, but it isn't here.

Exactly why people like me see the benefit of upgrading because we're going to upgrade from the 4090s we have to a 5090. That's when we will actually see the more pronounced difference in 4k.

Ironically I have a 4090 paired with a 5800X3D in a different system, but again I'm happy with the performance there at 4k (even at 120hz).

If it makes sense to upgrade to the 5090 and there is a significant bottleneck at 4k then I'll look at doing something there, but I'm not in the business of buying parts in anticipation of bottlenecks that don't currently exist for my use-case.
 
Last edited:

analog_future

Resident Crybaby
I… think I hit the lottery with my 9800X3D?

It’s passing intense stress tests at PBO scalar 10x/+200mhz/negative 50 curve optimizer offset.


I almost think I’m missing something, I’m so surprised it’s allowing me to undervolt/overclock to this degree with no special effort.
 
Last edited:

analog_future

Resident Crybaby
I… think I hit the lottery with my 9800X3D?

It’s passing intense stress tests at PBO scalar 10x/+200mhz/negative 50 curve optimizer offset.


I almost think I’m missing something, I’m so surprised it’s allowing me to undervolt/overclock to this degree with no special effort.

Ran a few more stress tests and it seems to be golden. 1448 Cinebench 2024 score. Super, super pleased.

Done messing around tonight but can't wait to actually play some games with this tomorrow.
 

Codeblew

Member
Ran a few more stress tests and it seems to be golden. 1448 Cinebench 2024 score. Super, super pleased.

Done messing around tonight but can't wait to actually play some games with this tomorrow.
That is crazy. That is a -50 undervolt? Did you try any gaming benchmarks or memory tests for stability?
 

LiquidMetal14

hide your water-based mammals
This is how the two components in question are priced where I live (don't ask me why, but that's how it is, it's just a $20 difference):

a7e47t4.jpeg
QLobboH.jpeg

Plus the emulation performance:

emulation-ps3.png


Running the 9700x stock results similar power consumption and lower temps than the 7700 in stock form, all while netting better performance.

Like you said, if the 7700 was significantly cheaper then it would have made sense at 4k, but it isn't here.



Ironically I have a 4090 paired with a 5800X3D in a different system, but again I'm happy with the performance there at 4k (even at 120hz).

If it makes sense to upgrade to the 5090 and there is a significant bottleneck at 4k then I'll look at doing something there, but I'm not in the business of buying parts in anticipation of bottlenecks that don't currently exist for my use-case.
Well said. We're GPU bottlenecked at 4k more than CPU. People like me want the best they can get to drive multiple 4k high refresh panels. Looking forward to the next few months.
 
  • Like
Reactions: GHG

analog_future

Resident Crybaby
That is crazy. That is a -50 undervolt? Did you try any gaming benchmarks or memory tests for stability?

Yep, -50 undervolt. Completed an OCCT 20 minute memory/cpu/AVX2 stress test, completed several 3D Mark benchmarks, and completed several Cinebench 2024 runs. No lockups/crashes yet and temps are maxing out in the low/mid 70s.


Will attempt to complete a 60 minute stress test today and then do a bunch of gaming to confirm stability, but all looks really promising so far.
 

DenchDeckard

Moderated wildly
Yep, same logic as to why I went for the 9700X for my living room build. Especially since it's a philips 4k OLED that I leave in 60fps mode because that's the max framerate the surround sound amp will accept at that resolution.

It's efficient, cool, runs quiet and also has incredible emulation performace. I doubt I'd see any noticeable uptick in performace on that system if I got a 9800X3D considering it's a 4080 gpu and I'm capped at 60fps.

Really nice to see someone test what is a more typical real world scenario for a lot of us.

Perfect build for the environment you are running it.

I am an idiot (you know this) and just throw money at things to have the best and don't always think logically of how I am going to be using the system.

Did you go small form factor or just have a desktop hooked up to the TV. how has it been?
 

LordBritish

Member
Yep, -50 undervolt. Completed an OCCT 20 minute memory/cpu/AVX2 stress test, completed several 3D Mark benchmarks, and completed several Cinebench 2024 runs. No lockups/crashes yet and temps are maxing out in the low/mid 70s.


Will attempt to complete a 60 minute stress test today and then do a bunch of gaming to confirm stability, but all looks really promising so far.
Did you post your motherboard and ram you are using?
 
Top Bottom