Trogdor1123
Member
I think my uncle is looking at getting one to replace his 7800x3d. I told him to wait another generation but who knows with him.
The problem with this dude is that he doesn't actually test the boards and only looks at the specs on paper (which is also valuable of course)I went with the ASRock X870E Nova - seems to be one of the better boards according to this guy
The problem with this dude is that he doesn't actually test the boards and only looks at the specs on paper (which is also valuable of course)
I'm pretty much always playing with dlss performance or even ultra performance so I guess that's me!all them people gaming on a 720p screens with lowest graphics, otherwise I cannot explain this crazy desire to upgrade a cpu.
The problem with this dude is that he doesn't actually test the boards and only looks at the specs on paper (which is also valuable of course)
The desktop version of AMD's next-generation "Zen 6" microarchitecture will retain Socket AM5, Kepler_L2, a reliable source with hardware leaks, revealed. What's more interesting is the rumor that the current "Zen 5" will remain AMD's mainstay for the entirety of 2025, and possibly even most of 2026, at least for the desktop platform. AMD will be banking heavily on the recently announced Ryzen 7 9800X3D, and its high core-count siblings, the Ryzen 9 9950X3D and possible 9900X3D, to see the company through for 2025 against Intel. The 9800X3D posted significantly higher gaming performance than Intel, and the 9950X3D is expected to be at least faster than the 7950X3D at gaming, which means its gaming performance, coupled with multithreaded application performance from its 16-core/32-thread count should be the face of AMD's desktop processor lineup for at least the next year.
It wouldn't be off-character for AMD to launch "Zen 6" on AM5, and not refresh the platform. The company had launched three microarchitectures (Zen thru Zen 3) on Socket AM4. With "Zen 6," AMD has the opportunity to not just increase IPC, but also core-counts per CCD, cache sizes, a new foundry node such as 3 nm, and probably even introduce features such as hybrid architecture and an NPU to the desktop platform, which means it could at least update the current 6 nm client I/O die (cIOD) while retaining AM5. A new cIOD could give AMD the much-needed opportunity to update the DDR5 memory controllers to support higher memory frequencies. The Kepler_L2 leak predicts a "late-2026 or early-2027" launch for desktop "Zen 6" processors. In the meantime, Intel is expected to ramp "Arrow Lake-S" on Socket LGA1851, and debut the "Panther Lake" microarchitecture on LGA1851 in 2025-26.
Not worth creating a new thread just for this, so I'll leave it here.
AMD "Zen 6" to Retain Socket AM5 for Desktops, 2026-27 Product Launches
The desktop version of AMD's next-generation "Zen 6" microarchitecture will retain Socket AM5, Kepler_L2, a reliable source with hardware leaks, revealed. What's more interesting is the rumor that the current "Zen 5" will remain AMD's mainstay for the entirety of 2025, and possibly even most of...www.techpowerup.com
Expensive lolHandheld Strix Point - https://wccftech.com/worlds-first-amd-strix-point-handheld-onexfly-f1-pro-starts-1099-us-up-to-1899/
4.8 vs 4.8 ghz
Zen4 vs Zen5
I am blind, I don't see anyThere is tests in the end of the video?
Seems like a great choice as long as you don't want to do anything crazy with the memory.Seems like a solid enough board for the price.
Big Diffrence even at 4k
Its why I “only” use a 7600x for my 4k 120fps TV PC build. I’m 99% of the time GPU bound. Anymore CPU wouldn’t be worth the cost.Pretty good content by them. I hope that it puts to bed the "I wanna see 4k results" in the future. I mean, most of us already knew that testing CPUs in a GPU bound scenario isn't a good representative of a given CPU performance, but some people still had their doubts.
We need to remember that a benchmark isn't about showing to you exactly what you will see in home, is a metric of how a component does behave in comparison with another, but in a controlled environment. Unless your setup is exactly like the testbed, you won't see the same results, it's just useful to see what's the absolute faster, and observe the margins.
It's so good that it deserves it's own thread, if you're willing to make it, winjer
Yeah, if you're not playing competitive, its more than fine. Spend more on the GPU side and, upgrade in the future, if needed.Its why I “only” use a 7600x for my 4k 120fps TV PC build. I’m 99% of the time GPU bound. Anymore CPU wouldn’t be worth the cost.
Pretty good content by them. I hope that it puts to bed the "I wanna see 4k results" in the future. I mean, most of us already knew that testing CPUs in a GPU bound scenario isn't a good representative of a given CPU performance, but some people still had their doubts.
We need to remember that a benchmark isn't about showing to you exactly what you will see in home, is a metric of how a component does behave in comparison with another, but in a controlled environment. Unless your setup is exactly like the testbed, you won't see the same results, it's just useful to see what's the absolute faster, and observe the margins.
It's so good that it deserves it's own thread, if you're willing to make it, winjer
Yeah, I thing it's pretty much given than almost everyone gaming with a 4k monitor or tv haves a strong GPU and are aware of that. The test also touches that subject.There is one thing to consider. Most people don't run games at native resolution.
Currently, most modern games have DLSS, FSR, XeSS, TSR and PSSR.
So when we run a game at 4K with DLSS quality, the real resolution is 1440p. And it's 1080p for the Performance present.
Its why I “only” use a 7600x for my 4k 120fps TV PC build. I’m 99% of the time GPU bound. Anymore CPU wouldn’t be worth the cost.
My only qualm with his video is that a 4090 users aren't going to use balanced DLSS when they just spend $1600-2000 on a GPU. When going from Balanced (1253P) to Quality shows a big jump in image quality.There is one thing to consider. Most people don't run games at native resolution.
Currently, most modern games have DLSS, FSR, XeSS, TSR and PSSR.
So when we run a game at 4K with DLSS quality, the real resolution is 1440p. And it's 1080p for the Performance present.
My only qualm with his video is that a 4090 users aren't going to use balanced DLSS when they just spend $1600-2000 on a GPU. When going from Balanced (1253P) to Quality shows a big jump in image quality.
Or in the future if people have enough power to drive their GPU's to 4K Quality where most systems are likely hit GPU limits. Thats until where more powerful GPU's like when the 5090 releases, where this video would make more sense.
Yes, but what about the players without a 4090? Will those settings impact them? Again, the 4090 is there only to help prevent a GPU bottleneck, not to show 4090 users how their systems will behave, that's reserved to the 4090's tests.My only qualm with his video is that a 4090 users aren't going to use balanced DLSS when they just spend $1600-2000 on a GPU. When going from Balanced (1253P) to Quality shows a big jump in image quality.
Why didn't you just get a 7700 non X and run in eco mode? iirc those are mad cheapYep, same logic as to why I went for the 9700X for my living room build. Especially since it's a philips 4k OLED that I leave in 60fps mode because that's the max framerate the surround sound amp will accept at that resolution.
It's efficient, cool, runs quiet and also has incredible emulation performace. I doubt I'd see any noticeable uptick in performace on that system if I got a 9800X3D considering it's a 4080 gpu and I'm capped at 60fps.
Really nice to see someone test what is a more typical real world scenario for a lot of us.
In what timeline is that? One day?Mindfactory Sales
P.S Poor Error Lake
Results will vary either way with different settings, set ups and what the user is going for. But sure a more powerful CPU would most likely benefit a persons set ups when not GPU limited and game allows it. Though with how heavy game engines are these days most games will run into GPU limits way before CPU.Yes, but what about the players without a 4090? Will those settings impact them? Again, the 4090 is there only to help prevent a GPU bottleneck, not to show 4090 users how their systems will behave, that's reserved to the 4090's tests.
one weekIn what timeline is that? One day?
Exactly why people like me see the benefit of upgrading because we're going to upgrade from the 4090s we have to a 5090. That's when we will actually see the more pronounced difference in 4k.This is especially good because they are using DLSS balanced at 4k (so internal res of just over 1080p if I'm not mistaken).
It goes to show that even though the internal resolution before any AI upscaling is that low, you are still very much GPU bound in most games when outputting to that resolution.
Should still get you to next gen. Pulled more value out of that AM4 board that way as well.I ordered a 5700x3d to place my x5800 since I couldn't find any 58003d at reasonable prices. Although now I'm thinking I should have just waited for black Friday
Now I got you better, but i think that they thought the other way arround. Instead of testing even further regarding quality, they did a step by step benchmarks and the first step up in quality was already enough to prove his point. I mean, if they continued to increasing settings, the results graphic was going to be even flatter and there would be somebody pointing that those are the real results. Just showing that the GPU bottleneck is real is more than enough to prove his point, there was no need to continue.But at the same time its disingenious to show 'real world' tests at 4K - where you're likely going to be GPU bound on higher settings. Only to artificially cap res+settings just to prove a point. We can clearly already see it with the myrid of 1080P tests and reviews with a 4090 and those data points.
Why didn't you just get a 7700 non X and run in eco mode? iirc those are mad cheap
Exactly why people like me see the benefit of upgrading because we're going to upgrade from the 4090s we have to a 5090. That's when we will actually see the more pronounced difference in 4k.
I… think I hit the lottery with my 9800X3D?
It’s passing intense stress tests at PBO scalar 10x/+200mhz/negative 50 curve optimizer offset.
I almost think I’m missing something, I’m so surprised it’s allowing me to undervolt/overclock to this degree with no special effort.
That is crazy. That is a -50 undervolt? Did you try any gaming benchmarks or memory tests for stability?Ran a few more stress tests and it seems to be golden. 1448 Cinebench 2024 score. Super, super pleased.
Done messing around tonight but can't wait to actually play some games with this tomorrow.
Well said. We're GPU bottlenecked at 4k more than CPU. People like me want the best they can get to drive multiple 4k high refresh panels. Looking forward to the next few months.This is how the two components in question are priced where I live (don't ask me why, but that's how it is, it's just a $20 difference):
Plus the emulation performance:
Running the 9700x stock results similar power consumption and lower temps than the 7700 in stock form, all while netting better performance.
Like you said, if the 7700 was significantly cheaper then it would have made sense at 4k, but it isn't here.
Ironically I have a 4090 paired with a 5800X3D in a different system, but again I'm happy with the performance there at 4k (even at 120hz).
If it makes sense to upgrade to the 5090 and there is a significant bottleneck at 4k then I'll look at doing something there, but I'm not in the business of buying parts in anticipation of bottlenecks that don't currently exist for my use-case.
That is crazy. That is a -50 undervolt? Did you try any gaming benchmarks or memory tests for stability?
Yep, same logic as to why I went for the 9700X for my living room build. Especially since it's a philips 4k OLED that I leave in 60fps mode because that's the max framerate the surround sound amp will accept at that resolution.
It's efficient, cool, runs quiet and also has incredible emulation performace. I doubt I'd see any noticeable uptick in performace on that system if I got a 9800X3D considering it's a 4080 gpu and I'm capped at 60fps.
Really nice to see someone test what is a more typical real world scenario for a lot of us.
Can anyone advise what is the best and most long lasting thermal paste?
For ease of application, lifespan, and thermal performance I would recommend the Arctic MX-6. MX-4 is slightly cheaper and almost as good.Can anyone advise what is the best and most long lasting thermal paste?
as others have said Arctic MX-6 is good, spreads well and I had no issues with it when i used it. Currently using Noctua NT-H1 and that gave me slightly lower temps than the Arctic thermal paste.Can anyone advise what is the best and most long lasting thermal paste?
Did you post your motherboard and ram you are using?Yep, -50 undervolt. Completed an OCCT 20 minute memory/cpu/AVX2 stress test, completed several 3D Mark benchmarks, and completed several Cinebench 2024 runs. No lockups/crashes yet and temps are maxing out in the low/mid 70s.
Will attempt to complete a 60 minute stress test today and then do a bunch of gaming to confirm stability, but all looks really promising so far.