Rudius
Member
That moment when you realize this is the new sales pitch between hardware iterations.
That moment when you realize this is the new sales pitch between hardware iterations.
Sadly I don’t think outside of Sony we’re going to see on game built with the pro in mind. Best bet would be some tacked on “extras” which will be nice but developers are notorious for not putting the effort into niche configurations.
I’m still at a loss why they couldn’t just put in a slightly larger cpu of the same variant to give 30% boost in CPU which would have allowed almost all games to reach 60fps the 10% in the pro isn’t even going to be felt for most titles.
A locked 60fps can probably be done with a 7600 or 13400. It’s mostly for the Act 3 Baldurs Gate city area. But the PC video was only there to show how heavy on the CPU that specific area is.Since when we need to take into account the absolute min framerate in order to get to 60fps territory? Show me what CPU is needed to get to locked 60fps on PC here, is there even such a CPU? Oddly we are only doing that on consoles when talking about those "30fps" games. The game is already mostly 60fps in the others areas.
Why use max settings?Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.
Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as @winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if PaintTinJr is onto something or not.
but the ps5 pro will not be running the game at "max settings"Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.
Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as @winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if PaintTinJr is onto something or not.
Why not? The base PS5 runs the game at max settings. The only scaling the game has is 1440p native for Quality and 1440p FSR 2 quality for Performance. All other settings are a match for the PC Ultra settings.but the ps5 pro will not be running the game at "max settings"
is it confirm the base ps5 is running the game at PC max settings?Why not? The base PS5 runs the game at max settings. The only scaling the game has is 1440p native for Quality and 1440p FSR 2 quality for Performance. All other settings are a match for the PC Ultra settings.
Yeah, that’s what DF found in the analysis, everything is a match for Ultra.is it confirm the base ps5 is running the game at PC max settings?
Are we going to get any decent Pro footage from TGS? GT7 and FFVII Rebirth are supposed to be playable there
is it the performance mode or the quality mode? im actually interested whether the PRO can improve the framerate for the performance mode.Yeah, that’s what DF found in the analysis, everything is a match for Ultra.
For both, only thing that changes is that instead of native res it switches to FSR2 quality for the performance mode.is it the performance mode or the quality mode?
Not GPU limited at 1080p using... a 4090. This is DF logic at its best here.Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.
Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as @winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if PaintTinJr is onto something or not.
but the ps5 pro will not be running the game at "max settings"
I'd love Cyberpunk 2077 to get a Pro patch. You're saying even if it doesn't, then the dynamic resolution (I always played in performance mode) will still be higher than on regular PS5?Review units will probably go out to publications either before the next pre-order phase on 10/10 or before launch on 11/7.
I'm still surprised that some specific games weren't highlighted as receiving pro updates
Cyberpunk, Wukong, Baldur's Gate, Elden Ring, FF16.
Elden Ring should probably get a locked 4K60 via Boost Mode it'll be interesting to see if it can handle the RT mode at 60. FF16 had DRS, but I think there are some things that are locked feature wise to fidelity mode.
All of these games should benefit from boost mode, but could have gone further.
I am very interested in the performance comparison between the base ps5 and pro ps5 now, this will be very interesting.For both, only thing that changes is that instead of native res it switches to FSR2 quality for the performance mode.
I thought PS5 was Ultra for everything? Did that change?Lowest vs. highest:
But in your 19% video they were only testing patches at that resolution, not comparing the 3600 CPU with two different GPUs.Alex was stress testing the game, so it makes sense that his FPS was lower. However the point is that the effect of lowering the resolution even to 720p while decreasing settings to their minimum and likely lowering the DLSS cost, increased performance by only 19%.
What more evidence do you need to see to conclude that the game is largely CPU limited in the city in Act 3? And moreover think about the hypothetical case where no performance patches had been released. What do you think the Pro's chances of hitting a locked 60 would have been in that case?
I'd love Cyberpunk 2077 to get a Pro patch. You're saying even if it doesn't, then the dynamic resolution (I always played in performance mode) will still be higher than on regular PS5?
I thought PS5 was Ultra for everything? Did that change?
And that right there shows that the cost of issuing the extra GPU work costs 25% of the CPU performance, because at 720p the GPU isn't limited to render all the FX on that card so should have seen zero performance degradation if CPU limited for game logic or simulation, but instead the CPU incurs more draw call preparation and losses 25% frame-rate, meaning the draw call preparation is at least one major bottleneck.Lowest vs. highest:
That is not just a comparison of graphical settings difference. The lowest setting disables dynamic crowds and lowers animation quality, both of which are CPU related performance options, especially in the city area which has hundreds of NPCs.And that right there shows that the cost of issuing the extra GPU work costs 25% of the CPU performance, because at 720p the GPU isn't limited to render all the FX on that card so should have seen zero performance degradation if CPU limited for game logic or simulation, but instead the CPU incurs more draw call preparation and losses 25% frame-rate, meaning the draw call preparation is at least one major bottleneck.
yeah not to mention if a game's code is borked on CPU hardly any setting will make a noticable differenceIt's on max settings in quality mode you are right:
What I posted was to show difference in fps between highest and lowest settings in CPU limited scenario.
But some people just can't comprehend what CPU limit is and what it looks like so it's a waste of time...
The change with the biggest positive impact was neither lowering the resolution and settings, nor swapping out the GPU. Indeed the 3070 was seemingly "outperforming" the 4090 across the two benchmarks even at 1440p with DLAA!But in your 19% video they were only testing patches at that resolution, not comparing the 3600 CPU with two different GPUs.
From the multitude of videos, we saw that day 1 code on Ryzen 3600 while dropping resolution to 720p on weaker RTX 3070 (with DLSS to 1440p) out performed a RTX 4090 atjust 1080p on the same CPU by 10fps in the same troubled area.
Meaning that resolution reduction had more of a positive impact than the difference between a 3070 and 4090, which clearly shows that the GPUs aren't able to parallelize their work well, and the RTX 4090 appears handicapped by its caches, VRAM or PCIe bus with the modest 2x increase in resolution not even allowing it to match the much weaker 3070, but actually fall behind by 10fps.
They are just supplying more GPU data - assuming it needs streamed - and more draw calls, because the animation blending with quaternions has been accelerated since the PSP added that feature to its PS2-esq portable hardware. The number of collision tests isn't massively changing in that 10fps drop screenshot because the number of NPCs is the same or almost the same from manually counting what I think are and aren't NPC models.That is not just a comparison of graphical settings difference. The lowest setting disables dynamic crowds and lowers animation quality, both of which are CPU related performance options, especially in the city area which has hundreds of NPCs.
The reason the CPU cost is so high in the city area is that every single object in the game is simulated in a very large area (well outside of visual range) of the player. You can leave a trail of explosives through the entire city and blow up everything at once, and that brings even a 14900k to well under 20fps. Hence the the dynamic crowds and animation setting having such a large performance impact as it influences not just NPCs you can actually see.They are just supplying more GPU data - assuming it needs streamed - and more draw calls, because the animation blending with quaternions has been accelerated since the PSP added that feature to its PS2-esq portable hardware. The number of collision tests isn't massively changing in that 10fps drop screenshot because the number of NPCs is the same or almost the same from manually counting what I think are and aren't NPC models.
So that comparison is only different by GPU data and draw calls being prepared by the CPU for the GPU, and is hardly 2 generational jumps from AC2 or later running on a single core 2way(SMT/HT) PPU of a PS3 Cell BE CPU to justify having a Ryzen 3600 to struggle with hitting 60fps with any modern GPU at 720p.
At some point you just have to say that the developers are better game designers than game developers, and that technical incompetence is on show for the PC release.
Which brings me back to a question you asked me many pages ago that I didn't answer(what will happen with the Pro version?), and my answer is probably that they'll get help from PlayStation to ship it in a quality state running at 60fps on the Pro.
No, it's PC logic 101 that every reputable reviewer like GN, HU, Debauer, and everyone else uses.Not GPU limited at 1080p using... a 4090. This is DF logic at its best here.
It's no longer just about CPU if you go down to a 2700. You go down from DDR5 to DDR4, you get fewer PCIe lanes, and everything gets a lot slower. Clearly, if it was GPU-limited at 1080p, the performance would increase substantially at 720p. It doesn't because the GPU isn't the limiting factor.Show me the same spots with a 2700 or a 5700XT please. Now we'll really know if it's 100% limited by the CPU or if the GPU (or bandwidth) has some parts, even small, in the framerate.
Because that's what the consoles use and we want to maximize the load on the CPU. Max settings add more NPCs and other effects that also require more CPU horsepower.Why use max settings?
More often than not, whenever we see a title with inconsistent framerates there are engine/dev issues. Time and again we hear how a particular title is too ambitious for current hardware, only to see significant performance gains months down the line. We are also told the CPU is the overwhelming culprit for poor FPS, only to see XSS often capped at 30FPS, indicating there is far more to it.After reading through the thread, it seems to me that the old CPU is not a big deal for the PS5 Pro. The number of CPU limited games this gen (as far as hitting 60 fps goes) can be counted on one hand. If CPU was the be all end all, then there should have been no game that shipped with a 60 fps mode on XSX/PS5 that also did not have a 60 fps mode on XSS. Heck, Tom Warren expected the XSS to outperform the PS5 (in fps) due to higher clocked CPU. However, that never panned out and many 3rd parties simply skip 60 fps modes on XSS.
I would pay $2000 for a Pro that gets rid of SSR and cube maps in all games for RT reflectionsThe differences are actually quite big.
But it proves nothing about the PC version, which is independent of the console using different hardware bindings. The discussion about the game on PC and lowering resolution - independent of PSSR lowering resolution discussion on Pro - was that it is CPU bound, when max settings just distorts the claim by becoming the bottleneck to stop the lower resolution showing that it can run the game faster with the CPU being the same.Because that's what the consoles use and we want to maximize the load on the CPU. Max settings add more NPCs and other effects that also require more CPU horsepower.
I would pay $2000 for a Pro that gets rid of SSR and cube maps in all games for RT reflections
Dude, you're not making any sense. Rendering the game at 1080p or 720p sees no difference in performance, and you're here arguing that max settings are the cause? What kind of nonsense is that? If the GPU had been the limiting factor, chopping down the resolution to 720p would have increased the performance. Resolution is often the single biggest thing hitting the GPU. Max settings in BG3 won't bog down a 4090 so much that it can't break past 120fps at 720p. This is ridiculous.But it proves nothing about the PC version, which is independent of the console using different hardware bindings. The discussion about the game on PC and lowering resolution - independent of PSSR lowering resolution discussion on Pro - was that it is CPU bound, when max settings just distorts the claim by becoming the bottleneck to stop the lower resolution showing that it can run the game faster with the CPU being the same.
Dude, you're not making any sense. Rendering the game at 1080p or 720p sees no difference in performance, and you're here arguing that max settings are the cause? What kind of nonsense is that? If the GPU had been the limiting factor, chopping down the resolution to 720p would have increased the performance. Resolution is often the single biggest thing hitting the GPU. Max settings in BG3 won't bog down a 4090 so much that it can't break past 120fps at 720p. This is ridiculous.
Whether or not this applies to PSSR, I don't know, but resolution has no impact on CPU performance unless under very specific circumstances. Every hardware reviewer when running CPU tests will pair the most powerful GPU on the market with the test system and usually run the games at 1080p. BG3 in my example is CPU-limited and there's no way to argue that.
I have no idea how PSSR which is strictly related to the resolution will help CPU performance, but you obviously know something that the rest of us don't. Those discussions are often annoying because people are presented with mountains of evidence, but all they do is stonewall. They never provide their own footage or proof to defend their claims. You've been saying a lot of stuff over the last few pages, but provided nothing tangible to back you up. You simply deny what's there on unfounded grounds. If at least you put in some effort to show us proof of your claims, this would be a lot better. As it stands, you simply stonewall.
The CPU to GPU workload is the bottleneck we can clearly see from all the videos of the PC with different configs. As you lower the CPU to GPU work my dialling back resolution and settings that need additional CPU work instruction that block the CPU, you then see the CPU rises the frame-rate.Dude, you're not making any sense. Rendering the game at 1080p or 720p sees no difference in performance, and you're here arguing that max settings are the cause? What kind of nonsense is that? If the GPU had been the limiting factor, chopping down the resolution to 720p would have increased the performance. Resolution is often the single biggest thing hitting the GPU. Max settings in BG3 won't bog down a 4090 so much that it can't break past 120fps at 720p. This is ridiculous.
Whether or not this applies to PSSR, I don't know, but resolution has no impact on CPU performance unless under very specific circumstances. Every hardware reviewer when running CPU tests will pair the most powerful GPU on the market with the test system and usually run the games at 1080p. BG3 in my example is CPU-limited and there's no way to argue that.
I have no idea how PSSR which is strictly related to the resolution will help CPU performance, but you obviously know something that the rest of us don't.
Resolution has no impact on CPU performance unless under specific circumstances. Else you better tell the whole tech community that they've been conducting their tests incorrectly for years and that you know better even without data to support yourself.The CPU to GPU workload is the bottleneck we can clearly see from all the videos of the PC with different configs. As you lower the CPU to GPU work my dialling back resolution and settings that need additional CPU work instruction that block the CPU, you then see the CPU rises the frame-rate.
For BG3 those launch comparisons are pretty obsolete as the game got significant performance patches since launch. For the PC CPU performance they had an update here:
For consoles here:
3600 goes up to the low 40s, PS5 went from the low 20s to the low-mid 30s. Basically Act 3 is still killer on the CPU, albeit not as much as before. I’m not sure how the Pro can take 32fps all the way up to a locked 60, as the game is certainly not GPU limited. Even using the 10% faster CPU and lowering the internal resolution to 720p via PSSR that’s not going to increase your CPU performance by 90%. Maybe if another impactful performance patch roles out.
The only PS5 games I’m aware of that have issues with CPU performance is BG3, DD2, and Space Marine 2. Talking about stuff people care about, not shit like Gotham Knights. For upcoming releases maybe MH Wilds and KC Deliverance, but that’s an unknown. So it’s a tiny amount of games and certainly not indicative of the PS5 library as a whole.
I would pay $2000 for a Pro that gets rid of SSR and cube maps in all games for RT reflections
Yeah, massive difference. Hitman shows how much it can change look of the game:
Yeah, massive difference. Hitman shows how much it can change look of the game:
The ryzen 3600 is certainly not enough to match PS5 CPU performance. It has less CPUs cores and it also have to work a lot harder to make up for the lack of decompression chip. If the game is built around the PS5 decompression chip (like the TLOU1 remake) the 3600 cannot run such port well.
Dips below 60fps, almost 100% full CPU usage due to decompression resulting in stuttering.
I cannot understand why Digital Foundry keeps using 3600 in their PS5 comparisons. They mislead people because that CPU is not up to the job.
Like always typical Bugaga nonsense.In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
Do you truly have the slightest hope for this man to humbly accept his ignorance and evolve? He is mere an amusing figure as a PCMR caricature with a overgrown ego, that is all there is to him.I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
We don't know that though - especially given console RT is almost always GPU limited, and RT is only CPU heavy on console if you chose to make it so (ie. if you have CPU cycles to spare, or you just couldn't be bothered).fps was dropping like motherfucker and it was most likely caused by cpu.
On most consoles - CPU/GPU memory contention means that GPU completing frames faster (with free time in the frame) = better CPU performance. This could be anywhere from single digit % of CPU performance to stupid things like 80-90% (two particular consoles had a broken memory-subsystem that could cripple the CPU if you didn't manually throttle the GPU work).Resolution has no impact on CPU performance unless under specific circumstances.
I mean - strictly speaking - this is the only correct* way to handle LOD (it should be based on screen coverage). It's how texture LOD has worked - well for as long as GPUs could mipmap textures, so the last 30+ years, give or take. It's also - at the heart of virtualized geometry approaches like Nanite to do exactly this.The only way this can be true, is if a game has an LOD system that is tied to screen resolution. So higher resolution will call higher detail lods, that will have more objects and mor complexity.
He doesn’t just mention it without evidence, he shows that even on the latest patch the 7800x3D is running at under 60fps in one scene due to the CPU overhead in this title with RT enabled. As he rightly points out, I wouldn’t trust the Jedi survivor devs with this considering their track record. It not a problem with the hardware in this example, it’s a problem with the developers.In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.