• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the PS5 GPU Really Underpowered - What Can We Really Expect

dEvAnGeL

Member
as i stated in many previous post, i don't give an absolute fuck about loading times or these fancy features console have, give me 1 min loading times for everything at 4k60 ultra details and i'm happy with that.

i have my tablet on my side with some shit on yt or gaf topic ready for when i have to wait for loading times 😆
So you should be gaming on PC if your priority is ultra details at 4k/60.
 
The difference between the two consoles :
- Lower Resolution
- Less FPS
- Less Ray Tracing
- Less details
That's about it, most won't even care, just buy what you like.
I too had been saying that 24BG was the minimum amount of RAM needed for next gen and was surprised and bummed out to hear about the 16 GB. The fact that both Sony and Microsoft came to that same place means that there must be a way with the new system enhancements to justify the 16 GB. The SSD and I/O improvements means that RAM will not be used the same way it has been in the past. We shall see how it turns out i guess.
Let's not forget pricing, more ram, higher the price, never believed that 24gb ram speculation.
 
The difference between the two consoles :
- Lower Resolution
- Less FPS
- Less Ray Tracing
- Less details
That's about it, most won't even care, just buy what you like.

Let's not forget pricing, more ram, higher the price, never believed that 24gb ram speculation.

Better controller, more AAA exclusives and games may even look better on the PS5, although I expect the Series X to be supeior on that front.
 

dEvAnGeL

Member
i already play 95% of games on my pc :lollipop_winking:
console is for exclusives games.
Than i guess the console you are buying is ps5 since you already can play xbox games on PC, if that's the case Sony first party studios will still produce the better looking games of either console, just look at the current ps4 library.
 
Last edited:

GymWolf

Member
Than i guess the console you are buying is ps5 since you already can play xbox games on PC, if that's the case Sony first party studios will still produce the better looking games of either console, just look at the current ps4 library.
Yeah i'm going with ps5 only for this gen, i absolutely love their exclusive games.
 
Last edited:

ZywyPL

Banned
I think it's too early for such assumptions/discussions - due to their variable nature we don't know what the actual clocks/TF will actually be during games, or how effective RDNA2 is with high frequencies, because who knows, maybe it will be able to reach 2.4-2.5GHz in PC GPUs, and PS5 clocks will turn out not to be actually that high as we all think. Because with either CPU or GPU there are diminishing returns once you reach a certain clock speeds, for CPUs it seems that anything above ~4.5GHz is completely irrelevant, there are very little gains with going for 4.7, 4.8, 5.0, 5.2GHz, and for GPUs it's more or less 1900-1950Mhz, anything more gives a single digit FPS gains, be it Pascal, Turing or Navi. So yeah, it will all come down to how well RDNA2 handles the frequencies, because it might turn out that despite being 2.2 there is just 2-4 more FPS compared to 2GHz a.k.a. 9.2TF, or it might turn out the console really runs at almost rock-solid 2.2GHz, and it actually performs really damn well. It's hard to speculate given how many questions and uncertainly there is around PS5 GPU, we simply need the first batch of 3rd party titles for some actual comparison.
 
D

Deleted member 775630

Unconfirmed Member
Better controller, more AAA exclusives and games may even look better on the PS5, although I expect the Series X to be supeior on that front.
You've only seen a picture of it "pie_tears_joy:
We don't know anything about the launch lineup, let alone the lineup for the rest of the generation
Based on the specs 3rd party games will definitely look better on XSX, 1st party we'll have to see since Sony has amazing studios.
 
Last edited by a moderator:
PS5 exclusives may look better than series x multiplatform and exclusive games.
It's already posted

I think it's too early for such assumptions/discussions - due to their variable nature we don't know what the actual clocks/TF will actually be during games, or how effective RDNA2 is with high frequencies, because who knows, maybe it will be able to reach 2.4-2.5GHz in PC GPUs, and PS5 clocks will turn out not to be actually that high as we all think. Because with either CPU or GPU there are diminishing returns once you reach a certain clock speeds, for CPUs it seems that anything above ~4.5GHz is completely irrelevant, there are very little gains with going for 4.7, 4.8, 5.0, 5.2GHz, and for GPUs it's more or less 1900-1950Mhz, anything more gives a single digit FPS gains, be it Pascal, Turing or Navi. So yeah, it will all come down to how well RDNA2 handles the frequencies, because it might turn out that despite being 2.2 there is just 2-4 more FPS compared to 2GHz a.k.a. 9.2TF, or it might turn out the console really runs at almost rock-solid 2.2GHz, and it actually performs really damn well. It's hard to speculate given how many questions and uncertainly there is around PS5 GPU, we simply need the first batch of 3rd party titles for some actual comparison.
 

Rat Rage

Member
Underpowered in regards to what purpose? Creating great games? Of course not. I has enough power to create awesome games, just like the XboxEX has.
 
D

Deleted member 775630

Unconfirmed Member
It has next gen features, series x controller does not, although i do like it too.
What does that even mean "next gen features"?
Sony has been playing catch up. A more ergonomic design to rival the Switch Pro controller and Xbox One controller, haptic feedback to again bring it up to date, texture analog sticks to again bring it up to speed, etc. The only new thing here are adaptive triggers, which will only be used in select first party games. Oh right, and the built-in mic. Yeah I prefer my privacy otherwise I would've used my Kinect more often.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
That's overcompensating, because before specs were released they all cared about power. That was very obvious in the next-gen topic.
And in every poll more people are still getting the PS5. People want more new and powerful consoles but the difference between these consoles won't matter that much is what I'm trying to say
 

LED Guy?

Banned
How many people here believe that Sony under delivered with the PS5's graphical capabilities based on the 10.3 TFLOP metric? Do you subscribe to the notion that the PS5's GPU is nothing more than a RX 5700/XT card or that it's only on par with midrange GPUs today?

This post is about providing real data and educated estimations to dispel the notion that the PS5 GPU is only a "midrange" GPU that is not on par with today's top tier commercial GPUs. Indeed, looking at the TFLOP number in isolation is indeed very misleading and the truth about the actual performance of the GPU points paints a very different picture. I know many of you don't know me but I can say that I am not just pulling this info from nowhere. I have over 15 years experience working in gaming and have spent nearly 5 years of my career doing critical analysis of GPU performance. Take it for what it is.

Before I begin, a full disclaimer: this post is not about a comparison to or commentary on the Xbox Series X. No console fanboyism here please. The fact is, the Xbox Series X has a bigger GPU with more theoretical horsepower. Period. Nobody is refuting that so please no Xbox defense force please.

Like many, I too was initially somewhat disappointed to when I first heard the PS5 specs mainly because there was so much information before hand that pointed to more performance being a possibility. We've all been hearing about the 12-14 TFLOP monster that Sony was building and honestly it's not about the raw numbers that matter. I was more happy about the idea that both consoles would come out being really close in power which benefits gamers by establishing a high baseline where neither machine will have subpar 3rd party releases. But after taking some time to process the specs and information Sony released as well as doing some in depth analysis, I am pretty happy with what Sony ended up with from a GPU standpoint.

Let me be clear: the goal of what I'm presenting here is not to define an absolute performance metric for PS5 with a given piece of content. In other words, I am not trying to predict that PS5 can run game X at Y Fps specifically. That is impossible since there are some many variables affecting overall performance that we do not know about: CPU, memory, driver, other console specific optimizations etc. Instead what I am doing is establishing a realistic expectation of a baseline of performance of the GPU specifically by looking at known real world performance data from comparable hardware.

How am I doing this? Let me break it down:
  1. Let's establish a comparison mapping to other known GPUs based on their GPU architectures and theoretical computation power based on what we know:
    • We know that AMD's RDNA architecture is a general 25% increase in performance per clock when compared to GCN -> 1TFLOP (RDNA) = 1.25 TFLOP (GCN)
    • We know that RDNA 2 will be even more efficient than RDNA (i.e. perf per clock and per watt will be better). Now we can guess how much more efficient based on some actual hints from Sony and AMD:
      • Mark Cerny himself during the PS5 tech dive revealed that the size of each CU in the PS5 GPU is roughly 62% larger than a PS4 CU. Thus, there is the equivalent of 58 PS4 CUs in the PS5. So 36 CU (PS5) = 58 CU (PS4). Now 58 CUs running at the PS5's 2.23 Ghz frequency => ~16.55 TFLOP (GCN). So what is the conversion factor to get from 10.28 TFLOP (RDNA 2) to 16.55 TFLOP (GCN)? Well it turns out that the additional perf per clock to reach that ratio is precisely 17%. So by this data: 1 TFLOP (RDNA 2) = 1.17 TFLOP (RDNA 1)
      • AMD has already said that they are pushing to deliver a similar improvement with RDNA 2 over RDNA 1 as saw from GCN to RDNA 1. They have also confirmed that RDNA 2 will see a 50% improvement in perf/watt over RDNA 1. GCN to RDNA 1 saw a 50% perf/watt and 25% perf/clock increase. A 25% further increase in perf/clock in RDNA 2 sounds pretty ambitious and i will be more conservative. But we can use this as an upper bound.
      • AMD has talked about mirroring their GPU progression to that of their CPU. They have specifically talked about increasing CPU IPC by roughly 15% every 12-18 months. The 10-15% range is typical of GPU generational transitions in the past
    • Using the 25% ratio of RDNA to GCN and a 15% ratio of RDNA 2 to RDNA 1, we can calculate the equivalent amount of theoretical performance (i.e TFLOPs) for the PS5 GPU in terms of both RDNA performance and GCN performance:
PS5 TFLOP = 10.28
PS5 TFLOP (RDNA 1) = 12.09 (used to compare against RX 5700 and RX 5700 XT)
PS5 TFLOP (GCN) = 16.13 (used to compare against Radeon VII, PS4)
2. We can also note that it is actually easier to guessestimate the PS5 GPU performance since there is a GPU on the market very similar to it in the RX 5700. The GPU config in terms of CU count, number of shader cores, memory bus size, memory bandwidth etc is exactly a match for the PS5. At a high level, the PS5 is simply an extremely overclocked RX 5700 in terms of hardware. Now typically on PC, overclocking a GPU gives limited returns due to power issues and system design limitation that will not exist in a console. So if we calculate that the PS5's typical GPU clock of 2.23 Ghz is indeed ~34% higher than the typical GPU clock of the RX 5700 at 1.670 Ghz, we can extrapolate PS5 as being roughly 35% higher than that of an RX 5700. However, doing that raw translation does not account for RDNA 2 additional efficiencies. So if we add the 15% uplift in efficiency, we can get a pretty good idea of the PS5 GPU performance. It turns out that this projected value is pretty much identical to the TFLOP conversion factors I computed above :messenger_winking:
3. Now that we have a quantitative comparison point, we can calculate a PS5 projected performance target based on theoretical performance from comparable GPUs. For example, RX 5700 XT = 9.7 TFLOPs (RDNA 1) and PS5 = 12.09 TFLOP (RDNA 1) That puts the PS5 projected performance at ~25% higher than a RX 5700 XT. Using these calculations for other GPUs as reference points we get the following:

PS5 vs Xbox Series X = -15% (PS5 is 15% slower)
PS5 vs RX 5700 = 153% (PS5 is 53% faster)
PS5 vs RX 5700 XT = 125% (PS5 is 25% faster)
PS5 vs Radeon VII = 120 % (PS5 is 20% faster)
PS5 vs PS4= 8.76x (PS5 is nearly 9x faster)
4. Finally, now that we have a performance factor for some common GPUs across various AMD architectures, we can see where a projected PS5 performance will rank compared to the fastest cards on the market including Nvidia cards. I've looked at several industry aggregate sites such as Eurogamer, TechpowerUP, and GPUCheck (numerous games tested) as well as a couple of high profile games such as DOOM Eternal, Call of Duty Modern Warfare, and Red Dead Redemption 2 to look at where the PS5 performance will fall. I've done this analysis across numerous performance metrics, resolutions, and difference GPU references defined above to see if the data was consistent. The goal here was to identify which GPU currently on the market had the closest performance to a projected PS5 performance. I've highlighted the 4K rows since 4K is the target resolution for the PS5. The summery table shows which GPUs came closest to the projected PS5 performance at different resolutions. The raw results are below:

ecvwP0.jpg

**Note: Game performance was captured from TechpowerUp benchmark analysis using max settings at all resolutions

Key Takeaways:
  1. General takeaway is that in most cases at higher resolutions, the PS5 performance is actually slightly higher than that of the RTX 2080 Super.
  2. Note that the 1080p values are a bit misleading since some games are CPU bound at that resolution. Thus, most GPUs exhibit lower perf which is why the RTX 2080 Ti was the closet at 1080p.​
  3. These numbers do no take into account other factors that can improve PS5 GPU performance even further such as: GPU specific optimizations, console specific optimizations, lower level driver compared to PC, I/O throughput improvements in PS5, memory subsystem etc​
  4. This analysis is just a rough estimate and again is not to be taken literally in terms of actual performance in games. There are still a ton of variables and unknown factors. But it does account for known information to give a good relative performance baseline to set expectations on how much performance the PS5 GPU may possess. The answer is that it is definitely not "just an RX 5700 XT" and will likely have more performance than a 2070 Super​
  5. My analysis went well beyond these websites, game titles, and reference GPUs. I presented the highlights but the overall takeaways is the same from the additional data: performance is most in line with a RTX 2080 Super.​
So is the PS5 GPU underpowered? The data shows that the actual game performance is roughly around a RTX 2080 Super at a minimum in most cases which is currently the 2nd fastest commercially available GPU on the market! Anyone that can call that under-powered or "midrange" is...not very knowledgeable on this topic. Yes, by this same analysis the Xbox Series X would be matching or exceeding a RTX 2080 Ti which is amazing! The point here is that both consoles will have plenty of graphical horsepower and the PS5 in general is still a significant step up from anything AMD has released to date and is a generational leap over the PS4!

Every should be excited but please stop spreading FUD about PS5 performance :messenger_winking:
It will be just like Xbox Series X minus 5 or 6 frames, or 160p less resolution, that is it.
 

base

Banned
VRR makes sense at sustained framerates from 45 to 60, it doesn't help with 30 fps games at all.
And that's what I was hoping for. IMO only 1st party devs gonna stick with 30 fps. Some will have unlocked framerate (fingers-crossed). And here comes VRR.
 
D

Deleted member 775630

Unconfirmed Member
And in every poll more people are still getting the PS5. People want more new and powerful consoles but the difference between these consoles won't matter that much is what I'm trying to say
Yeah I think it never would've mattered. Even if the PS5 was a 9.2TF console, and XSX 13TF, or if the XSX had the same exact SSD, it just wouldn't have mattered to the people on this website. PS5 would still win in every poll, because people just love the brand more.
 

base

Banned
Someone mentioned better RayTracing on XsX. Good luck with that.


P.S: No chance we would be getting any good RT this gen. We're not even sure if the upcoming NVIDIA Ampere gonna be enough for 4k with RayTracing. But keep dreamin' u guys.
 

Radical_3d

Member
24GB as a minimum???
How much ram do you think these machines actually need? They're game consoles not work stations.
RAM tec has slowed down a lot. Imagine the PS3 with 128 MB of RAM. Or the PS2 with 12. Or the PS4 with 1GB. That’s the same scenario we are seeing here. Let’s see how quickly the developers hit to that wall. And if they complain in public.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Yeah I think it never would've mattered. Even if the PS5 was a 9.2TF console, and XSX 13TF, or if the XSX had the same exact SSD, it just wouldn't have mattered to the people on this website. PS5 would still win in every poll, because people just love the brand more.
That's true because PS has been making great games for so long consistently that's where the brand loyalty comes from
 
Yes, all the effort put into the entire I/O subsystem design of the PS5 will only benefit a handful of use cases.

Argues against people downplaying the GPU difference.
Proceeds to downplay the I/O differences.

That's not what I said. I said that there are very real limitations to NAND memory technology, which have been the case since it first existed. And there are no amount of customizations you can do to change those particular limitations. IF there were, persistent memory technologies like MRAM, FRAM, Xpoint/Optane etc. would not have came into existence.

And my comment on the SSDs was aimed at both systems, not just PS5. But no amount of I/O customization will work around the inherent limitations of NAND technology and how it ultimately functions, which will impact the range of use-cases where it can be truly useful in terms of certain types of game design. Literally all I was saying.
 
Someone mentioned better RayTracing on XsX. Good luck with that.


P.S: No chance we would be getting any good RT this gen. We're not even sure if the upcoming NVIDIA Ampere gonna be enough for 4k with RayTracing. But keep dreamin' u guys.
But now 540p is just as good as 4k!
 

rnlval

Member
How many people here believe that Sony under delivered with the PS5's graphical capabilities based on the 10.3 TFLOP metric? Do you subscribe to the notion that the PS5's GPU is nothing more than a RX 5700/XT card or that it's only on par with midrange GPUs today?

This post is about providing real data and educated estimations to dispel the notion that the PS5 GPU is only a "midrange" GPU that is not on par with today's top tier commercial GPUs. Indeed, looking at the TFLOP number in isolation is indeed very misleading and the truth about the actual performance of the GPU points paints a very different picture. I know many of you don't know me but I can say that I am not just pulling this info from nowhere. I have over 15 years experience working in gaming and have spent nearly 5 years of my career doing critical analysis of GPU performance. Take it for what it is.

Before I begin, a full disclaimer: this post is not about a comparison to or commentary on the Xbox Series X. No console fanboyism here please. The fact is, the Xbox Series X has a bigger GPU with more theoretical horsepower. Period. Nobody is refuting that so please no Xbox defense force please.

Like many, I too was initially somewhat disappointed to when I first heard the PS5 specs mainly because there was so much information before hand that pointed to more performance being a possibility. We've all been hearing about the 12-14 TFLOP monster that Sony was building and honestly it's not about the raw numbers that matter. I was more happy about the idea that both consoles would come out being really close in power which benefits gamers by establishing a high baseline where neither machine will have subpar 3rd party releases. But after taking some time to process the specs and information Sony released as well as doing some in depth analysis, I am pretty happy with what Sony ended up with from a GPU standpoint.

Let me be clear: the goal of what I'm presenting here is not to define an absolute performance metric for PS5 with a given piece of content. In other words, I am not trying to predict that PS5 can run game X at Y Fps specifically. That is impossible since there are some many variables affecting overall performance that we do not know about: CPU, memory, driver, other console specific optimizations etc. Instead what I am doing is establishing a realistic expectation of a baseline of performance of the GPU specifically by looking at known real world performance data from comparable hardware.

How am I doing this? Let me break it down:
  1. Let's establish a comparison mapping to other known GPUs based on their GPU architectures and theoretical computation power based on what we know:
    • We know that AMD's RDNA architecture is a general 25% increase in performance per clock when compared to GCN -> 1TFLOP (RDNA) = 1.25 TFLOP (GCN)
    • We know that RDNA 2 will be even more efficient than RDNA (i.e. perf per clock and per watt will be better). Now we can guess how much more efficient based on some actual hints from Sony and AMD:
      • Mark Cerny himself during the PS5 tech dive revealed that the size of each CU in the PS5 GPU is roughly 62% larger than a PS4 CU. Thus, there is the equivalent of 58 PS4 CUs in the PS5. So 36 CU (PS5) = 58 CU (PS4). Now 58 CUs running at the PS5's 2.23 Ghz frequency => ~16.55 TFLOP (GCN). So what is the conversion factor to get from 10.28 TFLOP (RDNA 2) to 16.55 TFLOP (GCN)? Well it turns out that the additional perf per clock to reach that ratio is precisely 17%. So by this data: 1 TFLOP (RDNA 2) = 1.17 TFLOP (RDNA 1)
      • AMD has already said that they are pushing to deliver a similar improvement with RDNA 2 over RDNA 1 as saw from GCN to RDNA 1. They have also confirmed that RDNA 2 will see a 50% improvement in perf/watt over RDNA 1. GCN to RDNA 1 saw a 50% perf/watt and 25% perf/clock increase. A 25% further increase in perf/clock in RDNA 2 sounds pretty ambitious and i will be more conservative. But we can use this as an upper bound.
      • AMD has talked about mirroring their GPU progression to that of their CPU. They have specifically talked about increasing CPU IPC by roughly 15% every 12-18 months. The 10-15% range is typical of GPU generational transitions in the past
    • Using the 25% ratio of RDNA to GCN and a 15% ratio of RDNA 2 to RDNA 1, we can calculate the equivalent amount of theoretical performance (i.e TFLOPs) for the PS5 GPU in terms of both RDNA performance and GCN performance:
PS5 TFLOP = 10.28
PS5 TFLOP (RDNA 1) = 12.09 (used to compare against RX 5700 and RX 5700 XT)
PS5 TFLOP (GCN) = 16.13 (used to compare against Radeon VII, PS4)
2. We can also note that it is actually easier to guessestimate the PS5 GPU performance since there is a GPU on the market very similar to it in the RX 5700. The GPU config in terms of CU count, number of shader cores, memory bus size, memory bandwidth etc is exactly a match for the PS5. At a high level, the PS5 is simply an extremely overclocked RX 5700 in terms of hardware. Now typically on PC, overclocking a GPU gives limited returns due to power issues and system design limitation that will not exist in a console. So if we calculate that the PS5's typical GPU clock of 2.23 Ghz is indeed ~34% higher than the typical GPU clock of the RX 5700 at 1.670 Ghz, we can extrapolate PS5 as being roughly 35% higher than that of an RX 5700. However, doing that raw translation does not account for RDNA 2 additional efficiencies. So if we add the 15% uplift in efficiency, we can get a pretty good idea of the PS5 GPU performance. It turns out that this projected value is pretty much identical to the TFLOP conversion factors I computed above :messenger_winking:
3. Now that we have a quantitative comparison point, we can calculate a PS5 projected performance target based on theoretical performance from comparable GPUs. For example, RX 5700 XT = 9.7 TFLOPs (RDNA 1) and PS5 = 12.09 TFLOP (RDNA 1) That puts the PS5 projected performance at ~25% higher than a RX 5700 XT. Using these calculations for other GPUs as reference points we get the following:

PS5 vs Xbox Series X = -15% (PS5 is 15% slower)
PS5 vs RX 5700 = 153% (PS5 is 53% faster)
PS5 vs RX 5700 XT = 125% (PS5 is 25% faster)
PS5 vs Radeon VII = 120 % (PS5 is 20% faster)
PS5 vs PS4= 8.76x (PS5 is nearly 9x faster)
4. Finally, now that we have a performance factor for some common GPUs across various AMD architectures, we can see where a projected PS5 performance will rank compared to the fastest cards on the market including Nvidia cards. I've looked at several industry aggregate sites such as Eurogamer, TechpowerUP, and GPUCheck (numerous games tested) as well as a couple of high profile games such as DOOM Eternal, Call of Duty Modern Warfare, and Red Dead Redemption 2 to look at where the PS5 performance will fall. I've done this analysis across numerous performance metrics, resolutions, and difference GPU references defined above to see if the data was consistent. The goal here was to identify which GPU currently on the market had the closest performance to a projected PS5 performance. I've highlighted the 4K rows since 4K is the target resolution for the PS5. The summery table shows which GPUs came closest to the projected PS5 performance at different resolutions. The raw results are below:

ecvwP0.jpg

**Note: Game performance was captured from TechpowerUp benchmark analysis using max settings at all resolutions

Key Takeaways:
  1. General takeaway is that in most cases at higher resolutions, the PS5 performance is actually slightly higher than that of the RTX 2080 Super.
  2. Note that the 1080p values are a bit misleading since some games are CPU bound at that resolution. Thus, most GPUs exhibit lower perf which is why the RTX 2080 Ti was the closet at 1080p.​
  3. These numbers do no take into account other factors that can improve PS5 GPU performance even further such as: GPU specific optimizations, console specific optimizations, lower level driver compared to PC, I/O throughput improvements in PS5, memory subsystem etc​
  4. This analysis is just a rough estimate and again is not to be taken literally in terms of actual performance in games. There are still a ton of variables and unknown factors. But it does account for known information to give a good relative performance baseline to set expectations on how much performance the PS5 GPU may possess. The answer is that it is definitely not "just an RX 5700 XT" and will likely have more performance than a 2070 Super​
  5. My analysis went well beyond these websites, game titles, and reference GPUs. I presented the highlights but the overall takeaways is the same from the additional data: performance is most in line with a RTX 2080 Super.​
So is the PS5 GPU underpowered? The data shows that the actual game performance is roughly around a RTX 2080 Super at a minimum in most cases which is currently the 2nd fastest commercially available GPU on the market! Anyone that can call that under-powered or "midrange" is...not very knowledgeable on this topic. Yes, by this same analysis the Xbox Series X would be matching or exceeding a RTX 2080 Ti which is amazing! The point here is that both consoles will have plenty of graphical horsepower and the PS5 in general is still a significant step up from anything AMD has released to date and is a generational leap over the PS4!

Every should be excited but please stop spreading FUD about PS5 performance :messenger_winking:

From

Figure 3 (bottom of page 5) shows 4 lines of shader instructions being executed in GCN, vs RDNA in Wave32 or “backwards compatible” Wave64.

Vega takes 12 cycles to complete the instruction on a GCN SIMD. Navi in Wave32 (optimized code) completes it in 7 cycles.

In backward-compatible (optimized for GCN Wave64) mode, Navi completes it in 8 cycles.

So even on code optimized for GCN, Navi is faster., but more performance can be extracted by optimizing for Navi.
Lower latency, and no wasted clock cycles.
----

Major RDNA pref/watt improvement is by instruction retirement latency reduction i.e. Vega GCN's 12 clock cycles to RDNA's GCN mode 8 clock cycles or native RDNA's 7 clock cycles.
 
Yes, the Ps5 GPU is exactly that.
Basically a 5700

Same CUs,
Same TMUs
Same ROPs
Same memory Interface
Same bandwidth.

The only difference is a different RDNA architecture and clock speeds.
 
Last edited:

Tqaulity

Member
I think it's too early for such assumptions/discussions - due to their variable nature we don't know what the actual clocks/TF will actually be during games, or how effective RDNA2 is with high frequencies, because who knows, maybe it will be able to reach 2.4-2.5GHz in PC GPUs, and PS5 clocks will turn out not to be actually that high as we all think. Because with either CPU or GPU there are diminishing returns once you reach a certain clock speeds,..
  1. I think it's a foregone conclusion that the RDNA 2 PC GPUs coming this year will indeed be clocked above the PS5 clocks (rumblings suggest well above 2.5Ghz). AMD has suggested as much and the PS5 just shows what's possible even in a closed box with limited power.
  2. People still talk about the PS5 "variable clocks" like it's something so foreign. In the PC space, both the CPU and GPU operate with varying clock speeds during execution. This is just how it works but it is a foreign concept in a console. However, based on what Cerny has said, the PS5 clocks will likely be less variable than their PC counterparts. The typical operating frequency will be at the caps of 3.5Ghz and 2.23Ghz for the CPU and GPU. This has the benefit of saving a ton of power when the workload does not demand full frequency while keeping a consistent experience expected from a console.
  3. I pointed out that the diminishing returns on clock speed increase on the PC is largely due to external factors with power gating and board design. Since a console is designed from the ground up, they can design around such limitations when building the box. There of course is still a curve when it comes to clock speed gains that will hit a wall at some point. But if Sony designs the PS5 around the 2.23 Ghz value, it can ensure that the box has everything around it to maximize performance at that frequency
It's interesting that nobody talks about how the RTX 2080 Ti is not really a 13 TFLOP card because it rarely ever runs at it's max frequency :messenger_smirking:. In fact, most PC GPUs never come close to reaching their theoretical max performance when running workloads. That's one of the fundamental differences between the PC platform and a console. Developers can control every aspect of the execution on a console so you can maximize the efficiency of the hardware. It's all about efficiency guys. That 13 TFLOP 2080 Ti may be only reaching a max throughput of 6 TFLOPs in even the best case (i.e. most demanding games). This is especially true when you realize that the games it is running is largely designed for MUCH lower hardware specs (1.8 TFLOP PS4 for example).

This is also why we consistently see console games that seem to punch above their weight and do things thought not possible on such low end hardware. We are implicitly comparing that to PC standards (i.e what a 1.8 TFLOP GPU can do on PC) which is wrong since it's entirely dependent on the software it runs and the software designed for a console in constructed differently than that on a PC in many ways. But that God of War game for example is able to maximize those 8 Jaguar cores and 1.8 TFLOP GPU performance to a degree that PC software generally doesn't come close to.
 

base

Banned
i already play 95% of games on my pc :lollipop_winking:
console is for exclusives games.
Well I had the same but decided to sell my gaming PC and leave laptop just for websurfing/netflix. After making my own gaming PC I've noticed there aren't many games that interest me. I play CS:Go most of the time.
 

Tqaulity

Member
Yes, the Ps5 GPU is exactly that.
Basically a 5700

Same CUs,
Same TMUs
Same ROPs
Same memory Interface
Same bandwidth.

The only difference is a different RDNA architecture and clock speeds.
Yep, but sum of those differences in architecture and clock speeds could mean a >50% increase in actual performance over that PC card. And this is advantageous to Sony as well because they can achieve top tier performance out of a smaller GPU that is much cheaper to manufacture and takes up less die space. Again efficiency :messenger_winking:
 

Tqaulity

Member
Ok sure. It's all theoretical numbers anyway with no real way to measure. My point to put it another way is I was saying that you may only get like 50% of the max throughput on a typical game with a GPU that powerful running software designed for roughly 10% of that spec.

BTW that's different than discussing the utilization of the GPU. Sure you can run Furmark at certain settings and get the RTX 2080 Ti to hit 100% utilization. But I'm talking about how the software is designed to maximize efficiency and achieve optimal throughout during it's execution. That simply doesn't happen for the vast majority of PC game software.

Another great analogy is thinking about amplifier. You may buy an amplifier that says it can do 200W/ch with a low amount of distortion. Yet when listening to most music even at "loud" volumes, your amp may not be exceeding 10W of output. Every so often it may spike to 100W or so for very short burst and in those cases, that 200W amp will be able to handle that with aplomb and not distort. But in the vast majority of it's runtime, it never comes close to outputing at 200W.
 

Neo Blaster

Member
I was expecting 9.4 TF so I’m a happy camper, but I think the RAM situation is disgrace. As the OP numbers the PS5 is 9 times faster than the PS4, but the memory allocated for games is not even three times bigger (13GB vs 5.5GB). And yes, it is faster and there are optimisations but come on, they are always faster and there are always optimisations in every generational leap and that hasn’t stop the memory pool from being at least 8x bigger.

That’s why I think the SSD is a cheap replacement of a good memory setup.
You have to consider RAM will be more efficiently used thanks to SSD speed, no more using memory space with stuff you won't use because HDDs are slow.
 
Yep, but sum of those differences in architecture and clock speeds could mean a >50% increase in actual performance over that PC card. And this is advantageous to Sony as well because they can achieve top tier performance out of a smaller GPU that is much cheaper to manufacture and takes up less die space. Again efficiency :messenger_winking:
The biggest issue is bandwidth.
Ps5 GPU has the same bandwidth as the RX 5700 while being more powerful and has to share that bandwidth with the CPU
That's not great.
 

Journey

Banned
Xbox One was considered underpowered because the PS4 existed, the next down the line is the Switch.

Xbox Series X exists 🤷‍♂️
 

Tqaulity

Member
Radeon VII is using GCN 5 architecture, while PS4 GCN 1, so in reality RDNA2 estimation should look even more impressive compared to PS4.
Yep, right on the nail :messenger_winking: . I was waiting for someone to point this out. I didn't want to go too crazy and over-complicate things even more by making that level of distinction. But you're absolutely right. It's probably closer to 10x PS4 (which is Sony's goal with every generation)
 

EverydayBeast

ChatGPT 0.1
Good luck having the most powerful anything at any given time when new tech is released every second, as soon as you walk out the door of Best Buy with your new computer a more powerful one is already being stocked.
 

CuNi

Member
Let loading the 4k texture take 10 minutes. Got killed? another 5 min of loading last checkpoint. But hey, who cares? :D

I played Bloodborne on release...
Something inside of me died back then and now I literally can't care about loading times anymore...
 

GymWolf

Member
Well I had the same but decided to sell my gaming PC and leave laptop just for websurfing/netflix. After making my own gaming PC I've noticed there aren't many games that interest me. I play CS:Go most of the time.
i mean, 95% of games that come out are third party that you can play on pc, it's not really a small quantity compared to exclusive games, quite the opposite.
 

GymWolf

Member
The difference between the two consoles :
- Lower Resolution
- Less FPS
- Less Ray Tracing
- Less details
That's about it, most won't even care, just buy what you like.

Let's not forget pricing, more ram, higher the price, never believed that 24gb ram speculation.
not so sure about less details (if you are talking about exclusives vs excusives of course)
tsushima or tlou2 don't have anything to envy to any xone x game in term of details and the difference is bigger between these 2 console compared to ps5 and sex.
 

base

Banned
i mean, 95% of games that come out are third party that you can play on pc, it's not really a small quantity compared to exclusive games, quite the opposite.
You're right but I just feel I prefer couch than my chair. Joking. I don't really care that much about 4k 60 fps. I play mostly rpg games and they are more user-friendly on consoles. Yeah, I an plug in hdmi cable to my pc. No thanks. Too much work taking my pc from on room to another.

Honestly, I'm more console gamer now, had pc between 1995-2008. Then I moved to ps3 and then ps4. My work required to use pc but it's not necessary anymore so I'll stick with my ps4.
 

base

Banned
I forgot to mention. I used to play many titles years ago. Now I choose only few titles which I want to play. Haven't played many AAA games which other people praise. Maybe I've changed a little. Unfortunately, I'm not a teenager anymore.
 
Top Bottom