• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 4080 Graphics Card Now Expected To Feature 23 Gbps GDDR6X Memory at 340W TBP

tusharngf

Member


According to the leaker, most of the specifications for the NVIDIA GeForce RTX 4080 graphics card remain the same and the only thing that has changed is the memory spec and the power design. It looks like NVIDIA will be leveraging from Micron's faster GDDR6X memory modules, with the 4080 reportedly getting 23 Gbps dies. This spec update has also resulted in a slightly higher TBP, moving up to 340W from the previous 320W figure.

The NVIDIA GeForce RTX 4080 is expected to utilize a cut-down AD103-300 GPU configuration with 9,728 cores or 76 SMs enabled of the total 84 units whereas the previous configuration offered 80 SMs or 10,240 cores. While the full GPU comes packed with 64 MB of L2 cache and up to 224 ROPs, the RTX 4080 might end up with 48 MB of L2 cache and lower ROPs too due to its cut-down design. The card is expected to be based on the PG136/139-SKU360 PCB.

As for memory specs, the GeForce RTX 4080 is expected to rock 16 GB GDDR6X capacities that are said to be adjusted at 23 Gbps speeds across a 256-bit bus interface. This will provide up to 736 GB/s of bandwidth. This is still a tad bit slower than the 760 GB/s bandwidth offered by the RTX 3080 since it comes with a 320-bit interface but a lowly 10 GB capacity. To compensate for the lower bandwidth, NVIDIA could be integrating a next-gen memory compression suite to make up for the 256-bit interface.

NVIDIA GeForce RTX 4080 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4080 TINVIDIA GEFORCE RTX 4080NVIDIA GEFORCE RTX 3090 TINVIDIA GEFORCE RTX 3080
GPU NameAda Lovelace AD102-250?Ada Lovelace AD103-300?Ampere GA102-225Ampere GA102-200
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~450mm2~450mm2628.4mm2628.4mm2
TransistorsTBDTBD28 Billion28 Billion
CUDA Cores148489728?102408704
TMUs / ROPsTBD / 232?TBD / 214?320 / 112272 / 96
Tensor / RT CoresTBD / TBDTBD / TBD320 / 80272 / 68
Base ClockTBDTBD1365 MHz1440 MHz
Boost Clock~2600 MHz~2500 MHz1665 MHz1710 MHz
FP32 Compute~55TFLOPs~50 TFLOPs34 TFLOPs30 TFLOPs
RT TFLOPsTBDTBD67 TFLOPs58 TFLOPs
Tensor-TOPsTBDTBD273 TOPs238 TOPs
Memory Capacity20 GB GDDR6X16 GB GDDR6X12 GB GDDR6X10 GB GDDR6X
Memory Bus320-bit256-bit384-bit320-bit
Memory Speed21.0 Gbps?21.0 Gbps?19 Gbps19 Gbps
Bandwidth840 GB/s736 2GB/s912 Gbps760 Gbps
TBP450W340W350W320W
Price (MSRP / FE)$1199 US?$699 US?$1199$699 US
Launch (Availability)2023?July 2022?3rd June 202117th September 2020

Source: https://wccftech.com/nvidia-geforce-rtx-4080-graphics-card-23-gbps-gddr6x-memory-340w-tbp-rumor/
 
i want one tombstone GIF
 

SlimySnake

Flashless at the Golden Globes
50 tflops only 340 watts? My 12GB 3080 was hitting 400 watts consistently.

Not bad at all. The real question is whether or not its real tflops or fake 30 series tflops. The 30 tflops 3080 performed more like a 20 tflops 20 series card.
 

winjer

Member
These leaks are all over the place.
Every week there is some different leak, some times from the same individual.
I guess one of them will be right. maybe,
 
I wonder if I can get away with using a 750watt psu gold for this. I will undervolt.

I had a 308012gb evga in a box, but had to return it due to income issues... Didn't open it.

New job starts this weekend, so I'm in the market for either this or 4070. I know my psu can handle the 4070.

I saw some users using a 750w psu for 3090ti lol, so I think some people really overblow what psu you need.
 

OZ9000

Banned
I wonder if I can get away with using a 750watt psu gold for this. I will undervolt.

I had a 308012gb evga in a box, but had to return it due to income issues... Didn't open it.

New job starts this weekend, so I'm in the market for either this or 4070. I know my psu can handle the 4070.

I saw some users using a 750w psu for 3090ti lol, so I think some people really overblow what psu you need.
I'd be keen to undervolt. I hear it confers a significant power/temperature reduction whilst only having a small performance penalty. Is that true?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Apple laughs as they continue to gain on nVidia while using 1/3 or less of the watts
They are laughing at Nvidia?

GeForce chips are Gaming chips, what does Apple have to laugh about when on the gaming front their chips dont make up even 1%.


Ohh but maybe you mean productivity where CUDA runs pretty much the entire RenderFarm industry.


Nah obviously you mean in Deep Learning where Apples deep learning server GPUs are deployed by................


hmmm.....
NuYcWUM.png
 
Top Bottom