As an Amazon Associate I earn money from qualifying purchases.

Tuesday, November 1, 2022

Nvidia GeForce RTX 3090 Specifications

It's hard to believe that Nvidia's GeForce RTX 3090 has been around for over two years now, though it has now been displaced by the new RTX 4090. You can see where the RTX 3090 ranks among other cards in our GPU performance hierarchy.

Originally launched in September 2020 alongside the RTX 3080, people complained about the price — how it was a Titan-class price without the Titan-class features like improved professional application drivers and support. And then the cryptocurrency mining boom of 2020 through 2022 hit, and suddenly $1,500 for an RTX 3090 that could potentially earn (at its highest point) over $30 per day seemed like a steal.

Naturally, prices shot up to compensate, scalpers got super involved with graphics cards, and the rest is sort of history. Painful history, at least for gaming enthusiasts — miners loved the RTX 3090. This was the fastest consumer graphics card throughout 2021, packed with features to make 4K gaming truly viable. And yet most of these cards probably spent all of 2021 chipping away in the Ethereum mines. RIP, Ethereum mining! But let's look at the specifications.

Nvidia GeForce RTX 3090 Specifications
Process TechnologySamsung 8N
Transistors (Billion)28.3
Die size (mm^2)628.4
Streaming Multiprocessors82
GPU Cores (Shaders)10496
Tensor Cores328
RT Cores82
Boost Clock (MHz)1695
VRAM Speed (Gbps)19.5
VRAM Bus Width384
L2 Cache6
Render Outputs112
Texture Mapping Units328
FP32 TFLOPS (Single-Precision)35.6
FP16 TFLOPS (Sparsity)142 (285)
Bandwidth (GB/s)936
Total Board Power (Watts)350
Launch DateSeptember 24, 2020
Launch Price$1,499

RTX 3090 uses the same GA102 chip as several other Nvidia GPUs — RTX 3080, RTX 3080 12GB, RTX 3080 Ti, and RTX 3090 Ti are all based off GA102. Back in 2020, there probably weren't that many nearly fully functional GA102 chips available, so supplies of the 3090 were quite limited and Nvidia pushed the narrative that these were "professional cards, not just for gamers." That's despite having GeForce branding right in the card name, so you can guess how well received such claims were.

With 82 of the available 84 SMs enabled, and the full 384-bit memory interface matched up with 24 1GB GDDR6X chips, this was a beast of a card. Power consumption on the reference model was rated at 350W, but third-party AIC vendors pushed things well into the 400W and higher range on factory overclocked cards.

One of the big issues with the RTX 3090, particularly for cryptocurrency miners, is that the GDDR6X chips ran hot. Fire up Ethereum mining and many 3090 cards, including the Founders Edition, would quickly reach 110C on the memory before starting to throttle. Dismantling the cards to replace the original thermal pads with better variants was common, and even in gaming workloads you could easily hit more than 100C on the memory.

The root of the problem was that, because Micron was only making 1GB (8Gb) capacity GDDR6X chips at the time, the RTX 3090 required half of the 24 chips to reside on the back of the PCB, while the other half were on the same side as the GPU. Take a look at that chunky Founders Edition up top: It has a chunky a triple-slot cooler to help keep all the chips cool... except that's only for the "front" of the PCB; the back of the PCB just has a metal cover with thermal pads and no active cooling. I can state from experience that the back of the card — with the RTX 3090 logo seen above, which usually faces upwards in most computer cases —can get extremely toasty under load!

Theoretical compute performance for the RTX 3090 tips the scales at 35.6 teraflops, and the Tensor cores can do up to 285 teraflops of FP16 with sparsity enabled. All of that number crunching prowess is put to good use in ray tracing games, and Nvidia's DLSS feature can leverage the Tensor cores to help with image upscaling that looks almost as good as native, at least in Quality mode. Almost all of the most demanding ray tracing games support DLSS, though, which actually brings 4K gaming into reach.

No comments:

Post a Comment