[ad_1]
Our Verdict
The second Lovelace GPU is a 4K heavy hitter, but its price might hinder its popularity with some gaming PC enthusiasts.
The Nvidia RTX 4080 has finally rolled up to the graphics card scene, and its next-gen abilities are a GeForce to be reckoned with. Sadly, while it stands tall next to its RTX 4090 sibling, it’s probably not the 80-class successor you’ve been waiting for, as it flies a bit too close to the premium price point sun.
The best graphics card battlefield currently feels a bit topsy-turvy, as AMD’s top GPU, the Radeon RX 7900 XTX, is actually an RTX 4080 rival that costs $899 USD. I can’t say for sure whether the RDNA 3 card will use the Lovelace card’s lofty price against it, but it’s certainly worth keeping in mind while judging Nvidia’s contender.
The RTX 4080, however, comes in at $1,199 USD / £1,269 GBP – $500 more than the RTX 3080 at launch. In a way, I feel bad for the next-gen newcomer, as its debut is haunted by both the now canceled 12GB model and its out-of-place price. Yet, if you’re willing to ignore Nvidia’s questionable branding, the GPU is still a 4K heavy hitter that’ll boost fps further than an RTX 3090, and its remarkable DLSS 3 capabilities somewhat sweeten the deal.
Specs
Equipped with 16GB GDDR6X VRAM, the RTX 4080 is designed with 4K gaming in mind. It packs 9,728 CUDA cores, 1,024 less than the RTX 3090, and features faster 2,505MHz boost clock speeds.
Nvidia RTX 4080 FE | Nvidia RTX 4090 FE | Zotac Gaming RTX 3090 TI | |
GPU | AD103 | AD102 | GA102 |
---|---|---|---|
Cores | 9,728 | 16,384 | 10,752 |
RT cores | 76 | 128 | 84 |
Tensor cores | 304 | 512 | 336 |
VRAM | 16GB GDDR6X | 24GB GDDR6X | 24GB GDDR6X |
Memory bus | 256-bit | 384-bit | 384-bit |
Memory bandwidth | 716.8GB/s | 1,018GB/s | 936.2GB/s |
Base clock | 2,205 MHz | 2,235MHz | 1,395MHz |
Boost clock | 2,505MHz | 2,520MHz | 1,890MHz |
TDP | 320W | 450W | 450W |
MSRP | $1,199 USD (£1,269 GBP) | $1,599 USD (£1,679 GBP) | $999 USD (£1,090 GBP) |
Unlike the RTX 4090, Nvidia’s AD103 GPU isn’t as ravenous for power, thanks to its 320W TDP. That means you’ll be able to use the same 750W PSU that’s powering your RTX 3080, but you’ll still have to deal with an adapter.
Design
In our RTX 4090 review, I called the Asus TUF gaming model an aesthetic nightmare, but Nvidia’s Founder’s Edition RTX 4080 design isn’t quite as offensive. It even provides a splash of subtle lighting, complete with a customisable RGB strip and an illuminated GeForce logo. Will it stand out at a LAN event? Probably not, but it helps prevent the card from looking like a dark obelisk lurking within your case.
Don’t get me wrong, the RTX 4080 Founder’s Edition is still ridiculously chonky, and the fact it’s the same size as 4090 is perplexing. However, the reference GeForce shroud avoids committing too many visual sins while keeping itself quiet and cool, with temperatures in the region of 60-65 degrees Celsius under load.
Just like every other RTX 4000 model, the card’s PCIe 5 power connector lives near the center of the card. Personally, the 8-pin socket’s placement drives me around the bend, and it makes me want to give up on trying to manage cables completely. Thankfully, the RTX 4080 uses a less intrusive three 6+2-pin 12VHPWR adapter, so I didn’t have to wrestle with a Medusa head of snakes this time around (or talk myself into buying the best power supply with PCIe 5 support).
Performance
The RTX 4080 doesn’t put on quite as much of a performance show compared to the RTX 4090, but it’s still a killer 4K GPU that’ll force the RTX 3090 into retirement. Add DLSS 3’s frame generation into the mix, and you’ve got a card that’ll potentially blow most other high-end options, including the RTX 3090 Ti, out of the water.
Just for a moment, let’s pretend the 4080’s magical DLSS abilities are non-existent and check over some unaided frame rate stats. In Hitman 3, the card pushes out 102fps with every setting cranked to ultra at 4K, matching the abilities of the MSI RTX 3090 Suprim X. With ray tracing turned on, the Lovelace GPU actually beats the souped-up custom card, achieving a 39fps average.
Total War: Warhammer 3 tests produce a similar result, with the RTX 4080 pushing out a respectable 74fps. It’s not quite the staggering result flaunted by the RTX 4090 (around 103fps), but it’s still impressive when you consider the strategy game’s strenuous battle map visuals.
On average, the RTX 4080 churns out 28% fewer frames than the 4090, and that figure increases to around 40% when it comes to ray tracing. Both cards struggle to reach 60fps with fancy lighting enabled at 4K, but just like before, that’s where DLSS 3 comes in to save the day.
DLSS 3
Frame Generation is a GPU game-changer that’ll help the best gaming PC builds perform ridiculous frame rate tricks, and the RTX 4080 really embraces Nvidia DLSS 3. In Cyberpunk 2077, the tech punts performance into triple-digit territory, producing 110fps with ultra 4K ray tracing settings enabled.
The same leap applies to A Plague Tale: Requiem, with average frame rates increasing from 95 to 129. Of course, that former figure is still spectacular, as switching off both Super Resolution (DLSS 2’s AI upscaler) and Frame Generation results in frame rates below 60. If you’re picky about potential visual DLSS artifacting, you could stick with just Frame Generation and still achieve a respectable 89fps.
DLSS 3 is incredible, and Nvidia’s tech is undeniably a selling point for the 4080. Sure, it still partly relies on Super Resolution features that are available to all RTX series cards, but Frame Generation can produce stunning results on its lonesome. I’d personally stick to using both as a package, especially if you want to avoid dialing down any settings for a good few years.
Can the RTX 4080 handle 8K?
Talking about 8K performance feels silly, as even the best gaming monitor options available right now are capped at 4K. Yet, here I am, about to delve into whether the RTX 4080 can actually handle future resolutions. I can’t help but blame AMD for this, as the recent Radeon RX 7900 XT reveal teased a DisplayPort 2.1 connector with future-proof capabilities.
To simulate an 8K monitor setup, I took advantage of Nvidia’s Dynamic Super Resolution (DSR) setting within Control Panel, which forces the GPU to run things at 8K. Naturally, I can’t speak for how the resolution actually looks, as current screens will still simply display 4K. Yet, the experiment reveals that the RTX 4080 can in fact pull off respectable performance, provided you’re willing to dial down settings and use DLSS.
Hitman 3 is a great example of a playable 8K game, as the 4080 can run it with ray tracing enabled and maintain just over 60fps. Again, bumping every setting to low and harnessing DLSS is a must, but I wouldn’t call that an unreasonable tradeoff.
I also tried out Warhammer 3 at 8K, but the experience wasn’t exactly stellar. I’m not saying it was a slideshow, as it actually ran at around 46fps, but reducing the sim’s settings down offended my eyes a bit too much.
Cyberpunk 2077 actually runs fairly well at 8K, thanks to the power of DLSS 3. Sacrificing all settings above low to the performance gods helps the dystopian RPG run at just under 60fps – a remarkable achievement for a game that typically makes GPUs sweat.
Verdict
All things considered, the Nvidia RTX 4080 is a mighty graphics card with a menacing price tag. By rights, it should be a pixel-pushing hero that offers premium performance for less than a 4090. While the latter is true, Nvidia’s latest 80-series card costs a chunk more than the RTX 3080 and 2080, and straying away from traditional pricing may prompt some enthusiasts to wait for the RTX 4070.
So, should you buy an RTX 4080? Well, if you’re feeling impatient, it’ll provide your machine with marvelous DLSS 3 superpowers. However, I feel like that particular group of frame-rate fuss pots have probably splashed out on an RTX 4090 already, and the rest are waiting to weigh up the entire Lovelace lineup.
Still, the RTX 4080 is an excellent next-gen option that throws a heavy 4K punch, so if you do give in to its questionable MSRP, you won’t feel cheated when it comes to raw performance, features, and AI-powered capabilities. That said, custom variants will likely cost even more, and your wallet will probably end up screaming either way.
[ad_1]