Skip to content
At least the MSRP is three digits

RTX 4070 review: An ideal GPU for anyone who skipped the graphics card shortage

Skip the 30-series because of shortages and inflated prices? This one's for you.

Andrew Cunningham
Nvidia's GeForce RTX 4070. Credit: Andrew Cunningham
Nvidia's GeForce RTX 4070. Credit: Andrew Cunningham

Nvidia's GeForce RTX 4070 is here. It's the company's first launch in over a year of a graphics card that could charitably be described as "mainstream," both in performance and in price. It costs $600.

It's not productive to keep going back to the also-$600 GTX 1080, at the time the fastest graphics card you could buy anywhere from anyone, and wondering how we got here from there (some of it is inflation, not all of it). But I keep doing it as a reminder that $600 is still more than many people pay for their entire PC, tablet, smartphone, or high-end game console. No other component in a gaming PC has seen its price shoot up like this over the same span of time; a Core i5 CPU cost around $200 in 2016 and costs around $200 now, and RAM and SSDs are both historically cheap at the moment.

To review the 4070 is to simultaneously be impressed by it as a product while also being frustrated with the conditions that led us to an "impressive" $600 midrange graphics card. It's pretty fast, very efficient, and much more reasonably sized than other recent Nvidia GPUs. In today's topsy-turvy graphics card market, I could even describe it as a good deal. But if you're still yearning for the days when you could spend $300 or less on a reasonably performant GPU with the latest architecture and modern features, keep waiting.

The RTX 4070, and a 40-series refresher

Most of these photos are just here to drive home how small the 4070 is compared to the 4080/4090 design.
The 4070 is shorter, narrower, and shallower.

The Founders Edition version of the RTX 4070 we reviewed is considerably smaller in every dimension than the Founders Edition RTX 4080 and RTX 4090, plus most of the 4070/4070 Ti/4080/4090 cards from Nvidia's partners. The two-slot GPU is just 240 mm long (compared to 310 mm for the 4080/4090) and 40 mm tall (compared to 61 mm). That's before you account for the extra space taken up by the 12VHPWR-to-8-pin power connector, which is still pretty bulky even though it only requires a pair of 8-pin connections rather than three or four. But it's still a small enough card to fit in just about any PC case, including prebuilt OEM mini-towers and tiny custom ITX builds.

The RTX 4070 has a lot in common with the 4070 Ti, the GPU originally announced as the "RTX 4080 12GB" before being "unlaunched" and relaunched. They share the same AD104 GPU die, a smaller chip than the AD102 flagship used in the RTX 4090 or the AD103 used in the 4080 series. They also both use 12GB of GDDR6X memory on a 192-bit interface. The 4070 has fewer CUDA cores (5,888, down from 7,680), but the 4070 and 4070 Ti are much more similar than the 16GB and 12GB RTX 4080 cards would have been.

The 4070 Founders Edition is small, but the included 12VHPWR adapter does add bulk. Some newer power supplies include a native 12VHPWR connector. Credit: Andrew Cunningham

That 192-bit memory interface is narrower than the 256-bit interface used by the 3070 and 3070 Ti, which Nvidia has compensated for by adding 32MB of extra L2 cache to the 4070 (for a total of 36MB). That cache and much faster base and boost clock speeds are the 4070's biggest improvements over the last-gen cards.

RTX 4090 RTX 4080 RTX 4070 Ti RTX 4070 RTX 3080 Ti RTX 3080 10GB RTX 3070 Ti RTX 3070 RTX 3060
CUDA Cores 16,384 9,728 7,680 5,888 10,240 8,704 6,144 5,888 3,584
Boost Clock 2,520 MHz 2,505 MHz 2,610 MHz 2,475 MHz 1,665 MHz 1,710 MHz 1,765 MHz 1,725 MHz 1,777 MHz
Memory Bus Width 384-bit 256-bit 192-bit 192-bit 384-bit 320-bit 256-bit 256-bit 192-bit
Memory Clock 1,313 MHz 1,400 MHz 1,313 MHz 1,313 MHz 1,188 MHz 1,188 MHz 1,188 MHz 1,750 MHz 1,875 MHz
Memory size 24GB GDDR6X 16GB GDDR6X 12GB GDDR6X 12GB GDDR6X 12GB GDDR6X 10GB GDDR6X 8GB GDDR6X 8GB GDDR6 12GB GDDR6
TGP 450 W 320 W 285 W 200 W 350 W 320 W 290 W 220 W 170 W

Because it shares the same Ada Lovelace architecture as the other RTX 4000 GPUs, the RTX 4070 also supports Nvidia's new DLSS 3 and the accompanying DLSS Frame Generation (DLSS FG) features. DLSS is AI-assisted upscaling that takes a lower-resolution image rendered by your GPU and upscales it, improving frame rates at high resolutions while losing relatively little detail.

When combining DLSS upscaling with DLSS frame generation, Nvidia can (under the right conditions) produce a more detailed, higher-frame-rate image without requiring the raw GPU performance it would normally take to render that image. Credit: Nvidia

DLSS FG, you might recall from our RTX 4090 review, "doubles" your frame rate by creating one AI-interpolated frame for every frame that the GPU renders. DLSS FG works in concert with typical DLSS upscaling—Nvidia says that "seven out of every eight pixels" can be generated by AI when DLSS and DLSS FG are both enabled. The result can be a clean-looking, high-resolution, high-frame-rate image rendered using just a fraction of the GPU performance that would be needed to render the same scene natively. Nvidia claims the RTX 4070 can be 1.4 times as fast as an RTX 3080 when DLSS FG is enabled.

But as we found in our RTX 4090 review, it tends to work best when your GPU is already pumping out reasonable frame rates, making imperfections in the AI-generated frames harder to perceive. In other words, it's a lot better at making a 60 fps image look like a 120 fps image than it is at making a 15 fps image look like a 30 fps image. DLSS FG can also add latency, making it harder to recommend for twitchy competitive first-person shooters, and it's compatible with a relatively narrow selection of games (though Nvidia's list of "50-plus titles" includes big names like Microsoft Flight SimulatorDiablo IV, and Cyberpunk 2077).

Performance and power use

Gaming testbed
CPU AMD Ryzen 7 5800X3D (provided by AMD)
Motherboard Asus ROG Crosshair VIII Dark Hero (provided by ASUS)
RAM 64GB DDR4-3200 (provided by Crucial)
SSD Western Digital Black SN850 1TB (provided by Western Digital)
Power supply EVGA Supernova 850 P6 (provided by EVGA)
CPU cooler 280 mm Corsair iCure H115i Elite Capellix AIO
Case Lian Li O11 Air Mini
OS Windows 11 22H2 with Core Isolation on, Memory Integrity off

Nvidia's reference numbers and other reviews have confirmed that the 4070 performs much like the last-generation 10GB version of the RTX 3080, a card that debuted at $699 in late 2020. It usually wins by a little and sometimes loses by a little, but the numbers are generally within five percent or so of each other. We didn't have a 3080 to test against, but we've provided benchmark data from several other cards to put the 4070's performance in context.

Across all of our rasterized gaming tests (that is, a mix of Direct3D 12, Direct3D 11, and Vulkan titles with no ray-tracing effects enabled), the 4070 was about two-thirds as fast as the RTX 4080 while only costing half as much, which is a pretty good value proposition. Across our test suite, the 4070 could usually turn in frame rates near or above 100 fps when playing at 1440p, and it even gets pretty close to the 60 fps mark running at 4K. Especially in games with DLSS support, the 4070 is more than usable as a 4K GPU, though its 12GB RAM limit may become a problem in newer games with fancier effects.

That said, in terms of raw rasterized game performance, it's not quite as big a slam dunk as the 3070 was when it was released. That GPU could soundly beat the more-expensive previous-generation RTX 2080 and 2080 Super for less money while trading blows with the RTX 2080 Ti. The RTX 4070 occasionally comes within a few percent of matching the 3080 Ti, but it consistently comes up short. That's not bad, especially if you can actually buy a 4070 for $599, but previous midrange GeForce cards have done a slightly better job of bringing last-generation flagship performance down to more reasonably priced PCs.

In games with heavier ray-tracing effects, it becomes clearer why Nvidia is marketing this first and foremost as a 1440p card. Games with lighter effects, like Shadow of the Tomb Raider or Forza 5, do fine. In both cases, the card maintains its "great at 1440p, solid at 4K" position. In Hitman 3, the GPU falls a bit short of the 60 fps line at 1440p and short of the 30 fps line at 4K. In Cyberpunk 2077, which has always been a bit of a torture test for graphics cards, it does worse, hitting 30 FPS at 1440p but struggling at 4K.

Turning DLSS on in these games mostly helps across the board. In Cyberpunk, it's the difference between workable and unplayable at 4K, at least when you have all the ray-tracing effects enabled. For games that can already hit 60 fps or above, it simply provides more of a cushion, helping things run smoother on high-refresh-rate monitors. It's not a "turn all the settings up and forget about it" card like the RTX 4090 is, but games that don't run smoothly should only need a few small settings tweaks to run well.

As for DLSS FG: in our RTX 4090 review, we speculated that the AI-accelerated frame-generation feature might not scale down very well. In games that are already running at 50 or 60 fps, DLSS FG can add a hint of extra smoothness without unduly impacting image quality. But when there are fewer rendered frames to work with, generating the interpolated frames requires more guessing, and some of those guesses are going to be wrong.

We had to crank games up to 8K to start seeing signs of this with the 4090, but for the 4070, you can begin to see it at 4K with games like Cyberpunk 2077 that are running at around 30 fps with DLSS enabled. Lights and fine lines in the Cyberpunk benchmark are noticeably more shimmery on the 4070 than they are on a 4080 with the same settings (or our 3080 Ti with DLSS enabled but without DLSS FG support), and small, fast-moving particle effects become difficult to see clearly. I don't really notice any of this on the 4080 with DLSS FG enabled.

In Cyberpunk, I think the 4070 actually looks a bit better at 1440p with DLSS (but not DLSS FG) enabled than it does running at 4K with DLSS FG enabled. Regular DLSS upscaling (and, to a somewhat lesser extent, AMD's FSR and Intel's XeSS) is great because it can take a borderline-playable game and make it playable. If DLSS FG already requires a game to be running well to deliver on its promise, it's not as much of a selling point for lower-end cards. Expect this to be even more noticeable on any 4060 or 4050-series GPUs we see.

Power efficiency is one of the RTX 4070's best qualities. Credit: Andrew Cunningham

The 4070's performance is reasonably respectable on its own, but the results become particularly impressive when you take its power consumption into account. It consumed less than 200 W in our tests, while last-generation cards that can beat it usually consume 300 W or more. It's pretty much in line with the RTX 4080's power use—two-thirds as fast while using two-thirds as much power.

For another intriguing comparison point, look at the RTX 3060, the most popular GPU on Steam right now; the 4070 runs nearly twice as fast as the 3060 while consuming just 16 percent more power. The 3060 isn't really a 4070 competitor today, but you could find 3060s for way over $599 if you were trying to buy one back in 2021. That's what we mean when we say the 4070 is a reward for those who sat out the RTX 3000-era GPU shortage—this level of performance at this price would have been unthinkable during the peak of the last crypto mining boom.

Power use is one reason why we haven't talked much about how AMD's cards stack up to the RTX 4070. The two to consider are the RX 6800 XT ($579) and the RX 6950 XT ($649) since the company still hasn't released lower-end RDNA 3-based RX 7000-series cards to follow up the 7900 XTX and 7900 XT yet. And while the 6950 XT, at least, is capable of beating the 4070 in many benchmarks, it does so while consuming 86 percent more power. Once you factor in the RX 6000 series' poor ray tracing performance, it's difficult to recommend the 6950 XT, even if you do get a bit more raw performance for a similar price.

Not a bargain, but it’s worth what it costs

The RTX 4070 is a nice GPU, but it's not "budget." Credit: Andrew Cunningham

Nvidia GeForce RTX 4070

Reading other coverage of the RTX 4070, the adjective I take the most issue with is "budget." This is a $600 graphics card, and no matter the comparison point you pick—past xx70-series GPUs, the cost of a fully-built high-end game console, the cost of the entire rest of the PC you're putting it in—it's not a "budget" card. When people say "budget" or "affordable" about the RTX 4070, they of course mean to say "affordable relative to the RTX 4090, 4080, and 4070 Ti," but that's not an impressive achievement by itself. The RTX 4000-series are all historical outliers when it comes to pricing, and the 4070 is, too.

What is impressive is the GPU’s price-to-performance ratio—roughly as fast as an RTX 3080 but priced $100 lower. And let’s be honest, for the vast majority of its life span, buying an RTX 3080 for anything close to $700 was a virtual impossibility. The RTX 4070 is the first GPU that properly rewards people who decided to wait out the RTX 3000 generation because of the shortages and price inflation that accompanied it.

The card's size and power efficiency, at least in the Founders Edition card we tested, is also impressive for its performance level. You get RTX 3080-ish performance at less than two-thirds the power consumption, and it's small enough that (for once) you don't need to break out the spec sheet for your case to ensure it will fit. Whether you're installing an upgrade in some prebuilt desktop or looking for an efficient card for your tiny mini ITX custom-build, this 4070 will fit fine (though using the 12VHPWR-to-8-pin adapter adds a lot of bulk, especially if you're trying not to bend or pull on the cable in ways that might cause problems).

As for the competition, purely in terms of rasterized gaming performance for the money, the RX 6800 XT ($579) and RX 6950 XT ($649) are competitive with the RTX 4070, but both cards consume way more power (about 50 percent more for the 6800 XT and nearly twice as much for the 6950 XT), and both cards are soundly beaten by the 4070 in any benchmark with ray-tracing effects enabled. AMD's FSR 2 looks fine and is supported by many games, but it doesn't help performance as much as DLSS does at comparable resolutions and quality settings, and AMD has yet to replicate anything like DLSS FG. AMD still offers good performance per dollar if you don't care about any of that other stuff, but it's a long list of things you have to decide not to care about.

In the end, the RTX 4070 is a great graphics card for people who have $600 to spend on a graphics card. It's frustrating that it isn't a little faster or a little cheaper, which keeps it from being as easy to recommend as cards like the 1070 or 3070 were. And it's annoying that people with $200 to $400 to spend have been so underserved by the major GPU makers for so many years. We're way past the days of Nvidia and AMD putting genuine effort into $200 GPUs rather than leaving old and/or compromised models to do the job.

But the 4070 is a solid value for the price, giving you all of the benefits of Nvidia's latest architecture, outstanding 1440p performance, usable-with-compromises 4K performance, and excellent power efficiency. For the people who can afford it in the first place, it's worth the money.

The good

  • A good all-around performer at a price that, in today's GPU market, is relatively reasonable and commensurate with performance
  • Excellent 1440p performance, even with ray tracing turned on
  • Competent at 4K, especially in games that support DLSS
  • Great power efficiency
  • DLSS remains a great performance-boosting feature
  • Actually in stock at MSRP a week after launch

The bad

  • 12VHPWR connector seems especially unnecessary in a GPU that only needs one or two 8-pin connectors (some partner cards do use 8-pin power instead of 12VHPWR)
  • All the partner cards that are still gigantic
  • Limited game compatibility for DLSS 3 and DLSS FG
  • Even putting aside latency issues, DLSS FG struggles to help games that aren't already running pretty well without it

The ugly

  • The conditions that have led us to a "mainstream" GPU that sells for six hundred dollars

Listing image: Andrew Cunningham

Ars Technica may earn compensation for sales from links on this post through affiliate programs.

Photo of Andrew Cunningham
Andrew Cunningham Senior Technology Reporter
Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.
Prev story
Next story