Nvidia's GeForce 1060, 2060, and 3060 graphics cards are some of the most widely used GPUs in all of PC gaming. Four of Steam's top five GPUs are 60-series cards, and the only one that isn't is an even lower-end GTX 1650.
All of this is to say that, despite all the fanfare for high-end products like the RTX 4090, the new GeForce RTX 4060 is Nvidia's most important Ada Lovelace-based GPU. History suggests that it will become a baseline for game developers to aim for and the go-to recommendation for most entry-level-to-mainstream PC gaming builds.
The RTX 4060, which launches this week starting at $299, is mostly up to the task. It's faster and considerably more power efficient than the 3060 it replaces, and it doesn't come with the same generation-over-generation price hike as the higher-end Lovelace GPUs. It's also a solid value compared to the 4060 Ti, typically delivering between 80 and 90 percent of the 4060 Ti's performance for 75 percent of the money.
That doesn't mean there aren't small things to gripe about. Stepping down from the 3060's 12GB of memory to a stingier 8GB doesn't feel great, especially when poorly optimized PC ports seem to want as much video memory as they can get. It's only a modest speed upgrade over the RTX 3060. And DLSS Frame Generation, the big new 4000-series feature that Nvidia plays up in all of its announcements, is less impressive on midrange cards than it is on high-end ones.
The RTX 4060
RTX 4090 | RTX 4080 | RTX 4070 Ti | RTX 4070 | RTX 4060 Ti | RTX 4060 | RTX 3060 Ti | RTX 3060 | |
---|---|---|---|---|---|---|---|---|
CUDA Cores | 16,384 | 9,728 | 7,680 | 5,888 | 4,352 | 3,072 | 4,864 | 3,584 |
Boost Clock | 2,520 MHz | 2,505 MHz | 2,610 MHz | 2,475 MHz | 2,535 MHz | 2,460 MHz | 1,665 MHz | 1,777 MHz |
Memory Bus Width | 384-bit | 256-bit | 192-bit | 192-bit | 128-bit | 128-bit | 256-bit | 192-bit |
Memory Clock | 1,313 MHz | 1,400 MHz | 1,313 MHz | 1,313 MHz | 2,250 MHz | 2,125MHz | 1,750 MHz | 1,875 MHz |
Memory size | 24GB GDDR6X | 16GB GDDR6X | 12GB GDDR6X | 12GB GDDR6X | 8GB or 16GB GDDR6 | 8GB GDDR6 | 8GB GDDR6 | 12GB GDDR6 |
TGP | 450 W | 320 W | 285 W | 200 W | 160 W | 115 W | 200 W | 170 W |
The RTX 4060 Ti was an outlier compared to the 3060 Ti, shipping with fewer of Nvidia's CUDA cores and half the memory bandwidth and leaning on boosted clock speeds, additional L2 cache, and other architectural upgrades to close the gap. The result was a card that didn't always feel like much of an upgrade, especially at higher resolutions.
It's only when you look at the name (to compare to last gen x60 cards instead of x50 cards) and price inflation that's hit every segment of the GPU market that it becomes a bitter pill. Meaning the card isn't bad, the PC gaming market in general is trash when "entry level GPU" is ~$300 and "enthusiast GPU" is ~$1000+.
The market being trash is kind of orthogonal to "I need something to play games". Nothing you can do about the former if the latter is true; you just need direction to the least-worst option. My personal advice would be one of "buy used", "check out a console", or "forget about gaming until the market is more sane". But if none of those are an option, and you just have to get something right now, this is a very solid choice. Which makes me gag a bit for old man yells at cloud reasons, but current reality is current reality, and ultimately I can't really fault the author for operating within it.
Caught up on this thread this evening, since I'm working on other things. It's clear this review isn't doing the job for some of you - I can at least try to give a bit more context about my thought process and the realities of recent GPU reviews. I will split this up using pithy subheads so things don't get too lost in the sea of text. :)
I Suddenly Review GPUs Now: An Incredible True Story
I hear and sympathize with the complaints asking for more context, particularly re: last-generation cards, other competing cards, etc. This is mainly a problem of equipment. I started doing GPU reviews in November/December of last year, with the RTX 4080 and 7900 XTX/XT. I had very few GPUs to start with, and a totally different testbed from what Sam had been using - we're a remote workplace without a big centralized store of equipment to draw on, so what's in my closet at any given point is pretty much what I have.
Unlike at Gamers Nexus or LTT or some other big YouTube channels, there is not a "team" here reviewing GPUs or working closely and collaboratively on these things, there's not a person who takes the pictures and a person who runs the tests and a person who draws up the charts and a person who writes the script. It's all mostly just One Guy, and now the One Guy is me.
So I was starting from a position of not really wanting to use numbers from old reviews - I can't be sure how comparable they are, given that I have a totally different CPU/motherboard/RAM/etc. And I only had one or two other GPUs in my possession. I've tried to get more, and had some success, but companies generally don't have old GPU stock just sitting around to send out to people (I've definitely asked).
I have tried to do older-gen comparisons when I can; for the RX 7600 review, I ended up going to a friend's house and borrowing his own personal 5600 XT, something he generously allowed me to do. Not something that's repeatable for many GPUs, though!
This is not to be all woe-is-me, the situation just is what it is. I just want to key y'all in to the behind-the-scenes difficulties I've dealt with while also juggling the work I had already been doing before. I think the CPU reviews have gotten better now that I've been doing it for a couple of generations and I have a deeper well of hardware to draw from; I think the GPU reviews will get there too.
On Used Cards, And The Value Of Efficiency
When I do GPU reviews (or CPU reviews) basically I try to start by pretending that a friend who plays PC games but is not particularly tech-savvy has come to me and told me they want to build something or buy something or upgrade.
I don't really consider the used market a ton in this context, because buying used tech stuff from eBay, Amazon, whatever 3rd-party source you wanna name has a lot of potential problems and pitfalls. You don't know where components have been or what they've been doing, you might get scammed and sent some counterfeit part, you might not be covered by whatever warranty that card originally had.
It's clear I should at least be giving more high-level context about the secondhand market for Ars' readership, though, and I'll try to do that. But I don't think "just go buy a used 3070" is like, slam-dunk super-actionable buying advice for a lot of people.
I will also say that I personally place more value on power consumption and power efficiency than most GPU reviewers. Most people seem to treat it as incidental, and put performance above everything. That's just a matter of perspective, and if you want to buy a 300W GPU that's cheaper than but performs the same as a 200W GPU (just pulling numbers out of the air), that's fine! But assuming you're going to use a GPU for 2, 3, even more years, efficiency does eventually have a dollar value.
This also plays into the "recommend used cards" thing, incidentally, since this gen's new cards have been so much more efficient than last gen's.
Things I Think I Am Right About, Actually!!
I do object to the idea that I'm not considering the wider market at all; consider the "no price hike" bullet point, for example. That's listed in "the good" because the RTX 3060 came out in February of 2021 for $329, which is about $380 in mid-2023 dollars. A super-soft PC market has helped CPU and SSD and RAM prices stay level or go down over the last year or two, but given the way costs of most consumer goods have spiked recently, it is actually meaningful when the price of a GPU doesn't go up compared to last gen. And lest we memory-hole the GPU shortage, it's not like you could actually buy a 3060 for $329 for most of its life. For a GPU to come out on the other end of that and cost less than its predecessor's MSRP is, honestly, pretty miraculous.
On the subject of "1080p is SO five years ago, my $300 GPU should play 4K now!," I think this line of argument is a little strange? Even ignoring the thing about most Steam games being played at 1080p, people who make this point sort of act like games haven't gotten more demanding as time has progressed. It's not as though "1080p" is some static target that a 1060 is just as good at hitting now as it was in 2016; a game released in 2023 needs a better GPU to play at 1080p than one released a few years ago.
I'll also say that I have a really hard time caring a ton about the meta-conversation going on around Nvidia's model numbers this gen - that so-and-so card should actually be a 70 series, or a 50 series, or whatever. I feel like performance relative to last-gen cards and current competitors is always the most important and most relevant thing. I get that everyone wants to go back to the days of the 1060. I do too! But we haven't been there in a while - it's not like the 3060 was such a huge upgrade over the 2060 Super it replaced, after all.
I dunno if any of this will satisfy anyone, but it's also important to me to try to earn and maintain trust and there are some valid points being made. I wouldn't change my analysis or conclusions here; I think in the new GPU market as it currently exists, the 4060 is the easiest and most straightforward recommendation, the one with the fewest caveats, and at the end of the day most folks just want to plug something in and play games and not think about whether their game uses an API or rendering effect that the GPU doesn't like.
Action items
So the tl;dr - I will try to pay more attention to the used market, including highlighting the most trustworthy sources for buying used GPUs. I'll keep working on getting more alternate and older GPUs for comparisons, a situation that will also get better over time as I do this for longer. And on things like power efficiency where I'm giving more weight to it than others generally seem to be, I will try to be more upfront about it.
I will also ask you all to keep criticism constructive (as many of you have!) and to remember that I am not personally in charge of setting GPU prices, or naming them, or dictating the rate of inflation. It's also not my job to root for underdogs. I really want AMD and Intel to give Nvidia a run for its money - competition from AMD (and, sort of, Apple/Arm) has shaken up the CPU space in a super exciting way over the last five years. It's just clear from the Steam survey that none of AMD or Intel's recent products are really doing that, and based on the data I've gathered, that's the general conclusion I've come to, too. Lots of cards that are "good, except for/but..."
I chuckle whenever someone uses RTX 4070 as an example of poor GPU sales. This started when some pointed out how terrible it was doing in unit sales at German Retailer Mindfactory. Someone has been conveniently aggregating a weekly sales update for Mindfactory for some time. AMD fans often used them to show how great AMD CPU sales were doing...
So how terrible was it doing? It was their best selling card. Yeah, I don't get it either. But since then I followed the Mindfactory results when available.
The 4070 has been their best selling card, every single week, since it released.
Here are the most recent I could find. Week 24 and again, RTX 4070 is the best selling card:
Also you can check the Steam Survey. In only 6 weeks after launch, the RTX 4070 made the cuttoff to appear. I don't think I've ever seen a card jump to the Survey this fast.
In the Steam Survey, RTX 4070 has already almost caught the RX 6800 XT, that has been out for over 2 years. I remember many saying, it was much better to buy an RX 6800 XT with all the great deals on it.
I think this narrative of new generation selling poorly is more a projection of wanting to see NVidia punished than an actuality.