The Price is Wrong: This is What GPUs Should Have Cost

Based on the fact that the current generation of Mercedes sedans go no faster than those from 4 generations ago, my 2024 S-Class should have cost only $5,000 new. And housing? Why, on a per-square foot basis, these new ones cost far *more* than those built 10 years ago! Us poor consumers are getting exploited on all fronts. The government should pass a law mandating at minimum 30% price-performance increase in any and all new products released.
 
Prices are high and it won't change anytime soon. TSMC milks all. 30% higher powerbill for Taiwan. Inflation. Expensive parts / VRAM with GDDR7 entering the stack, with a much higher price and shipping cost is much more expensive today as well + AI boom on top. We won't see cheap GPUs anytime soon.

Next gen is going to be weird. 5090 will demolish all and cost 2000-2500 dollars probably.

Nvidia with full high-end focus and AMD with low to mid-end focus. No high-end SKUs at all.

I expect Nvidia to use Samsung or Intel fabs for low to mid-end real soon, from 6000 series.

Nvidia don't need prime nodes for these chips. Prime nodes are for Enterprise/AI and highest end 90 and 80 range. Maybe 70 series, depending on SKU.

Look forward to Intel making dGPUs at their own 20A/18A fabs. They are coming fast for low to mid-end GPU segment and DX9-10 gets less and less relevant. Intel does well in DX11 and newer. They are not giving up this market. This is AMDs prime market they are coming for.

Intel and AMD can fight for scraps while Nvidia owns high-end and probably mid-end too. Nvidia don't really care for low-end, unless its highly profitable, which it will only be if they make the low end stuff on cheap nodes.
 
Last edited:
Used market is king for me, 3070 at $300 that I got one year ago still does the job.

Anyway nice article, thanks!

Thats strange! According to AMD fanboys you need at least 12GB if not 16GB in 2024! And most PC gamers use 4K native or even higher!

Yet 3070 8GB beats 6700XT 12GB with ease in 4K/UHD native on ultra settings in terms of minimum fps

minimum-fps-3840-2160.png


3070 has 15% higher minimum fps and they both launched at 480-500 dollars.

Not that any of these cards do 4K well, however VRAM is not the problem. :laughing:

Also, DLSS is far superior to FSR, which will help alot with longevity down the road.

Developers embranced upscaling and its here to stay. DLSS/DLAA reigns supreme and most PC gamers will gladly use upscaling if performance goes up by 50-100% if not more, with built in anti aliasing and sharpening. Thats just reality.

Lets hope AMD can improve FSR alot, after all they changed approach recently and will go the AI route as well. Nvidia is just years ahead.
 
Last edited:
Nah, I don't think more analysis of this is necessary. The relative weight that the cost per frame/value angle carries in your overall approach to GPU coverage makes for some predictable and frankly rather dry reading/watching.

We could do with a bit less moralizing and instead perhaps some more passion about what excites you/attracted you to this industry in the first place, if you ask me.
 
Currently priced GPUs are still overpriced. Telling Nvidia and AMD that the prices are what they should have been now, compared to what they were is stupid.

4080 came out originally $1200. That's easily $500 to high, especially considering that the 4080 is only about 25% faster than the 3080 10GB that MSRPed at $700. I could have seen the 4080 priced at $749....maybe $799 at the very most, but $1200 is just f'ing gross.

The 4080 Super version offers no performance gains and is priced at $1000. That's still $200-300 too much.

Hopefully prices keep coming down, but I guess we wait and see....though I wouldn't really hold my breath.

We asked you, when buying a new GPU, how much performance you are after at the same price, filtering this by how often you upgrade your GPU.

This is how I used to upgrade in the past.

I used to run SLI builds. Never top end cards, but the second or third to top. I'd run two cards in SLI, then my upgrade would be to wait a couple of generations and generally get a low-high end set of GPUs that were coming up on EOL production and were reduced in price to move remaining inventory.

8800GTS 512MB in SLI gave similar performance of a single GTX 280.
At the time the GTX 4xx series was launching, I actually came across some NIB GTX 280 cards that ran me around $280 each (which surprised me because the 280s were out of production due to them being replaced by the GTX 275). I moved to them.

GTX 280s in SLI gave a similar performance of a single GTX 570.
At the time the GTX 6xx series was coming out I picked up two GTX 570s for around $250 a pop.

GTX 570s in SLI gave about 15% less performance (if memory serves me right) of a single GTX 970.
I pondered going with the 970 since my 570s were getting long in the tooth, but this was the first time ever that I decided to go with a single, top-end card and picked up a 980Ti. Usually I'm always buying a generation behind what is releasing, but this time I went with the current gen of the time and opted for the top-end card.

That 980Ti lasted me 6 years until I got a chance to replace it (was starting to have issues) with a 3080 (took a while to find one that didn't cost nearly twice the MSRP due to the bullshit shortages caused by the manufacturers directly selling off most of their inventory to miners).

Then I got hands on a 3080Ti that I've been using the past 16 months and sold the 3080 for $400 recently. If things pan out like they did with the 980Ti, I'm hoping I get another 4 years out of this 3080Ti. Cost of GPUs is so stupid these days that it is actually rather off putting thinking about upgrading and I'm hoping to avoid having to spend money on one for a good long while.
 
Thats strange! According to AMD fanboys you need at least 12GB if not 16GB in 2024! And most PC gamers use 4K native or even higher!

Yet 3070 8GB beats 6700XT 12GB with ease in 4K/UHD native on ultra settings in terms of minimum fps

minimum-fps-3840-2160.png


3070 has 15% higher minimum fps and they both launched at 480-500 dollars.

Not that any of these cards do 4K well, however VRAM is not the problem. :laughing:

Also, DLSS is far superior to FSR, which will help alot with longevity down the road.

Developers embranced upscaling and its here to stay. DLSS/DLAA reigns supreme and most PC gamers will gladly use upscaling if performance goes up by 50-100% if not more, with built in anti aliasing and sharpening. Thats just reality.

Lets hope AMD can improve FSR alot, after all they changed approach recently and will go the AI route as well. Nvidia is just years ahead.
it’s ironic you wrote all that slop, quoted tech power up, then posted it on tech spot, while ignoring they tech spot testing for games like RE:V and TLOU2 that show a significant performance loss on 8gb cards, along with a host of rendering issues.

But some people gotta meatshield their multi billion dollar corpus I guess.
 
it’s ironic you wrote all that slop, quoted tech power up, then posted it on tech spot, while ignoring they tech spot testing for games like RE:V and TLOU2 that show a significant performance loss on 8gb cards, along with a host of rendering issues.

But some people gotta meatshield their multi billion dollar corpus I guess.
Funny that those two games are AMD sponsored and both were rushed console ports as well.

And they were fixed long ago with patches:


The irony that you picks two rushed console ports, sponsored by AMD, to prove your point

Meanwhile 3070 8GB beats 6800 16GB in AVATAR in minimum fps at 4K/UHD using Ultra preset :joy:

Runs like **** on both, but 3070 still win, proving 8GB is plenty.
And DLSS easily beats FSR in this game, like all other games.


min-fps-3840-2160.png
 
Last edited:
Nvidia is like Apple now where half the selling point is the brand name not the performance per money. My 1080gtx costed 150% less than current gen equivalent. less than fun. too bad they have taken root here in purgatory, it will require ww3 to have them join 3dfx so bring out the nukes my mans.
 
Last edited:
Used market is king for me, 3070 at $300 that I got one year ago still does the job.

Anyway nice article, thanks!

I don't have a current need for anything more powerful than my RTX 2060, but if it were to die today I would be looking at used or Intel Arc series.
 
Thats strange! According to AMD fanboys you need at least 12GB if not 16GB in 2024! And most PC gamers use 4K native or even higher!

I see a single person here fanboying that crap and it's you.

More posts here drives engagement which is great for the site but beating on the same tired cherrypicked points in every post makes them easy to skip. Luckily Tim and Steve are great at assessing value in their articles.
 
What a biased comparison. The summary of the article is all the time

"Hmm, considering inflation and the position of the stars as we see it Nvidia demonstrates great value across the board, even the 3050 6gb. You should upgrade now!"

"No, AMD! You shouldn't consider inflation, don't consider adjusted MSRP. it's all bad, just buy that old gpu"
 
But Capitalism! If people are willing to purchase 50% over MSRP, then the product was clearly underpriced!

/sarcasm
Instead of ignorant sarcasm, why not learn at least the rudiments of economic theory? When demand outstrips supply, then either the price must rise or shortages occur, and no one can purchase.

It's worth reminding people that, during the Covid graphics card supply crunch, nearly all those excess profits went to scalpers. Had NVidia raised the MSRP to reflect economic realities, it would have been better for consumers overall.
 
Instead of ignorant sarcasm, why not learn at least the rudiments of economic theory? When demand outstrips supply, then either the price must rise or shortages occur, and no one can purchase.

It's worth reminding people that, during the Covid graphics card supply crunch, nearly all those excess profits went to scalpers. Had NVidia raised the MSRP to reflect economic realities, it would have been better for consumers overall.

Nope
 
Nice in-depth article, which confirms everyone's gut feeling that even accounting for inflation, GPUs are still overpriced post-pandemic shortages. The thing with inflation is that until about 2020, the low to mid-range GPUs that I am interested in were largely immune to it.

Let's look the Nvidia "600/60" series over the years. This was my target range - higher performance than the lowest end, but still affordable for a family guy on lower wages. Pricing was pretty consistent, from the $199 6600 GT released in 2004, through the GTX 1060 3GB 12 years later, also at $199, without adjusting for inflation. In 2019 the GTX 1660 was released at $219. However, the RTX 2060 the same year was over $300, as were the RTX 3060 and RTX 4060. The only $200 class GPU from Nvidia since then, the RTX 3050, is comparable in performance to the three year older GTX 1660. It is not the huge leap in performance for the same price that we had come to expect.

To be honest, as much as Nvidia likes to brag about their dominance and profits in the AI space, I am surprised they don't sell gaming cards at a loss just to pressure AMD. They have pretty much said they don't need the money from desktop GPUs.
 
I am one of those 'every other gen' upgraders. I usually buy nVidia GPUs but am not really brand conscious, just like to get the best bang for buck.

The 'market' has been good for all GPU makers of late, what with Covid, Bitcoin, and now AI driving up prices of any and all cards. I do not look forward to my next GPU. I bought my last GPU during the Covid price bubble, paid FAR to much for a 3090 TI. Now, with no end in sight for the AI 'bubble', my next GPU will also be FAR overpriced.

May decide to wait a gen, but the expectation of the devs is that everyone they develop for has the latest-greatest hardware tends to make three gen upgraders a bit woeful during that third gen wait.... And, with nVidia being in the position (again) of having the best product for the latest 'bubble'.... Well that tends to force the price 'bubble' to also include AMD.

My usual use case is photoshop, blender, etc. and also the games I develop mods for. So, yeah - I am definitely in the workstation build category, but NOT into the 'I make my living at it' category.
 
Instead of ignorant sarcasm, why not learn at least the rudiments of economic theory? When demand outstrips supply, then either the price must rise or shortages occur, and no one can purchase.

It's worth reminding people that, during the Covid graphics card supply crunch, nearly all those excess profits went to scalpers. Had NVidia raised the MSRP to reflect economic realities, it would have been better for consumers overall.
Ladies and gentlemen, may I present to you the cult of unregulated Capitalism.
 
Ladies and gentlemen, may I present to you the cult of unregulated Capitalism.
The "cult" of capitalism has raised several billion people from the crushing poverty of laboring 12-hour days in wheat and rice fields to the high standards of living we see today. Whereas the alternative of statist socialism has given us Stalin, Mao, Castro, Pol Pot, and the DPRK's Kim dynasty -- and the hundreds of millions of their own citizens they actively murdered.

To paraphrase Churchill, capitalism is the worst economic system -- except for everything else we've ever tried.
 
It doesn't matter what any product *should* have cost. Enough people bought them at the time to convince the manufacturers that their pricing decisions were correct. There's no incentive to change anything because any efforts by consumers to vote with their wallets weren't enough to get the job done. Media hand-wringing years later achieves little more than giving AMD and nVidia the opportunity to laugh at what suckers people were for paying those prices in the first place. If you want the pricing model to change, spend a generation or two of not buying anything that's not up to the job, and pray that the manufacturers don't simply bail on what they might consider to be a dead market.
 
Back