4GB vs. 8GB: How Have VRAM Requirements Evolved?

All that text, and you miss the point. PC games will typically use as much VRAM as consoles have RAM. This has been true every generation. The only thing holding back this gen is the trash series s with 10gb total VRAM.

The 8gb GPU era is over. It's time to upgrade.
Yeah, 12-16GB will probably be enough for the rest of this console gen as long as the game is decently optimized. In the PS4/Xbone era you could get away with a 4GB GPU for most of it, barring some higher texture/shadow settings in certain games.
 
They don't have "at least" 8GB for graphics, they use 50% of 16GB RAM tops, many games uses 2-4GB, some uses 6-8GB. Never heard any PS5/XSX developer talk about using more than 8GB and I watch alot of behind the scenes / coding and hardware talk.

Many PC games uses just 1-2GB more going from 1080p to 2160p/4K UHD on identical settings even tho resolution increased 4 times - Example:

https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html

Some of the most beautiful games on PC uses less than 8GB on max settings at 4K/UHD. Its called good optimization (texture compression etc) and proper coding. Rushed console ports are often a mess in this regard, at least on release.

Many console ports are rushed big time, some uses far more VRAM than they should, other just run like garbage or has tons of bugs. Waiting 6-12 months usually fixes this while you pay 50% less for the game on top. Never preorder console ports.

None of the socalled "demanding" console ports are actually looking great. The most impressive one is Horizon Zero Dawn and this uses less than 8GB at 4K.

Some engines just allocate all the VRAM they can. Has nothing to do with requirement. Avatar for example - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

3070 still beats 6700XT with ease in 4K/UHD at ultra settings, minimum fps included, even tho 4090 uses up to 15GB of VRAM. Allocation is the keyword.
In some cutscenes I suffered from missing/late loading high-res textures on my 3070 in FoP. Pushed me to upgrade to a 4070 Super.
 
I played HZD for a while on a 1050 Ti and RX 6400 (even tried it on a GTX 745 4GB!) and I don't remember missing textures or LOD problems on the 6400 but I had to lower quality enough on the slower 1050 Ti enough that LOD may have ended up being a non-concern relative to other settings.

One thing HZD was unique for was VRAM corruption on the 1050 Ti when I'd OC the VRAM too much. I'd get some consistent major errors in foliage and other things making ridiculous shapes in-game. Clock VRAM down a notch and restart and all is good. No other games have done that with this GPU.
These are the problems I was experiencing, but it was around the time the game launched and it's possible it's much better now with all the subsequent patches the game got.
 
I've recently bought a Lenovo Legion gaming laptop with an rtx 4070 laptop GPU(8GB Vram), Ryzen 7 7840hs CPU and 32 GB RAM. It is the best laptop system I can afford, as notebooks with 4080 laptop gpus are priced 50% higher in my country.

The 4070 laptop is in fact a slightly lower 4060 Ti 8GB, having the same graphics chipset(AD106), with a few more CUDA cores and tensor cores. This is because nvidia's naming scheme for ada lovelace generation. The card is configured at 140W max TGP.

I've tested it in a few new titles like Hogwarts Legacy and Alan Wake 2, which in my opinion is the most demanding game at the moment. All games run well at 2560 x 1600 resolution, high details, RT and so on, except for Alan Wake 2 for which I had to lower the resolution down to 1200p to keep max settings, RT high, DLSS on quality with FG and to squeeze a playable 50 FPS at least in the forest at the beginning of the game. CPU usage isn't a problem, it is about 15% average.

Question:
How long it will be before I am forced to lower RT quality and texture quality to medium to play the latest games on my 8GB GPU? And also I am excited for Stalker 2, I hope I'll be able to play it with highest settings at 1600p with decent FPS but I'm not going to pre-order.
 
6500XT is a steaming pile, true. 4GB VRAM is too low. 6GB can work for some, but 4, nah. Only for older games or indies.

I remember when AMD launched Fury X with 4GB HBM and called it futureproof because it was HBM, soon running into VRAM issues. Meanwhile they were praising how great 8GB was on the 390 refresh series 😂 980 Ti with its 6GB VRAM aged much much better (and overclocked like a champ, bumping performance by 30-40% with OC). Meanwhile Lisa Su called Fury X an overclockers dream on stage, yet it barely did 1% perf gain with OC.


Most PC gamers today are fine with 8GB, since 99% use 1440p or lower. 12GB is more than plenty, even for 3440x1440. If you want to push settings hard, the GPU will buckle before VRAM anyway.

Big difference between allocation and actual requirement. Its crazy how many people don't understand how allocation work.

Allocation does NOT reflect actual requirement. Especially not when you compare GPUs with diffferent VRAM Pool.

PS5 and XSX have 16GB shared RAM in total. OS and Game uses like 8-10GB. Meaning Graphics are 6-8GB, with a 4K/UHD target.
Series X does have the split RAM pools with the 10GB portion having the higher bandwidth and likely being used mostly for graphics. People on forums were using that to argue that 10GB on the RTX 3080 would be fine this whole console generation.
 
These are the problems I was experiencing, but it was around the time the game launched and it's possible it's much better now with all the subsequent patches the game got.

Wow that's straight up bad! Yeah HZD had a lot of problems at launch on PC but I got the game maybe 9 months later and it seems they ironed out those bugs by then. It was quite enjoyable to play on the 4GB RX 6400 (at PCIe 3.0 no less) with an i7-4790 and 16GB at a mix of Med and Hi settings.
 
Most PC gamers today are fine with 8GB, since 99% use 1440p or lower. 12GB is more than plenty, even for 3440x1440. If you want to push settings hard, the GPU will buckle before VRAM anyway.

I am currently using 3440x1440 and let me tell you you want that gpu to have at least 16. It's slowed down thank goodness but it had me worried there for a bit.
 
I feel this title is not inappropriate: "Why VRAM is so Important for Gaming"

A good game is not defined by the amount of VRAM, so I would dispute the fact that it is important. I think you are coming from the angle of graphics, and that fortunately, is only part of the equation of a game. I mean like a 2D game can still be fun even if it does not look graphically impressive and hence, don't require that much VRAM to run.
 
As Watzupken posted above, I do agree. Game play fun is No. 1
But I also love great graphics, but it's not necessary to have fun. Depends on the person (and budget!!)


So speaking graphics specifically.

The first time I thought I might need more memory was in 2016, DOOM. I was using a GTX 1080 (EVGA FTW). It had 8GB of VRAM.

About 9 months after release of the game, ID added a "special," "Nightmare," texture setting which would only appear on cards with OVER 8 GB VRAM. Luckily I was just about to upgrade to the EVGA GTX 1080ti FTW3 OC edition. I had pre-ordered. That card, back then had 11GB VRAM GDDR5X.

After installation I was delighted to see the new super nightmare option appear in settings. Of course it made no difference even though I tried to invoke the placebo affect. Still it was a good marketing move, but that game runs great on 8GB or probably less VRAM. That was 2016/2017.

Right, back to the present. I never bought the RTX 3080 as it only had 10GB VRAM. (Yes, I know it is much faster than the GTX 1080ti's 11GB - but I am an emotional individual and just couldn't get the "downgrade," from 11GB to 10GB out my head.

Regardless, the prices were nuts so I waited for the the arguably best version air cooled Asus Strix RTX 3080ti gaming OC. Out the box the power limit was 450w, most were ref. 400w I believe. It has 12 GB GDDR6X VRAM.

I still use that card as it can OC amazingly, and at 1440p, so far I can still run modern demanding games, sans ray tracing at Ultra or Ultra/Very high mix. (TBH Ultra eats up resources, but is of dubious benefit.)

To the point, I always record with HWINFO64 in the background, max and ave figures for a few parameters, VRAM total usage being one them.

It's not uncommon for the Max vram usage to be over 8GB, but there are only two games where it exceeded 10GB (10.480 GB out of max 12.288 GB to be precise).

I have never noticed any of the symtoms of lack of VRAM, but I run at 1440p with a max refresh rate of 170Hz. (Usually I set max fps to 120fps, or even 85fps while leaving refresh rate at at max 170Hz) No ray tracing, just G-Sync and NVCP FPS limiter. Works a charm. I am a graphics geek, so except for First P. Shooters, I sacrifice fps before high graphics settings. Personal preference thing.

Anyhow, if I went 4k I would need a better card, and not just for vram. Even at 1440p I'd say 12 GB VRAM, an unusual number is the safe minimum as of Mar 2024 if matching high spec DDR4 and CPU.

I posted because not many cards come with 12GB so it's for comparison.

However, although I won't upgrade for a while, if I were to now I absolutely would not go below 16 GB.

Note my opinion is based on a limitless budget (no, not me.). So with that in mind any high or very high end gaming rig needs at least 16GB VRAM.
 
I know that's true, but I've been daily driving a steam deck for the last 3 weeks. The idea that I can play games at 20 watts that I played 15 years ago at ~800watts is way more impressive to me than a $2000 GPU running at 200 watts instead of 450-500.

I've always been a gamer but I've also always been a hardware nerd. The wattage specs on the steam deck seem way cooler to me than over all specs on top tear hardware these days. I ran top-tier hardware for over 15 years but I just don't find it as interesting as the lower end stuff these days.

It's not even about price, I can buy a 4090 if I want, but the platform idea of the steam deck as battery tech improves is really what draws my interest these days.
Yeah, and I have a laptop pulling 70-90w tops and I share your thoughts. Also an android handheld.
My point was, stuff, especially in the PC space, are getting insane improvements in both performance and efficiency.
 
I am currently using 3440x1440 and let me tell you you want that gpu to have at least 16. It's slowed down thank goodness but it had me worried there for a bit.
Link me one game that uses more than 12GB VRAM in 3440x1440. Meaning huge fps dips because VRAM runs out, not just allocation which many engines do. 3440x1440 has pretty much same VRAM usage as regular 1440p.

Most games barely use 8GB in 4K/UHD native. Some do, with maximum settings and RT or Path Tracing enabled, which very few cards do because of lacking GPU power and zero AMD cards will do it. RT/PT uses alot more VRAM. Very few people use it and if they do, they often run DLSS/FSR along with it.

Only 4090 is an actual true 4K/UHD card. Thats the card I have, and I did not buy it for the 24GB VRAM which is pretty much pointless. GPU will be the limiting factor. 7900XTX and 4090 has 24GB because of 384 bit bus. Not because 24GB is required for any games.

Just look at 3090 24GB today. Can't run maxed out settings in 4K. Plenty of VRAM, GPU is lacking. Mostly beat by 4070 Ti 12GB in new games, even in 4K.

VRAM is always good to have, but personally I will rather have a fast GPU. You can always tweak settings to use less VRAM and make things run great, with high fps if GPU is powerful. But a weak GPU with alot of VRAM, think 6700XT, will limit you in the same way. GPU is not powerful enough to run settings that actually will take use of the VRAM, forcing you to lower settings...

And this is why VRAM will never futureproof any card. GPU always taps out eventually.

3070 beat 6700XT in tons of games in 2024, including minimum fps in 4K which is not even a realistic scenario for actual gamers using these cards.
 
Last edited:
All that text, and you miss the point. PC games will typically use as much VRAM as consoles have RAM. This has been true every generation. The only thing holding back this gen is the trash series s with 10gb total VRAM.

The 8gb GPU era is over. It's time to upgrade.
No they don't. Not a single game developer ever said that. Total nonsense.

I need to upgrade? I think my 4090 is doing alright still.

Meanwhile 3070 8GB still smashes 6700XT in brand new AAA games, 4K/UHD minimum fps included - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

Maybe you should read up on RAM allocation vs actual requirement.


"No significant performance gains from 16 GB VRAM"

As expected. And 99% of PC gamers use 1440p or lower, not 4K/UHD.
 
Nope.

First, it's not just the 1%ers with high res monitors and TVs. This is just a fact.

Second, you completely missed the entire point of this article and the last one. Go read or watch it. The "GPU will buckle before VRAM" myth is busted with science and example after example (even at 1080p).

Yes it is. No-one cares about 4GB in 2024, why would I bother. 6500XT is a pile of junk. With a 20/100 Score on TechSpot.

I had more than 4GB VRAM 10 years ago.

What I am saying is 8GB is plenty for 99% of PC gamers, and it is.


One of the best looking games right now and 3070 8GB beats 6800 16GB in 4K/UHD, ultra preset, minimum fps included.

Game is even AMD sponsored 🤣

Meanwhile 99% of PC gamers use 1440p or lower and don't care about 4K at all.


Keep dreaming tho 🤣
 
Had no issues with 3070 for the last year, but can't say the same about AMD 5600x...
I enabled PBO for fun with CO -30 and some games begin to stutter and others have audio issues with external USB card (clipping). Back to stock on CPU and issues are now gone.
But the fact that a 8GB card can still play modern games remain true for now.
And yeah raw GPU power beast the framebuffer, I agree.
 
Had no issues with 3070 for the last year, but can't say the same about AMD 5600x...
I enabled PBO for fun with CO -30 and some games begin to stutter and others have audio issues with external USB card (clipping). Back to stock on CPU and issues are now gone.
But the fact that a 8GB card can still play modern games remain true for now.
And yeah raw GPU power beast the framebuffer, I agree.

The AMD users won't believe you. They think you need 16GB VRAM for 1080p gaming.

Raw GPU power is more important yes. Thats why 3070 keep beating 6700XT even tho they launched at identical pricing back in 2020-2021. 3070 also has option for DLSS where 6700XT are stuck with inferior FSR.

Better GPU + Proper upscaling beats VRAM any day for longevity.

However, even in 4K/UHD native, 3070 beats 6700XT.

3070 has more than 14% higher minimum fps in 4K minimum fps ->

 
Last edited:
The AMD users won't believe you. They think you need 16GB VRAM for 1080p gaming.
1440p here not 1080p

Was looking at the Pandora charts in TPU, the 3070 is close to 2080TI 11GB and 6800 16GB. but all below 60FPS.

performance-2560-1440.png
 
1440p here not 1080p

Was looking at the Pandora charts in TPU, the 3070 is close to 2080TI 11GB and 6800 16GB. but all below 60FPS.

performance-2560-1440.png

Has nothing to do with VRAM, but rather lacking GPU power

Enabling DLSS/FSR or lowering settings will make 60+ fps possible on most of the lower end cards

Even in 4K/UHD on ultra settings, this game don't need more than 8GB VRAM while looking great and better than 99.9% of games. Read the conclusion, its all allocation, they say this

When dev's actually code well and use good texture compression, PC gamers won't need alot of VRAM.

Most of the games with massive VRAM usage, was AMD sponsored titles, which were rushed console ports. TLOU, RE4 etc. They were fixed later with patches.

AMD did stuff like this before -> https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/

They made the developer (AMD was sponsor again) push out a texture pack with 6GB VRAM requirement, that changed NOTHING for the end-user.

All they did was remove compression pretty much.
 
Last edited:
And here I am, still using middle-ground gpu with vram capacity between 4GB and 8GB, a 6GB 2060..
but for games that I play its still very much suffice..
still not need more than that capacity..
 
One of the best looking games right now and 3070 8GB beats 6800 16GB in 4K/UHD, ultra preset, minimum fps included.

Game is even AMD sponsored 🤣

Meanwhile 99% of PC gamers use 1440p or lower and don't care about 4K at all.


Keep dreaming tho 🤣

LOL motivated reasoning. You use a single game as an example, even though the $400 GPUs you're referring to can't get 40 FPS in it. That convinces nobody.

Instead look at as many games as possible to see the balance. You'll see the 6800 "smashing" the 3070, to use your exact words:

average-fps-2560-1440.png


Raw GPU power is more important yes. Thats why 3070 keep beating 6700XT even tho they launched at identical pricing back in 2020-2021. 3070 also has option for DLSS where 6700XT are stuck with inferior FSR.

Better GPU + Proper upscaling beats VRAM any day for longevity.

However, even in 4K/UHD native, 3070 beats 6700XT.

Claiming something about MSRPs is equally hilarious as if those were the prices you could buy these cards at. Look at actual prices. That real people paid. The 6700 XT was under $400 for years while the 3070 was $500+ and finally squeaked under $400 for a couple of months before recovering to it's typical price of over $100 more than the 6700 XT. The 6700 XT was always better value:

2023-12-20-image-3.png


2023-12-20-image-6.png


It's not about playing games with individual prices or benchmarks, it's about the whole value package. DLSS vs FSR is a good consideration as is 11% more FPS (Oooo an 11% SMASH!!) and is worth more money, but is it worth 33% more money, 50% more?

IMO no.
 
LOL motivated reasoning. You use a single game as an example, even though the $400 GPUs you're referring to can't get 40 FPS in it. That convinces nobody.

Instead look at as many games as possible to see the balance. You'll see the 6800 "smashing" the 3070, to use your exact words:

average-fps-2560-1440.png




Claiming something about MSRPs is equally hilarious as if those were the prices you could buy these cards at. Look at actual prices. That real people paid. The 6700 XT was under $400 for years while the 3070 was $500+ and finally squeaked under $400 for a couple of months before recovering to it's typical price of over $100 more than the 6700 XT. The 6700 XT was always better value:

2023-12-20-image-3.png


2023-12-20-image-6.png


It's not about playing games with individual prices or benchmarks, it's about the whole value package. DLSS vs FSR is a good consideration as is 11% more FPS (Oooo an 11% SMASH!!) and is worth more money, but is it worth 33% more money, 50% more?

IMO no.

6800 launched at 579, 3070 launched at 499, pointless comparison to begin with, AMD is always cheaper, meaning the true competitor was 6700XT, priced at 479 several months after 3070 released with inferior features (and they are still inferior in 2024)

You are listing prices after AMDs massive price drop because of low sales, 2 years after release, who cares. 3070 price stayed high because demand was high. Demand was low for AMD in comparison.

And AMDs demand stays low - https://www.pcgamer.com/hardware/gr...cards-are-its-worst-selling-in-over-20-years/

Sad but true. AMD needs to get back to the roots and improve their feature-set massively or people won't even bother. Starting to miss ATi which had full focus on GPUs.

It is very clear that GPUs are not a prime focus for AMD. It's CPUs and APUs.

Go have a look at Steam HW Survey and you will see that AMDs dGPU marketshare is miserable. 6000 and 7000 series are barely listed in the top 50 most used GPUs.

Most AMD users deny this fact, but thats just reality. I work with B2B sales and I know for a fact that Nvidia is shipping mad numbers and AMD don't.

Nvidia absolutely wrecks AMD in Enterprise and AI market. Gaming too, but Nvidia down-prioritize gaming right now. The money in AI are too big. AMD wants in, but they don't have anything good here.
 
Last edited:
6800 launched at 579, 3070 launched at 499, pointless comparison to begin with, AMD is always cheaper, meaning the true competitor was 6700XT, priced at 479 several months after 3070 released with inferior features (and they are still inferior in 2024)

You are listing prices after AMDs massive price drop because of low sales, 2 years after release, who cares. 3070 price stayed high because demand was high. Demand was low for AMD in comparison.

MSRP is irrelevant if you cannot buy at MSRP. What matters is: what am I getting for my money right now. I would have bought a 3080 at $700 to replace my 1080. I had planned for it and had the money, $700 was a good price for that GPU.

There were none available at that price. It's a simple as that.

I waited and eventually got a 6800 XT at $560 when its price competitor was the 3070. The 3080 was between $900-1000. Pretty easy decision there.

Did you forget that both GPUs were released during the cryptocrapfest? Nvidia's MSRPs were lower because they were priced as crypto took off and AMDs were higher because they were released and priced later during the worst of crypto. Which is why these MSRPs are useless as neither reflect what buyers actually paid.

All the rest of the market share and enterprise blah blah is irrelevant and deviating from the point.
 
Last edited:
MSRP is irrelevant if you cannot buy at MSRP. What matters is: what am I getting for my money right now. I would have bought a 3080 at $700 to replace my 1080. I had planned for it and had the money, $700 was a good price for that GPU.

There were none available at that price. It's a simple as that.

I waited and eventually got a 6800 XT at $560 when its price competitor was the 3070. The 3080 was between $900-1000. Pretty easy decision there.

Did you forget that both GPUs were released during the cryptocrapfest? Nvidia's MSRPs were lower because they were priced as crypto took off and AMDs were higher because they were released and priced later during the worst of crypto. Which is why these MSRPs are useless as neither reflect what buyers actually paid.

All the rest of your market share and enterprise blah blah is irrelevant and deviating from the point.

Exactly. I still think it's ridiculous it was allowed. You didn't see Amazon or anyone outside ebay jacking the prices for PS5 or Xbox above MSRP. Yet, it's fine to watch it happen year after year with video cards. Either the manufacturer needs to enforce MSRP the way Sony and MS do, or they might as well stop announcing it at all. At the very least, you should've been able to place an order direct through one of the manufacturers like EVGA and get it when you get it but get it at MSRP. They just keep making GPUs a glaring example of what's wrong with the market at so many levels.
 
Exactly. I still think it's ridiculous it was allowed. You didn't see Amazon or anyone outside ebay jacking the prices for PS5 or Xbox above MSRP. Yet, it's fine to watch it happen year after year with video cards. Either the manufacturer needs to enforce MSRP the way Sony and MS do, or they might as well stop announcing it at all. At the very least, you should've been able to place an order direct through one of the manufacturers like EVGA and get it when you get it but get it at MSRP. They just keep making GPUs a glaring example of what's wrong with the market at so many levels.

I tried forever to get one from EVGA, was on the list for a base model 3060 Ti and 3080. It seemed I eventually got a notice for the 3060 Ti but when I tried to see if it was worth it I could never get logged in to see an option to buy. I think because it was still a first come first served thing and I don't live on my email.
 
I tried forever to get one from EVGA, was on the list for a base model 3060 Ti and 3080. It seemed I eventually got a notice for the 3060 Ti but when I tried to see if it was worth it I could never get logged in to see an option to buy. I think because it was still a first come first served thing and I don't live on my email.

Yeah, that's more bullshit I don't understand from these companies. Just take the ****ing orders. You can pre-order random **** from Amazon months in advance. Sometimes a year in advance for games. They toss a listing up there asap and put an unknown release date on it. It's beyond stupid the way this stuff and the game consoles end up handled every time something new releases.
 
Back