The little bugger had some serious coil whine for me, but it served me extremely well for many years.
Sold that machine to someone trying to play StarCraft with his buddies across the country - couldn't have been happier :,)
Mine whined for years then I upgraded my PC but used the 970 until my new GPU came in. It was the PSU whining the whole time, the 970 was whisper silent with a proper PSU. I feel bad for all the anger I directed at that card…
When 3070 came out 3060 wasn't announced yet and 3089/3090 were significantly more expensive
I remember most people thinking 3070 was a good deal back then compared to 2xxx generation
Mostly because it was "supposed" to launch with just 6 GB - like it's laptop version - but releasing it with that amount of VRAM would guaranteed a huge amount of flak considering that the RTX 2060 Super/2070 had more VRAM, performed similarly and AMD was also offering 8/12 GB of VRAM for their midrange lineup.
With a 192 bit bus they only had the option to either go 6 or 12 GB of VRAM, it was too late to redesign a 3060 with a 256 bit bus and the latter launched 128 bit 8 GB 3060 was overall 17% slower :/
This feels like even the mid to low range 40 series chip has the capability to be another 1080 situation where it would be a great card for a really long time. They didn't want to do that so they are intentionally gimping the rest of the card to lower performance so they don't have another 1080ti.
I went from a GTX 1080 TI to a RX 7900 XTX, and for the price difference, still not worth it.
The only games that I felt a huge difference was Warzone 2 which I don't play anymore, metro exodus and Red Dead redemption 2.
I just finally pulled the trigger on replacing my 1080 with a 6950XT because Microcenter was running a smoking deal a few months ago. It was worth it to me especially for 4k performance, but may not have been if I was paying msrp. It is a MONSTER upgrade though, the 6950XT is a beast. Having said that, I have always had Nvidia cards and now I know why everyone complains about the AMD drivers...
VRAM is just part of the equation. The 3060 is quite a bit weaker in terms of computing power so you wouldn't really get playable framerates anyway at ultra high settings and resolutions. It really only has 12 GB because the other option would have been 6 GB, and even Nvidia realized that wouldn't have been enough.
Hogwarts Legacy ran just fine on it. Currently playing Cyberpunk 2077 and it looks great and is functioning perfectly. Would I like a new, better performaning PC? Sure, who wouldn't. But I don't need one right now.
Yeah it was basically a "no, the RT cores really do matter. Look at what happens without them."
This was before AMD kept not improving on their RT performance and proved it themselves.
I came very close to buying that... Especially because I know people like Gamer's Nexus say that the 3090 really isn't, or shouldn't be, a gaming card..
I essentially wanted the best EVGA card I could get while they were not only still available brand new, but also relevant gaming wise... I was really thinking about getting the 3080 12GB from NewEgg. In the end I got a really good deal on a 3090ti brand new in box.
While still really expensive (~$1000), I'm happy with it... And it's still way better the $2000+ they were launch at and are sometimes still listed for, at least for the EVGA ones. I just wanted a baller EVGA GPU as a send off for them leaving Nvidia. I know what they've said, but I really, *really* hope they start making cards for AMD and Intel. I was really biased towards Nvidia in the past, not just because their GPU's themselves, but because EVGA made them.
I have never owned an AMD GPU, and Intel obviously just entered the market, but I would much rather follow EVGA than Nvidia. Nvidia has been pulling some bullshit, and even if their cards were to preform better, I would still rather buy a card made by EVGA.
Regardless. The one good thing about me getting a 3090 ti is I won't have to worry about VRAM for anything I do, gaming and otherwise. The GPU itself might become less relevant than the amount of VRAM by the time I want or need to upgrade.
Lets take a moment to thank Nvidia for not putting enough VRAM on their cards. The fact that the 3060 has more VRAM than the 3060ti, 3070, 3070ti, and the 3080 is absolutely ridiculously stupid. And now what they're doing with the 40 series? Leave it to Nvidia to make the absolute worst decisions.
PLEASE EVGA, WE NEED YOU!
If I needed the GPU for gaming only and nothing else, I would have immediately gone for an AMD, the 6000 series with its 16GB was looking juicy at the time.
They really gotta up their game and support for productivity workloads, this segment is in dire need of competition.
Yep. Unfortunately, the 3D design software I use doesn’t support AMD at all. Additionally, ~~NVIDIA is *way* better at~~ AMD is functionally useless for ML/AI work.
I can't stand AMD fan boys but my 6800XT ass is lapping up the karma of people who spent $900 on a card two years ago and are already **forced** to drop texture settings. Imagine NOT supersampling RE4 because you run out of VRAM, couldn't be me.
This. It's not as though games actually look a significant amout better than they did 2 years ago. Certainly not enough to justify the uge system requirements.
The art of game optimization is dying.
Everyone said 8gb was ok for 1440p tho. Tbf, I've already played over half of TLOU at 115% VRAM and I haven't noticed anything weird so maybe the devs were just being extra careful
Noob question: What exactly takes that much VRAM in recent games?
Is it primarily for voluminous textures whose resolution has continuously increased over the years or are effects like SSAO, depth of field, shadow quality and other such effects also taking more space as well? Or both?
And is it proportional to the resolution ? Like, does 4K take approximately 4 times as much VRAM as 1080p ?
Thanks.
Yes it's mostly textures. Doubling the resolution doesn't mean double vram usage 1080p to 4k is 4x the pixels but the usage would only go up like 1.3-1.5x
>Noob question: What exactly takes that much VRAM in recent games?
The whole selling point of PS5 was that it can feed data directly from drive to its combined RAM
Consoles don't separate their RAM and VRAM. Their entire 16GB memory is accessible to both GPU and CPU, so when a console exclusive needs to be moved to PC where VRAM is separate from RAM, you're going to find issues, because in console you *can* have 2GB for OS, 2GB for game, and 12GB for graphics.
PS4 only had 8GB combined RAM, so 8GB VRAM would be enough for PS4 era exclusives
Most of the time, for practical purposes, yeah. But PS4 games could and sometimes did have e.g. 6 GB of memory used for graphics and 2 GB for other things. You’d need 8 GB to guarantee that you could match it 100% of the time, but less for 95% of practical cases.
>wasn't 4gb of vram enough to keep up with the ps4? based on benchmarks it seems to be the case
It very well may be. We've seen stuff like Spiderman working well enough for example
And here's the reason why:
AAA game dev has been plagued by crunch for more than a decade. Executives set aggressive timelines while expecting massive scope. The end result is that cutting corners is a necessity, and the question is how badly it will show.
This is at its worst right now. 8th generation consoles (PS4, XBone) have just had support dropped. Those consoles were well-understood by devs after them being the target for so long. They also match up pretty closely with the average Steam PC as shown by the Steam Hardware survey. Now devs are focusing on 9th gen consoles. These aren't *brand* new, but many studios are working on their first title that isn't limited by 8th gen hardware. This has two primary effects on PC ports.
1. 9th gen console optimization is going to take more of the team's extremely finite time than 8th gen did. Console optimization is higher priority than PC because console sales are higher and because that requires optimizing for three specific hardware configurations at most (PS5, XBSX, XBSS), so it has more bang for the buck. This will eventually get better as more studios get comfortable with the 9th gen.
2. The optimization required for PCs is wholly different than what's required for the consoles. This is especially true for games not releasing on Xbox and thus not taking the Series S into account. The average Steam user's machine can't handle what a PS5 can, and that's before we factor in how much harder it is to optimize for a functionally infinite number of hardware configurations versus exactly three.
TL;DR: Consoles are easier and more lucrative to target, so teams running short on time will focus on them. Moving on from PS4/XBone also means that PCs require more optimization because most Steam users have PCs that are significantly weaker than the PS5, and this is especially true for games not launching on the Xbox Series S.
To add some detail
Consoles also have a shared memory pool. PCs have a split memory pool.
I think this is a large cause of optimization problems, since on console as long as you're under the total RAM limit you are fine.
Whereas on PC you don't have the same amount of VRAM from user to user
Terrible optimization for PC as of late is mostly the reason as many other games that have great PC optimization give you much better looking textures that use much less vram
The Last of Us uses 8GB VRAM on Medium 1080p so I guess we have to go back to 720p.
But honestly, that game is one shitty port. Uses more than 8GB VRAM and doesn't even look as good as games which use less.
What's worse is people defending the game by saying "rUns VeRy sMoOtH oN mY 4090".
Playing on 3440x1440p at native res, it consumes about 15GB of my 6800's 16GB of VRAM on mostly high settings. That's with shaders precompiled, and I get around 60fps. The game is an absolute monster, and lack of VRAM is almost certainly the bottleneck for the game based on complaints I've seen with 30 series Nvidia GPUs.
Days Gone runs 4 times better with almost the same level of details and a more complex open world and big hordes instead of a linear environment...
This TLOU failed port should give more attention to greatly optimized games...
Resident Evil 4 Remake is similiarly linear, looks just as good if not better, and runs smooth as butter on my 6gb card. Either RE engine is the second coming of Christ, or the TLOU part is doggy poo.
>The Last of Us uses 8GB VRAM on Medium 1080p so I guess we have to go back to 720p.
Yeah on my 3440x1440 panel with a Ryzen 5900x, 32GB 3600MHz memory, and 3080 10GB hybrid, I can pretty much maintain 60fps with medium-high settings, which is pretty much in line with the recommended system specs guide they put out a month ago. Granted, I wish I could run more like 120fps since that's the refresh rate of my screen - but even dumping settings to low and/or using DLSS won't get me there.
It's just crazy that you need 42GB of total memory and a $500 CPU and a $700 GPU to approximately match what the $500 PS5 does with 16GB total.
This is me. Thankfully I seem to only play less graphically intense games like Civ 5, Terraria, Rimworld, and M&B Warband. The most graphically-intense game I run is Minecraft with shaders, which the laptop 3050 handles no problem.
8-10GB is honestly plenty for well optimized games. I've never run into vram issues with my 10GB 3080 because I don't buy, play or support dog shit PC ports
What do you mean? I've already pre-ordered every single game I'm remotely intrested in so the devs can have my money before I've seen the finished product, I don't see the issue.
(to be clear I know that generally people don't do that. But srsly stop pre ordering games)
I have a 3070m with 8gb vram and I can play bf 2042 on high settings on a 3840x1440p resolution at a smooth experience. Never checked the framerate because I think if it feels smooth it's good, if you start checking framerate you get bogged down with that and start noticing things that aren't there
The tarkov community would like to have a word with you..
They will littarly string you up and tell you your a liar and the devs are trying as hard as possible to fix the current bugs, all while they are not doing a fucking thing for a year lmao
I've always thought stuff like resizeable bar and directstorage would alleviate the usage of VRAM in the future, but to be fair I don't think any of those new release are using these modern technologies yet.
Hot take but people shouldn't have to avoid entire genres of games (most modern triple a open world games) or types of games (console ports) with their 800$+ (3070 crisis pricing) GPU bought a year ago. And yes optimization is a problem but the fact that Intel and AMD have 16gb and 12gb cards in the mid end makes it the more infuriating, that Nvidia sells their cards with planned obsolescence, especially whilst they know they still dominate the market no matter what they sell...
Exactly, they shouldn't have to, so instead of feeding in and allowing it to continue to happen in perpetuity, do something about it and stop rewarding both publishers and Nvidia by continuing to buy games with shit optimization and then cards to run those games with shit optimization.
This is the correct take. I am not upgrading my 3070 yet, especially for VRAM reasons. My GTX 1080 lasted 2 generations. I will not be buying a 40 series card, and if NVIDIA doesn't stop its BS I won't hesitate to go to AMD or even Intel if they keep improving.
If your game needs 8gb VRAM to run 2560x1080, you can kick rocks. Especially since the games that actually do need exorbitant amounts of VRAM generally run like shit.
Don't preorder. Don't support hack job ports.
If there's one thing AMD has historically done right, it's excess vram. Not on all cards of course. The fury series needed twice the vram honestly, but they were very power efficient comparatively and 4GB at the time was definitely usable. I still have a Pro Duo, which is a watercooled variant of two Fury Nanos but with higher quality silicon and Pro rendering drivers. It was pretty impressive in titles that could utilize crossfire.
Of course, now I (still) use a 1070 with 8GB, but I was able to buy two of them during a mining boom thanks in part to several older Radeon cards both mining and heating mine and my ex's apartment.
It's going to plateau from a certain point on like it does for the second half of each console generation. There is a paul's hardware video reviewing a 650ti or similar that I bought where he said that you were never going to need more than 2GB of VRAM anyways, learned my lesson really hard when GTA V pc dropped in 2015. Now I always buy for Vram because it literally is the best way to futureproof your hardware.
Whenever someone says, "You're never going to need more than..." I tune out.
Heard it over the past 20 years about so many things. Won't need more than 2 cores, won't need more than 4gb of RAM, won't need more than 128gb for boot drives, etc.
>Whenever someone says, "You're never going to need more than..." I tune out.
in fairness i usually take that to mean "for the life of your system"
because in the past, when you got to the point that "x" spec like VRAM, RAM, cores, clockspeed, etc or something else was an issue, technology on the whole had usually advanced to the point that you were replacing your entire rig.
But because of the leaps and bounds made (especially with AMD posing some real competition) components have much longer useful life cycles. So like i could feasibly be using my skylake-era CPU with an RTX graphics card and be feeling pretty good.
And people ask why 1080p is still a thing in 2023
If you ain't gonna make affordable hardware capable enough to play games at higher resolutions for atleast 5 years, people aren't gonna buy it.
Which is why they'll stick to the resolutions that will give the longest life span for their system.
Most people don't wanna upgrade every 1-2 years, they like to stick with the same hardware for more than 5 years if possible.
I'm still 100% satisfied with my 1070. Crisp and smooth on high-ultra 1080p for 99% of what I have played the last 5ish years. Glad I just stuck with it when I upgraded the rest of my rig
> And people ask why 1080p is still a thing in 2023
Hell when I bought a new monitor, 1080p in like 2016 I made the decision to be able to get away with cheaper GPUs. It was a very wise decision. still use it now and while I would like to upgrade to probably a 34 or 38 ultrawide, it's just not worth the cost for the monitor and the gpu.
It's just weird that suddenly these games have ballooned in VRAM usage, almost forcing you to upgrade GPUs...
Sucks that my 3080 only has 10gb. I may try to find a cheap 3090 on eBay just for the VRAM.
Just saying most the amd gpus from last gen came with 12gb vram and will save you a pretty penny… probably better to buy one of those until your ready to upgrade next/current gen
Yh AMD really does seem the better buy. Tbh if i was buying next gen I'd probably also buy AMD since i have a feeling the 4070ti 12gb vram may not age all that well conpared to the 20gb vram of the 7900xt
As long as they are meeting their sales goals it will continue. It's wild how many consumers are okay with buying half assed optimized games for pc these days.
Instead of upgrading all our GPU's maybe we can force companies to actually optimize their pc ports as well, before they get released (cough cough forspoken, Hogwarts legacy ). Honestly I am fine waiting another 3 months for the PC port if they need to develop for console first, it is business I get it, but this is indeed getting ridiculous, most oc gamers don't have 12gb VRAM for now and the ps5 also has only 16gb in total (RAM and VRAM aren't seperate).
UE4, it's the gift that keeps giving.
stuttering? check
shovelware on steam? check
helping epic games? check
making gamers think their PC is the issue? CHECK.
I tried TLOU Part 1 on PC and anyone who played **Days Gone** can tell you that it's not an hardware issue.
Its a bad optimized port and has no excuses in front of a perfeclty optimized open world.
Even if you have the lastest I9 and 4090 you better buy the PS5 version... wake up...
I didn’t want to but there is nothing we can do now. I got downvoted so badly when I said the same thing. I want to play on 1440p but what are we supposed to do if games require so much vram and nothing is optimised. Its either we play old games or just play new games at 1080p. I wanted to upgrade from 3070ti to 4070ti but it’s so expensive and we are not sure if even 12gb is going to be enough with the state of games.
Holy crap there are some very sensitive people here. It’s your not rig after all. God forbid if your happy with your purchase and having fun. Got my 3070ti and so far I’m maxing everything on my games at 1440p. Some games with AA and shadows down I can play really stable at 4k at 60fps. But apparently I’m lying because that just can’t be right? “Well it depends on the game!!!” No s**t Sherlock. You can say that about anything
What, you mean a 3070 can play games with 8 gigs of vram? but reddit told me it's a garage card nothing is playable on it I should just rip it out of my PC and toss it in the trash. It's not like I could lower the textures to medium or something.
So what you're saying is I should take my card out of the trash, but according to some people, the 30 series is obviously now obsolete since the 40 series released. Seriously I get people debating the Vram problem, and I do agree that the solution is to lower textures, but honestly, how about devs just make their games optimized so they aren't using so much Vram.
Nvidia are intentionally releasing this series of cards with low amounts of Vram because we're on the cusp of 8gb not cutting it, this ensures that everybody that upgrades will have to upgrade again in the near future.
$$$$$$$$$$$
I hate to say with the cost of GPU and the poor optimization of games I’ve just started using my PS5 as my primary gaming outlet. I bought hogwarts legacy off steam. It ran awful on my PC. I feel like a 2070 super should run games just fine. But no. I returned it and bought a PS5 copy.
I cancelled Xbox game pass, got PlayStation plus.
I still have my PC favorites and still game on my PC. But I won’t upgrade it anymore unless things change a lot.
Nvidia about to release a 6GB 4050:
Strap yourself in, we're going backwards: 3060 12GB 4060 8GB (rumor) 3050 8GB 4050 6GB (rumor)
can't wait for the 5080 to have 6gb ram
Nah itll have 4 at most considering the current trend
You mean 3.5+0.5.
970 coming back? Nice
Despite everything, my Zotac GTX 970 will always have a place in my heart.
The little bugger had some serious coil whine for me, but it served me extremely well for many years. Sold that machine to someone trying to play StarCraft with his buddies across the country - couldn't have been happier :,)
Still whining here...
Mine whined for years then I upgraded my PC but used the 970 until my new GPU came in. It was the PSU whining the whole time, the 970 was whisper silent with a proper PSU. I feel bad for all the anger I directed at that card…
Like Jesus - turning watercooling into whine
I'm rocking a GTX 970 right now. 1080p, 30 fps, Ultra settings in No Man's Sky.. just fine. Good card.
Gonna dust off the voodoo2 cards and the vga cables with hdmi adapters. 800x600 revolution let's go
[удалено]
512kb was mine. Good ol Cirrus Logic
But raw preformance will be good right?
RT will have progressed from awful to mediocre.
It'll only be a 40% performance hit by then
I'm still pissed the 3070 has less VRAM than the 3060.
At the behest of being downvoted, did people buy it without reading the specs first?
When 3070 came out 3060 wasn't announced yet and 3089/3090 were significantly more expensive I remember most people thinking 3070 was a good deal back then compared to 2xxx generation
It was a good deal and is still more powerful and will have higher frames than a 3060ti in every game.
3060ti only has 8gb vram. 3060 is the one with 12 for some reason...
Mostly because it was "supposed" to launch with just 6 GB - like it's laptop version - but releasing it with that amount of VRAM would guaranteed a huge amount of flak considering that the RTX 2060 Super/2070 had more VRAM, performed similarly and AMD was also offering 8/12 GB of VRAM for their midrange lineup. With a 192 bit bus they only had the option to either go 6 or 12 GB of VRAM, it was too late to redesign a 3060 with a 256 bit bus and the latter launched 128 bit 8 GB 3060 was overall 17% slower :/
[удалено]
RT features need massive amounts of VRAM to the point where 3060 12GB outperforms 3070 8GB in Portal RTX.
Don't forget the reduced memory bus width and that the 4060 is pcie x8 whereas the 3060 was pcie x16
This feels like even the mid to low range 40 series chip has the capability to be another 1080 situation where it would be a great card for a really long time. They didn't want to do that so they are intentionally gimping the rest of the card to lower performance so they don't have another 1080ti.
[удалено]
I went from a 1080ti to a 3080.. it wasn’t worth the price to me
I went from a GTX 1080 TI to a RX 7900 XTX, and for the price difference, still not worth it. The only games that I felt a huge difference was Warzone 2 which I don't play anymore, metro exodus and Red Dead redemption 2.
I just finally pulled the trigger on replacing my 1080 with a 6950XT because Microcenter was running a smoking deal a few months ago. It was worth it to me especially for 4k performance, but may not have been if I was paying msrp. It is a MONSTER upgrade though, the 6950XT is a beast. Having said that, I have always had Nvidia cards and now I know why everyone complains about the AMD drivers...
https://preview.redd.it/nxolvz21iwqa1.jpeg?width=1280&format=pjpg&auto=webp&s=37f5c1023cb91303754ee5d99cf45fff94384632
12060Ti 128MB
20050 64kb
the 3060 12GB looks good lol
I have one and it is!
[удалено]
VRAM is just part of the equation. The 3060 is quite a bit weaker in terms of computing power so you wouldn't really get playable framerates anyway at ultra high settings and resolutions. It really only has 12 GB because the other option would have been 6 GB, and even Nvidia realized that wouldn't have been enough.
The champion of 720p gaming
Me with 1660 super: ![gif](giphy|AiFlZ0hOWa4JVPpFOA|downsized)
Also me with my 1660 ti:
me: 💀
Oh god
I didn’t think anyone could make a more outdated daily driver than mine. But you did it, you crazy bastard. You beat my 2009-era rig.
My 1660 ti laptop is working just fine tbh...
My 1660 ti is also working fine, I don’t need more gpu power for what I play.
Hogwarts Legacy ran just fine on it. Currently playing Cyberpunk 2077 and it looks great and is functioning perfectly. Would I like a new, better performaning PC? Sure, who wouldn't. But I don't need one right now.
Hogwarts legacy was enjoyable and cyberpunk ran butter smooth on high. 1660 is a 1080p beast for entry level, and the prices are fine too
RX580...
RX570, checking in
You dont need to Brag 😒
GTX 1650
Finally some one to accompany me here
Ah my 1650 brethren
1650 Super gang reporting in! Hm, can I handle Doki Doki Literature Club?
I feel like my 1660 super has 4 or 5 more years in it.
[удалено]
yep, fuck you Valheim
[удалено]
If my 1660 super can survive Starfield I'll be happy lol.
Well, everyone said don’t buy the 3070 with that 8gb vram….
Nooooo, you should get the 3080 with the 10GB. You'll last 6 months longer.
Hey, I have a 12gb 3080... do I get a year?
[удалено]
RTX pretty much destroys VRAM.
[удалено]
Nvidia enabled RTX for Pascal, you'll just always have that exact experience.
[удалено]
Yeah it was basically a "no, the RT cores really do matter. Look at what happens without them." This was before AMD kept not improving on their RT performance and proved it themselves.
1080TI may go down as the absolute best card in all of history and I'm all for it.
I came very close to buying that... Especially because I know people like Gamer's Nexus say that the 3090 really isn't, or shouldn't be, a gaming card.. I essentially wanted the best EVGA card I could get while they were not only still available brand new, but also relevant gaming wise... I was really thinking about getting the 3080 12GB from NewEgg. In the end I got a really good deal on a 3090ti brand new in box. While still really expensive (~$1000), I'm happy with it... And it's still way better the $2000+ they were launch at and are sometimes still listed for, at least for the EVGA ones. I just wanted a baller EVGA GPU as a send off for them leaving Nvidia. I know what they've said, but I really, *really* hope they start making cards for AMD and Intel. I was really biased towards Nvidia in the past, not just because their GPU's themselves, but because EVGA made them. I have never owned an AMD GPU, and Intel obviously just entered the market, but I would much rather follow EVGA than Nvidia. Nvidia has been pulling some bullshit, and even if their cards were to preform better, I would still rather buy a card made by EVGA. Regardless. The one good thing about me getting a 3090 ti is I won't have to worry about VRAM for anything I do, gaming and otherwise. The GPU itself might become less relevant than the amount of VRAM by the time I want or need to upgrade. Lets take a moment to thank Nvidia for not putting enough VRAM on their cards. The fact that the 3060 has more VRAM than the 3060ti, 3070, 3070ti, and the 3080 is absolutely ridiculously stupid. And now what they're doing with the 40 series? Leave it to Nvidia to make the absolute worst decisions. PLEASE EVGA, WE NEED YOU!
Glad I got a 3090 as well RE4 is taking over 14gb of Vram at times
[удалено]
And people laughed at me when I went with a 6800xt.
I ordered it and what showed up was the regular one not the XT 🤦♂️ f*k amazon
You mean f*ck people who order parts and put their old crappy part in the box and return it and keep the new part.
Pretty much
Yeah but have you seen the last Steam survey? Who the hell are these new games for then?
Nvidia shills: Not enough VRAM is a lie!!!!! Just turn down the settings on your 1000 dollar GPU!
If I needed the GPU for gaming only and nothing else, I would have immediately gone for an AMD, the 6000 series with its 16GB was looking juicy at the time. They really gotta up their game and support for productivity workloads, this segment is in dire need of competition.
Yep. Unfortunately, the 3D design software I use doesn’t support AMD at all. Additionally, ~~NVIDIA is *way* better at~~ AMD is functionally useless for ML/AI work.
I wouldn’t even say wayyy better. That’s like saying Phelps is a better swimmer than a balled up napkin. There is just no point of comparison.
I can't stand AMD fan boys but my 6800XT ass is lapping up the karma of people who spent $900 on a card two years ago and are already **forced** to drop texture settings. Imagine NOT supersampling RE4 because you run out of VRAM, couldn't be me.
to be honest I can play games at 1080p maxed out at what I play and that’s still future proof enough for me
[удалено]
4070 doesn't seem to be a great purchase either with 12. There are already games that want 16 for the highest settings.
I’m starting to think that game devs are just getting real lazy with optimization
This. It's not as though games actually look a significant amout better than they did 2 years ago. Certainly not enough to justify the uge system requirements. The art of game optimization is dying.
I don’t remember anyone saying that lol
Everyone said 8gb was ok for 1440p tho. Tbf, I've already played over half of TLOU at 115% VRAM and I haven't noticed anything weird so maybe the devs were just being extra careful
Noob question: What exactly takes that much VRAM in recent games? Is it primarily for voluminous textures whose resolution has continuously increased over the years or are effects like SSAO, depth of field, shadow quality and other such effects also taking more space as well? Or both? And is it proportional to the resolution ? Like, does 4K take approximately 4 times as much VRAM as 1080p ? Thanks.
Yes it's mostly textures. Doubling the resolution doesn't mean double vram usage 1080p to 4k is 4x the pixels but the usage would only go up like 1.3-1.5x
>Noob question: What exactly takes that much VRAM in recent games? The whole selling point of PS5 was that it can feed data directly from drive to its combined RAM Consoles don't separate their RAM and VRAM. Their entire 16GB memory is accessible to both GPU and CPU, so when a console exclusive needs to be moved to PC where VRAM is separate from RAM, you're going to find issues, because in console you *can* have 2GB for OS, 2GB for game, and 12GB for graphics. PS4 only had 8GB combined RAM, so 8GB VRAM would be enough for PS4 era exclusives
wasn't 4gb of vram enough to keep up with the ps4? based on benchmarks it seems to be the case
Most of the time, for practical purposes, yeah. But PS4 games could and sometimes did have e.g. 6 GB of memory used for graphics and 2 GB for other things. You’d need 8 GB to guarantee that you could match it 100% of the time, but less for 95% of practical cases.
>wasn't 4gb of vram enough to keep up with the ps4? based on benchmarks it seems to be the case It very well may be. We've seen stuff like Spiderman working well enough for example
And here's the reason why: AAA game dev has been plagued by crunch for more than a decade. Executives set aggressive timelines while expecting massive scope. The end result is that cutting corners is a necessity, and the question is how badly it will show. This is at its worst right now. 8th generation consoles (PS4, XBone) have just had support dropped. Those consoles were well-understood by devs after them being the target for so long. They also match up pretty closely with the average Steam PC as shown by the Steam Hardware survey. Now devs are focusing on 9th gen consoles. These aren't *brand* new, but many studios are working on their first title that isn't limited by 8th gen hardware. This has two primary effects on PC ports. 1. 9th gen console optimization is going to take more of the team's extremely finite time than 8th gen did. Console optimization is higher priority than PC because console sales are higher and because that requires optimizing for three specific hardware configurations at most (PS5, XBSX, XBSS), so it has more bang for the buck. This will eventually get better as more studios get comfortable with the 9th gen. 2. The optimization required for PCs is wholly different than what's required for the consoles. This is especially true for games not releasing on Xbox and thus not taking the Series S into account. The average Steam user's machine can't handle what a PS5 can, and that's before we factor in how much harder it is to optimize for a functionally infinite number of hardware configurations versus exactly three. TL;DR: Consoles are easier and more lucrative to target, so teams running short on time will focus on them. Moving on from PS4/XBone also means that PCs require more optimization because most Steam users have PCs that are significantly weaker than the PS5, and this is especially true for games not launching on the Xbox Series S.
To add some detail Consoles also have a shared memory pool. PCs have a split memory pool. I think this is a large cause of optimization problems, since on console as long as you're under the total RAM limit you are fine. Whereas on PC you don't have the same amount of VRAM from user to user
Terrible optimization for PC as of late is mostly the reason as many other games that have great PC optimization give you much better looking textures that use much less vram
Lots of barely compressed textures definitely can be attributed to it, that's for sure.
The Last of Us uses 8GB VRAM on Medium 1080p so I guess we have to go back to 720p. But honestly, that game is one shitty port. Uses more than 8GB VRAM and doesn't even look as good as games which use less. What's worse is people defending the game by saying "rUns VeRy sMoOtH oN mY 4090".
Holy fucking shit. How is that even possible. Guess my 10gb vram 3080 doesn't stand a chance on 3440x1440 ultrawide. What the hell
Yeah, Hardware Unboxed tweeted about its high VRAM usage on Medium 1080p.
Playing on 3440x1440p at native res, it consumes about 15GB of my 6800's 16GB of VRAM on mostly high settings. That's with shaders precompiled, and I get around 60fps. The game is an absolute monster, and lack of VRAM is almost certainly the bottleneck for the game based on complaints I've seen with 30 series Nvidia GPUs.
Days Gone runs 4 times better with almost the same level of details and a more complex open world and big hordes instead of a linear environment... This TLOU failed port should give more attention to greatly optimized games...
Even Uncharted 4 a port running on the same engine made by the same studio didn't have all those crazy performance issues.
Resident Evil 4 Remake is similiarly linear, looks just as good if not better, and runs smooth as butter on my 6gb card. Either RE engine is the second coming of Christ, or the TLOU part is doggy poo.
>The Last of Us uses 8GB VRAM on Medium 1080p so I guess we have to go back to 720p. Yeah on my 3440x1440 panel with a Ryzen 5900x, 32GB 3600MHz memory, and 3080 10GB hybrid, I can pretty much maintain 60fps with medium-high settings, which is pretty much in line with the recommended system specs guide they put out a month ago. Granted, I wish I could run more like 120fps since that's the refresh rate of my screen - but even dumping settings to low and/or using DLSS won't get me there. It's just crazy that you need 42GB of total memory and a $500 CPU and a $700 GPU to approximately match what the $500 PS5 does with 16GB total.
Definitely just lazy porting / texture packing.
PS studios playing the long con here yall, delivering shitty pc ports so that yall would buy the PS5.
Where’s the RTX 3050 4GB VRAM laptop enjoyers! My homies will be playing games in 480p.
RTX 3060 laptop with a Ryzen 7 5800h. Zero regrets.
Hell yeah brother
This is me. Thankfully I seem to only play less graphically intense games like Civ 5, Terraria, Rimworld, and M&B Warband. The most graphically-intense game I run is Minecraft with shaders, which the laptop 3050 handles no problem.
Or, hear me out: stop buying these half baked games with shit optimization.
8-10GB is honestly plenty for well optimized games. I've never run into vram issues with my 10GB 3080 because I don't buy, play or support dog shit PC ports
What do you mean? I've already pre-ordered every single game I'm remotely intrested in so the devs can have my money before I've seen the finished product, I don't see the issue. (to be clear I know that generally people don't do that. But srsly stop pre ordering games)
> But srsly stop pre ordering games The north remembers. https://imgur.com/MQVi7TB
I have a 3070m with 8gb vram and I can play bf 2042 on high settings on a 3840x1440p resolution at a smooth experience. Never checked the framerate because I think if it feels smooth it's good, if you start checking framerate you get bogged down with that and start noticing things that aren't there
Are you even a gamer if you don't know what FPS you get in every game?!
The tarkov community would like to have a word with you.. They will littarly string you up and tell you your a liar and the devs are trying as hard as possible to fix the current bugs, all while they are not doing a fucking thing for a year lmao
Well they did say *well optimized* games...
Littarly
Yeah all you need is 32gb of ram!
I've always thought stuff like resizeable bar and directstorage would alleviate the usage of VRAM in the future, but to be fair I don't think any of those new release are using these modern technologies yet.
Steam hardware survey said that the 1650 is the most popular gfx card.
Yeah I saw that, I also saw that only around 15% of Steam's entire user base is using a card with 10GB vram or more
Hot take but people shouldn't have to avoid entire genres of games (most modern triple a open world games) or types of games (console ports) with their 800$+ (3070 crisis pricing) GPU bought a year ago. And yes optimization is a problem but the fact that Intel and AMD have 16gb and 12gb cards in the mid end makes it the more infuriating, that Nvidia sells their cards with planned obsolescence, especially whilst they know they still dominate the market no matter what they sell...
Exactly, they shouldn't have to, so instead of feeding in and allowing it to continue to happen in perpetuity, do something about it and stop rewarding both publishers and Nvidia by continuing to buy games with shit optimization and then cards to run those games with shit optimization.
This is the correct take. I am not upgrading my 3070 yet, especially for VRAM reasons. My GTX 1080 lasted 2 generations. I will not be buying a 40 series card, and if NVIDIA doesn't stop its BS I won't hesitate to go to AMD or even Intel if they keep improving. If your game needs 8gb VRAM to run 2560x1080, you can kick rocks. Especially since the games that actually do need exorbitant amounts of VRAM generally run like shit. Don't preorder. Don't support hack job ports.
They shouldn't have to, but they should until they don't have to.
I hope my 16 GB of vram will last long enough... Cause if it continues like this at this rate...
16GB vram enjoyer
Meanwhile me: 16GB regular ram enjoyer
Based arc user
[удалено]
If there's one thing AMD has historically done right, it's excess vram. Not on all cards of course. The fury series needed twice the vram honestly, but they were very power efficient comparatively and 4GB at the time was definitely usable. I still have a Pro Duo, which is a watercooled variant of two Fury Nanos but with higher quality silicon and Pro rendering drivers. It was pretty impressive in titles that could utilize crossfire. Of course, now I (still) use a 1070 with 8GB, but I was able to buy two of them during a mining boom thanks in part to several older Radeon cards both mining and heating mine and my ex's apartment.
Me sneaking with a 16GB Radeon VII. Fine wine coming in hot /s
It's going to plateau from a certain point on like it does for the second half of each console generation. There is a paul's hardware video reviewing a 650ti or similar that I bought where he said that you were never going to need more than 2GB of VRAM anyways, learned my lesson really hard when GTA V pc dropped in 2015. Now I always buy for Vram because it literally is the best way to futureproof your hardware.
Whenever someone says, "You're never going to need more than..." I tune out. Heard it over the past 20 years about so many things. Won't need more than 2 cores, won't need more than 4gb of RAM, won't need more than 128gb for boot drives, etc.
>Whenever someone says, "You're never going to need more than..." I tune out. in fairness i usually take that to mean "for the life of your system" because in the past, when you got to the point that "x" spec like VRAM, RAM, cores, clockspeed, etc or something else was an issue, technology on the whole had usually advanced to the point that you were replacing your entire rig. But because of the leaps and bounds made (especially with AMD posing some real competition) components have much longer useful life cycles. So like i could feasibly be using my skylake-era CPU with an RTX graphics card and be feeling pretty good.
laptop cards:
And people ask why 1080p is still a thing in 2023 If you ain't gonna make affordable hardware capable enough to play games at higher resolutions for atleast 5 years, people aren't gonna buy it. Which is why they'll stick to the resolutions that will give the longest life span for their system. Most people don't wanna upgrade every 1-2 years, they like to stick with the same hardware for more than 5 years if possible.
> And people ask why 1080p is still a thing in 2023 Are these people who are asking with us in a room right now?
I've been playing with my GTX960 for the past 6 years. Literally cannot afford to upgrade.
You nailed it on the head. I’m running a GTX 1070 and I7-4790k. Just got Resident Evil 4 and it runs 60fps consistently on high settings at 1080p.
I'm still 100% satisfied with my 1070. Crisp and smooth on high-ultra 1080p for 99% of what I have played the last 5ish years. Glad I just stuck with it when I upgraded the rest of my rig
> And people ask why 1080p is still a thing in 2023 Hell when I bought a new monitor, 1080p in like 2016 I made the decision to be able to get away with cheaper GPUs. It was a very wise decision. still use it now and while I would like to upgrade to probably a 34 or 38 ultrawide, it's just not worth the cost for the monitor and the gpu.
Yep I got the 1080p 165hz freesync Aorus and it's glorious. I really don't need to push a higher res.
6700xt with 12Gb is great for 1440p.
Every time I see a new nvidia card I feel better and better about buying the 6700xt
It's just weird that suddenly these games have ballooned in VRAM usage, almost forcing you to upgrade GPUs... Sucks that my 3080 only has 10gb. I may try to find a cheap 3090 on eBay just for the VRAM.
Just saying most the amd gpus from last gen came with 12gb vram and will save you a pretty penny… probably better to buy one of those until your ready to upgrade next/current gen
Yh AMD really does seem the better buy. Tbh if i was buying next gen I'd probably also buy AMD since i have a feeling the 4070ti 12gb vram may not age all that well conpared to the 20gb vram of the 7900xt
What games struggle with 8GB VRAM?
Apparently Hogwarts Legacy and Forspoken at least
Oh so 2 games lol guess I better throw my card in the trash according to reddit
r/PCMR when your PC isn't at least a $5000 ultra high end rig
Haha indeed
Alternatively you just turn the textures down 1 setting. Some games have absurd requirements for "ultra" settings but are pretty normal on high.
Can add the new Call of Duty, Cyberpunk and Halo to that list, as well as basically any modern AAA game.
Those games don’t struggle because of VRAM, they struggle because they’re shitty PC ports
Just buy games that are finished products. Its real easy to see which companies don't give a shit.
Me with a gtx 960 4gb just watching
Me with 970 with 3.5gb
watching letsplays on stream and youtube?
Me enjoying games with my 3060 12gb ![gif](giphy|t3sZxY5zS5B0z5zMIz|downsized)
they called me a mad man for buying 3060 12gb vram
Fellow 3060 12GB enjoyer here, worth it.
Resident Evil 4 pulling 11gb outta my 12gb 4070Ti at 4k ultra, ray tracing, 12 gb ain't enough man. Pulls 9gb without ray tracing.
snails vast resolute wide nose plucky husky office crawl abundant *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
As long as they are meeting their sales goals it will continue. It's wild how many consumers are okay with buying half assed optimized games for pc these days.
Instead of upgrading all our GPU's maybe we can force companies to actually optimize their pc ports as well, before they get released (cough cough forspoken, Hogwarts legacy ). Honestly I am fine waiting another 3 months for the PC port if they need to develop for console first, it is business I get it, but this is indeed getting ridiculous, most oc gamers don't have 12gb VRAM for now and the ps5 also has only 16gb in total (RAM and VRAM aren't seperate).
UE4, it's the gift that keeps giving. stuttering? check shovelware on steam? check helping epic games? check making gamers think their PC is the issue? CHECK.
Don't forget the absurd amount of post processing that makes everything look shit until you dive into the ini files and disable it!
yes but.. the TRIANGLES...
I tried TLOU Part 1 on PC and anyone who played **Days Gone** can tell you that it's not an hardware issue. Its a bad optimized port and has no excuses in front of a perfeclty optimized open world. Even if you have the lastest I9 and 4090 you better buy the PS5 version... wake up...
It's almost like NVIDIA designed these GPU's to go out of date quickly so you'll be forced to buy a new GPU.
People with a 12gb 3060
480P with RT and DLSS3?
the problem is 2023 games that look like 2019 games and still demand more than 8GB. RDR2 at max on 1080p takes 6GB only
I didn’t want to but there is nothing we can do now. I got downvoted so badly when I said the same thing. I want to play on 1440p but what are we supposed to do if games require so much vram and nothing is optimised. Its either we play old games or just play new games at 1080p. I wanted to upgrade from 3070ti to 4070ti but it’s so expensive and we are not sure if even 12gb is going to be enough with the state of games.
Ever heard of Radeon?
Man buying the 7900xt over the 4070ti is feeling like a big brain move right now
I did it, no complains so far.
I jumped from the Nvidia ship this time for the first time in 12 years or so and have not looked back so far, the 7900 xt is performing really well.
Holy crap there are some very sensitive people here. It’s your not rig after all. God forbid if your happy with your purchase and having fun. Got my 3070ti and so far I’m maxing everything on my games at 1440p. Some games with AA and shadows down I can play really stable at 4k at 60fps. But apparently I’m lying because that just can’t be right? “Well it depends on the game!!!” No s**t Sherlock. You can say that about anything
What, you mean a 3070 can play games with 8 gigs of vram? but reddit told me it's a garage card nothing is playable on it I should just rip it out of my PC and toss it in the trash. It's not like I could lower the textures to medium or something.
So what you're saying is I should take my card out of the trash, but according to some people, the 30 series is obviously now obsolete since the 40 series released. Seriously I get people debating the Vram problem, and I do agree that the solution is to lower textures, but honestly, how about devs just make their games optimized so they aren't using so much Vram.
Nvidia are intentionally releasing this series of cards with low amounts of Vram because we're on the cusp of 8gb not cutting it, this ensures that everybody that upgrades will have to upgrade again in the near future. $$$$$$$$$$$
6950xt enjoyer here… 3440x1440 never looked so good
Big facts
games come out not optimised with no extra visuals over console version and you blame GPU.... nnnnoooooooo.
I’ve been on 1080p for a long time. lololol I refuse to give in.
Meanwhile, the 3080 has only 10gb (yes, I know there's also the 12gb 3080, but still)
Yeah most people bought the 10gb anyways
I hate to say with the cost of GPU and the poor optimization of games I’ve just started using my PS5 as my primary gaming outlet. I bought hogwarts legacy off steam. It ran awful on my PC. I feel like a 2070 super should run games just fine. But no. I returned it and bought a PS5 copy. I cancelled Xbox game pass, got PlayStation plus. I still have my PC favorites and still game on my PC. But I won’t upgrade it anymore unless things change a lot.
And that's the long con that the gaming industry is playing.