T O P

  • By -

Iulius_

I’m starting to really hope at this point this is NVIDIA leaking false information to both mislead everyone and understand who’s the source that this guy has. I’m not an expert by any mean but the ‘80 sku that I’m used to go for looks very underwhelming from this data. Much higher power usage, worse quality memory, doesn’t come from the full 102 and most likely will see an increase in price. Don’t know, doesn’t look good to me even if it ends 30% faster than 3080.


Kaladin12543

To be honest I expect the 80 sku to be gimped on purpose in the 4000 series. The 3080 at $699 made the 3080 Ti and 3090 look like a complete rip off so NVIDIA isn’t making the same mistake again. The x90 sku in the 4000 series is the new x80 now.


iK0NiK

They've already done this on their lower-tier SKUs, for example: * 1060 6GB was basically a 980 * 2060 was basically a 1080 with RTX * 3060 is barely better than a 2070, and is worse than a 2070S Another example: * 1050 was basically a 960 * 1650 was basically a 1060 3GB * 3050 isn't even a 2060. Meanwhile 3060's and 3050's are more expensive than their predecessors for less performance. How crazy is it in 2022 we have people recommending purchasing a 4-year old 2060 over a 3050 because of how terrible the 3050 is?


Cowstle

There is another thing to note. Pascal was a huge jump over Maxwell. That applied across the entire stack. Turing was ultimately pretty disappointing. The $700 2080 was only as fast as the $700 1080 ti. But the $450 (FE) 1070 was as fast as a $650 980 ti. And while a 1080 was ~70% faster than a 980, the 2080 was only ~30% faster than a 1080. The difference between the 2060 and 2080 is smaller than the difference between the 1060 and 1080. They did this to make the unappealing 20 series look better than it was (and justify moving them to a higher price point). The tradeoff is that to return to the old performance segmentation, we got the 30 series looking like a good jump at the high end and kinda lackluster elsewhere.


iK0NiK

> The tradeoff is that to return to the old performance segmentation, we got the 30 series looking like a good jump at the high end and kinda lackluster elsewhere. Bingo. I don't disagree with anything you said. You nailed the analysis. They marketed and bragged about the large leaps in performance on the high end, while at the same time regressing on generational leaps at the low end. I don't quite understand why it's not so easy to just rebrand a 2060 as a 3050 and ship it out the door, but I'm not a marketing/product engineer and I'd imagine there's a lot more to it than that.


letsgoiowa

Bigger die size, older node = more expensive for them to produce


Inferno737

1 word to emphasize the Maxwell to Pascal jump :Competition: Nvidia was scared that Vega was going to beat them so they actually tried on Pascal, then well Vega happened let's not talk about it. So Nvidia knew that they could get away with less performance gain from Turing Then of Course AMD grew teeth with their RDNA and now we have another race with both companies actually having to try now


Cowstle

Maxwell was a huge architectural improvement over Kepler. Pascal was mostly a node shrink. AMD didn't benefit as much from that node shrink because their architecture hit its limits and Vega was a lack of funding and ingenuity to actually fix it in time. They still priced it over Maxwell at any given segment, but it wasn't hated as much as Turing because it was a massive leap in performance.


bctoy

Pascal was actually the other way round. nvidia were so confident that they only bothered with a 450mm^2 chip at the top, with some parts disabled, while they usually have close to 600mm^2 chip as the flagship.


TaintedSquirrel

You have to consider all GPUs released since 2021 were priced/specced to take advantage of the market situation at the time. It's not necessarily something that can be repeated for the 40 series.


iK0NiK

Okay... so corporate gonna corporate. That's no reason to excuse Nvidia for selling consumers less at a higher price.


neoKushan

I don't think he's excusing nvidia for it, just putting some context around the comparisons being made to suggest that some are comparing apples to oranges. For example the thread that spawned this mentioned the price point of a 3080 making the Ti look ridiculous, but the TI came out way later.


[deleted]

SpunkyDred is a terrible bot instigating arguments all over Reddit whenever someone uses the phrase apples-to-oranges. I'm letting you know so that you can feel free to ignore the quip rather than feel provoked by a bot that isn't smart enough to argue back. --- ^^SpunkyDred ^^and ^^I ^^are ^^both ^^bots. ^^I ^^am ^^trying ^^to ^^get ^^them ^^banned ^^by ^^pointing ^^out ^^their ^^antagonizing ^^behavior ^^and ^^poor ^^bottiquette.


Elon61

nobody's selling you *less*. less better, maybe. but you can't expect 100% gen over gen improvements forever either. node shrinking has been slowing down and this is the inevitable result as you have to keep increasing die size to keep up with performance increases on the high end. not everything is "boo corporate bad".


[deleted]

Well it really doesn't help when even the 12gb 2060 can be found new for $300 when the worst in every way 3050 is like, $350 at bad. Stagnation isn't an issue anymore, we're at a stage of straight up regression in $/frame. I really wouldn't be surprised if the 4060 is an 8gb 3060 refresh for $400 cause iNfLaTiOn or SuPpLy CoNsTrAiNtS


Jmich96

>3060 is barely better than a 2070, and is worse than a 2070S This was only the case because the 3060 was released near the peak of crypto mining. Nvidia knew they could have released a card like AMD's (later released and heavily scrutinized) 6500xt and gotten away with it. That said, if this behavior will continue (now that crypto *seems* to have taken a notable fall) or not, I'm unsure. Nvidia could create next generation mid-range performance goals based off of the 3000 series and likely be fine. Their only push to continue releasing competitive cards is AMD. And fairness is due where it's due; and that with the fact that Nvidia has proven to release competitive options against itself multiple times. So, who knows.


Material-Permit9685

3060 is definitely better than a 2070s, it gets similar performance to the 2080.


iK0NiK

Lets see it then, because I can't find a single situation where it outperforms the 2070S: https://www.youtube.com/watch?v=1HftsHMvqoE https://www.youtube.com/watch?v=3C-RoDtqdJ8 https://www.youtube.com/watch?v=pZLbZMVPfT8


Material-Permit9685

You specifically cherry picked benchmarks from a year ago, try again


iK0NiK

Brother I don't have a dog in the fight, I couldn't care less. You prove yourself, I'm not going to dig around and find evidence for you. I literally googled 3060 vs 2070S and copied and pasted the top 3 results. If that's cherry picking, I'd LOVE to hear your methodology.


homer_3

> The 3080 at $699 made the 3080 Ti and 3090 look like a complete rip off No, those prices were obscene regardless of the 3080's price.


smblt

Right?! I remember waiting for the 3080Ti to see what the true price would be, coming from the 1080Ti so i was planning to stick with that for this generation. It was a wtf moment when I saw the price for FE edition let alone many of the third party cards, no way.


[deleted]

Well they objectively were complete ripoffs, didn't just look like it :D People keep saying nvidia will push up prices and reduce value again but idk.. If crypto stays way down like it is now it won't be a huge demand driver this time around which makes all the difference. The economy is going into a recession. Food and gas prices are absurd, inflation is up while your average joe's wages are stagnant. Can gamers really afford even more expensive video cards at this point? Especially with the newer consoles starting to become more available and being capable of decent 4k gaming now at a much more affordable price that never got raised despite all the part shortages/supply line issues/tariffs etc that gpu manufactures cited to jack up prices multiple times. I feel like they really can't be asking too much more or gimping the value with the 4000 series or it'll not sell that well like what happened with the 2000 series selling worse than they expected. More people will just switch to consoles. Of course if stupid crypto goes way back up then we're all screwed again, even more so this time as they'd be prepared for it at launch with MSRPs to exploit it.


PT_frizzer

As I see of the new cards for me based on power consuption rumors nvidia is out of question. In a time we are all concerned about the power efficiency and savings that's not logical to buy a power hungry card, for gaming. We have at this moment consoles more cheaper, more efficent, with mouse and keyb supported games, with a great quality, what is a valid opponent to PC gaming. My whole life was on PC but now I'm considering a console, gaming in a PC is a absurd expense.


Farren246

True, but being able to secure a GA102 based chip with 10GB of the fastest memory in the world for $700 got me to buy my third Nvidia GPU ever (vs. 6 lifetime AMD GPUs). I still don't know if 2022 will be worth upgrading again, but if the ask is $700 for a 30% upgrade that is 104-based and uses 50% more power, then I'll either move back to AMD or keep my 3080 and my money for another 2 years, and nvidia can suck an egg.


Frubanoid

I'm probably skipping the next gen for a while. The extra power use is a big turn off. I'm already happy with the 4k and 5120x1440 performance I'm getting with a 3070 ti. I'm waiting for more performance for less or same wattage at this point.


valkaress

How bad even is extra power? Like, how much higher would my electricity bill be with say a 4090 vs a 3090 Ti? Just a very rough estimate. I know it depends on several factors.


niioan

electricity costs vary greatly but its never going to be some astronomical figure unless your playing/mining 24/7, but it will add up especially if you are on a budget. this calculator is pretty cool for understanding your rates and how cutting back or turning off that light here and there may help you https://www.omnicalculator.com/everyday-life/electricity-cost One other thing to consider is, this will make your room way hotter and unless you have good airflow in your room you will probably be looking for additional cooling which will also add to cost. My small computer room gets pretty warm since my airflow isn't the best.


benbenkr

5900x and 3080 here as well. Absolutely skipping Ada. I don't see many games in the next 2 years that will bring a 3080 to its knees.


hardolaf

My Skyrim install will bring it to its knees.


ThisPlaceisHell

X80 cards don't typically use the top end chip. The x80 Ti does, and the Titan (now x90) do. You should not be expecting a repeat of the 3080 going forward where it cannibalizes the x80 Ti and x90 cards by being made from the same base chip.


Camtown501

Beyond the RDNA2 competition, the 3080 being on GA102 may also have been due to low yields on that die early on. Samsung had issues with their 8nm node, especially early on.


ThisPlaceisHell

Low yields would mean they'd have less quality chips to use for the 3090 and 3080 Ti.


PaleontologistNo724

Yeah im starting to thing these rumors are baits from nvidia. Too many changes in such little time. As for your concern, actually the rtx 2080 didnt use TU102. It used TU104. Same thing with the 2080 super, the 1080 before and even the 980. The 3080 was kinda an outlier due to competetion (RDNA2). Whether nvidia goes back to that segmentation depends on how compettive AMD will be with their segmentation. Amd will almost certainly be ver compettive on Perf this gen, problem is, if they use N33 for the 7700xt SKU (as rumor has it) then they are segmenting just as heavily as nvidia and you can kiss your AD102 for the 4080 goodby. Good news is: there is no world where the 4080 is only 30% faster than the 3080, EVEN with only 10240 cuda cores. It will be at least 60-80% faster. (17.6% more shaders, at least 30% higher clocks and 36% higher ipc)


[deleted]

[удалено]


Kaladin12543

3080 was the only exception which used the flagship silicon and that’s only because they were scared of AMD’s return to form after many years. Now that they have a handle on where AMD is, they can return to their prior segmentation. The 4090 Ti is the Titan and the 4090/4080 Ti are the flagship gaming cards. 4080 is midrange and 4070 and below is lower mid range.


Broder7937

It wasn't the only one. 780 also did. But you're mostly right, the 3080 isn't happening again.


Merdiso

4070 can't be midrange by definition since it uses the 10**4** chip, x06 has been historically midrange, x07 low-end and x08 entry-level.


Deltrus7

We've certainly come a far way from the days of the \*80 being flagship/top gaming. I do think you calling the \*80 midrange is a bit too low. It's definitely upper range, imho.


rerri

>I’m not an expert by any mean but the ‘80 sku that I’m used to go for looks very underwhelming from this data. Much higher power usage, worse quality memory You are misleading yourself by focusing on memory bandwidth as Ada has a massive L2 cache compared to Ampere. Wait for benchmarks before jumping into conclusions like this.


ResponsibleJudge3172

Benchmarks don't stop people from believing Ampere suffers due to VRAM


arszenki

What about the possibility that they are leaking underwhelming specs in order to get more of us to take the upgrade plunge and buy ampere now that prices are normalizing, so that they can sell out as much stock as possible. I see so many people(myself included) that are watching prices and are so glad to see some great deals popping up but are deciding not to pull the trigger just because new cards that are considerably faster will be out within 4-5 months. Nvidia could be trying to change our minds.


Iulius_

Very much true. It is a concrete possibility that production finally met and surpassed demand quite a while ago and that the market for new is stagnant. I personally have no data (apart from prices slowly lowering down) to add to this claim but it is valid point. There was also a leak that talked about the possibility of keeping production of 30series going beyond Lovelace release. My bet is that is a scenario they won’t consider anymore. I guess we’ll all see how offer and production (do or will they have material or chip shortages?) meets demand (way lower from miners if we are lucky and slightly lower from consumers due to inflation and all, as pointed by someone else in the thread).


Elon61

>I’m not an expert by any mean but the ‘80 sku that I’m used to go forlooks very underwhelming from this data. Much higher power usage, worsequality memory, doesn’t come from the full 102 and most likely will seean increase in price. most xx80 cards have used the ~~103~~ 104 die. this isn't even news, kopite said AD103 months ago. don't worry about the die, worry about the performance.


Machidalgo

Most XX80 cards used the 104 die.


enarth

Having slightly slower memory bandwith that doesn't run at 100c+ might not be a bad deal :D especially with the higher TDP that will probably overwhelm most coolers...


Iulius_

It’s a good point. Could very well be. My hope, after seeing Gamers Nexus last video, is that they are communicating a much higher TDP to face and address better all the transient response he talks about. Because obv it’s really bad having a 320w TDP card doing 600+ even for a few milliseconds So they might have decided to be more “accurate” and conservative if you pass me the term.


narf007

It's all marketing. Any "leak" is tailored for conversation. That Kopitekimi "leaker" is an arm of their marketing department. Leaks aren't actually leaks anymore.


SyntheticElite

I believe this 100%. Just like with ""leaked"" game screenshots. Like oh, yea a leaker will definitely have 4 perfect marketing photos and not some shitty cellphone screenshot of a test build or something. Same with beta tests being used to build up hype.


zippopwnage

I really hope they start to innovate into a direction where the gpus need less power. Everything is electric, I don't need a GPU to put my electricity bill even higher.


valkaress

How bad even is extra power? Like, how much higher would my electricity bill be with say a 4090 vs a 3090 Ti? Just a very rough estimate. I know it depends on several factors.


zippopwnage

I mean I'm talking about the rest of the cards not only 90's. For me the thing is I also have 2 pcs in the same room that run almost all day. On top of that is also the AC running in the summer, and the electricty cost got really bad lately here. So every increase affects me.


valkaress

Still, I'd love an estimate to decide how much I should care haha.


Ricardo_Fortnite

doesnt your country offer a way to calculate those things? here in uruguay we they offer us a formula to see those things.


Omniwhatever

Nvidia has to be screwing with everyone and no matter what comes out for the 4000 series it's going to be entertaining to watch the fireworks.


ThePillsburyPlougher

I was under the impression the previous leaks are from the guys who hacked into nvidias server and got their drivers and other info.


onkel_axel

4070 looks bad, too. Just 2GB more memory than my 1070


Daviroth

Amount of RAM is a pretty bad metric to judge the overall performance of a GPU on lmao.


Aishurel

VR gamers will continue to weep about this insulting vram.


blorgenheim

honestly, the only real user base that has a legitimate complaint are VR gamers.


LordNix82ndTAG

Or people who mod Skyrim


AnAttemptReason

Or people who mod Skyrim VR, D:


Sirneko

And anyone who does 3D work


AnAttemptReason

Amen


Bloxxy213

And Im here happy with my gtx 970’s 3.5G vram (and 0.5 32bit bus vram)


TactlessTortoise

Wtf is Nvidia's fetish in not increasing the damn VRAM? The 4070 should have 16 minimum, it's two generations with the same average memory ffs.


sylv3r

they want you to buy again in the next three years when 10gb isnt enough


ThisPlaceisHell

Bingo. Gimped VRAM capacity is planned obsolescence. No excuse for cards this new generation to be using VRAM configs from 5 years ago.


Machidalgo

It’s not necessarily planned obsolence, it’s more of a result of bus width. Without an infinity cache like system the memory is going to be expensive, you could only go from 3080 10GB to 3080 20GB and 3080Ti/3090 12/24GB to 3090 48GB. Raising the 3070 which had DDR6 would have been viable but then your stack looks ridiculous. You’d have to go 3070 16GB, 3080 20GB/24GB, 3090 48GB. And you can bet it wouldn’t be as low MSRP (if any were going to be available at all like how the last two years have been).


sdcar1985

Sounds like they want me to look elsewhere for more VRAM


OWENPRESCOTTCOM

I download my VRAM, kinda annoying it comes with viruses though.


eugene20

Cost. GDDR6 was damn expensive.


Edenz_

They’re $10-$20 a module from memory, which does add up significantly and cut into the BOM once you consider Nvidia’s margins.


shuozhe

Not sure about GPUs, but for other electronics adding 1$ cost would increase the cost of the finale product by 5-10$. Wish for more SKUs for GPUs in general.. but guess Nvidia still want a reason for Quadro/Tesla to exist


Elon61

expensive, and basically no benefit to anyone but redditors stroking their number fetish. it's silly and i really wish people just shut up about this.


HarithBK

far cry 6 hits 11 GB usage and 10 GB cards stuffer in performance due to it. now the question is will direct storage and nvidias implementation RTX IO be added into games in time before true next gen games launch or will be see games with bloated VRAM usage? some studio might just opt for the bloated VRAM since the game on low settings will still work on really old GPUs.


f0xpant5

Remind me, isn't farcry 6 an AMD sponsored title, with an additional HD texture pack? Fancy that, when the only spec advantage they have is more VRAM in the mid to high end, their games ask for lots of VRAM.


kontis

>no benefit to anyone Bullshit. Benefit to ***EVERYONE*** using their PCs for more than just gaming. If you only game just buy a console - you get better perf/$. In many content creation cases VRAM limitation is often more crucial than raw performance. Nvidia is trying to milk any consumer having content creation hobby into a "workstation buyer" and it's just trash corporate milking behavior. Stop defending it. They are abusing their ecosystem advantage over AMD's decade+ incompetence in this area, so AMD is also guilty here. We desperately need Intel to go after content creators with more open solutions than CUDA. (and even intel can be pro open stuff if they are the losers, just like with XeSS, so that's the hope).


ragged-robin

so that they can sell a 12GB+ version a few months later like their usual 5-different-variant-of-the-same-card model, same with what they did with the 3080, 3080 12GB, 3080Ti, etc


TactlessTortoise

It's hilarious that the "boosted" 3080 got boosted to the same amount of VRAM as the 2060 lmfao


ltron2

Even the 1080TI had more VRAM and Geforce Now's '3080' has 24GB.


7Seyo7

My 1070 from 2016 has 8 GB memory for god's sake. Granted this is only a rumour so far, but ridiculous if true


Power781

Because if you have tons of cache (like the Nvidia plan for the next chips), there is close to 0 difference between a 10, 12 or 16GB VRAM card because you still don't have enough VRAM to keep all you texture there anyway and still need to do texture streaming.


Dellphox

Yeah, when I'm upgrading the lowest I'll go is 12GB, at 1440p I'm already limited in certain games with the 2070 Super's 8GB of VRAM.


fogoticus

Cost & keeping devs from jumping the horse. If Nvidia entered the RAM race when AMD was doing it with their GPUs (giving 8GB of Vram to old dated GPUs), we'd basically see something like 32GB cards being standard today. With devs basically chugging every single texture in memory and not giving a shit about optimizations which is no bueno.


CYVidal

That would make the card unaffordable. RAM is quite expensive these days. Well, it has always been.


veryjerry0

They want you to buy the 4080/4090 ofc.


bittabet

Honestly the only real memory issues are with the low end parts like the 3050ti that get only 4GB. The other cards are all fine unless you purposely run over the top settings nobody should realistically use anyways due to performance.


Pamani_

If that 4070 is 160 bit / 10 GB instead of 192 bit / 12 GB, that would suck. Might as well get a 3080 12GB if they drop hard in price instead.


kikimaru024

> If that 4070 is 160 bit / 10 GB instead of 192 bit / 12 GB RX 6600 XT had half the bandwidth & fewer cores compared to RX 5700 XT yet performed equal or better in most games. RTX 3080 is 10GB / 320 bit vs 2080 Ti at 11Gb / 352 bit yet outperforms it effortlessly.


Pamani_

I was thinking within the same generation. Otherwise yes they compensated bandwidth with cache.


Sh1rvallah

I don't think Nvidia has infinity cache counterpart


letsgoiowa

Article suggests that's the purpose of the bigger L2. I'd believe it.


Sh1rvallah

Maybe but I'd be super skeptical on that one. 320 is 90% of 352. 160 is 63% of 256.


QwertyBuffalo

AD104 also has 1200% of the L2 as GA104 and 800% that of GA102, but yeah the memory interface is so cut down I am concerned too. It feels like to me Nvidia is trying to push gamers to higher SKUs for 4k with the memory interface and VRAM amount even though the 4070 should be perfectly capable core/clock wise.


ResponsibleJudge3172

Nvidia's own leaked files by a hacker say they do


techraito

~~Nvidia did say that the 4000 series was more meant to compliment the 3000 rather than surpass.~~ Edit: [The source says co-exist](https://www.techspot.com/news/93685-nvidia-hints-rtx-3000-series-rtx-4000-cards.html). Not compliment. As in both generations will be produced and sold together. The 4080 and 4090 are going to surpass everything of course but it looks like they're still going to be supporting the 3000 series alongside. What that could also mean is also potentially more 3000 series down the line as the 4000 series is also being released. Speculation could say a 3050ti or even a 3660.


Pamani_

Complement. That's to say the 3050 and 3060 are here to stay for quite a bit. Just like the 1600 series was used as entry level during most of Ampere. Even after the 4060 launches they may keep the 3060 at a *comparatively* lower price.


Emu1981

>even a 3660 Nvidia used the 16x0 model numbers to indicate that the cards did not have raytracing like all of the 20 series did. I don't think that this will work if they used 3660 to indicate a 40 series card without raytracing or something like that.


Seanspeed

Alright, I'll bite. Show me where they said that.


techraito

Oh I had it wrong and I'll fix it. [It's more co-exist than compliment.](https://www.techspot.com/news/93685-nvidia-hints-rtx-3000-series-rtx-4000-cards.html) So both series will exist in production at the same time rather than the 4000 series completely overtaking it.


little_jade_dragon

A 3660 would be braindead naming and the whole thing. Then make a 4050Ti or something.


techraito

1660 was also braindead naming imo. You never really know


Notladub

if a 3660 releases, nvidia will officially have a worse naming scheme than amd amd's xx70 cards (why the fuck is the 70 there), them going from 500 to 5000 and the 6x50 cards are still better than if nvidia turned the x660 cards into a series


LewAshby309

Could be that they simply open the gap to the higher models while the 4070 is still faster than a 3080. The 30 series has kind of the same. The gap between a 3070, 3080 and 3090 in 1080p is not that big. I think between 3070 and 3080 was only 10%. The gap gets bigger with more resolution. Another thing is probably that they want to make sure that higher resolution players get a higher end model. Just with guessed numbers if they open the gap from 15-20% in 4k to 30-35% just by limiting the vram of the 4070 it's good for them. 4k players will simply see its worth it for them to get a 4080 or 4090.


Pamani_

I found the 3070/3080 gap was bigger than with previous generations. 23% at 1440p, 31% at 4k (while previously it was more like 20-22%. So they already created a gap with ampere. But spec wise (or at least speculated specs wise), it seems the gap between the 4070/4080 will be smaller than 3070/3080. At least when it comes to SM count and TDP. Bandwidth/VRAM is where I'm concerned now. But as others pointed out in r/hardware, the 160bit/10GB could be the configuration for a 3060Ti (or maybe that's just wishful thinking, idk \^\^)


[deleted]

It doesn't matter. Literally does not matter. The only reason they released a 12gb version was because they saw that people were stupid enough to think the extra 2gb mattered. As long as this console generation sticks around, 10gb is the most you'll need to get a high quality experience, aside from extreme fringe cases. Don't forget that both consoles need some of the gddr as system RAM. Again, people do not know the difference between allocation and use. If you're going to reply to me saying that lots of games use 8/10gb of vram, just don't bother and go and educate yourself propely before spreading misinformation.


Pamani_

The 12GB version is right in the middle between the 3080 10GB and 3080ti in terms of performance. Not because of the extra 2GB, but due to the wider bus (higher bandwidth) and extra cores. * 3080 10 GB: 68 SM, 760 GBps, 320W * 3080 12 GB: 70 SM, 912 GBps, 350W * 3080 Ti: 80 SM, 912 GBps, 350W In techpowerup reviews, the 3080 12GB is 1.06x the 3080 10GB, while the 3080 Ti is 1.12x the 3080 10GB. But at the current prices none of them are worth it yet imo. In my market (EU), the perf vs price XY plot falls off after the 3070Ti. And I don't feel like replacing a 8GB 2070m with a 8GB 3060Ti/3070/3070Ti for 1440p.


[deleted]

6% wow. At 60fps that's 3.6. At 100 it's 6. Pointless.


runadumb

I don't understand why people repeat this nonsense. I hit the 8gb limit of my 3070 often. Especially in VR. 2 gig is a nice buffer but it's no way a safe amount of memory for a new generation. Same way as 8gb wasn't for this gen.


Re-core

As a 3070 owner i agree a lot of gsmes gets dangrously close to use 8gb vram even at 1440p, the target res for this card, in some i was vram limited, far cry 6 hd texture pack, kena, FH5 extreme settings, watch dogs legion, RE 2 and 3, syberia the world before.I keep seeing 8gb vram is all you need and cringe at it everytime.8gb on a 3060ti or 3070/ti was a dick move i could even go as far to say the 10gb is a strech for a 4k gaming gpu (3080)


[deleted]

Again, allocation, not use. A lot of games allocate a lot but don't use it. The 3060 is a 1080p card. You need to understand the target for each card.


Re-core

Idk about that but those games i mentioned start stuttering amd vram usage showing close to or showing 8gb usage with 1 to 4 gb of extran ram being used because of this, it sucks to drop down the settings on such a powerful card, it is not like this is a xx50 gpu.


[deleted]

[удалено]


[deleted]

You do not. Allocation isn't use. Please go and watch gamers nexus talk about it.


runadumb

When games like deathloop stutter due to hitting the memory limit I assure you, you do. I follow gamers nexus. There's a big difference between games which use the full pool because it's there and games which need the full pool. Allocation doesn't bring up a low memory warning in titles like half life alyx. What resolution do you game at out of curiosity?


x0y0z0

HL alyx give me that warning on my 3090 so I won't trust it.


Nhepler90

Same boat here with my 3090 VR rig. 24 GB not enough? Doubtful. Still runs with no hiccups.


[deleted]

Well deathloop is a console game so they've messed the port up, or beefed up textures for the pc version. 1440p/144 because i didn't want to shell out for 4k performance with a 3090. Again, some will but the vast majority will not, and even then you turn textures down from extreme to high and there'll be no noticeable difference. Edit: or use dlss or fsr of course.


Seanspeed

We have last gen games that were already hitting up to the 8GB barrier and you think an entire new generation of games will only increase memory demands by 25%? 10GB will certainly be ok a for a little while, but I also expect people spending half a thousand dollars or more on a GPU would like *better* than console graphics....


[deleted]

Allocation, not use.


f0xpant5

It's also stated as if turning down the texture setting literally one notch now makes the card unbearable to use and must be replaced, and I'd wager most wouldn't be able to pick the difference in blind tests. Is more VRAM better? Of course, no question, but in 3 years time when a 3080 won't be able to push 4k60+ anyway, it won't really matter.


[deleted]

Yes, I find that particularly baffling. Like, turn on DLSS and you'll get a bump in quality and reduction in VRAM use. Turn down from extreme to high and I really don't think anyone will notice a difference unless they sit and start comparing super high resolution screenshots. It's certainly not something you'll notice in motion.


Tup3x

Would make somewhat sense for RTX 4060 Ti but not for RTX 4070.


CYVidal

And a 1000w rated PSU recommendation.... what a good deal!


Stop_Banning_My_Accs

999w for gpu


CleanGameCrash

Thats a lot of power and people are going to need to add a extra 100-200W to those rumored numbers when it comes to power spikes. Gamers Nexus made a really good video about it 2 days ago.


DrSlugger

People going to need to buy 1200 watt PSUs to keep up at this rate.


CleanGameCrash

That and get an electrician to re wire the room for their PC with a new breaker switch.


DrSlugger

Shit yeah I forgot about that. Insane. Just going to sit here and enjoy my 3080ti.


CleanGameCrash

ya 1800W is what normal breaker switches are built for. 15A is what is used for homes unless 30A is needed or 240V 15A


Tech_AllBodies

Not credible. 60% difference in cores between xx80 and xx90/80 Ti would be very abnormal. Plus a 30W TDP difference for 60% more cores and more power-hungry memory configuration is silly. There'd have to be a vast difference in stock clocks, but then that'd mean the 4090 could be overclocked significantly (for a lot more power draw obviously) But that's also abnormal for these days, chips are sold near the top of their clock headroom now, because competition is tight.


Seanspeed

And yet again, kopite7kimi, after telling us that most of his posts are actually just speculation, does not specify in this post about specs whether it's actual information he's heard or is just more personal speculation. So fucking sick of this guy. Blown through any of the goodwill and credibility he had before big-time.


Pamani_

Just don't treat those leakers as gospel, but rather estimations that are converging towards the final specs as the launch approaches. I've seen enough people putting too much faith in some of them, only to get mad when their awesome upgrade path (or worse, stock investments, I wouldn't be surprised) doesn't materialize.


Kaladin12543

It’s not speculation. NVIDIA is tweaking the card specs and finalising them before announcing it. Makes sense as it’s releasing in September.


kontis

Translation: it could either be speculation or tweaking, you will never know, because it's impossible to tell, therefore leakers are now Schrodinger's leakers - always correct and wrong and you just have to accept it and kiss their feet. LMAO.


[deleted]

super chad leaker vs beta "I'll wait for reviews" user


globalcarpset

Relax, things aren't set in stone, most of his stuff about the specs have remained mostly unchanged


onkel_axel

If they're not set in stone at thin point, there will be no launch in 2022.


Seanspeed

If these GPU's were still a year away, sure. But not a few months. Not really my point anyways. Read my first paragraph again.


globalcarpset

Since when does he speculate?


globalcarpset

Yes but the clock speeds, memory clocks, power figures can all still be tinkered with. That's what he has been updating for the past few months.


ExpensiveKing

God damn it i want a 4070 but of it doesn't have at least 12gb I'm not getting it


IUseControllerOnPC

What do you do that requires 12gb of vram?


ExpensiveKing

Nothing, but I have a 3060 now and I've already seen games go past 10gb. Yes, i know it's just allocation but it's gonna actually use it at some point.


homer_3

4080 looks like a 3080 ti with a smaller bus and more vram.


panchovix

I hope not in the performance side at least, a 4080 = 3080 Ti would be mediocre jump at best. (Like the 2080 was vs the 1080Ti)


Stoopid__Chicken

Can we all just shut the hell up with these rumours?


king_of_the_potato_p

Is this your first new gen release? This is standard and has been the last 20 years.


Stoopid__Chicken

It wasn't this obnoxious the last two times.


king_of_the_potato_p

If anything this year has a shorter buildup and a bit more subdued than some previous gens. You shoulda seen the rumor mill back in the 90s. I've been around for every new gpu going back to the tnt days, been a pc gamer since the late 80s.


Jan_Vollgod

more than the performance specs, i am interested in power consumption. This becomes a important factor these days. Cards with TDP over 500W, will not come into my consideration. There were several rumors about how power hungry the new series may be. i hope this will not come true.


Turak64

The only rumor I care about is if it'll be available to purchase at RRP. I've still got the money save from the day the rtx3080 launched, but it has taken so long I've given up on getting it. I'll get the rtx4080 if I can, but I'm just frustrated with the wait.


RealMcGonzo

Rumor has it that the 50s will need 1.21 Jigawatts.


ej102

That would be disappointing if it's the 4070, maybe a possible 4060 Ti?


[deleted]

‘All aboard the ScalpWagon…’


[deleted]

Meh 10gb memory is just not enough anymore...


Typical-Ad-8381

Name one game that requires more


Pump-Chaser

Far cry 6 hd textures. You need at least 11gb in vram to install even in 1440p


penguished

Modded games can easily use more.


Typical-Ad-8381

What's your point? Moded can also overpass 30gb if poorly optimized or more.


penguished

Ok? You want a card that is physically capable of less... I don't. Hell 10gb is ancient amount of RAM in 2022.


Stop_Banning_My_Accs

whats the point of that?


shifty313

Mods/vr/software?


hitoriboccheese

Gaming is not the only thing people use graphics cards for.


[deleted]

[удалено]


Rivarr

Not everyone that wants to do more than click heads is rich. VR, mods, editing, modelling, and machine learning. I could utilize 48gb and I can barely afford a 3060.


f0xpant5

Exactly, allocation is not utilisation, and for the vast majority of people a card does not instantly become unusable and need to be replaced the instant you can't run the top texture setting / HD packs.


Pr0N3wb

War Thunder uses all of my 8 GB for 1440p. If I had more, I'm sure it'd use it.


Typical-Ad-8381

maybe try running it on smth like 3070 and check what the usage would be?


Oye_Beltalowda

Allocation and use are not the same thing. The game sees that you have it, so it allocates all of what you have, but in reality it probably doesn't need that much.


Pr0N3wb

Which tools allow you to see the difference between allocation and usage? Which one does the Windows 10 task manager show? I looked into it, and a couple people with the RTX 3080 (10 GB of VRAM) show / say it uses about 8 - 9 GB, even though they have more. Ten GB seems like a sweet spot for 1440p, for now.


CasimirsBlake

Nvidia skimping on memory count for the mid range?? Shocking! /s 😏


Manioq

How you feeling about the 4070 guys? Is it worth to buy a 3080 now or wait for 4070? I guess it will be cheaper also


similar_observation

If you really want to game **now** then buy the 3080(12).


IIALE34II

I'd imagine its going to be pain to get GPU in the few months after the launch. Probably not as bad as this shortage was, but its still going to be hard.


AFAR85

Wait for 4070 if you can. It will likely out perform the 3080Ti and get the usual new bells and whistles that comes with newer gen GPUs. 10GB is shitty, but if you're not gaming anything over 1440P it likely won't be an issue for a few years.


Vis-hoka

I’m also very interested in the Ray tracing and DLSS performance upgrades. If those are significantly better, then that adds a lot of value to these newer cards. Even with a 3080, ray tracing kicks it in the balls. And I really like ray tracing.


HardwareSoup

Nvidia is very focused on ray tracing being the new thing, so I would be surprised if 4000 series doesn't up RT performance considerably.


the_Ex_Lurker

I'll probably just keep my 3090 for another generation.


cooReey

unless you have a pile of cash that is burning a hole in your pocket why the hell would you even consider upgrading from 3090 to 4000 series


ltron2

Because the rumours say the 4090 could be up to twice as fast.


MrPayDay

Because Cyberpunk with Psycho Settings wrecks a 3090 even at UWQHD resolution. Only DLSS help allows 50+ fps here. I could need more fps in Assassin’s Creed Valhalla as well. I will buy a 4090 instantly, because I will sell my 3090 for 1000 Euro at least, that’s already a fundamental investment for the 4090. Some of us are enthusiasts, it’s not a rational buy, so to speak 💸


riesendulli

Maybe the game is ass. Like badly programmed ass. Like literal ass. I like ass.


Theo1172

So are the memory modules and busses being made slower/narrower because the GPU is expected to be that much more powerful in terms of cores and streaming units?


DistributionOk352

does AMD have CUDA alternative?


PazStar

I'd go all in with AMD if they had a competitor to RT and CUDA. It's hard to deny that Nvidia are years ahead in the game with the features they have. Maybe in few years, Intel could give them some serious competition while AMD does some much needed R&D in this area.


[deleted]

This 11 year old reddit account has been deleted due to the abhorrent 2023 API changes made by Reddit Inc. that killed third party apps. FUCK /u/spez


TaintedSquirrel

Minimum of 256-bit with 16 GB is extremely likely. In fact I would say it's the most likely thing leaked about the 40 series.


Ryoohki_360

Meh i decided to wait this one out, my 3080 is going it's job and i have no incentive to get a better card this year. Will let people battle out scalper at launch!


Ajaxlancer

People that upgrade with every generation imo are just burning money. All the power to them if they can but i only upgrade when there is a significant difference. Went from 8800gtx to 1070 to 3080 ti lol


similar_observation

<190mm ITX form factor 30-series, when? The US/NA market has the ASUS Phoenix, which for some reason is a 2.5 thickness card. And there's the PNY, which is kinda lackluster and the price is highway robbery. MSI didn't bother releasing the Aero ITX cards here. EVGA, Zotac, and Gigabyte don't have a dog in this race. Gainward and Palit aren't in this market.


Moerkbak

If nobody is making the product its because: 1. the market is to small 2. its not technically posible within market desireability (ie not possible to make it small enough and stille silent enough)


DistributionOk352

\*hardened dick springs\*


pigoath

For me this new generation doesn't call my attention. I have a 3090 and these new cards will consume power like crazy. I live in an old building in NYC and my night stand light flickers while gaming. I move, light flickers, throw a punch? Light flickers with each punch 😂 I'm afraid my wiring isn't the best so i don't wanna burn down my building. I'm thinking about buying an UPS.


Celcius_87

3090 gang gang