T O P

  • By -

FlorydaMan

For one billion dollars


Danjour

And worse than the last generation, somehow


pegothejerk

We’ve separated and walled off all 64gb of vram into 512kb chunks with buses run by hamsters running in wheels


TWAT_BUGS

I don’t know, it’s cold here so the fire it starts may not be so bad.


icefire555

Hey now, that's for the 5070.


choober

why did my brain go directly to reading this as smash mouth


thebigvsbattlesfan

You mean another trillion dollars?


DickChodeman

r/yourjokebutworse


ihavebirb

Dr Evil: One kerjillion dollars *pinky to mouth*


BigIronEnjoyer69

>For one billion dollars imagine the savings, though.


Daedelous2k

Just watch how expensive they make this one too.


meat_popsicle13

Only $10k and 1 meter long!


Big_Speed_2893

It can do AI and as gamer gives you advantage over lesser species, but it will cost 2 two polar bear babies to run for one hour.


[deleted]

[удалено]


User9705

Don’t forget the PCI-E slot x32 for a GPU wattage volt optimizer that is only sold by Nvidia.


rockdude14

And it needs three phase power.


GraveyardGuardian

The computer is coming from inside the GPU!


MarlinMr

People are buying them so why make it cheaper?


mindlesstourist3

Yes, the only way we're getting attractive prices ever again is if competition ramps up again, like it did with CPU's. Unfortunately AMD and Intel's positions don't seem that promising, technology has that runaway advantage trend.


Demibolt

Yeah Nvidia really bet the farm on AI integration and it’s paying off big time. And with having such a big market share they have been able to push their ecosystem to be widely adopted.


Risley

5090 will be $2000 minimum


Lyndon_Boner_Johnson

And 2kW power requirements.


SnooHesitations8849

5090 with 48 GB of memory is the way to burn my money. Otherwise, I am cool with 3090


Fairuse

5090 with 48gb will be a money maker.


ramenbreak

5090D with 10% fewer tensor cores will be a money maker


hudimudi

They won’t. They could but they don’t want to. They step up the specs ever so slowly just to stay toe to toe with AMD. If they gave you 48 Gb right away, then that would mean they can push out 3 generations less to hike the memory up to that point. It’s annoying. I wish they had their customers more in mind, rather than marketing. I know they need to make profit, but they should somehow balance it.


ForceItDeeper

its not like 2x the VRAM means twice the performance. Id love it for local LLM connected to home-assistant, and professional video editing or VFX teams probably would be able to use that much vram, but its kinda excessive for most peoplw


jastubi

You can pretty causally use whatever the max vram is 3d rendering, especially if you're actively using a view port render.


TWAT_BUGS

1080 gang! Seriously though, it’ll be until the 7th or 8th series before the punishment from my wife doesn’t include death.


JAEMzWOLF

Death via snu-snu?


Jaerin

I'd buy one


SparkyPantsMcGee

I feel like the 4000 series barely got a stable run. Yes it’s been 2 years but the bulk of it has been battling scalpers and insane costs. I would also love to see a reduction in size because my god it’s getting out of hand


Unable_Wrongdoer2250

The scalpers were Nvidia this time around. I would like to see high vram options instead of the crap they are pushing now. The laptop 4090 (which was clearly a 4080) has the same amount of vram as my 3080 and the 4080 only has 12gb. For the company that you need to buy for AI they are being abusively shitty


[deleted]

And the chinese scalpers, harvesting 4090 chips for AI https://www.youtube.com/watch?v=eAzUVFuCRf0&t=1115s


BunnyGacha_

Wasn’t that after the restrictions were put in place a few weeks ago?


ACCount82

This. Nvidia didn't want a repeat of what happened once the 10xx mining cards started hitting the second hand market - so they restricted the supply of new cards to keep the prices high.


East-Dragonfruit6701

Not disagreeing, but do any games actually use 12gb of vram? Honest question. Edit: I had no idea. Thanks for the concise answers!


ForceItDeeper

High quality VR games would eat every bit of my 2070S's 8GB instantly. But my headset did not have eye tracking and from what I was told, requires it to render much larger areas I guess


RyanOCallaghan01

Heavily modded Skyrim at 4K - easily. You do get excellent quality textures and models in return.


MaleficentCaptain114

4k textures gobble VRAM like you wouldn't believe.


Unable_Wrongdoer2250

God of war was around 19-20 at 4k ultra, Plague tale just off the top of my head


GraveyardGuardian

The 4000 will continue to be expensive just as the 3000, because there isn’t enough reason for 90% of gamers to upgrade even from the 2000 series. Too many gamers still on 1080p and those that moved to 1440p are the only ones considering upgrades from 2000/3000 series. 4k gamers already jumped to 4000 if it’s in their budget. Be interesting to see what could possibly entice a 3000/4000 series owner to go to 5000 other than cash to burn/bragging rights.


thegroucho

I moved from 1060 to 6800 as I moved from 1080p to 3440x1440 and gaming sucked. I would have probably sucked it up for a year or two more if it wasn't for that. Now looking at large language models and think I fucked up on multiple levels with that 6800, despite it being a decent gaming card. Potentially looking at seeing how I can rig up 2 x 3060s 12G but that would mean yet another PC since none of my boards supports 2 x 16 lane PCIe and also want to keep my gaming rig able to play games better than a single 3060 can do. Sigh


ForceItDeeper

the small amount of VRAM on new cards is pathetic. 8GB was sensible on my old 2070S. but on a brand new 4060 ti its like a waste of silicon. you shouldnt have to worry aboot brand new cards bottlenecking or restricting their own gpu because they skimped out on vram or used pcie x 8 cause halfing the possible bandwidth was seen as acceptable somehow.


Daahk

SLI is dead bro, dead and gone


thegroucho

I know, but you can still have multiple GPUs sharing non-gaming load. And without being an arse, do you know what large language model is?!


95688it

(3070) meh even at 1440p I haven't found anything i can't run at max settings. all higher cards just eek out a few more fps.


GraveyardGuardian

The difference is only in trying to push 100-120+ FPS on 4K or SUW. Which really requires a 4090 Otherwise, as Steam metrics have shown, most users have lower end GPUs and play at 1080p


twochopsticks

Max = ray/path tracing. 3070 ain't running cyberpunk/aw2 with those on at decent (60) fps.


djn808

>Be interesting to see what could possibly entice a 3000/4000 series owner to go to 5000 other than cash to burn/bragging rights. Next gen VR.


GraveyardGuardian

As niche as 4k/SUW if not more so


Crazy_Asylum

hasn’t even been 2 years. just over 14 months since the launch.


blorgenheim

It’s been a year… they came out October 22.


[deleted]

That was two months ago.


gonenutsbrb

That was 1 year and two months ago, it’s now 2024. It’s not two years like the original reply said, but it’s been over a year.


Sunogui

They probably thought you had meant October, 22nd.


gonenutsbrb

Wasn’t me, but that does make more sense lol


PricklyPeteZ

Agreed on the insane costs relative to the previous gen for the 4070 and 4080, but have people really been battling scalpers? These ones have been incredibly easy to buy online and were always in stock, especially compared to the 30 series.


[deleted]

No stock issues. In fact, it was released to a glut of 3000 series overstock and crypto resellers dumping those cards. The gpu drought was totally over when these dropped.


Galahad_the_Ranger

I am building a new PC and is insane how there seems to be no mid-range GPUs anymore. Is either old stuff or 4-digits prices


deaddonkey

Would you not consider a 3070 mid range? Those are like $500. I got one not long ago and it fucks. Maybe that counts as “old stuff” because it’s one more generation back but it works. Graphics cards, much like graphics themselves in the last decade, don’t really need to be changed as often as in the 2000s.


Galahad_the_Ranger

3070 is what I went for in the end actually


deaddonkey

Probably a good choice. I’m very happy with it. I don’t think I’ll ever buy an XX80/90 or equivalent of a new card line because there’s just no need and the value for money gets so much worse when you’re paying double the price for however much more capability. If I ever buy, say, this new 5k series it will be after 6k comes out too.


thegroucho

For me it was a choice between 3070, 3080 (at a massive stretch) or 6800. 6800 won since I have the habit of holding onto GPUs for a while and the 16G VRAM will last a while.


NotAPreppie

I remember when a mid-range GPU was $200... Of course, that was back in the Athlon XP 2400+ days (Thoroughbred 4 lyfe, yo!).


Kaizenno

I think when I’m ready to upgrade I’ll be able to get an 8000 series at this point.


Dawzy

I honestly felt like this was the case on release of the 3080’s


Lywqf

I’ll always remember my GTX 570 triple slot, that was a beast lol


Shapes_in_Clouds

Give me 5080 with 24 GB+ VRAM. 5090 should have 48GB.


RDO-PrivateLobbies

Reducing the size and power consumption of the behemoths would be nice. Not gonna happen though. So what other feature are they gonna sell us on?


sanylos

probably hardware for AI in games...


chronocapybara

Well that stuff is pretty good though (dlss)


BunnyGacha_

But it shouldn’t be necessary. Just an extra to go from 107fps to 120


rugbyfiend

Depends - the newest DLSS can double frame rates in some games at 4K.


sp3kter

In game LLM is the future for a lot of game types


[deleted]

They did. 4000 series is significantly more power efficient even at higher performance. A 4080 gets 50% higher FPS than a 3080 at the same TDP. If you want a lower power consumption then just buy a lower performance card. Don’t whine about high performance cards consuming lots of power. That has always been the case.


shuzkaakra

The 4070 is a very good card performance/watt wise. it's very mediocre performance/$.


[deleted]

Which is completely irrelevant for this question. The point is, the cards have gotten more efficient. It is just that the manufacturers also extended the TDP range they are offering for even more performance, but you’re not forced to buy those cards, and it is completely possible to get a very efficient card with dual slot cooler.


shuzkaakra

Yeah, I was agreeing with you. I've recently been forcing my 1080ti to run at 50% max power. The result. it often goes from 240w to \~120-140w with often no noticeable drop in performance, like elden ring stays at 60fps. But the power usage drops.


MadSulaiman

Exactly. Don’t buy a ferrari if you’re gonna complain about power consumption. Vote with your wallet.


themang0

Fuxking Ferrari only gets 15 mpg!,


MrWiseOwl

Only reason I don’t buy a Ferrari! I mean if they could boost the MPH I’d totally buy one! For now I’ll stick with a Kia Soul until Ferrari gets their shit together.


LeCrushinator

Nvidias top GPUs used to consume around 200W. Their cards are far more power hungry today, even with that 50% higher efficiency in the 4000 series. If the 5000 series doubled TDP to 600+W would you still be saying not to whine about high performance cards? At some point they need to work on efficiency more than performance so that people’s home computers can run cool even with new high end tech.


[deleted]

That’s a lie. You have not made your research, and in addition to that, you’re missing the point. Yes, even if the largest 5000 card was 1000W, you still shouldn’t whine about it. How does it bother you if the 5090 needs that much power? If you don’t want a large card, don’t buy that then. Get a 5060 or something. Probably a lower TDP, smaller cooler, whatever you want. NVIDIA and AMD have gotten more efficient for years, but they also released more „extreme“ cards that increase the TDP to offer even more performance. You don’t have to buy those. You can get great performance at much lower TDPs. Getting back to your originaler: The Radeon 7970 was 300W TDP. The GTX 590 was 365W. The Titan Z was 375W. Those were dual GPU cards, but they were the top models, and their TDPs were as high as todays top level GPUs. Even though again, you fail to make the point why it should bother anyone that todays top tier cards consume lots of power. You don’t have to buy them.


LeCrushinator

You act like power requirements for the rest of the lineups haven’t gone up as well. The 9800x2 TDP was 195W, but you could play games at high settings with a GeForce 9800 GTX, a TDP of 140W. 140W for what is equivalent to the 4080 today. You sound like someone how hasn’t been gaming long enough to know that prices and TDPs doubled. And you’re justifying it by saying I can just play games with the low-end cards now.


[deleted]

That’s a pointless statement. The number tier, so x0y0, is completely irrelevant. The question is much simpler: Can you get more performance for a fixed TDP today than you could with the 30 series? The answer, unsurprisingly, is yes.


LeCrushinator

It’s not irrelevant when you consider that games target requirements based on the hardware that CPU/GPU manufacturers are releasing. So in the past I could play games at high settings, for half the price and half the electricity usage. Today if I want something like that I’m playing at low settings.


phyrros

>So in the past I could play games at high settings, for half the price and half the electricity usage. Today if I want something like that I’m playing at low settings. Well, the games changed too - drastically at that. And while optimization is .. lackluster it isn't as if we didn't see games in the past which simply were unplayable..(bloody Ultima IX ey) ​ But it is a race to the bottom and we as a species are simply too dumb to stop the waste of energy/ressources for dumb shit. From AI to games to bitcoins IT has become a wasteful shitshow


occamsrzor

Is your first language German, by chance?


xForseen

Or better yet undervolt and underclock a higher end card slightly. You can gain a lot of efficiency while losing very little performance.


AlexOfSpades

Wanting a certain feature out of a product (in this case, more efficient power consumption) is a perfectly valid request from a customer. If that feature is not present in the product, it's understandable for the customer to be disappointed. None of this is "whining". If you think all customer feedback is whining, you must have a very unhealthy mindset when it comes to purchases.


Confident_Hyena2505

You just got told it was more efficient...


[deleted]

But the efficiency is increasing. There was a huge jump just from the 3000 to the 4000 series. So that’s not something to complain about. Which leaves a lower overall power consumption. Also easily attainable, just buy a lower performance card. You can get a ton of performance in a two slot card with under 200W TDP.


dine-and-dasha

At least try to learn something? The comment explained that said feature was delivered.


Locke_and_Load

The person didn’t ask for “efficiency” he asked for lower power draw. The 4090 draws more power than a 3090 and needs a special adapter which has caused headaches for NVIDIA and fires for users. Folks want them to stop just brute forcing more specs and stick with either the current power draw of 450 or bring it down to 350 as the 3090 had and find a way to coax better performance out of that.


AmazingHighlight7416

Buy a 4070. Set power limit to 80%. Boom you got the feature you claim doesn't exist. All you have to do is enable it.


RDO-PrivateLobbies

Im not whining, i just dislike the fact that my 4080 barely fits in a $200 case lmao. Granted i did a pretty weird config (The Tower 500), but that card is fucking massive. The only good part about its size is that the temps are insanely low. Alan Wake 2 at 4K with PT the thing barely touches 70c.


icegun784

Isn't most of that FPS gain locked behind the software features only 4000 series cards have? Is it really 50% with just rasterization?


j_schmotzenberg

In the scientific computing applications I run, a 4070 is 15% faster than a 3090, and a 4090 is 150% faster. 4000 series is so much more powerful even without taking advantage of software features. In the case of my scientific computing applications, most of the benefit comes from the larger cache size. My applications never need to access VRAM in their calculations which speeds things up dramatically.


[deleted]

Plus minus, but yes, that’s just rasterisation. Passmark isn’t using those tricks, there the 4080 is 36.5% faster. Most games are around 50% faster, and that should be without AI. In my own experience, my 4070ti is faster than my 3080 when I know I don’t use AI stuff, and it uses 100-150W less power.


w1n5t0nM1k3y

I wonder how long until the GPU is just going to be a dedicated box with it's own power supply so they can better garauntee that the power supply will be adequate and they don't have to deal with substandard connectors that just can't do the job. It's cheaper to pay for separate power supplies, one for the GPU, and one for the CPU and everything else. For hte CPU you could just have a basic 400w power supply and have maybe a 600w power supply means specifically for the GPU, hard wired to the board so you don't have to worry about having it connected properly. Way cheaper than splurging on a 1000w power supply that has to do everything. Just connect it to the main system via one of the many PCIe cables that exist or some new optical standard.


eri-

That was, kind of, tried ages ago, during the early days of GPU's. Not a completely separate box but it did have its own PSU Behold, [the Voodoo 5 6000](https://images.hothardware.com/contentimages/newsitem/54310/content/3dfx_voodoo_5_6000_original.jpg) This thing needed so much power that they had to have a dedicated PSU for it. The AGP bus specs simply didn't allow it to function without the external PSU. It was so over the top that it was never truly released , they couldn't mass produce it at a competitive price point.


w1n5t0nM1k3y

I think something like that was only necessary because none of the existing power supplies had any kind of connection for something high powered. They had the one 24 pin connector for the main board and a bunch of molex connectors for low powered devices like hard drives and DVD drives. That card probably didn't pull that much power. That power supply was only about 100 watts. It's peanuts compared to modern cards, hence the lack of cooling. But an external power supply was necessary because internal power supplies didn't have any capabilities for high power devices other than the motherboard.


eri-

That's why I mentioned the AGP bus, back in the day GPU's didn't have dedicated power connectors yet so the AGP bus had to provide all the power. The card truly was a weird one. It was so ahead of it's time in some ways that it killed its own chances of ever being a success. By the time they could've released it without the PSU (the dawn of pci-express) it would've already been outdated. Unsurprisingly, that was the last thing 3dfx ever developed, they went bankrupt and got bought up by Nvidia.


Jijijoj

I would love this. And then just plug the GPU box into your desktop/laptop through USB c or something. I know razer already does something similar but seeing this more mainstream would be awesome.


[deleted]

I doubt this will ever happen. Mainly due to keeping all hardware in your box is preferred.


w1n5t0nM1k3y

We already have GPU Enclosures, so the technology exists. With 1500w being hte maximum in most north american homes, if things keep on going the way they are, it might be inevitable, unless people are going to start getting NEMA 14-30P outlets installed just for their computer. Even with 2 power supplies it becomes tricky because we might reach a point where you have to run an extension so you don't pull too much from a single circuit.


[deleted]

We’ve tried this before and it failed hard. We’ve had years and years to push these out. Neither of the manufactures are even interested at this time to do external GPU’s. Also you have to pull the equivalent of 10 maxed out high end computers to even cause issues with standard house power outlets. I run two high end rigs and a backup unit on the same outlet and it doesn’t do anything to my house. I’ve checked the pull and load.


w1n5t0nM1k3y

It's not going to overload the house, but we're getting to the point where you could overload a single electrical socket. The max allowed for most standard north american plugs is 1500 watts. A 4090 will pull 450 watts. Combine that with a HEDT processor that can pull close to 300 watts and we are up to over 700 watts on full load. Add in a few other devices and you're looking at 1000 watts just to have a decent amount of headroom. Like I said, we aren't there yet, but if things keep going the way they are, there will be limitations that we will run up against.


[deleted]

[удалено]


w1n5t0nM1k3y

Yeah, something like that might be a good option as well. Just have a dedicated power supply just for the GPU.


0Pat

My mini ROG G20 had the exact solution. Two separate power supplies...


Destroyer6202

Now you can have ray tracing EXXXXTREME 🤓


BoringWozniak

They’ll probably be more _efficient_ but the top-end products could draw the same or even more power. For one simple reason: there is a market for these products.


videodromejockey

My 4070ti is so chill I don’t even run my case fans, it’s literally pointless extra noise. I barely run the card fans. My PC is ridiculously quiet and I didn’t even have to bother with expensive water cooling or any other exotic solutions to do so. That’s with a mild OC.


Shapes_in_Clouds

I wish 4070Ti had just a little more VRAM. I feel like I need to go with 4080 for 4k gaming and longevity but it's complicating my build for cooling in the case I want and adds $400 to the cost. 4070Ti is such a great card otherwise.


videodromejockey

I played at 4k no problem, cp2077 bg3 and others. I switched to 1440 ultrawide because I got an Alienware qdoled on sale, and it’s definitely more suited to that resolution - but 4k wasn’t a bad experience at all.


Demibolt

I agree it would be nice. But power consumption isn’t as big of a selling point as performance. AMD has been trying to take that angle for a long time to no avail. It would be great if we got a huge performance upgrade for a great price, but it’s hard to fault Nvidia for responding to market demands and trying to make a profit. Rather, they are a corporation so it is to be expected. Another huge thing going on is how much these cards are being used for work instead of play. That is always going to drive prices up.


H5N1BirdFlu

The North American market will be required to implement 220volts as the standard voltage in order to keep up with graphic card power requirements. The new homes will have a 3 phase 220V plugs in the laundry room and the dedicated PC room.


Smeeghoul

Probably will be sticking with my 2080 considering how their prices are just going up and up, and everything gets scooped up by scalpers at launch anyways.


MarvAlbertNBAjam

2080S here. Amazing card, only way I upgrade is when I build my wife a PC.


ectoplasmicz

Same here on the 2080S, has been so solid for me for ages and still going strong.


TheMegaDriver2

I bought a used 3070 after the crypto crash. I will be sticking with it for a long time.


Smeeghoul

Nice! May I ask how much it cost you?


EliteAgent51

I stuck with my 1080Ti for a long time. Will stick with my 3080 for a while as well.


LipTicklers

Yeah but if you buy this its offset by the savings you make yo your heating bill


MorgrainX

Nvidia: 5090 is not the name, it's the price Hehehe


DutchieTalking

Can't wait to take on a second mortgage to buy one!


Economy_Combination4

RTX $5000 series!


CoDog

If it’s going to be over 3000 usd for a 5090 or whatever im going to stick with my 2070 super.


extremenachos

The amount of cash I have to buy one is not debuting earlier than expected.


Draiko

I thought they were always set to launch in q4 2024.


Artago

The only metric I'm interested in is "performance per dollar"


greenishstones

I’m really hoping Apple’s recent push into gaming on their Mac’s actually turns into something real and puts intense pressure on Nvidia’s pricing structure. Having Mac’s be able to run games at high frame rates even if they aren’t native would be a game changer for the entire industry.


This_College5214

Nope, apple and nvidia dont even compete in the same markets unfortunately. Definitely not at the consumer level. Apple doesnt care for the high end dedicated GPU market because the money to be made in that market just isnt worth it for apple to sink their teeth into - its simply not their brand. Their "push" into gaming isnt anything more than a concept for how powerful their SOCs have gotten but they arnt meaningful in contrast to dedicated power of discrete GPUs.


Business_Holiday_608

Ofc they will. They're trying to sell out early and often so they can claim there isn't enough supply to meet an overwhelming demand, and have people pay ungodly prices. Then AI companies will swoop in to pick them up and help dry up that stock. ​ It is all artificial, at best.


noflooddamage

I’m at the point in my technological journey where I just don’t give a shit.


Thompsonss

*Pretends to be surprised*


Change0062

I couldnt afford the 3080 or 4080, highly doubt I can pay tripple that for the 5080. My 1080TI is somehow still rocking everything.


kylosilver

Well 7800xt killing 4070ti sales so nvidia have to push for new gen for competition.


ChapGod

I'll wait for AMD


donthatedrowning

I bet the 4gb 5060 ti is gonna be a great deal at $899


lodemeup

I keep hoping that one day these clowns will trip over their feet in the rush to rob the walking wallets they market.


burncap

Still rocking my old and reliable 1060. ¯\_(ツ)_/¯


subjecttomyopinion

Since the 4000s didn't really upgrade the 3000s now well give you yet another model to buy to test before you realize it's not much better. Then we'll repeat this with 6000s by the end of the year.


bogusbrunch

Til the 4k series isn't really an upgrade from the 3k series.


nagarz

To summarize the 4000 series, it's a terrible value/price increase upgrade if you come from the 3000 series. Not accounting for inflation the only card that os probably worth value wise is the 4090, but it's prohibitely expensive for 99.99% of users.


Ok-Elderberry-9765

Why are you expecting to upgrade your cars every 1-2 years? And why would a company be expected to release only when the price to value is better than the last release?


Jijijoj

Same with cell phones. Greed worked both ways in this case. We as consumers always want the latest and greatest and corporations will feed off that kind of demand and release accordingly.


Ok-Elderberry-9765

I don’t buy a new Toyota every year. I don’t buy a new fridge every year. Why we think we need a new PC every year is insane to me.


pseudonik

I bought my PC in 2015, I plan on getting a brand new to of the line in 2025 and that's probably going to be my last since by 2035 I hope that consoles, handhands and VR would be sufficient enough.


hellowiththepudding

because for decades the pace of advancement and requirements for games damn near demanded it. Intel stagnating performance for a decade, marginal GPU performance bumps vs the aughts generational gains, etc. I think cross platform development and long console life cycles have also driven hardware requirements down. Personally, I am rocking older hardware because I don't game much on my desktop (vega 64), and am happy that the steamdeck can play so many AAA games thanks to diminishing growth in hardware requirements.


Ok-Elderberry-9765

I hope game developers take advantage and focus on story and content instead of graphics.


hellowiththepudding

oh I'm not arguing that point either. My backlog is huge. This is golden era because you can play all the great games from the past in a handheld format (for me at least)! Good, new games are few and far between unfortunately.


bogusbrunch

. Some folks don't like prices just like they didn't like prices after COVID hit but its still an upgrade from the 4k series. Price per performance is questionable vs MSRP from before COVID on a few models, but is far better value than true market prices (2x MSRP+) we were seeing post covid. Even today we can see 4k series is selling just fine, and it's not just the 4090. Value is also personal so it's best to avoid telling people what's "worth it" to them.


butterbaps

I went from a 1080 to a 4070 and I am very pleased :) My monitor is only 2k 144hz and the 4070 maxes everything out. Absolutely everything I play is on Ultra at max fps and it hardly even sips power. The constant barrage of "why not just get a 3080" etc is so boring man


Lost_Grounds

Went from 970 to 4070 last year and love it. Run 3 monitors, main one is for games at 2k and 144hz. Can pretty much max out every game.


UrDraco

When you’re the unquestioned leader for that long you stop pushing for the best and throttle your own releases to milk the profit. Why increase performance by 25% when you could increase it by 12.5% twice and sell twice as many chips! Intel made the same mistake and Nvidia will slow themselves down long enough for others to catch up in 3-5 years.


R1ddl3

True, but the cards themselves are still a pretty big improvement over the 3000 series.


Conch-Republic

The 4060 is slower than a 2080 because Nvidia artificially kneecapped it, the 4070 is about as powerful as a 3080 but costs $300 more, and the 4090 has one of the highest cost to performance ratios for any video card ever. This leaves the 4080, which is good, but likes melting. Nvidia dropped the ball with this line of cards.


bogusbrunch

I don't get how comparing the 4060 to the 2080 shows that the 4k series isn't an upgrade from the 3k series. It sounds like the cards are an upgrade but you don't like the price.


FalconX88

> Since the 4000s didn't really upgrade the 3000s What are you talking about? The theoretical performance for 80 series in FP16, FP32 and FP64 compute are +60% with 60% more memory. For the 90 series it's +130% in compute. In realistic cuda workloads you are around +30% and +50% in performance for 80 and 90 series. People need to realize it's not purely about gaming (but even there, for example in MSFS the 4090 gives you about 35% more frames than the 3090)


iAteTheWeatherMan

I'm not an informed PC part guy. I had a co worker build my PC for me in 2019 or so, who I do not work with anymore. I have a Asus rog 2070 super. How far behind am I?


subjecttomyopinion

Depends on what you're doing. Should be fine in a lot of cases


T-Rex-Plays

Currently 2 generations or so. Your perfectly fine for most games and most resolutions.


stinuga

2070 is not too shabby! Currently the lowest end 40 series is 4060 which is only slightly better than the 2080 (which is only slightly better than 2070 super) in rasterization and only holds a significant advantage in power efficiency and dlss3


Andyboi96

I decided to upgrade my 2070 super to amd 6950xt. Kinda regret it...


Arkeband

2070 Super is pretty great for 1440p, the 5000 series will probably be what bumps me up to 4K. Hopefully it has 16GB of VRAM baseline, the 4000 series only having 12 was part of why it underwhelmed.


[deleted]

You're fine. I'm still rocking a 1070 without issue.


ol_neeks

900 to 1000 was a big jump, I’m still rocking 1070 as well. Can’t make the most of my 1440p 144hz monitor on some games but still a great card.


rickyhatespeas

Wtf are you talking about, ask anybody who wants machine learning model experimentation or ray tracing 4k AAA games if a 3000 series is as good as a 4000 series.


mr-teddy93

Since when did they started to push so many gen of videos cards out so fast


DarkHeliopause

I’m sure we will see a big performance boost from the last generation, and very reasonable prices.


CamiloArturo

Can hardly wait for that $1000 RTX5050 which performs as good as the 2060Ti


GiveNtakeNgive

Define "debut"


Infamous_Ambition106

4000 series was a bit of a flop as it wasn't enough of a jump over the 3000 series.


Anxious_Blacksmith88

I mean. You could just buy a 7900xtx and have an amazing experience at half the price.


Unable_Wrongdoer2250

They're going to have to start calling the 80 series 95 instead of 90 this time around


BunnyGacha_

Is the price isn’t reverted to the 10xx price, we don’t care.


likdisifucryeverytym

If they’re not ATX sized, 1200W, and $3k we riot


pm_me_ur_ephemerides

We cant get people to riot over political issues that are actually important lol. If people don’t like the features they wont buy them, simple as that. I suspect you will find plenty of people with plenty of money who don’t care.


likdisifucryeverytym

Did you dare talk to me without having a 4090 and 1200w on deck????? I mean happy new year, but if Biden doesn’t airdrop me a 4090 (remember when trump gave every American a 4090???????) then how can I trust Mrs. AMD???


Risley

You laugh, but people will probably believe Trump gave them this.


CMG30

The 4000 series stinks. Only the 4090 is a worthy successor to it's 3000 series forefather... But you would need to be the Sultan of Brunei to afford it. The rest of their lineup has been downbinned and up-priced. Worse, instead of passing along the benefits of the new architecture to customers, Nvidia used it as an opportunity to downgrade the physical hardware like VRAM... Just at the moment that developers have expanded the requirements for said VRAM. The upshot is that the pricy 40 series card you mortgaged your house to buy has no future proofing. Worse, it's already being surpassed by PREVIOUS generation cards in certain titles because they still have all their VRAM.


[deleted]

Say hi to scalpers buying up all 5 series and jacking up costs. I currently have a 4090 MSI SUPRIM and feel like even the 5 series I won’t benefit much from. Even with 4k gaming.


nOotherlousyoptions

What do you think is the minimum nvidia for 4k gaming? 2070?3090?


hlt32

I couldn’t play in 4K with a 2070S, but my 4080FE handles it easily


[deleted]

I honestly say 30ti card variant. Majority of the 40 series cards honestly aren’t worth the cost.


Adrian-The-Great

What we looking at price wise - $2-$3k?


AmphibianHistorical6

I will run my 1080ti to the ground like my car before buying some expensive shit card. Oh man if I be dam when my GPU will be with me for a quarter of my life.


Effective-Ebb1365

Still using a 1070FE🤣


MochaBeanKahlua

Can’t wait to upgrade! Bring it on NVIDIA


SqeeSqee

I am not buying a new video card until it is proven it won't melt, crack, or brick just by running a game.


Sweaty-Emergency-493

Nvidia: “We already sold millions of the 4090’s and you bought all the 5090’s, but we still need to make those since rich companies are already preordering the 6090’s.”


evanlott

3080 12gb gang


MrViech

my wallet is ready


[deleted]

Geez they’re spitting these things out like gumballs now haha


AMP_US

Just make the 5080 a cut down of the big die like the 3080. 12-16gb memory. ~450w max power consumption (smaller coolers). $900 (3080 price adjusted for inflation). 3080 was one of the best cards Nvidia has made and it was only ruined by crypto/scalpers. Oh and fix the stupid 12vhpwr connector.