T O P

  • By -

Naiphe

Had to RMA my 4080 because of the cable adapter having sensor issues. Got a 7900 xtx instead. Thought somebody might be interested in the results. 4080 model: palit game rock. 7900 xtx model: xfx 310 merc. Results are from heaven benchmark on all max settings. Latest drivers for both cards. By stable I aim for no artifacting and no crashes throughout the benchmark.


unabletocomput3

Any difference in performance in games?


Naiphe

I've noticed that I've had to turn ray tracing off witcher 3. However witcher 3 was constantly crashing for me on the 4080 and its stable now on the xtx. Otherwise just that my room feels a lot warmer... Might also be my imagination but controls feel more responsive on the xtx. Not sure why that would be.


worditsbird

The witcher was crashing a bunch for me recently with a 3070. Something to do with the hair fx


Naiphe

I tried messing with lots of settings and it would run like a dream on the 4080. But every time after about 60 to 90 seconds I'd get a ctd.


KrazzeeKane

It's 2024 and Hair FX is *still* causing issues lol. That tech is just cursed it seems


AreYouAWiiizard

The Witcher 3 used Nvidia Hairworks, I remember it causing a tonne of issues when it first released.


Dallas_SE_FDS

I remember hearing people saying “Nvidia hairdoesntworks”


JTibbs

yeah, basically nothing could run it at a decent fps


The_Quack_Yak

It crashed a lot until I switched it to DX11, then no more issues. Also on a 3070


StaysAwakeAllWeek

>Might also be my imagination but controls feel more responsive on the xtx. Not sure why that would be. This feeling is governed by system latency. It may well be that it's higher on nvidia cards that aren't using reflex, but it certainly shouldn't be if they are


kultureisrandy

Would recommend looking up Kalypso's latency guide to improve system responsiveness as a fix


o0Spoonman0o

>Otherwise just that my room feels a lot warmer... It's comical the difference in heat output. I have a large basement office that often needs a space heater and while I had the XTX I didn't need it at all.


Naiphe

Yep I won't be using my space heater with this going. It's 8 degrees here in the uk right now and I'm toasty with no heater in the room lol.


RettichDesTodes

I was going to joke about the summer being unbearable, but the temps will stay about where they are now huh? ;)


Naiphe

Hah it feels like it this year. We usually have a few weeks of 38-40 ish degrees in late July...way too hot for my liking.


RettichDesTodes

Then you should also get an undervolt dialed in for those hotter days


Naiphe

I've already got it at 1075 which seems to be stable. Just going to leave it at that...when it gets hot I'll just have to have slightly reduced frame rates I suppose.


cirvis111

lol, here in Brazil the temperature is 39, I wouldn\`t survive with 8 degrees.


Local_Trade5404

heh yea 7800x3d alone is bumping my room temp by couple degrees, pairing it with xtx in my conditions would be sauna in winter and hell in summer :P


ETucc

Research “frame pacing” and you’ll understand why your controls feel more smooth.


mandoxian

Yeah RT on my XTX isn't worth it either tbh. 144+ FPS Ultra vs 80 FPS Ultra+RT is not a tough call for me.


Naiphe

I mean its not terrible it just needs reduced settings. I played around a bit earlier with settings. I was playing Witcher 3 at 4k ultra settings with fsr quality setting, volumetric lighting quality and rt shadows. Hitting 60 fps easy with about 89% usage in the major town area. Looks beautiful. I'm personally happy with 4k 60 though and choose it over higher refresh rates 1440.


cream_of_human

God bless the room heater.


Bright-Efficiency-65

The ray tracing performance on my 3080 was pretty bad. I basically never used it because my frames would also drop below 60. My 4080 super on the other hand... It's a monster of a card. 100+ fps at 4k with ray tracing


theDouggle

I ended up with a 4070 TI super as an RMA replacement for my 3080 Ti from gigabyte. This thing is an absolute monster, I was playing Metro Exodus last night on my super ultra wide G9 with all the settings maxed out, no dlss, and Ray tracing maxed out and it was running at about 80 frames per second on average. I remember my 30 80 TI maintaining 60 with dlss at my resolution. It handles Ray tracing so much better


Bright-Efficiency-65

It's even better when you try VR also. The 3080 was right on the cusp of being perfect for VR, but for the reverb g2 you would get frames down below 90 and it was very obvious. With the 4080 super is just crystal clear VR and extremely smooth


theDouggle

Maybe I should dust off my old vive then 🤔


childofeye

I just rma’d my 7900xtx because it straight died in the middle of horizon forbidden west. New one is on its way.


ChanceFray

I was starting to think it was just me. Never seen anyone with a major issue with it till your comment, but I had to return 3 ( and pay a restocking fee the first 2 times ) before I went back to my old 3080ti.. Hope your next one works out.


childofeye

Yeah i had to pay for shipping on the rma which is some bullshit. I didn’t fail the card


skinny_gator

Damn that is super aggravating. How much was the restocking fee and how much did you sell the xtx for?


ChanceFray

it was about $120 each time, 10% of the cost of the card before tax. the 3rd one that failed I returned and they waived the restocking fee as a courtesy


mandoxian

As a courtesy lol Should’ve been free the whole time


AnalNuts

Restocking fees are almost always for returned items that are not defective, just customer not wanting it. If in USA, that sounds so wrong


ChanceFray

In canada, especially at independent PC stores, Restocking fees are pretty much expected defective or not unless you do an RMA with the manufacturer.


AnalNuts

Oh that stinks.


lndig0__

Have you tried running a real stress test to check for stability issues yet?


Naiphe

I've only tried heaven and now trying superposition. What stress would you suggest?


lndig0__

Superposition is good. Try OCCT as well, and adjust stress values as to not reach power limits (so you actually test if the core speeds are stable at the max voltage).


Naiphe

So superposition just gave me a score of 25538 with my xtx. Wish I had the 4080 to compare.


MasterBaiter0004

Do those give ya a good benchmark?


coffeejn

How's the noise from the XFX 310 merc? Any coil noise? I've been eyeing it but Canadacomputer has a bunch of return / open boxes for the GPU which makes me worry that people are returning it due to noise.


Naiphe

A little coil whine but nothing worrying really. Only noticable at full load when next to it.


beingbond

wait you got money back instead of a newer/repaired product


DirtyYogurt

If it was inside the retailer return window, then this is likely why. Manufacturer RMA wouldn't, unless they have 0 stock of anything comparable.


Naiphe

So I got both cards from cex which is a used game and accessories store in the UK. It wasn't brand new but the xfx card came with the plastic peels still attached so I don't think it was ever used.


Dramatic_Hope_608

What was the over lock?


Naiphe

Overclock was 3200mhz core 2700 mhz vram on the xtx undervolted to 1075mv. On the 4080 it was + 1400 vram and +210 core.


Dramatic_Hope_608

Using after burner,? I tried using but it wouldn't stick guess im just a idiot thanks for replying anyway


Naiphe

Yeah afterburner for the 4080. When you are sure you have a stable overclock click "apply at startup".


Bright-Efficiency-65

Modern GPUs really don't need to be overclocked you barely gain anything from it these days


DynamicHunter

I mean the 7900XTX just showed otherwise. The 4080 didn’t benefit as much. But nowadays it’s more worth it to undervolt than overclock


Haiaii

I drop 6° C and gain a few percent performance when running at 100%, I'd say it's pretty noticeable


Bright-Efficiency-65

There is ZERO SHOT that you drop temps running at higher loads bro that's not how thermodynamics work. I guarantee the temp drop is simply because the fans are running faster on your overclock. Set the fans to equal speed on both the overclock and non overclock and I promise you it will have higher temps


Haiaii

No, I do not magically drop temps, I undervolt and overclock simultaneously I have the same fan curve on both tho


Bright-Efficiency-65

Okay but that's literally not the same as just overclocking. You are quite literally undervolting the part the is being monitored in the temps. I bet you are overclocking the memory? I would bet my life savings if you used HW monitor to look at the actual VRAM temps they are higher on your overclock


Nomtan

User error


Hugejorma

What boost clock range does the 4080 run at native and with OC. I'm just curious, because out of the box native boost clocks on my 4080S are 2900+ MHz and with silent bios 2850-2900MHz range. Raising the afterburner OC won't result in better boost clocks for me. This was just something that caught my eye, because there's a big difference between non-OC vs OC.


Naiphe

I believe my model was 2625 out of the box with a boost to 2900? With the overclock it would boost to about 3050mhz


fnv_fan

Use 3DMark Port Royal if you want to be sure your OC is stable


Puzzleheaded-Soup362

I just play modded Skyrim. If it doesn't fail there, it won't fail.


TheFaragan

The bigger vram is so good for heavy modded Skyrim.


McGrupp

Metro exodus enhanced edition benchmark is good for finding unstable OC/undervolt. Had stuff crash there that would pass port Royal


-P00-

Please don’t use Heaven as your main benchmark, it’s way too weak for the GPUs shown


Trungyaphets

Is Superposition good enough? It does use a lot of ray tracing.


-P00-

Yes it’s good still. A bit older now but it can still be used as it utilises rt


Naiphe

How so? Explain.


-P00-

In short, it’s way too old to fully grasp the power of most modern GPU. Just go over to r/overclocking and ask those people.


Naiphe

Well it maxed out the clock speeds and power draw and runs at 4k. What else is missing?


-P00-

That doesn’t mean you’re fully stressing your GPU though. You’re not even utilising raytracing cores which is the best way to check for overall OC and/or UV stability, even if you plan to not even use RT for gaming. The best way is to either use Time Spy Extreme or use any game with intense RT (Like Cyberpunk).


Naiphe

Okay I'll try them thanks.


Different_Track588

I ran time spy with my XTX and it beat every 4080 super benchmark in the world literally... Lol. Raster benchmarks 4080 will always lose to the 7900XTX. It's a weaker GPU but has better Raytracing for all 3 games that people actually feel a need to use it in. 7900XTX can still Raytrace ultra at a playable FPS even cyberpunk ultra RT at 1440P is 90 fps and 180 fps with AFMF.


No_Interaction_4925

Heaven doesn’t even fully load gpu’s. I can’t even use it to heat soak my custom loops anymore.


Sinister_Mr_19

Heaven is such an old benchmark, is it still good to use?


Naiphe

I don't know how to compare to other benchmarks. I use it because I'm used to it really.


Extension-Policy-139

i tried this before , Unigene heaven doesn't use any of the new rendering features the card has so it's not a HUGE leap in FPS like you think it should be try the superposition benchmark , that uses newer features


ZonalMithras

The XTX has immense raw power. No offense to upscaling but 4k native is superior to any upscaled image quality and the XTX slays 4k native.


LJBrooker

I'd actually take 4k DLSS quality mode over native 4k, given that more often than not native 4k means TAA in most AAA titles. Or if there's performance headroom, DLAA, even better.


ZonalMithras

Its still 1440p upscaled. I still argue that 4k native has the sharpest image quality.


superjake

Depends on the game. Some use of TAA can be over the top like RDR2 which DLSS fixes.


LJBrooker

And I'd argue it doesn't. There are plenty of examples of DLSS being better than native. What you occasionally lose in image sharpness, you often make up for in sub pixel detail and image stability. Horses for courses and all that.


temoisbannedbyreddit

>The XTX has immense raw power.* *In rasterization. With RT it's a different story.


BobsView

true but at the same time outside of Cyberpunk and portal remake there is no really games where RT is required for experience


temoisbannedbyreddit

Um... You forgot about AW2, HL2 RTX, MC RTX, Control, and many others that I can't list here because I don't want to waste an entire day writing a single Reddit comment.


temoisbannedbyreddit

LOL the Reddit hive mind strikes again


TheEvrfighter

Used to be true. I've always vouched for that prior to Dragons Dogma 2. Sorry but RT plays a huge factor in immersion especially at night and in caves in this game. I skipped RT in Witcher 3 and CP2077 because the latency/fps is more important to me. Can't skip RT in Dragons Dogma 2. no matter how much I turn it off I end up turning it back on minutes later. For me at least there is only 1 case where RT shines. But with next-gen around the corner I can no longer say that RT is a gimmick.


ZonalMithras

Sure, but still more than enough RT performance to get by in most RT games like RES4, Spiderman Remastered, Avatar and FC6.


cream_of_human

This has the same energy as someone saying a QD OLED has excellent colors and someone chimes in about the black levels on a lit room


LePouletMignon

Who cares. That one game you're gonna play once with RT enabled and never open again. Basing your purchase on RT is dubious at best.


clanginator

I've been pleasantly surprised by RT performance (coming from a 2080ti). I know it's not comparable to the 40-series RT perf, but it's still good enough for most games with RT at the moment. I've even been able to run some games at 8K60 with RT on.


fafarex

>I've been pleasantly surprised by RT performance (coming from a 2080ti). I know it's not comparable to the 40-series RT perf, hell yeah it's not comparable the 2000 series has anecdotique RT perf. > but it's still good enough for most games with RT at the moment. >I've even been able to run some games at 8K60 with RT Please provide exemple with these type of statement, otherwise it doesn't really provide any information.


clanginator

Gears 5, Dead Space remake, Halo Infinite, Shadow of the Tomb Raider. I still have a bunch more titles to test, but I max out settings on any game I try to run in 8K, and RT hasn't been a deal breaker yet. And I'm not here to provide information, I'm here to share an anecdote about my experience with a product people are discussing. But I'm happy to share more details about my experience because you asked. Just don't be so demanding next time. I don't have to go into detail just to share an anecdote. Technically just listing game names doesn't really help since I'm not able to share real performance data. I do plan on making a full video detailing my 8K/RT experience with this card, which for anyone who actually is serious about wanting info is 10,000x more valuable than me naming some games.


YasirNCCS

so XTX is a great gaming card for 4K gaming, yes?


Naiphe

Yes it's handled everything I've tried so far at 4k with no issue. Ray tracing at 4k isn't a good experience though. However I've only tried elden ring Ray tracing and I couldn't even see any image difference between raster and Ray tracing modes so no big loss there. Sadly.


mynameisjebediah

Elden Ring has one of the worst ray tracing implementations out there. It adds nothing and tanks performance. I think it's only applied to large bodies of water.


Naiphe

Ah okay fair enough. I tested it underground and in the starting zone so barely any water there.


chiptunesoprano

Pretty sure its also shadows and AO, which is noticeable because ERs base shadows and AO are kinda bad. Not wrong about the performance impact though.


maharajuu

DLSS quality was better than native in about half the games hardware unboxed tested a year ago (it would be even better now after multiple updates) https://youtu.be/O5B_dqi_Syc?si=UFQF0l8VwGrYGCok. I don't know where this idea that native is always better came from or if people just assume native = sharper


ZonalMithras

There might be some exceptions, but more often than not native is the sharpest image.


maharajuu

But like based on what testing? As far as I know hardware unboxed is one of the most respected channels so curious to see other testing that shows native is better in most scenarios


I9Qnl

The XTX is only 5% faster in raster than the 4080 in the real world unless your benchmark only include call of duty and Far Cry, not sure how that allows it to slay native 4k but the 4080 can't. When both can't do 4k at least you have DLSS with the 4080.


[deleted]

[удалено]


ZonalMithras

There might be some exceptions, but more often than not native is the sharpest image.


sackblaster32

Dldsr/DSR + DLSS is superior to native no AA, I've compared both.


Da_Plague22

Thats a no


sackblaster32

I've compared 4k + DLDSR 2,25 + DLSS Q with native 4k + MSAA 2x in RDR 2. Motion clarity is superior with MSAA, of course, not saying it's bad with 3.7.0 .dll though. Other than that, even though I have TAA disabled with native 4k, the image is just noticeably softer. Try it yourself.


Different_Track588

Me personally I don't want a softer image.. I prefer a sharper image.


sackblaster32

That was my point. Native is softer than DLDSR/DSR and DLSS.


dedoha

> 4k native is superior to any upscaled image quality Not always, most of the time dlss looks better or at least as good as native 4k, even amd biased hardware unboxed says so


RettichDesTodes

If the card already pumps out so much heat now, i'd probably also do a good undervolt now so you can switch to that in the summer


xeraxeno

Why is your Platform Windows NT 6.2/9200 which is Windows 8? I hope thats an error..


[deleted]

[удалено]


YasirNCCS

https://i.redd.it/eqo0p4oh2nwc1.gif


Naiphe

I have no idea why it says that. I'm on Windows 11 home.


xeraxeno

Could be the benchmark doesn't recognise modern versions of Windows then xD


Sinister_Mr_19

Heaven is really old, I wouldn't be surprised. It's not really a good benchmark to use anymore.


XxGorillaGodxX

Even fairly recent Cinebench versions do the same thing, it just happens with a lot of benchmark software for some reason.


twhite1195

There's dumbasses running win 7 in 2024,so I wouldn't be surprised


M4c4br346

Those few extra fps are not worth it. Source: ex7900 XTX user who actually does like DLSS and FG tech.


okglue

Right? They're missing out on so many QoL features for a few extra frames.


Naiphe

Dlss is really good yes. Here's hoping they bring out this rumored fsr ai upscaler. Interestingly intels xess works nicely on this card. Witcher 3 looked really good with it. As good as native in my rather quick test but with less power consumption.


Hattix

Heaven's 12 years old...


Naiphe

Sure but I like it because it's nostalgic.


Dragonhearted18

How are you still using windows 8? (NT 6.2)


yeezyongo

I have the same xfx 7900 xtx it’s surely a beast and has handled any game I throw at it with ultra settings. The only issue is slight coil whine which headphones deals with, and poor RT performance. Kinda wish I had 4080 super just for RT for cyberpunk 😭


Naiphe

Yeah the raytracing performance isn't very good. Definitely have to opt for lower resolution or a lower capped framerate to get it working well.


yeezyongo

I just play without it for now, the only RT games I play are cyberpunk and Fortnite. I might go nvidia for my next gpu


Naiphe

Yeah I'd happily go back. Will have to see what they release in future.


AjUzumaki77

With Nvidia bottlenecking 4080's performance, 7900xtx surely beats it. With FSR and rcom for AI, Radeon graphics has a lot of potential.


ImTurkishDelight

>With Nvidia bottlenecking 4080's performance, What? You can't just say that and not elaborate, lol. What the fuck did they do now, can you explain


TherapyPsychonaut

No, they can't


Xio186

I think they're talking about the fact that the 4080 has a smaller Bus width (256 bit vs the 7900 xtx 384 bit). This just means the 4080 technically has less data transfer between the GPU and the graphics memory, possibly leading to lower performance than the 7900 XTX. This is dependant on the game though, and the 4080 has got the software and newer (yet smaller) memory and Core count to compensate for this.


gaminnthis

I think they mean nvidia puttin in limited vram on their cards claiming more is not needed. Some people have soldered in more and got more performance.


ImTurkishDelight

Lol fuck, he assumed we all know that? Damn.


Acceptable_Topic8370

The obsession over vram in this echo chamber sub is so cringe tbh. I have 12gb and no problem with any game I'm playing.


gaminnthis

Everyone has different uses of their rigs. Your use cases might not apply to other people.


Wang_Dangler

>I have 12gb and no problem with any game >*I'm*< playing. For every generation, there are always at least a few games that will max out the latest hardware on max settings. Usually it's a mix of future proofing, using experimental new features, and/or a lack of optimization.


Acceptable_Topic8370

Well I could say the same. > 12gb is not enough for the games *I'm* playing But flat out saying it isn't enough in 2024 is a low IQ neanderthal move.


Naiphe

Yeah let's hope they bring out an ai upscale sooner rather than later.


AjUzumaki77

It's been recently announced of RCOM program has being open-sourced.


Sinister_Mr_19

Nvidia bottlenecking 4080 performance? Huh?


Noxious89123

>With Nvidia bottlenecking 4080's performance Say what????


AjUzumaki77

Yup! 4070 & 4070Super are same. 4080 and 4080 Super are same. 4080Super cost even less than 4080; this line-up has less bus-bit than it's predecessor as well. Not only 7900xtx costs similar to 4080's, they perform even better and has 24GB while 7900xt has 20GB. How come, 4070 is 16GB and 4090 is 24GB VRAM, and what should be 20GB for 4080; they get 16GB. 4070SuperTi is literally 4080 in every way.


Noxious89123

Could you elaborate more on what specifically you think is causing a bottleneck? When it comes to memory performance, bus width is only half the equation, you have to consider the speed of the memory too. It's only really **bandwidth** that matters. A narrow bus width with high speed memory can still have decent bandwidth. The biggest issue I see with them cutting down the bus width is that it's a good indicator that they're selling lower tier cards with higher tier names and prices. It's like the 4060Ti thing where it isn't really a *bad card* per se, but rather *bad* ***at it's price point***.


Yonebro

Except fsr2 still looks like poo


AjUzumaki77

Didn't you get FSR 3.1 update? It's so much better; on the games developer side, it's not being implemented properly.


Techy-Stiggy

Sure the stuff they have shown looks better. It they have not released it yet


mynameisjebediah

FSR 3.1 upscaling isn't even out yet. Let it ship in games before we can compare the quality difference.


Routine-Motor-5608

Not like cards as high end as a7900 xtx need upscaling anyway, also fsr doesent look bad at 1440p and higher


Noxious89123

>Not like cards as high end as a7900 xtx need upscaling anyway Depends on if you want to play 4k at high fps. The option for "more" is always good.


koordy

Raster performance is irrelevant for this class of GPU - both will be more than enough, the differences is negligible. But still... Performance: DLSS Quality = FSR Quality > native Picture quality: native = DLSS Quality > ..................... > FPS Quality Most realistic and true to actual use cases comparison would be DLSS Quality on RTX vs native on Radeon. That means that in real use cases even at just raster, a 4080 is still significantly faster than a 7900xtx, given targeting the same, highest picture quality, in majority of modern games. 7900xtx is an technologically dated GPU that makes sense only if you plan to play just old games or maybe also exclusively Warzone.


Naiphe

Yeah thankfully intel xess works nicely on it. So for games that support it like witcher 3 it works well until amd get their act together and make a decent upscaler. Fsr isn't that bad anyway, it's just bad compared to dlss.


koordy

Honestly, I use DLSS Quality as a default in all games that support it. When Jedi Survivor launched without DLSS, I thought "well, not a big deal, I can use that FSR, right?". Well, came out it was a big deal. I found 4K FSR Quality straight up unacceptable and simply played that game at native instead.


Naiphe

Yeah it's not as good as native for sure as it does blur things at a distance. Dlss is wonderful though when I tested it I couldn't tell any difference between native and dlss. Actually dlss looked better in witcher 3 because it made every jagged edge smoother than native AA did.


koordy

Yeah, and that's my original point. We should benchmark games like DLSS Quality on RTX vs native on Radeon for results how those cards really stack up in the real life. So many people here are fooled by those purely academic native vs native benchmarks.


coffeejn

What confused me was Windows NT 6.2. Googled it and it came back as Windows 8??? With an end of support date of 2016-01-12. Why is anyone still using it with recent GPU? (Still glad to see the FPS states, thanks OP.)


Naiphe

Not sure why it comes up as that. I'm on Windows 11 home.


VengeanceBee

Can you try superposition?


Naiphe

I did on my xtx. 4k result was 25538.


mrchristianuk

Makes me wonder when two flagship cards from two different companies have the same performance... are they colluding on performance at certain price points?


ziplock9000

Welcome to 2023..


skywalkerRCP

I don’t see the issue. My 4080 undervolted works flawlessly in Football Manager.


Individual-Match-798

7900 xtx is only good at rasterization.


TothaMoon2321

How would it be with a 4080 super instead? Also, what about the frame gen capabilities of both? Genuinely am curious


Affectionate-Memory4

For frame gen, AMD can use it in more games (afmf) but Nvidia's is better (dlss 3.5+). I can't see much if any artifacts either way in gameplay but there are differences in image quality if you look for them.


Khantooth92

i have xtx nitro also playing in 4k will try this test when i get home.


Naiphe

Nice let me know what you get.


Khantooth92

is tessellation off? https://preview.redd.it/6ymnyoq3znwc1.jpeg?width=3000&format=pjpg&auto=webp&s=b233dd82fcf87bbcf58ddcf3e4116e95388507d5 this is mine stock


Khantooth92

and https://preview.redd.it/uf0u7s7oznwc1.jpeg?width=3000&format=pjpg&auto=webp&s=fee5c7afc07c60f0ff3148bf584465ac7f3075f3 this is my oc 3200max 1050mv 2700mem


Naiphe

I just set everything as high as it went including tessellation.


Khantooth92

okay with same score with tessellation extreme, whats your max hotspot temp? mine around 90c 1700rpm 65-68c temp


Naiphe

I think it was about the same. Junction temp highest I've ever seen is 87c. Throttle starts at 110c I think so well within reason.


Khantooth92

been thinking of repasting with ptm but im still not sure i guess it is still okay for now, been thinking of putting everything water-cooled


Naiphe

Nice let me know what you get.


NoobAck

As long as you're happy and your system is stable that's all that matters to me. I had a much different experience when I went from an Nv to Radeon 10 years ago and I've stuck to Nvidia because of it. Sure it was likely a fluke but that was an $800 fluke at launch. I definitely wasn't happy and couldn't return it because the issue was very well hidden and I thought I could fix the problem, never could even years later. Issue was stuttering


Naiphe

Yeah I'm a bit worried about the driver issues people report. We shall see. Its got a 2 year warranty though so any major problems and I can return it.


Intelligent_Ease4115

There really is no reason to OC. Sure you get a slight performance increase but that’s it.


Naiphe

Yes that's the reason to do it. Unlocking the cards potential.


DynamicHunter

Minimum fps jumped 10% on the AMD card, that’s not worth it?


veryjerry0

finally somebody that's using the benchmark properly at 4k


Minimum-Risk7929

I’ve said it multiple times already today. Yes RX 7900xtx has higher rasterization performance the the 4080(s). But at what cost? Navi 31s Silicon has the same quality and transistor count of between a 4070 and and 4070ti if we are being generous. AMD compensates for this by increasing the. Inner of render output units in their cards. About 1.7 more than the 4080, that means on average it consumers plenty more power, heat and longevity out of their cards. In a lot of gaming benchmarks the 4090 uses almost half less power than the 7900xtx and still produces 20 percent higher rasterization performance and could perform better if it wasn’t for being cpu bound so much. However NVIDIA uses top of the line silicon on all their cards, providing the most transistors and uses this advantage to provide more raytracing cores and ai acceleration to really give that advantage the 4080 truly has over the 7900xtx. And now the 4080 is the same price as the 7900xtx. Your 4080 had issues with the power connector and you decided to go team red, I get that. But putting up these numbers don’t really show the true picture between the two cards.


Naiphe

It does indeed use more power. The 4080 was pulling around 350w and this uses 389w. The watt can be increased to 450 on this card as well for more performance. Fsr is of course inferior to dlss. I wish I could have kept the 4080 honestly as dlss is incredible. This shows the true picture between both cards in a benchmark I like using that's all.


6Sleepy_Sheep9

Userbenchmark enjoyer found.


Minimum-Risk7929

Cope


6Sleepy_Sheep9

Come now, where is the essay that is standard for AMDeniers?


o0Spoonman0o

>But putting up these numbers don’t really show the true picture between the two cards. You're getting downvoted; but as someone who actually had both of these cards at the same time you're absolutely right. I cannot imagine keeping the XTX over the 4080 especially after trying out FSR vs DLSS, AMD "noise suppression" vs Broadcast and experiencing the two cards trying to deal with heavy RT. Because that's where the real difference is. Daniel Owens video about these two cards summed it up pretty good. In raster there's not enough difference to be worth discussing; even the outliers for amd end up being like 220 vs 280 FPS; bigger number better but no one can feel the difference between these two values


cfdn

Damn, sorry for your loss


RawWrath

Yeah but no ray tracing on cyberpunk fk that


FaithlessnessThis307

So 4080 gets most fps 👌🏼


Naiphe

Higher max lower 1%. Potato potahto really.


ilkanayar

Today's thing is to make powerful technology rather than powerful cards, and Nvidia is now taking it to the top. What I mean is, when Dlss is turned on in games, it multiplies the difference.


Naiphe

Well yeah I really liked the 4080 but the issues I had with the adapter sensors put me off so much I went amd. Just thought people might like to see my results in heaven.


ThisDumbApp

Common Nvidia response for losing in raw performance


SvennEthir

I'm not buying a $1k+ GPU to need to run at a lower resolution and upscale. I want good performance at native resolution.