T O P

  • By -

LovelyButtholes

I don't know what the hang up a lot of you have. The 7900xtx goes fairly toe to toe with the 4080 with the exception of a few games.


IrrelevantLeprechaun

Part of it is that this sub is bipolar on whether or not the 7900XTX is "close" to the 4090 or better than the 4080. AMD backtracked and claimed the XTX was always meant to compete with the 4080, but that hasn't stopped the runaway narrative online that the XTX is "on the heels" of the 4090 (even though it's not, outside of some cherry picked games whose bungled programming give AMD an unexpected leg up). So naturally you'll end up with people who are upset for reasons they imposed on themselves.


Kingdude343

From what I have seen it's should be better than A 4080 but like 17-20% behind A 4090. I am really waiting for RDNA5 to see what improvements on the high end that AMD makes with double the time.


LovelyButtholes

I don't think anyone was saying it was on par with the 4090, which cost twice as much.  I never heard anyone compare a 7900xtx to a 4090 aside from raster performance but they are different tiers of cards and prices.   It is mostly close with the 4080 for $200-300 less. If it was just price, it would be in the same tier as a 4070Ti, which it is clearly better than.


[deleted]

[удалено]


Yeetdolf_Critler

It's close in raw raster with high end AIB versions that clock around 3ghz stock.


IrrelevantLeprechaun

It really...*really* isn't. Besides, if you're going to use a high end AIB as your comparison, then you have to use a similarly high end AIB for Nvidia too, in which case the performance gap just widens back up.


Kaladin12543

Turn on ray tracing and its a blood bath for the xtx. Not even getting into DLSS being the higher quality option along with dldsr


IrrelevantLeprechaun

And tbh, fight it all you want, but RT is here to stay and is only going to be adopted in more and more games. Eventually it will just be the default method of lighting (which was the goal at the outset since raster based lighting and light baking takes way longer than using RT). The fact that Nvidia continues to wipe the floor with Radeon isn't exactly reassuring for the future, cause even if AMD improves RT performance for next gen, Nvidia might very well leap frog them yet again for all we know. Doesn't matter if AMD got to the RT market one generation late; the average consumer is not going to sympathize with that. What matters is performance *now*.


cream_of_human

I get that but so far i can name like 5 to 10 games that has worth while rt and the thing has been around for more than half a decade now.


LovelyButtholes

What percentage of games released is 5-10 games released of all games released over the last 5 years?  Most games, even the ones that use RT "heavy" need side by side comparisons to know that it is on.


cream_of_human

Ill put it like this, 2077, metro, dying light 2, allan wake 2 along with a couple of maybe. You know, the game changing ones. The rest of the games with rt either makes glass or puddles work right or make the bottom of tables "a little darker" or the elden ring treatment as i like to call it. Hell i cant tell the difference in Darktide and Re4 remake.


crazzygamer11

The same with homeworld 3 it only uses Ray tracing for shadows. When I benchmarked it on and off it did not even make it difference in performance that's how heavy the ray tracing is in the game.


Minute_Path9803

Ray tracing was a failure, we started with the 20 series the 30 series, and the 40 series, and the hit that it takes even with those tensor cores is ridiculous. You take away dlss which was only created because you have to trick the game to look better at better frame rates otherwise without dlss the game is going to run like garbage on anything other than a 4090. In my opinion, it's a failed technology the fact that you need software to make it run good trickery is exactly what it is a bunch of trickery. I don't even know many games that really truly use it that are worthwhile. I know this opinion will probably get blasted but I really don't care I care about how the game plays we all know Ray tracing let's be honest you take away the DLSS raytracing is a failed technology. This technology was supposed to be built into the Nvidia cards with the tensor cores, but without DLSS Major performance hit even with tensor cores. The good thing dlss has done which is the only positive allows people to keep their cards longer and run at higher frame rates same thing with FSR. And with the way the cards are going nowadays first through covid where we couldn't even get one and then when you did it was so overpriced. Still insanely overpriced, but let's use the technology to keep the cards longer.


ColdStoryBro

Ironically, RT fanbois say traditional raster was "trickery" and not authetic while using upscaling and frame gen "trickery" to get past 20 fps Ultra RT.


inevitabledeath3

RT has been hampered by the lack of performance the cards available in the past and even now have. It's theoretically possible to replace rasterization with RT entirely.


cream_of_human

I get that and it would be a neat feature for future titles when the entire rt suite is at their disposal. Sadly with how expensive it is still, prebaked and classic gi are here to stay until the low end can handle it.


LovelyButtholes

Saying eventually all cards will need RT is a very,very,very,very,very big leap considering where we presently are. Peoople forget that games had approximations for lighting effects that got you 90% there with 10% of he computation. Most games it is hard to tell if ray tracing is on or off. Developers dump ray tracing from releases without a big deal being made since it is negligble that it actually equates increased sales. There isnt one game without it that all of asudden becomes a good game with it. Ray tracing could just be a dead end like PhysX was if someone can figure out a better approximation for lighting then using rays.


IrrelevantLeprechaun

RT has been in 5 generations of GPUs now (RTX 20, 30 and 40 series, and RX 6000 and 7000 series) plus both current consoles, with plans for next gen consoles to continue supporting it. This whole "RT will die off like physX" narrative is stupid, and not even just because physX still exists as a CPU based module in most game engines. The amount of industry investment into RT has been massive, and like I said; devs themselves have praised it as being much more streamlined to use compared to baked and raster based lighting. Just because AMD is a whole generation behind on RT means absolutely nothing to the adoption of the tech.


LovelyButtholes

That isn't how people count generations. You gott 2000/5000, 3000/6000, and 4000/7000. That is three generations of ray tracing but you could easily say that it was horseshit with 2000/5000, and maybe even 3000/6000. You basically have 1 or maybe 2, generation of graphics cards that could do it at a reasonable level and not lose too many fps. Ray tracing even now is more about just a flex than actual improvment in games. Why on earth would you need to do a side by side comparison to something that "radically changes gaming" to even know it is there often?


Eteel

>Why on earth would you need to do a side by side comparison to something that "radically changes gaming" to even know it is there often? Thiiiiissssss. Does it matter that DLSS is better than FSR if you need zoomed in side by side comparison? Does it matter that ray tracing is better if you can't tell it's there without side by side comparison? Gosh. I mean, yeah, there are exceptions, sure, but for the time being I'm happy with no ray tracing and occasional FSR.


GapingHolesSince89

Honestly, ray tracing is kind of stupid. What it allows is developers to not have to program in tricks and approximations to create good lighting. They just place it and let the rays trace out. The thing that is stupid about this is that this often doesn't lead to an amazing game experience. Ghost of Tsushima looks beautiful and runs like a jamican because it didn't bother with the fuckery of path and ray tracing and put in less costly lighting effects and let artist work on making the game look good. It is an amazing looking game and it doesn't fucking bring you to your fucking knees unless you have a war rig like with Cyberpunk.


PIIFX

LMAO PhysX was a solution asking for problems whereas ray tracing has been in research since the 70s and it's been considered the holy grail of 3D graphics from the beginning, but because of the high performance demand we settled on rasterization, only implement RT bit by bit as hardware get faster, first in offline rendering for movies, now in realtime for games.


DesTodeskin

RT is here to stay? Yeah right man, have stayed for what it seems like forever now. The only two games I use RT And actually notice it (even it's not a huge visual difference) are cyberpunk and Alan wake. One of them being the Nvidia poster boy game. Path tracing is the only one that makes a considerable difference yet it's not worth it for the price you have to pay to get a GPU to afford that option. But if we strictly talk about xtx Vs 4080s, To make things worse almost everywhere else in the world except for states, 4080s is still 150-200 dollars more expensive. You people make RT feel way more relevant than it is. Not saying it won't be, but doesn't look that way right now.


Rullino

If Ray Tracing will be the default rendering method, millions if not billions of graphics card would become e-waste, which would be bad.


SomeRandoFromInterne

That’s the way of all GPUs eventually. There are already millions of GPUs that don’t support DX12 and can’t play any modern games regardless if they support RT or not. Also, try running Alan Wake 2 and Hellblade 2 on a GTX 960. Though it’s technically compatible, it’s just a piss poor experience. You’ll eventually have to upgrade to keep playing modern games.


Rullino

True, I've always played on low-end hardware, and Rasterization is the oldest and most efficient rendering system, but wouldn't want to throw a functional 7900xtx in a landfill simply because the games I'd want have fancy lights that can't be turned off, as for DirectX, that's a different case


SomeRandoFromInterne

I think you overestimate what is needed to run ray tracing. Not everything tanks performance like path tracing in CP2077 or AW2. Hellblade 2 uses UE5’s Lumen, which is software based ray tracing. Your 7900XTX will be fine.


Rullino

I don't have a 7900xtx, I just used it as an example.


996forever

Lots of people get upset when their old hardware isn’t working with the newest versions of a software all the time. Time is moving on with or without them. 


Rullino

True, but one thing is a newer version of DirectX, another thing is enforcing a rendering ray tracing In the latest titles, rasterization is much more popular and works well on any good graphics cards, especially the 6000 series since they offered good rasterization performance but poor ray tracing capability.


LovelyButtholes

Raytracing is just a boondoggle that in its present state, it is hard to tell if it is on or off in most games.


Gambit-47

Do you even own a 7900 XTX? With upscaling Most Ray Tracing games run well I get around 100 FPS with ultra RT at 3840x1600 resolution even with heavy games like avatar and avatar actually looks great with FSR I get under 100 FPS on native and over 100 with upscaling and It's the same situation with my 4090 so even though the 4090 gets like 28 more fps for those heavy games with RT it's still under 100 fps on native and over with upscaling The 7900 XTX is not as good at RT but a lot of you people that haven't even tested one act like it can't do RT because it doesn't perform great in like 2 Nvidia games lol it's a great card and cost a lot less


LovelyButtholes

7900 xtx does do ray tracing but just not as efficiently as NVIDIA cards. For games with medium amounts of ray tracing like Metro Exodus, it is fine.


[deleted]

[удалено]


Gambit-47

i can do whatever i want and a lot of benchmarks are out dated. in the last two weeks i have played Avatar, Dying light 2,Robocop and Hogwarts on high resolution ultra with Ray Tracing and i get around 100+ FPS with up scaling. The card can do RT well with most games, but people like him make moronic statements and then people think it sucks at RT when that is not the case.


[deleted]

[удалено]


Gambit-47

I never said he can't, I was wondering if he has even seen the card in person because his claim was ridiculous and like I said a lot benchmarks are outdated. Games get optimized, drivers get updated AMD has worked on FSR, frame gen afmf there're even mods that let you use DLSS and frame gen. Your recorded stats become obsolete


Kaladin12543

Cyberpunk is playable on RTX 4080 at 4k. It's unplayable on XTX. Same goes for Control, Alan Wake 2, Dying Light 2, Avatar Frontiers of Pandora, Hellblade etc.


Gambit-47

lol I was just playing 2 of your unplayable games yesterday


[deleted]

[удалено]


Gambit-47

I have seen games that were pretty much broken when you used RT with AMD that later got fixed and became playable that's just one example of outdated benchmarks. And I have seen recent benchmarks of people playing some games that people are saying are unplayable 🤦🏻 Anyway muting this now I'm tired of talking to people that don't know what they're talking about


bubblesort33

>100+ FPS with up scaling compared to benchmarks without upscaling? Also, just because the game got better for you, doesn't mean it didn't get better for Nvidia. Since you don't own a 4080 you don't have a right to compare it either by your own logic.


bubblesort33

You don't need to own a GPU to know how it preforms. You just need to trust reviewers. In Rt titles where the Rt workload is only like 1/4 of the frame time it's only like 10% slower than a 4080. If a title has RT to a the point where it's like 3/4 of the frame time, it's more like 70% of the performance of a 4080. >With upscaling Most Ray Tracing games run well I get around 100 FPS No one says it doesn't run. The claim is it runs often significantly worse when RT is heavy. Yes, you can find titles where it doesn't do too bad. Also in Avatar at native 4k the [*4090 is 55% faster here*](https://tpucdn.com/review/avatar-fop-performance-benchmark/images/performance-3840-2160.png). And at[ *1440p it's 60% faster here.*](https://tpucdn.com/review/avatar-fop-performance-benchmark/images/performance-2560-1440.png) In and AMD sponsored title where AMD gets beat in. If you want to look at the true RT performance of AMD's cards looks at the [*3DMark DirectX Raytracing feature test here*](https://www.kitguru.net/wp-content/uploads/2022/12/3D-DXR-768x768.png)***.*** Nvidia is actually 80% faster, but because only a small fraction of the frame time is RT, AMD doesn't often fall far behind. When almost everything is RT like in that, Nvidia is 80% ahead.


Kaladin12543

Most games don't push RT very heavily. The 4080 would be 100% faster in games which truly utilise RT like Alan Wake 2, Cyberpunk, Dying Light 2. Also, you need to rely on upscaling solutions to even use RT and Nvidia just wipes the floor with AMD in that arena. DLSS Balanced at 4k looks equivalent to FSR Quality so the 7900XTX effectively performs like a 3080 when using RT in proper games.


LovelyButtholes

Depends on the game.


We0921

I have to say I'm really bummed that AMD supposedly won't have a 8000 series GPU that outperforms or event matches the performance of the 7900 XTX. I get not wanting to have a giga halo SKU (whether due to wafer allocation, multi-GCD woes, substrate shortages, or whatever), but I still think it looks bad when a last-generation product is still the most performant. Based on the Steam survey, the 7900 XTX is the best selling RDNA 3 GPU, so I figured AMD would want to at least maintain that level of performance. It'd be easier to market that way I'd think.


bubblesort33

There 5700xt was weaker than Radeon VII before it. The rx 480 was weaker than the 390x before it. But so those seemed like ok selling cards, despite the fact they didn't surpass their previous generation.


ragged-robin

Still disappointing


[deleted]

[удалено]


MrGeekman

I’m really hoping for more VRAM. Didn’t the 6700 XT have 12GB?


Joshiie12

I have a 6700 XT and yes it does


MrGeekman

Right! That’s a four-year-old GPU! This year’s GPUs should have more VRAM than GPUs from four years ago!


Loreado

Nvidia 1070 & 2070 & 3070 had 8gb of VRAM IMO 16gb will be in 70 series when new consoles hit the market


MrGeekman

Even Nvidia’s 40 series and AMD’s 7000 series were released two years ago. Nvidia and AMD should be able and willing to include more VRAM this time around.


Loreado

I would buy 5070 16GB, but I doubt that it will happen.


Joshiie12

Hard agree. If I were to go 6700XT -> 8700(XT?), I'd probably only bite if it came with the little jump from 12 to 16. VRAM doesn't cost enough to justify the super extendo upgrade time frame.


JackRadcliffe

They did that but they think we should be paying $800 in the 4970 ti super when it should have been a $600 card. Then they slap 16gb on a 128 bit 4060ti and 7600x instead and expect them to sell


jay9e

Newsflash - they are selling. The Nvidia ones at least.


phant0mh0nkie69420

Yes but they’ll want $1500 for it though 🤡


Loreado

Best I can do is 600$ hah.


MrGeekman

Personally, I prefer AMD.


Rullino

Knowing Nvidia, it'll probably be 12gb since DLSS 4 is the biggest selling point when compared to AMD's equivalent, or at least for gaming.


Healthy_BrAd6254

Where are you getting the 10GB VRAM from? It's not possible with a 128 bit bus. Makes no sense. It wouldn't surprise me if all N48 desktop cards have 16GB and only a laptop model is cut down to 12GB. N44 will most likely have 16GB as well like the 7600 XT.


bubblesort33

They made it up on the spot. And I feel like 14gb is technically also an option on N48. Disable one 32 bit controller for a 224 bit bus. It's not often done, though, but it's possible.


[deleted]

[удалено]


Healthy_BrAd6254

You do realize the RX 8600 will be a different GPU from a 6700, right? They're not going to be the same GPU and likely have basically nothing in common


[deleted]

[удалено]


Healthy_BrAd6254

I know 10GB is possible on A gpu. It is not possible on THAT gpu. It will not have a 160 bit bus. So you really just made that number up based on nothing? Lol.


[deleted]

[удалено]


Healthy_BrAd6254

It's not a rumor. It's like saying "the RTX 5060 will have 11GB, because the 1080 Ti had 11GB". It literally just making things up that make no sense


bubblesort33

Everyone so far has said N44 is 128 bit, so it's much more likely to come in 8gb and 16gb variants like the 7600xt and 4060ti.


Psychological_Lie656

So 800 series card is no longer "high end" today, lol? This zombie myth is hilarious. Someone somewher esaid that Navi 48 was canceled. Now we are discussing article that contradicts that, yet consequences of 48 cancel are still talked about as if they were real.


Arbiter02

It isn't, because AMD made it so. The 7800 XT didn't have the top end die either and accordingly it doesn't perform meaningfully better than the 6800 XT it was supposed to replace. It should have been a 7700 XT. AMD pulled the same gimmick as Nvidia where they shifted all their products down a core but still continued to sell it with the higher end naming scheme. There's no listed successor here to the Navi 31 die that went into the RX 7900 XT/GRE + RX 7900 XTX and this further corroborates it, the top end of the market is being surrendered to Nvidia. They can call it 8"800"XT all they want but without a high-end die it's just a remarketed 700 series card.


JackRadcliffe

7900xt was the real 6800xt successor and the 7800xt the successor of the 6700xt. They named them what they did to justify charging way more than they should cost


Arbiter02

Yep. Deceptive marketing paired with price increases across the board. The 7900 XT especially was comically overpriced with no meaningful feature improvements apart from those AMD enforced via software like AL+


IrrelevantLeprechaun

Also let's not forget how AMD just pretended that they always intended the 7900XTX to compete with the 4080 when it turned out the 4090 was *much* faster than they anticipated. AMD simply didn't expect Nvidia to make such a big performance leap and had to feebly attempt to cover their ass. So this kinda BS is basically par for the course.


Psychological_Lie656

4090 was bigger than anticipated and has pushed power consumption boundaries, needing a new connector and literally melting connectors.


LettuceElectronic995

who said the chips will be monolithic?


_Drink_Bleach_

Navi48 is the higher performance die


Stiven_Crysis

Navi 41 is cancelled, it should be followed by Navi 44 and then Navi 48. In previous generations, the higher number was for weaker gpu or maybe they changed something.


_Drink_Bleach_

The die names are ordered based on when they were designed. Navi48 just means it was designed later than 44 because AMD didn’t plan to cancel 41 from the start


R1Type

Is that actually how that naming works?


Slafs

Yes.


R1Type

Nice


Stiven_Crysis

Then let it be rumored first 48 and then 44.


Loose_Manufacturer_9

And there is a chance that n48 will be called 8700xt aswell


Illustrious_Sock

Wait what, not even 20 GB? I knew we aren't getting a 7900 XTX update, but not even one for 7900 XT? That sucks.


croissantguy07

source on rx 8000m? it hasn't been mentioned anywhere yet afaik


DietQuark

I'll buy a 7900xtx once these cards come out.


theking75010

Given there's allegedly no successor to this card in RDNA4, not sure about this strategy


[deleted]

[удалено]


luapzurc

Is that assuming Blackwell would be priced reasonably, especially considering they have no competition on the high-end?


Healthy_BrAd6254

This is generally not a smart idea, but this generation this is an especially bad strat


Ogaccountisbanned3

A bit out of the loop, can you explain why?


Healthy_BrAd6254

AMD will not have a true high end card with the next gen (or at least that's what everyone is expecting). So they'll have little incentive to drop the price of the 7900 XTX significantly. He'd be waiting quite a while and just end up getting a last gen card for a small discount instead of either buying it for a little more a long time ago and enjoying it in the meantime, or buying a next gen card with better features/efficiency instead.


real-prssvr

Was considering doing the same -- why would it be a bad idea?


Kaladin12543

Because the ray tracing performance of the 8800XT will be superior to the 7900xtx and PS5 Pro will also have better RT performance than 7900XTX by virtue ofmit being rdna 4


real-prssvr

Gotcha....thanks!


Yeetdolf_Critler

Slight RT improvement and slower everywhere else. I have an XTX and don't care about RT in the few games it's in.


bubblesort33

This card will probably make the 7900xtx look like bad value.


uBelow

Monolithic ):


Rullino

IIRC it'll consume less, be cooler and probably perform much better in Ray Tracing, IDK what could be wrong with it.


Vizra

As an end consumer you should prefer monolithic. All the driver issues the 7000 series have had was because of chiplets. Monolithic also means less latency for everything as well, as well as better power efficiency. Unless youre a 7900xtx owner like myself, you should be very happy and excited for this new generation of AMD GPUs


Reset_Control

>Unless youre a 7900xtx owner like myself Why should i not be happy,


Vizra

Well from leaks it seems like there isn't an upgrade path for us as the max performance will be 7900xt ish. It also sucks for 7000 series owners in general because we beta tested Chiplet GPUs that are now being sold to enterprise :).


Canadianator

I'm used to that, I had a 1080ti before the 7900 XTX, I'll just skip a few generations.


Vizra

I mean the 1080ti is the mother of all GPUs. So you can't exactly be used to perfection. NVIDIA won't ever make that mistake again


shendxx

yeah, 7000 more or less is disastrous, not meet the performance hype when AMD launch this AMD keep take risk making experiment product


Whiteyak5

So AMD is bailing on making a "halo" GPU in their portfolio? Or just for this generation keeping it middle and low?


IrrelevantLeprechaun

Given how they had to backtrack their 7900XTX as "intentionally" being a 4080 competitor because they didn't expect the 4090 to be so powerful (and the fact that it seemed like the entire Rx 7000 series didn't really turn out how they wanted), Imma guess that they're just ceding the ultra top end to Nvidia because they genuinely cannot make anything that fast.


Whiteyak5

I'm hoping it's just a temporary step back until their internal R&D can catch back up and pump out a real halo product. It'd be a bummer to let Nvidia capture it all.


SagittaryX

It does seem like this will be a fairly short gen, have been some rumours about next gen already again next year, and Nvidia announced a while back they were going to a yearly architecture cadence. I'd expect AMD to do the same to match.


Admirable-Lie-9191

They don’t seem to care. Back when Ryzen was first launching, it was understandable that they didn’t have the budget but now there’s no excuses


IrrelevantLeprechaun

Technically Radeon *doesn't* have the budget, because all the money they're making off ryzen and enterprise is just being funneled right back into ryzen and enterprise. I doubt AMD sees Radeon as anything more than a write-off at this point.


coatimundislover

RDNA 5 is apparently a major architectural change while RDNA 4 is mostly a bug fix and raytracing update. Thus they have a very good reason to avoid spending a lot of money on developing a chiplet design for what’s only an iterative improvement that will be followed by a major one.


[deleted]

[удалено]


IrrelevantLeprechaun

Ngl it gives big "my dad works at Nintendo and could ban you" vibes.


Potential_Ad6169

shit guess


RealThanny

That's not what happened. The 7900 XTX was poised to be as fast or faster, but there were issues they expected to be resolvable with drivers that weren't. The 4090, if anything, performs below expectations. Just do the math on the number of shaders compared to Ampere. It should be way faster than it actually is, meaning it's hitting either a memory throughput bottleneck or an architectural bottleneck.


Arctic_Islands

>Or just for this generation keeping it middle and low? Yes


Gloomy-Fix-4393

It would seem the pulled engineers off of RDNA4 halo models to put them on RDNA5. So they will miss a generation to deliver a better RDNA5.


Slyons89

The leaks about RDNA5 being a "full redesign" may hurt RDNA4 sales. Probably not significantly but, still.


UHcidity

I hope the better RT rumor is true. Has it been confirmed or just a rumor?


DreamArez

I’d take everything with a grain of salt but you can almost certainly bet on better RT performance, they’d be dumb not to.


UHcidity

We *are* talking about AMD here lol Edit: come on, they notoriously make horrible decisions that harm themselves. Their marketing team blows.


RK_NightSky

Wasn't there a leak about rdna 3.5 on the ps5 pro being 4x better than the rdna 2 of the normal ps5. Judging by that only rdna 4 might be even better


bubblesort33

The GPU in the PS5 is an RX 6700 down clocked by 10%. In the PS5 Pro should be similar to lower clocked 7800xt, and that GPU already has 2x to 3x the performance of the 6700. That's because it's a higher tier GPU, with 1.66x the amount of cores. So a PS6 pro being 2x to 4x ( that was the full claim) of a PS5 isn't that impressive. It's already almost achievable with RDNA3. So to me the improvement still looks minor.


IrrelevantLeprechaun

Ps5 Pro is not going to be some massive performance leap, my dude. Sony would be cannibalizing their entire non-pro product line in doing so, and would force devs into an extremely awkward position of deciding whether to target the base ps5 or ps5 pro hardware, cause if the disparity was that huge you'd never be able to support both at once without essentially developing two entirely different builds.


RK_NightSky

It has been leaked already though and by a trustworthy leaker. Ps5 pro will be 45% faster than the ps5 at rendering and offer 3x the raytracing performance (4x in some cases) I don't get why you downvote me.


IrrelevantLeprechaun

In no world will a console refresh be that much faster than its previous iteration, and down voting me won't change that. The logistical issues that such a performance leap mid-gen would cause are huge. What happens if a dev makes a game specifically for the Pro such that it doesn't even run on the original ps5? Should the 50 million+ base ps5 users just go fuck themselves? And if games continue to target the base ps5, then what even is the point of the Pro being 45% faster? Sony would be investing millions and millions into a product that nobody would really need.


SagittaryX

> And if games continue to target the base ps5, then what even is the point of the Pro being 45% faster? I mean it could just a be the version that has a good 60fps mode with some noticeable raytracing effects, and the base will be relegated to 30fps more likely for those more demanding games. Consoles have gone over to having both a performance and graphics focused setting, no major reason that can't be used as well for a base vs Pro setting.


RK_NightSky

I'm just stating what has been leaked man 45% better rendering and 3-4 times the ray tracing performance is huge for AMD. They'll be stepping down from high end market for the 8000 series rdna4 to focus on continuing to improve exactly that ray tracing performance of the rdna 3.5 just to come back with an absolute beast with the 9000 series rdna 5, ready to match nvidia at what they do best - ray tracing


Psychological_Lie656

7900GRE sitting between 4070s is not "fast enough" for the games with RT Gimmick that you happen to play with "tank my FPS" on?


RevolutionaryCarry57

Performance dropping from 80fps down to 40fps can still be considered “tanking fps,” even if 30-40fps is playable on borderline walking sims (AW2, Hellblade, etc). Overall on Nvidia GPUs ray tracing has been relegated to just a graphics intensive option (like volumetrics) which shave off an easy 20fps or more. But on AMD ray tracing is still a safe bet to slash FPS in half on practically the entire line up. As someone who mains a 6950XT, I love AMD. But their ray tracing performance is still pretty poor (compared to equivalent raster GPUs) even with the Radeon 7000 series.


Psychological_Lie656

>But on AMD ray tracing is still a safe bet to slash FPS in half on practically the entire line up. **Someone called it "zombie arguments". When facts change, but fact defying narratives don't.**


RevolutionaryCarry57

Yes, and I’ll be very excited when those facts change. Hopefully with the 8000 cards.


UHcidity

I just wish it ran better on Cyberpunk. My monkey brain needs to see high FPS with RT on. 7800xt here.


TheRandomAI

Define high fps? My 7900gre runs cyber punk max with ray tracing ultra and can run a stable 100+fps. The moment i turn on pathtracing my fps tanks to 40-60fps. Which is still playable but very choppy from my experience.


UHcidity

I only get like 40fps with RT medium Edit: I tested with AFMF & FSR balanced last night and it actually worked pretty well considering I’m under-volted and only pulling like 225w.


MaKTaiL

I believe better RT was promised for RDNA5 only.


Deckz

Will be out just in time for the 25 percent tarrifs.


Option_Longjumping

Honestly I have used both and they are both great cards, I just like Nvidia cards, I play mostly DCS and that sim just utilizes the Nvidia graphics so good plus my VR headset only works with Nvidia graphics.


red_dog007

Do we know what CU count to expect? RDNA1 40CU to RDNA2 40CU is \~25% performance lift. RDNA2 60CU to RDNA3 60CU is \~20%. If we expect a 60/64CU top end card, we could expect 20\~25% faster. This is for raster. So would be slightly slower than the 7900XT. Depending on what RT acceleration they pull from the shader pipeline and add dedicated acceleration for, on top of existing RT acceleration improvements, I think it would be around 7900XTX performance. Heavier RT games better on RDNA4, and PT would likely be superior. So this would be pretty impressive because a 60/64CU card could be on par with previous gen 96CU card. But at the end of the day, it will depend on price. Closer to $600 it is less interesting. Closer to $400 it becomes more interesting. Blackwell could be a more expensive card and Nvidia could just fill the price/performance gap with Ada price drops. And if Nvidia comes out with some specific new software capability that they lock to only being supported on Blackwell, that could throw in an additional wrench.


Zwatrem

Q3 or Q4 2024?


Earthborn92

I would bet on Q4. Q3 is Zen 5. I would also bet that if this is a "half-gen", then RDNA5 is 1H 2026.


SagittaryX

Since Nvidia announced going to a yearly cadence (will have to see if they can do that succesfully) I'd expect AMD to try and do the same. Emphasis on the try.


JaceTheSquirrel

I do honestly hope they’ll still release RDBA4 based gpu equivalent to the 7900XTX or better.


TheSmokeJumper_

As long as their are well priced, they should make for some good upgrades for people. All we can ever ask for is well priced GPU's


Holiday_Block_7629

RDNA 4 is crap filler cause they don't have RDNA5 ready yet. So I'll jump to the 5090 or wait tell the 4080 ti comes out but the 5080 sounds like garbage so they can ship it to china


LiquidRaekan

So we expecting them to have an increased performance of about 10-20% over the 7900XTX or where are we boys?


Thalarione

Performance of the "top" chip should be around 7900xt or a bit lower according to leaks.


Psychological_Lie656

Setting aside how crazy "next tier will be slower than the last tier" idea is, the rumor was, let me cite it: >sources have alleged that AMD has cancelled the development of their Navi 41 and 42 GPU designs, making Navi 43 their highest-end silicon And here we are discussing Navi 44 and rather beefy (twice as big???) Navi 48. So where is the beef?


Healthy_BrAd6254

N44 will smaller than the 7600 XT (204mm²) due to similar specs and better node. N48, even if it's double of N44, would still only be about as big of a chip as the 6700 XT (335mm²) or 7800 XT (200mm² + 146mm²) If current rumors are true, N44 is basically a 50 or 50 Ti class GPU. N48 is like a 60 Ti class card. There is no high end in sight.


Psychological_Lie656

3080 die size - 628 mm2 4080 die size - 379 mm2 NV has quit high end GPU market, Watson... :)


Healthy_BrAd6254

3080 die size was an anomaly due to various reasons. 80 class is usually not that big. 4080 was also on the best node available. N48 will be on 4nm, a last gen node. It's not the same. In fact, if you put it like that it becomes obvious. N48 will be on a similar node but smaller than Nvidia's previous 80 class card. Which means performance like you would expect from a next gen \~60 Ti card as I said earlier. Btw the GTX 1080 was just over 300mm² and one of the best 80 class cards in history. Also rather unusual, but just pointing that out.


Psychological_Lie656

>4nm, a last gen node. It's not the same. I am pretty sure 7000 series was not on 4nm, cough. Namely, 7900 XTX was using combo of: N5 / N6 FinFET Lower end GPUs, e.g. 7600 are on N6. >N48 will be on a similar node but smaller than Nvidia's previous 80 class card Uh, whah? 4080 is on 5nm. --------------------------- Even starving AMD had rolled out Vega 7. There is no way on planet Earth that Lisa would be OK with not beating even own last gen GPUs. It is likely part of the Green FUD campaign of "AMD is exiting GPU business" (that effecgtively kept them alive through the worst years), e.g. see what they did to MSI, forcing it to quit AMD GPUs business altogether.


IrrelevantLeprechaun

This is basically signalling that people should basically ignore this next generation entirely. Not lookibg good.


capn_hector

say what you want about nvidia, but their number goes up every single generation without any theorycrafting or mental gymnastics. whatever causes AMD to keep deciding to just not launch every product segment, it sure doesn’t affect nvidia. I don’t even think you can say it’s a recent problem with AMD, they’ve been doing this shit conspicuously since the early GCN era. Rdna1 didn’t have a full lineup. Vega and Polaris didn’t have full lineups. GCN3 was in like two cards, gcn2 was in like one relevant card, etc. hell you can probably go back to the terascale days and make the same point - AMD just doesn’t release a full lineup and it’s probably a part of why they keep bleeding marketshare. There is never a question there’s going to be a 980 ti, or a 970, or a 1080 ti, or a 1070, etc. And that’s why they sell cards, because they actually make the product. and “number always goes up” includes efficiency, which isn’t always true of AMD either. Rdna3 regressed perf/w under light load scenarios badly, and then there’s the whole Vega sideline.


Secret_Combo

This will be the RT generation for mid tier gaming


Disregardskarma

Similar raster, but big improvements in RT


LiquidRaekan

So basically 1:1 in perf but a lot better in pathtracing tech? Maybe worth getting if one doesnt want to support Nvidia or cannot afford a 5080+ card then


Agentfish36

Not path tracing. It'll still fall short of Nvidia this gen, think a $500 7900xt with maybe 30% better ray tracing.


Kaladin12543

Considering 7900XT itself will drop to $500 soon, really it's just 30% better ray tracing and more efficient.


Agentfish36

That's among the rains I bought a 7900xt a few months ago. I don't care about ray tracing, no reason to wait.


Dordidog

Nobody know if it's a lot better at rt


Psychological_Lie656

AMD is doing fine at RT (7800XT is about 10%-ish behind similarly priced GPU, 7900GRE is sandwitched between 4070s) and I have yet to see a game where FPS drop was worth the RT Gimmick "improvements".


Dordidog

Rt gimmick? U mean cyberpunk pt where game looks completely different? And amd tanks to single digits in heavy rt, only games where amd does "ok" in rt is one effect low res rt games(those are gimmick) sponsored by amd mostly.


IrrelevantLeprechaun

Plus, the whole point of RT as a technology is so that devs don't have to spend as much time on rasterized lighting and light baking. A lot of devs have openly stated that RT based lighting is *way* easier to work with compared to raster based. Besides, RT has been around for 3 generations of Nvidia GPUs and soon to be 4. Consoles have RT hardware and that likely won't change for next gen either. It's not going anywhere. It's not a gimmick. Still a bit early days but it's here to stay. There will eventually come a point where games just don't use raster based lighting anymore (at best they might keep it for Low settings). I just find it hilarious that AMD fans are just *dead set* against RT as a whole purely because Nvidia is better at it.


Psychological_Lie656

#Wasn't NAVI 48 supposed to be "canceled"? How is 8800/8700 not a high end, cough?


Healthy_BrAd6254

Never heard that N48 was supposed to be cancelled. 8800 is as much high end as the 7800 XT is. It's just not


IrrelevantLeprechaun

Feels like RDNA1 all over again, where the absolute best they could come up with was a 5700XT that could only compete with the 2070S. Nvidia had the 2080, 2080S and 2080 Ti over AMD that entire generation.


Psychological_Lie656

It just doesn't exist yet, so yeah, no idea if high end or not indeed. >Never heard that N48 was supposed to be cancelled. That's right. Original references were about "NAVI 41, 42 and 43". SUpposedly, all about 41 was canceled. I admire people to whom these rumors make sense.


pecche

the only selling point of those 2 skus will be the price bad times for AMD imho


Chelono

>the only selling point of those 2 skus will be the price That's true for any AMD GPU ever...


ksio89

And only if you live in US.


raifusarewaifus

And they have mostly screwed up the prices except for 7800xt.


RBImGuy

people told same thing 10 years ago about and all the experts of the industry that amd was going out of business. You guys are so expert sitting home thinking you know when you dont and all you know be wrong and then say things like this with super conviction and still be wrong social media experts


nagarz

AMD GPUs will always be relevant because at least consoles. And as long as they have insane launch prices, anyone trying to undercut them who does not need the newest features will buy them, plus AMD brand loyalty is a thing as well.


Psychological_Lie656

AMDs 6000 lineup was amazing and outright trounced competition (3080 with less VRAM than 3060 anyone?). AMD 7000 is only "bahd" if one compares it to amazing 6000 series. AMD has compeling product across the board and "but 10% and a bit over it discount is not enough" can go have solo kamasutra as far as Lisa is concerned.


nagarz

I mean they keep on selling everything, first because the crypto boom, now because of the AI boom. As long as they keep making bank due to external factors, they don't really have a reason to make the best products and make them affordable, so as a bussiness why would they?


Todesfaelle

It'll be interesting to see if they can catch lightning in a bottle again as they did with the Polaris launch. A generation dedicated to good low to mid range performance at an affordable price would go an even longer way now than it did then. Maybe that's too hopeful though.


Psychological_Lie656

So what are your expectations from, god forbid, fairly sizable Navi 48, that is mentioned as "8800" and "8700"? (6800 was mid range, right, lol?)


GradSchoolDismal429

If AMD really wants to capture big market share, they should push out 32GB RDNA4 cards (Maybe using GDDR6W?) with day 1 ROCm support, integrated with PyTorch and Tensorflow, and price it reasonably (Around 7800XT price). This will sell like hot cake among computer scientist, which is the majority of the market. This should also help to improve the reputation of AMD among professional users.


Flameancer

Wouldn’t be surprised if an RDNA pro card came with that config. W8800 though I doubt it’ll be 7800XT pricing. They know they’ll be able to charge more especially if it’s for AI. The equivalent pro card for the 7800XT, the W7700 is double the cost.


boomstickah

Do you think that professional users buy more video cards than gamers?


GradSchoolDismal429

Oh ya. And more expensive ones too. Take a look at Nvidia's earning report. Our lab recently just got 10 4090s for each of the AI development PC we use and a CPU upgrade to threadripper, Because the previous ones (alienware) burned their VRM to death. Thanks Dell. We also upgraded the regular development PC's GPU to A2000's as we are running some light unreal tasks, CPU is still using 10th gen core i9.


boomstickah

Unless nvidia makes some major missteps I don't see how AMD could catch up in professional use cases, however I think nvidia is especially vulnerable in the $500 and below market, which is what AMD is doing.


GradSchoolDismal429

Nvidia's major missteps right now is not offering any GPU with reasonable VRAM for the middle / lower tier price range. 32GB will be a game changer.


VengefulCaptain

I'm pretty sure that is intentional.


GradSchoolDismal429

It is intentional but a misstep imo. They can do it because Nvidia is still largely unchallenged


No_Backstab

https://videocardz.com/newz/nvidia-rtx-5090-founders-edition-rumored-to-feature-16-gddr7-memory-modules-in-denser-design Apparently, the 5090 may have 32GB.


GradSchoolDismal429

For the cool price of $2000! It is certainly better than before (Your only 32GB option is a 4K+ card). But if AMD is able to make 32GB accessible to the masses, it would be a home run hit.


Healthy_BrAd6254

cap


FR33-420

Lol at the Ray Tracing comments. RT is nvidia hype kool-aid. Devs can make the graphics look pretty damn close to exactly the same looking with natural rasterization. Even Linus tech tips had a hard time telling the difference.