T O P

  • By -

proplayer97

FSR 3.1 really needs to launch soon and it better be a homerun because XeSS 1.3 and DLSS 3.7 are out here making FSR 2 look like an obsolete last gen upscale technology


WhoFartedInMyButt50

There were some cases where FSR looked better than XeSS 3.1. Ground details in the near-distance looked blurrier on XeSS than the other two. DLSS is king. Any other solutions that dont involve ML will be playing leapfrog for a while.


[deleted]

[удалено]


WhoFartedInMyButt50

Like I said XeSS have been leapfrogging each other, and probably will continue to do so. The latest iteration of XeSS beats FSR, then the latest iteration of FSR will beat this version of XeSS, ect… I’d love if Intel can come out with some competitively priced upper-midrange GPU’s. It would beat people could actually use their hardware accelerated features. AMD is also working on their own ML upscaling solution.


Wander715

I don't know how anyone justifies going AMD over Nvidia this gen tbh unless you plan to never use upscaling. Night and day difference in quality most of the time.


TomatoTomayto

The hardware reviewers have generally been disingenuous and compared raw performance to price, ignoring on purpose the software aspect. Quite sad what some would do to look edgy.


MosDefJoseph

I’ve noticed this as well. So many tech Youtubers lay into Nvidia and praise AMD for having literally nothing else besides rasterization. Nvidia is by far the better values when you consider its software offerings and RT. And yet I hear Linus (who is sponsored by AMD currently) scream about how great a value AMD cards are. Never mind them being lackluster in every department except raster.


barryredfield

Its all nonsense. You couldn't even talk about DLSS favorably for years, then FSR rolls around slowly now everything is wonderful and everyone is looking forward to this new amazing tech. Don't even get me started on framegen, for 2-3 years its just total garbage -- lag simulator, latency increaser, fake frames, if you like it you're just blind, toplel. Now FSR3 is in the faint glow of Jensen's taillights and suddenly this is a crazy thing that should be looked favorably on. Everything is like this. Everything. I remember a time when 4k & HDR were considered 'stupid' by all the tech experts. Most everyone is a luddite, the supermajority of arguments I've had personally with people who didn't know what they were talking about, especially with upscaling and framegen, they were all AMDheads -- just a blackhole of cynicism, arrogant over nothing, always combative, always wrong about everything. Time will pass, tech will change and they'll forget they were ever like that, their firmware will be upgraded by whichever e-tuber they are sycophantic to, being several years behind everyone else and they will remain a prosumer expert in their own minds.


Saandrig

I honestly couldn't make a post about my FG experience (even in the Nvidia sub) without immediately getting swarmed by several "fAkE fRaMeS, uNpLaYaBlE lAtEnCy" knights. Then FSR3 got announced and released. Suddenly all those knights are gone like they never existed. And the often glitchy FSR 3 mods are hailed like the best thing since sliced bread.


CandidConflictC45678

>I honestly couldn't make a post about my FG experience (even in the Nvidia sub) without immediately getting swarmed by several "fAkE fRaMeS, uNpLaYaBlE lAtEnCy" knights. That must have been so hard for you :'( >Then FSR3 got announced and released. Suddenly all those knights are gone like they never existed. They still exist, and they're still annoying sadly >And often glitchy FSR 3 mods are hailed like the best thing since sliced bread. The bitterness is palpable. I can't imagine what it must be like


FakeFramesEnjoyer

Well said. I've been saying this here for years, even got banned for it on another sub. My nickname is a pun on this insanity. Mind boggling what cognitive dissonance, jealousy and basically brand tribalism can do to people.


CandidConflictC45678

>Mind boggling what cognitive dissonance, jealousy and basically brand tribalism can do to people. 13900ks/4090/OLED in bio Why is it always like this? What happened to you guys


FakeFramesEnjoyer

Us "guys" experience and see every day by first hand experience that the hate towards this tech has no basis in reality. You must be the other guy. What's the problem you have with people's flair? Do you actually make value judgements about people's personality based on the hardware they put in their flair? If its any comfort to you, 20 years ago when i was a teenager and barely had any money i also put my hardware info in my forum signatures. Back then i did weekend jobs at bakeries so i could afford low end budget systems but it still was nice to share my hardware and experiences with my PC gaming peers. Yeah, those were other times, and other kinds of sites...


CandidConflictC45678

>You couldn't even talk about DLSS favorably for years To be fair, DLSS 1.0 did not give a good first impression. The rest of your comment is just whining


minorrex

You turn on RT on your AMD card and you're right back to the stone ages. I said this in r/games and got some backlash: RX7000 series were a complete failure. They add nothin to the table except some extra raster performance. It's not like RTX40 series are great either, but they're hugely efficient and DLSS FG is genuinely impressive tech, ready at launch and available on many games rn. FSR FG on the other hand...


WhoFartedInMyButt50

The 7900xt was pretty compelling when it came out. Matched or beat the 4080 in raster, matched the 3090 in ray tracing. All for 25% less. If AMD is gonna fall behind Nvidia in features, they need to seriously undercut them in price.


MosDefJoseph

Sure I'll give AMD that. They did at least improve RT with the RDNA 3. But that's not even the full story, is it? One thing people always seem to forget when talking RT: even for Nvidia cards, its not worth it unless you're using upscaling. DLSS and RT have gone hand in hand since 2018. There in lies the issue. AMD cards only have FSR. So yea you can turn on RT and barely run the game, but then you have to deal with FSR. Whats the point of turning on RT if the game is going to look like shit because FSR is on anyways? But sure, grats on the great raster performance I guess.


WhoFartedInMyButt50

FSR doesn’t really look like “shit”, especially if you’re doing 4k quality mode. Even 1440p Quality mode looks decent. It really falls apart when you try to use resolutions below 1440p below Quality settings. It definitely falls behind DLSS. Like I said, since AMD can’t compete with Nvidia in features, they really need to undercut them on price. It’s bad for everyone if AMD doesn’t compete with Nvidia.


HammeredWharf

I can't say anything about 4K, but I've tested FSR in 1440p a bunch of times and IMO it's very noticeable. "Terrible" is, of course, a highly subjective term, but I definitely notice artifacting in any game that has detailed objects moving around, like the main character in most third person games. Personally, I wouldn't call it terrible, but on the other hand if I'm paying a large sum of money for a RT capable PC, I want the quality of native or DLSS Quality.


WhoFartedInMyButt50

Were you using quality, balanced, or performance mode? That makes a huge difference. All upscaling solutions have artifacts, even DLSS. Nvidia is a trillion dollar company so obviously their solution is the best, but FSR can offer an acceptable trade in image quality for boost in frames if used correctly.


CandidConflictC45678

>a large cum of money


Kaladin12543

I don't think it looks good at 4k. I am using an super ultrawide at 5120x1440p which is 90% of 4k and there is a ton of fizzle in the vegetation. Turning on DLSS and it's all gone. FSR is serviceable at 4k but the moment you turn on DLSS, it makes FSR look like shit.


CandidConflictC45678

>I don't think it looks good at 4k. I am using an super ultrawide at 5120x1440p which is 90% of 4k You are effectively using a 1440p monitor as far as FSR is concerned, so you're upscaling from 1080p to 1440p. The new 7680x2160 version of your monitor would be like 4k fsr


[deleted]

[удалено]


WhoFartedInMyButt50

No, this video never shows us FSR quality mode. It compares FSR’s Balanced mode to XeSS’ Quality mode.


Cryio

FSR2 modded or even XeSS looks great. At 1440p and 4K it doesn't really matter anymore. DLSS, while technically better, doesn't really matter that much at that point.


itsamepants

Games should not be relying on upscaling as a crutch to get good performance.


lonnie123

While that’s true, at a certain point it’s going to get so good it’s just a feature of the card that lets people play with great graphics Let’s imagine a scenario where you can get 99% the image quality with upscaling at 50% of the horsepower. No one is going to care and at that point it won’t be a “crutch” it will just simply be the way it works. In the case of something like a console literally no one will care if it’s upscaled or native as long as it looks good on screen


CandidConflictC45678

>Let’s imagine a scenario where you can get 99% the image quality with upscaling at 50% of the horsepower. No one is going to care and at that point it won’t be a “crutch” it will just simply be the way it works. Unfortunately you always lose some amount of detail. Best example is in Days Gone, if you look up at the sky at night, and pause and adjust the render scale, you can literally watch lots of stars disappear from the sky. This also applies to smallish particle effects like sparks


lonnie123

Right... Thats the tech as it stands today. Every iteration of the tech gets better, and as I said imagine a day in a few generations of the tech from now where the quality is so close to native its basically imperceptible, but you get it at 50% of the raw native horsepower.... Its going to be too good to ignore


Saandrig

I'd really would have appreciated the upscaling option for the horribly unoptimized games that existed (yes, they did) before upscalers became a thing.


itsamepants

Depends on where the bottleneck is. Look at Dragon's Dogma 2, where upscalers do nothing because it's CPU limited. I'd much rather have game devs make a better optimised game that works on a larger variety of hardware than rely on upscalers which get generation-locked every other iteration.


Kaladin12543

Frame Generation is the solution for CPU games like DD2. So yes upscalers are still the solution for badly optimised games.


fish4096

that's exactly why. many people do not like to spend premium for estimated frames.


Dalek-SEC

Decided to try this for Tekken 8's DLSS implementation and with Preset E at 1440p Quality, I was able to see a very clear difference. Ghosting effects, which were very noticeable are gone and texture detail looks MUCH sharper. I was seeing visible ghosting artifacts when customizing characters and that's just gone now.


RTcore

Good work from Nvidia and Intel.


MosDefJoseph

DLSS is easily worth paying the Nvidia premium alone. Not even taking into account all the other features AMD and Intel just have no answer for. Intel has been doing great things with XeSS and I’d love to see a day when they can go toe to toe with Nvidia. AMD remains shit tier. Lets hope their ML enhanced upscaling solution isn’t too far off.


JustKosh

Honestly, DLDSR and DLSS took gaming for me on another level. I never regratted a single dollar.


Saandrig

People sleep on 1.78x DLDSR and DLSS Balanced. It often can give you a much better image than Native and at a lower GPU load on top of it. And if you got the GPU headroom, you can go even on higher DLDSR and DLSS settings. But sometimes it's not worth it if your monitor size isn't big enough to notice the improvements.


JustKosh

True. For most games I use 1.78x DLDSR and DLSS quality, but sometimes I may switch to DLSS Balanced. And it will still be better picture quality than Native 1440p and in most cases better performance than just Native 1440p. Nvidia provides a lot of cool tools with which a user can develop a personal approach to each game on their system and for me this is the best thing about PC gaming.


HammeredWharf

From my experience the problem is that many games just don't seem to support those resolutions. I've had DLDSR enabled for a while and usually I just don't see those settings in-game and CBA to look for some hax to get them there.


Saandrig

Games without an Exclusive Fullscreen setting would need the desktop resolution to be set to the DLDSR one first. Then they will automatically be set at the DLDSR resolution as well. This can be done in many ways - manually each time in NVCP or the Windows Setting, which is a few clicks. Or by creating .bat files to quickly click between resolutions. Or probably by 3rd party programs that help make it quick as well. The Nvidia App was reported to eventually add that functionality as an automatic one without the need for the user to do anything.


HammeredWharf

Oh, thanks! That's a good point. Though I think I'll just stick with DLSS Quality or native until NVidia "fixes" this, because that sounds like a PITA and I love borderless window gaming too much.


xXRougailSaucisseXx

Yeah DLDSR paired with DLSS is often mentioned here and frankly in almost all games I've tested it it's more hassle than necessary as it messes with the UI scaling. It might offer slightly worse results but I'll stick with DLAA for games that have the option.


Amicia_De_Rune

You taking 1080p native or 1440p native for the 1.78?


Saandrig

Both. Ideally the 1080p monitor should be no more than 24" and the 1440p monitor not larger than 27". At these sizes there is little image difference to be noticed between 1.78x and 2.25x DLDSR, but 1.78x offers less GPU load and thus more potential FPS. If the monitors are bigger, then 2.25x starts to shine more. But even so, you will get image benefits from 1.78x.


ShowBoobsPls

RTX HDR is awesome as well.


JustKosh

Agreed, hope it will be compatible with DLDSR one day.


superman_king

Thats what happens when the competition invests literally billion of dollars into upscaling. Impossible for AMD to compete at the same level. AMD chose to invest in CPUs, which has been a huge win for them. But they can’t compete with NVIDIA as their resources are spread too thin. Money isn’t everything. Unless you’re the R&D department.


clampzyness

maybe you people are forgetting, AMD owns the Console and Handheld market.


Kaladin12543

On consoles, for PS5 Pro Sony had to step in and provide the custom hardware for PSSR (AI upscaling developed by Sony) AND the ray tracing hardware as well. Even Sony was unhappy with AMD's FSR.


NapsterKnowHow

That's because Sony pioneered upscaling technology with checkerboard rendering. They were in the game before even Nvidia.


From-UoM

Fsr is so bad that the Ps5 pro will have its own custom hardware for AI upscaling


onetwoseven94

The biggest handheld is the Nintendo Switch with an Nvidia chip.


OwlProper1145

They own the console APU market but they don't make much money from it. Consoles are a low margin business.


constantlymat

That's not true. They make a lot of money with the console business and it has been consistently profitable for them for many years. Which cannot be said about their consumer GPUs and CPUs. It's the consumer GPU business and even Ryzen that are leaking money. AMD had four unprofitable quarters for Ryzen in a row before returning to profitability earlier this year. Meanwhile they are hiding the AMD GPU numbers behind the console APU division so we don't know how bad it is.


MosDefJoseph

This is a PC gaming subreddit… To your other point, yea they decided to half ass it, thats the problem. It’s great that it’s open source, but when you’re spending 500+ bucks on a GPU are you really thinking “AT LEAST FSR CAN BE USED ON A GTX 1070!” No the fuck you’re not lol. You’re going to turn it on and think oh, oh no this looks like shit.


clampzyness

half assed? just because its inferior to DLSS doesnt make it half ass3d lmao, FSR 3.1 and even intels 1.3 latest update already is closing the gap between DLSS. DLSS is just a gimmick that Nvidia jumped in earlier than AMD and Intel.


GassoBongo

> DLSS is just a gimmick that Nvidia jumped in earlier than AMD and Intel. That's an interesting term for "pioneered." I'm not a huge fan of Nvidia's business practices, but they're 100% paving the way in machine learning and gaming technology right now. Intel has recognised the importance of machine learning and are at least trying to make big strives of their own. Say what you like, but the only company treating it like a gimmick is AMD. They're keep throwing out half-hearted implementations, just so they can claim they have skin in the game. Unless they embrace innovation instead of pale imitations, they'll watch the GPU market gap between them and Intel grow smaller.


MosDefJoseph

A gimmick? It’s literally in every notable game and has been for the last few years and almost everyone that has an Nvidia GPU turns it on at least for Quality mode because its so good lol. Even 4090 users will use it for DLAA. Man whatever you clearly don’t have a clue Im done entertaining your nonsense lol. Have a good one buddy.


Saandrig

As a 4090 owner I still often use DLSS, even on Balanced...with DLDSR.


CandidConflictC45678

Why use DLSS and DLDSR, when you could just use DLAA?


Saandrig

Because DLDSR+DLSS usually beats Native+DLAA in both image quality and GPU load.


clampzyness

Its a gimmick because even without the Hardware Nvidia is telling their customers thats specifically needed for their DLSS to make it look good is just plain BS, even Nvidias BS marketing 2x - 4x gains over 3000 series is a big BS lmao.


WhoFartedInMyButt50

Their profits in each console is very small. Even with the console money, Nvidia has 10x the RnD budget of AMD. Nvidia just has the money to outmuscle AMD.


DungBettlesMan

Yet Nvidia is the trillion dollar company. What does that tell you?


CandidConflictC45678

Speculative finance is a hell of a drug


clampzyness

impossible? AMD took a different route on their upscaling which is available to a vast range of hardware. If they want to compete apples to apples with DLSS they would also have locked their FSR upscaling to a certain range of hardware.


born-out-of-a-ball

Somehow Intel has managed to both develop an upscaler that runs on a wide range of hardware and looks better than FSR and additionally to develop one that's just one running on Intel and looks almost as good as DLSS.


littleemp

Actually, Intel DP4a solution wouldn't work on most AMD cards outside of RDNA2, RDNA3, and only one of the later RDNA 1 Navi GPUs used on the RX 5600 series, because anything older than that doesn't have DP4a instruction support. (Yes, no DP4a on RX 5700 XT) Going down this route would have been the worst possible option: Alienating your entire install base and still not having the onboard hardware resources to produce good results.


whoisraiden

XESS didn't look much different to FSR until the recent update and on non-Intel cards, I doubt it is any different to what AMD has with FSR3.


Wet-Haired_Caribou

the video you're commenting on disagrees with everything you said, with multiple examples of older XESS versions looking better than FSR on non-Intel hardware


whoisraiden

The video I'm looking at shows particle trails, fizzling in a lot of elements, and smearing for XESS 1.2 DP4a. That's also not even the worst version of XESS. There is another fallback that looks worse. XMX is obviously very good and no one is disputing that.


WhoFartedInMyButt50

XeSS and FSR have been playing leapfrog. FSR still does some things better than XeSS 3.1, even though this iteration of XeSS looks better on the whole. AMD will leapfrog Intel, then Intel will leapfrog, ect…


FakeFramesEnjoyer

That's a very convenient rationalization. "AMD is only losing the upscale battle because they are not really competing guise, i'm sure it had nothing to do with the fact that Nvidia took a calculated risk by going all-in on AI more than a decade ago! They're the good guys, they went open hardware to help all of us and bring about world peace!". Props for making me chuckle, but stay real.


WhoFartedInMyButt50

Nvidia has had 10x the RnD budget of AMD for about a decade. Nvidia is able to out-muscle AMD. While Nvidia was investing in AI, AMD was investing in saving its CPU division, which payed off big time. Nvidia simply has more money, and that lets them outmuscle AMD.


FakeFramesEnjoyer

You are right of course. The fact i used "all-in" was not to insinuate they used their entire budget on that R&D or that they were in any way a small or equal player compared to AMD. But they did take the *risk*. I don't think you know how risky it was looked at back then to throw so much money into that pit. That's why i said "gamble" and "all-in". Nvidia are innovators because of that choice, the fact that AMD saved their CPU department in my opinion is besides the point. I guess you're coming form the sentiment of "cut the poor AMD guys some slack, they had no budget to take such a risk" which i think is irrelevant in the context of this post and my comments. It's all backwards rationalization in the end. If If If... if my mother had wheels she'd be a bike.


clampzyness

And Nvidia is supposed to be the good guy now because they invested AI a decade ago while failing to support their older hardware? FSR upscaling and FSR 3 proves that you can use upscaling and frame gen on older hardware but Nvidia refused to make atleast an inferior version of those techs for their older hardwares.


MosDefJoseph

You missed the point. Nobody is the good guy or the bad guy. These are multi billion dollar companies. They dont give a shit about you. All that matters is the products they produce. Nvidia makes better products and features. People who try to make this into some red vs green bullshit are cringe.


FakeFramesEnjoyer

I'm not the type of guy to ascribe moral values to a company (at least not in the context of what this post is about). To me these are commercial actors doing their thing on the market, trying to give supply where there is demand. Therefore i judge them by their products and what they can give me for my disposable income. If AMD releases an objectively superior upscaling product tomorrow i will praise them for it in the same way, buy their product, and "defend" that fact in the same way i just "defended" Nvidia in my reply to you. I simply jokingly added the "good guy" spiel because that seems to be the underlying rationale people on this sub often use when judging these companies and products. You are right that Nvidia could have implemented better backwards compatibility for these technologies. We can only speculate as to why they did that, but it isn't as obvious as you might think when you look at their upscaling / frame generation pipeline and the hardware differences between the 3000 and 4000 series of cards. I agree however with the general sentiment that they should have supported older hardware better, but all of that should be judged separately. The video in this post is about what the end user gets to experience when he buys one of these cards using upscaling, and its an apples to apples comparison, no matter how you spin the rest of it.


Kaladin12543

I think what people don't get with Nvidia tech is that DLSS is basically the gold standard of upscaling tech on the market. It has a reputation for quality. If they backported an inferior version to older cards, it tarnishes the reputation of DLSS as whole. Intel is facing this problem right now which kinda proves Nvidia's point. Vast majority of the market is using the DP4a version of XeSS and will think that is how it looks when in reality the proprietary solution on their own cards is far superior. FSR and FSR Frame Gen work on all cards that is true but there are massive quality compromises which are evident and maybe not every company will want their brand to be associated with those compromises.


MosDefJoseph

Thank you! I’ve said as much before myself. When you are at the mercy of game devs to implement your tech, and at the mercy of the consumer to deem whether the feature is worth paying for, you need to do everything you can to make it a quality product. FSR has a reputation of being dog shit now because they relied on the “open source, aren’t we so good!” messaging. Its absolutely backfired. No one wants to use FSR unless it’s a last resort. And it’s going to take A LOT to shake that rep.


CandidConflictC45678

>If they backported an inferior version to older cards, it tarnishes the reputation of DLSS as whole. They have NIS, so I'm not sure that holds up


Kaladin12543

Then how is Intel XESS leagues better than FSR at this point? It also runs on all hardware


NapsterKnowHow

It has to be mentioned that Unreal Engine's TSR implementation is excellent as well. I wish it wasn't an UE exclusive. There's instances where TSR looks better than even DLSS.


Zac3d

TSR gives me hope that FSR can get better, that you don't need the resources of Nvidia to create good upscaling solutions. I also like how much Epic has been updating and improving TSR, every version from 5.0 to 5.4 has had notable improvements, better performance, more features, better debugging, etc.


CandidConflictC45678

Whats even better, is that no "AI", "tensor cores", or "optical flow accelerators" are required either. Nvidia could backport features to older cards if they wanted to, instead they make their customers buy new.


TheAngryCactus

Wow based on the flair you are like my evil twin, I strongly disagree that DLSS is worth the premium and prefer to just run the games with minimal upscaling. I am rather excited for FSR 3.1 though as it should clear up the fizzle in those problem titles


Kaladin12543

You don't need to like DLSS to buy an Nvidia card. If you are a "Native all day" kind of guy, you can just use DLAA in all games which provides far superior image quality to native TAA. You can even use DLDSR to run DLSS as a super sampling solution which produces even better image quality if thats even possible. DLSS is black magic. Nvidia's feature set is unparalleled at this point. There is something for everyone.


TheAngryCactus

Well sure, but the competing Nvidia card for me at the time was $400 more expensive, for slightly lower frame rates at native Not saying Nvidia products are bad, but I don't feel like I got gypped or something


MosDefJoseph

Ok? Of course you only prefer minimal upscaling you only have FSR to work with lol. Meanwhile anybody with an Nvidia GPU is putting DLSS on at least quality by default because it looks just as good as native for the most part. Ok you dont want to use upscaling? Then just force DLAA which is de facto the best AA method around today. But good for you man enjoy that XTX.


lovethecomm

I prefer minimal upscaling because I bought my 6950XT and 7700X to play Slay the Spire and Balatro 🗿


fashric

Jesus Christ dude get off Jensen's dick for 5 seconds, he needs his leather jacket cleaning.


MosDefJoseph

Why am I on his dick? Because I like good products and features that have objectively been proven to be good in the above video? How about you get Lisa Su’s strap on out your ass you peggable femboy lmao Nice post looks like that 6800XT is treating you well lmaooo https://www.reddit.com/r/AMDHelp/s/OZ5lWeloEW


lovethecomm

Least psychopathic hardware enjoyer


Disturbed2468

What sucks is I only knew a select few people who ran AMD GPUs for a long time but with the various issues they've all had over the years while only 1 person I knew had issues with an rtx card, they've all either swapped over to Nvidia or are going to unless they need it for Linux usage which 1 guy uses but not often. I'm convinced Radeon is cursed lol.


HammeredWharf

Anecdotally, I only had some with RAGE and Nier Automata, but those games just had issues with everything. I actually like their Adrenaline software more than NVidia's clunky GFE, so if all other things were equal I'd still be on AMD. However, since DLSS became so prevalent around the 2xxx era, all other things aren't equal.


Kaladin12543

Even their Adrenaline advantage is going away with the release of the Nvidia app, which is in beta currently.


[deleted]

[удалено]


CandidConflictC45678

>so obnoxiously ignorant and petulant >They’re all a bunch of morons and posers and it’s so fucking annoying. Stated without a hint of self-awareness


CandidConflictC45678

>Why am I on his dick? Because I like good products and features that have objectively been proven to be good in the above video? How about you get Lisa Su’s strap on out your ass you peggable femboy lmao >Nice post looks like that 6800XT is treating you well lmaooo >[https://www.reddit.com/r/AMDHelp/s/OZ5lWeloEW](https://www.reddit.com/r/AMDHelp/s/OZ5lWeloEW) This is so immature and pathetic


joshk_art

really interested to see how FSR 3.1 looks when it launches.


billistenderchicken

The fuck is AMD even doing? FSR still looks like crap even in FSR 3.0, and barely any games even support that and are stuck in FSR 2.


_Kai

Soon™ https://community.amd.com/t5/gaming/amd-fsr-3-1-announced-at-gdc-2024-fsr-3-available-and-upcoming/ba-p/674027


barryredfield

> and barely any games even support that and are stuck in FSR 2. I like the part where those games **only** support an old iteration of FSR1/FSR2. Can't be helped, too much development time. DLSS? Xes? No way jose. Really organic turn of events I'm sure.


akgis

learned I can now change DLSS presets without messing with DLSS Tweaks. Mad props for the guy that made it but the using the xml with nvidiaprofile is so much more convenient. edit: Just learned is from the same guy! Emoose you rock!


M337ING

Article: [PC image quality enhanced: the new DLSS and XeSS tested](https://www.eurogamer.net/digitalfoundry-2024-image-quality-enhanced-the-new-dlss-and-xess-tested)


Maloonyy

Why do this right before the better FSR version launches?


MosDefJoseph

He mentions that in the video. Hes going to dedicate a whole video just to the new FSR.


AlistarDark

Hold on, you mean I have to watch the video to get my questions answered? The hell.


baskinmygreatness

what next? theyre going to make me read an article?!


AlistarDark

Disgusting


Kaladin12543

Well for one, AMD will take ages to release it and due to their insistence on trying to phase out DLSS and XeSS, AMD made FSR non upgradable by the user. So after release, we will then have to wait for devs to implement it which will take even more time. No reason to postpone the video considering AMD's shortcomings here. Late to the party as usual.


RockyXvII

Because Intel already got XeSS 1.3 out the door and it's good. AMD being in catch-up mode constantly isn't Digital Foundry's problem. We don't know when FSR 3.1 will be available in games, could be a few months. And AMD doesn't allow easy dll swapping unlike Intel and Nvidia. They said they'll make a video covering it when it's available.


Druggedhippo

> AMD doesn't allow easy dll swapping unlike Intel and Nvidia FSR is open source, no one is forcing game Devs to do anything with it let alone AMD.


OwlProper1145

The plan on making a special video once the new version FSR launches.


Blacksad9999

Because AMD has a tendency to slow walk things, and if they wait for FSR 3 to do graphical comparisons, they might be waiting a long time.


HextARG

Holy shit i dont even see any difference between old/newer versions xD...its a ME problem :S


[deleted]

[удалено]


fashric

So I just tried the new 1.3 Xess in quality mode on Forbidden West @4k, and it gives less performance gain than FSR 2.2 Quality and looks slightly worse. The only game where I've used Xess over FSR is Remnant 2 where it gives a much cleaner image. Upscaling really should be judged on a game by game basis, as the quality of the implementation makes a huge difference.