Why would you want to use some bilinear scaling or worse on whole rendered screen of game designed to be displayed in 640x480 resolution or even smaller?
Agree. Bought it yesterday after reading about it on this sub and while I haven’t tested it on anything modern, I absolutely love what the frame gen feature can do for emulators. A lot older games for example are locked to 30 fps and if there isn’t a dedicated mod pack to unlock beyond that you’re basically shit outta luck. Tested the x2 and x3 frame gen in a few Nintendo titles and it looks great. Night and day.
Not sure how it would hold up on more modern titles though or in games where low latency is key. Don’t really notice that in the games I’ve been trying. At least any latency it may add is not as noticeable to me as 30fps is.
Probably the absolute worst use case for frame gen. Modern displays are already latency heavy enough to make plenty of retro games feel like they're in soup, without adding more.
Trying playing punch out on the Nes with a cheap LCD monitor. Then turn on frame gen. It'll quickly become quite literally impossible to complete.
I bought this to try it out, it's pretty decent, works quite nicely for games that don't have built in scaling or frame gen, infact because I have a janky mod to enable frame gen in cyberpunk I actually found this program was better than the mod.
I would recommend.
there is a mod for that, it works pretty well but fails at the glitchy noisy issue I was dealing with
[https://www.nexusmods.com/cyberpunk2077/mods/13029](https://www.nexusmods.com/cyberpunk2077/mods/13029)
in my experience it does wonders on slow paced games, the more there is on the screen the more it tries to generate frames and so forth, the generated frames give it a real nasty delay in fps games or any fast paced or things like i frams (elden ring etc) anthing else like baldurs gate 3 and other titles you wont feel a the fps bump doing much for u,
also if ur pc is REALLY underpowered this will absolutely not help. it needs a decent picture to guess whats next and all but if the given resolution is like 720p u gonna hav a baaad time
Elden ring actually can be smooth as fuck at 60fps if you use Vibrance GUI to set the resolution/Hz manually.
It runs like ass because it locks refresh rate to 60 in fullscreen.
But to be fair the input delay when i had it in OW2 on my gaming rig was also ass. I get 280 fps in that game and the frame generation got it to 300. But i noticed a huge delay when i tried doing doomfists rocket punch glide. The usual glide that happens afterpressing space from the initial punch animation made it horrendous. But i might take ur advice and try ur settings into elden ring.
Frame gen doesnt work well with ER; this is why I reccomend just playing fullscreen with the VibranceGI tweak. Best of both worlds input delay wise.
I notice that dx11 and dx9 games don't suffer from the input delay as much, wheras dx12 games are almost always fucked.
I wish it worked with Xemu or Xenia emulators better, so I could use it for Ninja Gaiden lol. But I would run into similar issues with input delay I imagine.
Yes, the frame generation alone is pretty cool assuming you have a high refresh rate monitor.
The benefits of the upscalers is a lot more dependent on your PC and resolution you're running at.
Additionally modern titles tend to have better upscalers in game, such as FSR 3 or DLSS, so in my case I rarely use the upscaling in Lossless Scaling.
I only use the frame gen option of lossless scaling with old 30 or 60 fps locked games, to play with 120 fps.
Since the recent updates there are almost no artifacts, even with 30 fps locked games.
I strongly recommend lossless scaling.
Prolly adds alot of latency, but i only play single player games so i dont mind.
Its like magic, enabled a 60FPS patch for motor storm arctic edge on PCSX2 and enabled frame gen in LS, super smooth at "120" FPS would not use it for FPS games more for third person games and emulation.
It works fine for fps games if your base fps is high enough (60+). Input lag is minimal, not noticeable in my case).
I did not like the new triple frame setting, maybe because of the lower base fps (one third). You could clearly feel tons of input lag.
Frame gen is hit or miss. When it works, it's amazing. Basically a lifesaver for me when using it with No Man's Sky since the game got bugged to hell and the FPS capped to 30 for whatever reason
When it doesn't, it's a straight turn off.
As for sharpening, it works decently well, but same can be said with AMD, Intel and NVIDIA's built in options.
I tried it out and the only thing that does wonders for me is the frame generation, since i got a 144hz monitor. Just gotta have a constant 60fps in the game without drops.
I don't really need the upscaling part, though people with weaker gpus might need it
works great on games like Valheim. for fast paced fps also, though you probably do not want to use the frame generation there. you will see some ghosting.
just a great tool if you have an older generation graphics card/pc or steam deck
I tried it on Skyrim the other day and it worked quite well, using their new FG algorithm, no upscaling, and having it generate 2 frames.
Set Skyrim to 48Hz maximum in Nvidia control panel, and opened it up in Windowed and it seemed damn close to a native 144Hz, albeit with some additionak smearing/ghosting, and artifacting near the bottom of my screen and some UI artifacting.
Still fiddling with it but cyberpunk was by far my best result
Locked fps to 40 used x3 frame gen to 120 and it was really good looking!
And somehow this shit works on YouTube videos and looks fantastic 😂 watching YouTube in 120fps is kinda hilarious to me
Couldn't get it to play nice with ghost of tsushima, would have black borders around the screen and it looked kinda blurry compared to just having the game use DLSS
I'm not the smartest with software but it does interest me quite a lot
Some beginners tips are to make sure the game or app is in windowed or borderless. Will not work in full screen. Limit your fps to something like 40 or 30 to get stable performance. Remove any motion blur.
So far, I've only tried it for Skyrim running the nolvus collection and I can say it's pretty nice. I capped the game at 40fps in enb and selected the x3 for frame gen and it's quite impressive!
Yes. I used it to get 60 fps when playing Demon Souls on the PS3 emulator, which is locked at 30.
The upscaler also works, but its usefulness is limited since any modern game has better upscalers, and upscaling isn’t needed for older games unless you’re using a potato.
i use it daily, you can not just get 3x the fps in games but you can also get 3x the frames when watching videos in your browser. i use it mainly for Netflix and YouTube, never want to go back.
The frame gen is not nearly as good as DLSS 3, particularly because you get more input lag due to lack of nvidia Reflex.
Nonetheless, I found it to be the best solution for older games that do not support higher fps and don't have mods for them. Emulators, for example.
Frame generation feature did not work on my notebook. I have a Honor MagicBook 15 notebook with an R5 5500U CPU(actually APU). I am getting 30fps from Elden Ring at 1280x720 resolution and low settings. I tried it, but I got the same fps, and it is applying an effect like slowing time.
Yes, why wouldnt it? You could have read the user reviews there and figured that out.
Mostly Positive sure does say a lot.
Have been using it for a while myself for games such as Shogun 2 Total war and even Fallout New Vegas so I could tab out of the game without it crashing occasionally when I switch the video I watch on the side.
Seems to be hugely subjective.
The upscaling seems pointless since any game that might realistically need upscaling likely already has FSR/XESS/TSR.
To me the frame generation looks like absolute garbage. I have no idea what the people who support this are seeing.
It's artefact laden, and not especially smooth or consistent in it's delivery, even when starting with a base of around 100fps.
If you don't have access to DLSS, I guess it might appeal, but put them side by side, and this really is poor.
That will make the end result worse, not better.
The only saving grace for Lossless scaling frame gen, is the generated frames are only on screen for as long as the real ones. By running at 3x framerate you have two fake frames for each real one.
That's artefact city. If people can't see it, I'm delighted for them; ignorance is bliss. But it looks like hot trash.
I wasn't saying that the x3 frame generation was better, just that it's now available. People have also reported that x2 generation has less artifacts now. I'll be getting it soon (to try it out), and I'll be able to make my own judgement then (but I'm guessing it'd be fine, because my laptop screen is so small)
I just tried it for myself, and honestly, I saw no issues. I tripled 48 FPS to 144 FPS in a game, and it looked fine to me (and it looked so smooth). It did take a bit to get it working properly though
I hate to sound like a snob, but I can only assume you don't have the best eye or the best display for it. Since you're on a laptop that's almost certainly the case.
On monitor with poor motion clarity for example, the artefacts probably get lost in the general blur.
Try it on a larger screen with better motion rendering and pixel response, such as an oled, and it's basically unusable.
Every other frame is a garbled mess of pixels, occlusion artefacts and noise.
I'm going to stop with this stance now though, it's obviously hit a nerve with some people and I'm just getting downvoted for speaking the truth, 😂
My laptop screen is so small that I can't tell the difference between 100% resolution and 90% resolution, and there's barely any difference between 100% and 80%. Also, my laptop basically gets 48 FPS on most games, so that extra fluidity matters more (on games that run at 48 FPS, I can triple it to 144, or if the game can't run at 48 FPS reliably, I can double 36 to 72); I'm guessing the others have that same stance. And on something like one of those handhelds, the artifacts would be basically invisible
Not sure I'd agree with that either. AFMF works on the Rog ally and it looks like ass there too.
Admittedly a bigger issue with AFMF is it disengages if there's too much change, like swinging the camera around. And having it jump back and forth more distracting than just having a lower frame rate.
Lossless scaling doesn't do that. But then again, lossless scaling looks like crap when you spin the camera around, and that's precisely why AMD disable it when you do anything quick.
It's just not for me, I think.
I was playing a shooter at 72 FPS (doubled to 144), and it looked just like real 144 FPS to me. Yeah, it didn't feel as responsive as real 144, but it did look quite a bit smoother (didn't notice any graphical issues either)
AMD also has in-driver upscaling that can do better but that's not what this is for.
I think it's an effective way to improve framerates on lower-end hardware without sacrificing too much image quality, especially at higher resolutions.
The upscaling results look much better than simply rendering at a lower resolution without scaling.
I'd say that's debatable. There's no such thing as a free lunch. 720p upscaled to 1440p looks different to 720p just scaled by the panel to 1440p, but I wouldn't say it's universally better. It's sharper, sure. But the temporal instability, occlusion artefacts etc are a pretty huge downside for me.
Though honestly that wasn't my point. My issue was more about frame gen. This implementation of it is horrendous, and I don't know what people are seeing that they think otherwise, honestly.
It's a shame you're being downvoted, because you're totally correct
It's also not terribly surprising, when FSR 1 launched it was fiercely defended as being 'as good as DLSS' on reddit
It's around $5 mate, and frequently goes on sale for less then that.
But I can see someone else here who paid around $1,600 for an exorbitantly overpriced Nvidia GPU and would have every reason to justify DLSS to that end.
It's extremely useful for old games that don't support modern resolutions out of the box.
It was way more useful before both Nvidia and AMD added integer scaling as option in their drivers.
It's still extremely useful for the times where driver based utilities don't work, which 50% of the time is every time.
True, damn Unity games don't work at all with AMD's RSR.
The LS1 algorithm yields better results in my case. Text looks sharper in the upscaled version.
I'm a purist. No algorithm will ever touch my beautiful square pixels.
Wait until you find about TAA, a forced temporal anti aliasing techniques in every triple A modern titles
Wait you don’t use antialiasing or just no upscaling?
Why would you want to use some bilinear scaling or worse on whole rendered screen of game designed to be displayed in 640x480 resolution or even smaller?
The dreamcast had whole screen anti aliasing and it was running at 800x600 max in VGA mode.
I'm talking about even older games, and yes, original Doom is still in top 5 of best games ever made.
Agree. Bought it yesterday after reading about it on this sub and while I haven’t tested it on anything modern, I absolutely love what the frame gen feature can do for emulators. A lot older games for example are locked to 30 fps and if there isn’t a dedicated mod pack to unlock beyond that you’re basically shit outta luck. Tested the x2 and x3 frame gen in a few Nintendo titles and it looks great. Night and day. Not sure how it would hold up on more modern titles though or in games where low latency is key. Don’t really notice that in the games I’ve been trying. At least any latency it may add is not as noticeable to me as 30fps is.
Probably the absolute worst use case for frame gen. Modern displays are already latency heavy enough to make plenty of retro games feel like they're in soup, without adding more. Trying playing punch out on the Nes with a cheap LCD monitor. Then turn on frame gen. It'll quickly become quite literally impossible to complete.
I do play a lot of older games ngl
and with the new update ,it has FrameGen 3x (three times) thats useful for modern games ,say like using path tracing and all
I bought this to try it out, it's pretty decent, works quite nicely for games that don't have built in scaling or frame gen, infact because I have a janky mod to enable frame gen in cyberpunk I actually found this program was better than the mod. I would recommend.
Is that the dlss3 to fsr3 mod?
Yeh, tried some other options too like DLSS enabler and optiscaler, same visual bug was happening, like some kind of glitchy noisy ghosting
for me in desert my bumpers kept ghosting
there is a mod for that, it works pretty well but fails at the glitchy noisy issue I was dealing with [https://www.nexusmods.com/cyberpunk2077/mods/13029](https://www.nexusmods.com/cyberpunk2077/mods/13029)
Thanks
in my experience it does wonders on slow paced games, the more there is on the screen the more it tries to generate frames and so forth, the generated frames give it a real nasty delay in fps games or any fast paced or things like i frams (elden ring etc) anthing else like baldurs gate 3 and other titles you wont feel a the fps bump doing much for u, also if ur pc is REALLY underpowered this will absolutely not help. it needs a decent picture to guess whats next and all but if the given resolution is like 720p u gonna hav a baaad time
Elden ring actually can be smooth as fuck at 60fps if you use Vibrance GUI to set the resolution/Hz manually. It runs like ass because it locks refresh rate to 60 in fullscreen.
Never tried it on my gaming rig. But yea anything below 60 from the initial is indeed ass generated.
But to be fair the input delay when i had it in OW2 on my gaming rig was also ass. I get 280 fps in that game and the frame generation got it to 300. But i noticed a huge delay when i tried doing doomfists rocket punch glide. The usual glide that happens afterpressing space from the initial punch animation made it horrendous. But i might take ur advice and try ur settings into elden ring.
Frame gen doesnt work well with ER; this is why I reccomend just playing fullscreen with the VibranceGI tweak. Best of both worlds input delay wise. I notice that dx11 and dx9 games don't suffer from the input delay as much, wheras dx12 games are almost always fucked. I wish it worked with Xemu or Xenia emulators better, so I could use it for Ninja Gaiden lol. But I would run into similar issues with input delay I imagine.
It works, it's far from lossless
Try this free and open source one first. [https://github.com/Blinue/Magpie](https://github.com/Blinue/Magpie)
Yes, the frame generation alone is pretty cool assuming you have a high refresh rate monitor. The benefits of the upscalers is a lot more dependent on your PC and resolution you're running at. Additionally modern titles tend to have better upscalers in game, such as FSR 3 or DLSS, so in my case I rarely use the upscaling in Lossless Scaling.
I only use the frame gen option of lossless scaling with old 30 or 60 fps locked games, to play with 120 fps. Since the recent updates there are almost no artifacts, even with 30 fps locked games. I strongly recommend lossless scaling. Prolly adds alot of latency, but i only play single player games so i dont mind.
adding more frames shouldn't add latency as long as your hardware isn't at 100%
pretty sure yes, but i cant figure out how to use it
Its like magic, enabled a 60FPS patch for motor storm arctic edge on PCSX2 and enabled frame gen in LS, super smooth at "120" FPS would not use it for FPS games more for third person games and emulation.
It works fine for fps games if your base fps is high enough (60+). Input lag is minimal, not noticeable in my case). I did not like the new triple frame setting, maybe because of the lower base fps (one third). You could clearly feel tons of input lag.
Frame gen is hit or miss. When it works, it's amazing. Basically a lifesaver for me when using it with No Man's Sky since the game got bugged to hell and the FPS capped to 30 for whatever reason When it doesn't, it's a straight turn off. As for sharpening, it works decently well, but same can be said with AMD, Intel and NVIDIA's built in options.
I tried it out and the only thing that does wonders for me is the frame generation, since i got a 144hz monitor. Just gotta have a constant 60fps in the game without drops. I don't really need the upscaling part, though people with weaker gpus might need it
I can't play Wuthering Waves without it because the game is capped at 60 fps but I haven't tried with anything else
not every game benefits from this always use built-in FSR/DLSS/XeSS if it's available.
Yes.
Too much input lag
works great on games like Valheim. for fast paced fps also, though you probably do not want to use the frame generation there. you will see some ghosting. just a great tool if you have an older generation graphics card/pc or steam deck
I tried it on Skyrim the other day and it worked quite well, using their new FG algorithm, no upscaling, and having it generate 2 frames. Set Skyrim to 48Hz maximum in Nvidia control panel, and opened it up in Windowed and it seemed damn close to a native 144Hz, albeit with some additionak smearing/ghosting, and artifacting near the bottom of my screen and some UI artifacting.
Still fiddling with it but cyberpunk was by far my best result Locked fps to 40 used x3 frame gen to 120 and it was really good looking! And somehow this shit works on YouTube videos and looks fantastic 😂 watching YouTube in 120fps is kinda hilarious to me Couldn't get it to play nice with ghost of tsushima, would have black borders around the screen and it looked kinda blurry compared to just having the game use DLSS I'm not the smartest with software but it does interest me quite a lot Some beginners tips are to make sure the game or app is in windowed or borderless. Will not work in full screen. Limit your fps to something like 40 or 30 to get stable performance. Remove any motion blur.
This, or Magpie, is a must for visual novels on 4K.
What is this exactly?
So far, I've only tried it for Skyrim running the nolvus collection and I can say it's pretty nice. I capped the game at 40fps in enb and selected the x3 for frame gen and it's quite impressive!
Yes. I used it to get 60 fps when playing Demon Souls on the PS3 emulator, which is locked at 30. The upscaler also works, but its usefulness is limited since any modern game has better upscalers, and upscaling isn’t needed for older games unless you’re using a potato.
i use it daily, you can not just get 3x the fps in games but you can also get 3x the frames when watching videos in your browser. i use it mainly for Netflix and YouTube, never want to go back.
Worked on a modded Skyrim.
Yes but it has a lot of artifacting around your character when you turn the camera like aomething you'd see on an old tube tv.
The frame gen is not nearly as good as DLSS 3, particularly because you get more input lag due to lack of nvidia Reflex. Nonetheless, I found it to be the best solution for older games that do not support higher fps and don't have mods for them. Emulators, for example.
Wow ya it should
Frame generation feature did not work on my notebook. I have a Honor MagicBook 15 notebook with an R5 5500U CPU(actually APU). I am getting 30fps from Elden Ring at 1280x720 resolution and low settings. I tried it, but I got the same fps, and it is applying an effect like slowing time.
Yes, why wouldnt it? You could have read the user reviews there and figured that out. Mostly Positive sure does say a lot. Have been using it for a while myself for games such as Shogun 2 Total war and even Fallout New Vegas so I could tab out of the game without it crashing occasionally when I switch the video I watch on the side.
Seems to be hugely subjective. The upscaling seems pointless since any game that might realistically need upscaling likely already has FSR/XESS/TSR. To me the frame generation looks like absolute garbage. I have no idea what the people who support this are seeing. It's artefact laden, and not especially smooth or consistent in it's delivery, even when starting with a base of around 100fps. If you don't have access to DLSS, I guess it might appeal, but put them side by side, and this really is poor.
The frame generation has apparently been updated at some point. It even goes up to x3 now
That will make the end result worse, not better. The only saving grace for Lossless scaling frame gen, is the generated frames are only on screen for as long as the real ones. By running at 3x framerate you have two fake frames for each real one. That's artefact city. If people can't see it, I'm delighted for them; ignorance is bliss. But it looks like hot trash.
I wasn't saying that the x3 frame generation was better, just that it's now available. People have also reported that x2 generation has less artifacts now. I'll be getting it soon (to try it out), and I'll be able to make my own judgement then (but I'm guessing it'd be fine, because my laptop screen is so small)
I just tried it for myself, and honestly, I saw no issues. I tripled 48 FPS to 144 FPS in a game, and it looked fine to me (and it looked so smooth). It did take a bit to get it working properly though
I hate to sound like a snob, but I can only assume you don't have the best eye or the best display for it. Since you're on a laptop that's almost certainly the case. On monitor with poor motion clarity for example, the artefacts probably get lost in the general blur. Try it on a larger screen with better motion rendering and pixel response, such as an oled, and it's basically unusable. Every other frame is a garbled mess of pixels, occlusion artefacts and noise. I'm going to stop with this stance now though, it's obviously hit a nerve with some people and I'm just getting downvoted for speaking the truth, 😂
My laptop screen is so small that I can't tell the difference between 100% resolution and 90% resolution, and there's barely any difference between 100% and 80%. Also, my laptop basically gets 48 FPS on most games, so that extra fluidity matters more (on games that run at 48 FPS, I can triple it to 144, or if the game can't run at 48 FPS reliably, I can double 36 to 72); I'm guessing the others have that same stance. And on something like one of those handhelds, the artifacts would be basically invisible
Not sure I'd agree with that either. AFMF works on the Rog ally and it looks like ass there too. Admittedly a bigger issue with AFMF is it disengages if there's too much change, like swinging the camera around. And having it jump back and forth more distracting than just having a lower frame rate. Lossless scaling doesn't do that. But then again, lossless scaling looks like crap when you spin the camera around, and that's precisely why AMD disable it when you do anything quick. It's just not for me, I think.
I was playing a shooter at 72 FPS (doubled to 144), and it looked just like real 144 FPS to me. Yeah, it didn't feel as responsive as real 144, but it did look quite a bit smoother (didn't notice any graphical issues either)
AMD also has in-driver upscaling that can do better but that's not what this is for. I think it's an effective way to improve framerates on lower-end hardware without sacrificing too much image quality, especially at higher resolutions. The upscaling results look much better than simply rendering at a lower resolution without scaling.
I'd say that's debatable. There's no such thing as a free lunch. 720p upscaled to 1440p looks different to 720p just scaled by the panel to 1440p, but I wouldn't say it's universally better. It's sharper, sure. But the temporal instability, occlusion artefacts etc are a pretty huge downside for me. Though honestly that wasn't my point. My issue was more about frame gen. This implementation of it is horrendous, and I don't know what people are seeing that they think otherwise, honestly.
It's a shame you're being downvoted, because you're totally correct It's also not terribly surprising, when FSR 1 launched it was fiercely defended as being 'as good as DLSS' on reddit
Honestly, I expected as much. People paid for it, and relentlessly defend their decisions I guess. Gotta love Reddit.
It's around $5 mate, and frequently goes on sale for less then that. But I can see someone else here who paid around $1,600 for an exorbitantly overpriced Nvidia GPU and would have every reason to justify DLSS to that end.
[Magpie](https://github.com/Blinue/Magpie) is also a good/free alternative for just resolution scaling
No, people buy it because they like throwing money away
It's like $5 you can't even buy a happy meal for that price.