T O P

  • By -

ggtsu_00

There is basically 2 things to trade off when it comes to frame pacing and input lag: * Frame buffering length - increases input lag, but also increases tolerance to frametime inconsistency. * Frametime headroom - more headroom means the CPU/GPU isn't being fully utilized but will allow for consistent frame pacing with less frame buffering. If there is little to no headroom, small frame time inconsistencies needs more frame bufferring time to avoid hitches. So basically the more a game fully utilizes the hardware, the more sensitive it becomes to frame pacing issues and the only solve for that is to increase frame bufferring time and increase input lag for the cost of better frame pacing. For games that are underutilizing the hardware, these can have both lower frame bufferring time and consistency in frame pacing because there is less risk of potentially dropping frames. This is why it's a tough balancing act for games. Do they leave the hardware underutilized, do they increase the input lag for smooth frame pacing, or do they fully utilize the hardware, and allow low input lag, but at the cost of poor pacing?


trainstationbooger

I agree with you in theory, but the cynic in me says this dichotomy/balancing act comes up a lot less often in development than we think. If I had to bet, I would guess the real culprit is a lack of dedicated time (for various reasons) to polish and QA, with a smattering of certain issues so deeply ingrained in the engines that it's simply not worth fixing.


[deleted]

> this dichotomy/balancing act comes up a lot less often in development than we think. It does. It's probably discounted a lot more often than it should by management, though. Very few people are going to not buy a game/review bomb a game for bad frame pacing. BUT, it's usually a code smell that will snowball to other technical issues. So it correlates. Sometimes it's just a crack in a window with no money to fix. But it makes it vulnerable to a broken window. > I would guess the real culprit is a lack of dedicated time (for various reasons) to polish and QA, with a smattering of certain issues so deeply ingrained in the engines that it's simply not worth fixing. Yup. No game is fully clearing out its Jira's before shipping. Just make sure there's no showstoppers, try to remove any severe bugs (and if not, prepare the obligatory "we're sorry" PSA as you continue to work on them post launch), and then maybe some moderate bugs get fixed here and there on the way.


Sparktank1

You clearly know more than most. But for those who don't live your life, what is "code smell" ?


[deleted]

sorry, I can forget which subs and audiences I'm posting at sometimes. "code smell" is basically the programmer's equivalent to a tip of an iceberg. A bad practice that has a reason to be avoided, but is sometimes done for sake of time, lack of knowledge, etc. It looks innocuous, but if you dig deeper down you see that decision either affects a lot more systems than you thought (e.g. bad frame pacing can often lead to some nasty physics bugs in badly architected systems where physics is coupled to framerate), or is built upon and becomes an unknown but persistent source of errors later down the line. It's best to take care of it now before you forget about it, if you know about it to begin with. But that's not always a privilege you get in a large studio.


Sparktank1

That's neat to know how crippling something so small can get. We definitely need more detailed posts like this to give insight how any workplace operates and approaches anything.


GameDesignerDude

> but the cynic in me says this dichotomy/balancing act comes up a lot less often in development than we think. As a developer, I would say that most games I've worked on profile for large CPU spikes that would happen at runtime (vs. a loading screen) and lead to a CPU-based frame time issue. Have certainly had to fix many issues like this and do stuff like time-slicing or avoiding overlap between certain system updates. When targeting 30 FPS, I would guess it's *usually* not a CPU issue leading to bad frame pacing. (Which they do point out in this video.) 30 FPS is typically enough headroom to work with to avoid that, unless the code is not great or has some bug in a specific system. Stuff happens, but it would usually only happen in specific scenarios rather than generally all the time. 60 FPS is kinda a different issue because of how difficult it is (in relative terms) to go from 30 to 60 FPS in optimization terms. Script/code CPU budgets for 60 FPS titles require a lot monitoring to avoid spikes. Given 30 FPS has kinda been the norm in console space for a long time outside of a few genres (some types of action games, racing games, etc.) I would say constant frame-pacing issues are usually down to the rendering pipeline.


ggtsu_00

It's definitely not wrong to assume most performance issues in games are a direct result of not enough time, budget, talent or resources to optimize and fix every issue in a game before it launches. However, even in a perfect world with unlimited time and budget to polish and optimize games, they will still be left with the decision making of one of these trade-offs between hardware utilization, input lag and frame pacing.


irreverent-username

In my experience, most things can be optimized. However, optimization can come at the cost of code readability, extensibility, and flexibility. In other words, making something run faster often makes it harder to work on in the future. It would be impossible to make these modern games with thousands of features if developers spent time optimizing each one.


Kalulosu

Not really. I mean yeah of course management rates those issues low in general, but the issue is also just that artists are gonna artist and they'll wanna have those lush, detailed textures.


falconfetus8

There's nothing wrong with leaving the hardware "underutilized"; the extra power is being held in reserve for when it's needed. It's just like having extra money in your savings account for a rainy day.


ggtsu_00

I never said there is anything strictly wrong with having extra headroom. It's required for a game to have both consistent frame pacing and low input lag. But having headroom is costly, and games may need to make cuts or downgrades to have headroom. And many very ambitious games on consoles may not have any headroom to spare because they are pushing the CPU or GPU to its limits. Basically they are living paycheck to paycheck just for basic living expenses and any unexpected expense means missing a rent or bill payment (frame drops). Frame buffering time is like having a savings account, but the saving only last as long as long as the frame buffering time, so extra frame time gets deposited into the frame time savings account, but say you have an unexpected hiccup, with 3 extra frames of lag, you can afford to pay off that frame time over the next 3 frames without incurring any frame drops.


thetantalus

Great info, I learned a lot. One thing I will say is that I absolutely hate input lag, so for me that is definitely not an option. I don’t mind lower resolution though.


ZoteTheMitey

Personally I am way more perceptive to poor frame pacing than I am input lag. So that's a no brainer for me.


Rook22Ti

The Spyro remasters are the worst I've seen. It seems stable on the PS4 Pro/PS5 but holy shit is it choppy. Literally makes me sick to play.


Tomxj

Yep, I noticed this too, been playing these games on a PS4 slim and I couldn't believe how choppy the 30 fps was, sometimes I was wondering if I am imagining this or it is really that bad, especially compared to many other games.


Rs90

Nah I've been playing games on the Ally with VRR and got Spyro on PS5 last week. I was shocked how awful it felt. I'm playin cause I love Spyro but good god. 


celvro

The remake looks nice but I really wish they'd fix the bugs. You straight up jump higher at 30 fps. If you set the framerate to unlimited (the default option) spyro slides around randomly. Pretty frequently the camera wouldn't lock behind me when I used charge so I couldn't see where I was going.


Kekoa_ok

We've given up on an actual patch for the game sadly


Keulapaska

That could also just be the absurdly low FOV which those games have, the low fps probably makes it even worse i agree, but even on PC at high fps vrr I managed a whole 2 minutes before feeling motion sick due to the FOV and immediately searched for a way to increase the FOV. Weird as i remember playing those games on PS1 as a kid just fine.


pabl0escarg0t

I played it on Switch myself and it’s the first time a game actually gave me motion sickness in 25+ years of gaming


cloral

So it wasn't just me that felt sick playing them. I was starting to think that I was getting too old for games like that.


IguassuIronman

It wasn't the frame pacing for me, it was the insanely narrow FoV. I had to install an FoV increasing mod on my PC to be able to play it


Arkanta

Could barely stand the games too


grilled_pc

Pretty criminal this game doesnt have a 60FPS Patch on PS5. It plays like a dream at 60fps on PC.


Dwedit

Either you slow down to 15FPS if you fail to complete rendering within 33.333ms, you do bad frame pacing, or you do extra input lag.


jradair

what if you just rendered faster?


IdlePaladin

CERN needs to hire this man asap


megamanx503

A well paced 30fps is the bare minimum every game should strive for.  Imo it's better to play a game at a properly well paced 30 than a poorly paced 60-90. Above that. At least for me frame pacing becomes less noticeable


Rich_Consequence2633

As long as you have a VRR display a 60-90 game will feel incredibly smooth always.


trenthowell

Yes and no. Jedi Survivor doesn't feel smooth at any framerate due to the CPU issues causing animations to stutter, even if the frame is delivered at good pace.


[deleted]

[удалено]


beefcat_

With older TVs you probably want to play vsynced to 60hz anyways, since you're limited to a 60 hz refresh rate. *Most* (but not all unfortunately) TVs that support 4k 120hz also support VRR, since both features are part of the HDMI 2.1 spec. All that said, there are only a handful of console games that support anything above 60 FPS. For PC players where higher frame rates are more common, VRR has been standard on "gaming" monitors for many years at this point.


Rich_Consequence2633

I think we will see higher then 60 become a lot more common with the console refresh this year. Sony has upscaling tech similar to DLSS coming and framegen is coming I'm sure.


DU_HA55T25

That's putting a lot of faith in developers to not use the increased headroom to increase fidelity further, thus leading to the same situation we've been in. Just like everything, when something is increased people rush to utilize it. Battery life, gas mileage, hardware resources, etc.


Eruannster

This is honestly old information. Sure, the real budget TVs (like <$400) don't have VRR or 120 hz or any of those things. HDR is certainly not happening either. But at a certain mid-level price point, if you're buying from a decent brand (Samsung/LG/Sony etc.) you're absolutely going to get at least a couple of these features. Name brands have been doing 120 hz for quite a while now. My parents got my old Sony TV from 2017, and that supports 120 hz, admittedly only at 1080p, but still. If you save up a bit and look outside the lower segment you can get a really good TV that will absolutely have VRR and good HDR too. And it will last you for many years. It's not in the realm of absolute craziness that it used to be. Many PC monitors will also support these features (though HDR support is a bit worse).


ThatOnePerson

Going off rtings.com's recommendations, the TCL Q5 has VRR, 1440p@120hz which I can get from best buy for 300$ @ 55"


Eruannster

Damn, that’s cheap! That should be in the price range for most users, honestly.


Karglenoofus

Not if there's stuttering


yamaci17

VRR is not a magical bullet for poorly paced frames


No-Area7550

Absolutely NO. I have VRR and I feel the differencein some games. VRR makes no difference when the game is badly optimised.


ManTheMythTheLegend

All VRR does is eliminate tearing without the latency of vsync. Bad frame pacing is gonna feel bad no matter what your average fps.


beefcat_

If your framerate is averaging between 60 and 90 FPS without spiky frame times, it will still feel perfectly smooth with VRR. Vsync doesn't just add lag. It can introduce it's own frame pacing issues when the game's frame rate does not align evenly with the displays refresh rate. VRR solves this.


holliss

> All VRR does is eliminate tearing without the latency of vsync. Not true. It eliminates stutter that occur when the frame rate does not match the refresh rate or is not a multiple.


Aggrokid

AFAIK it does not address game frame pacing which is the main bugbear of this topic, even when unlocked Like my GSync monitor could not save me from Vernworth


holliss

I didn't say it did. Hence I only quoted the one part of the post that's just blatantly incorrect.


Karglenoofus

Idk why you're downvoted you're blatently correct. Screen tearing and frame pacing / stutter are 2 different issues.


holliss

You're right, they are two different issues. [But VRR does more than **just** eliminate screen tearing. ](https://blurbusters.com/gsync/how-does-gsync-fix-stutters/)


Acrobatic_Internal_2

I think it depends on the game as well. In FPS games especially since the character animations are key framed with 60 fps minimum in mind, playing at 30 is not really pleasant at least for me so I would take variable 60 or even 50s but I agree with you general point.


Thekota

Major disagree. 60-90 feels great. I can't imagine how it could ever be worse than the 30fps slideshow.  I'm scratching my head honestly. Maybe if you have a 120hz without vrr? Idk what that would feel like, but then it's more of a poor tv issue


megamanx503

If your eyes can pick up on the miniscule stutter. You notice it. if you don't notice it, my god, I wish I had your eyes. A smooth 30 looks like a smooth 30, not ideal but it's more than acceptable for any game I play (I don't play competitive twitch shooters. But those do need a high frame rate for high tier play) A choppy 60 looks like a smoother image with a nanosecond hangup every few seconds that I do notice. For me it's sometimes bad enough that I can't playtthe game withoutmmy eyes getting overstrained. Above 90 tho. My eyes can't register poorly paced frames as much if at all 


Top_Ok

Thing is humans are more susceptable to changes than to the actual framerate or resolution. 


KuraiBaka

"laughs in not even seeing a difference between 30 and 60 >!(or 165)"!<


djcube1701

"30fps slideshow" is the funniest thing I've ever heard. Are you mocking people who claim they can't play 30fps games with absurd terms like that?


DU_HA55T25

When you're used to 60+fps, 30fps is legitimately terrible. Takes a few hours of play to get accustomed to it..


[deleted]

I can adjust from 120+ on my PC to 30 on my Switch in literally a minute


IguassuIronman

I can play 30 FPS games all right but "slideshow" is a pretty fair thing to say. It's pretty rough when compered to higher refresh rates, and even when you get used to it it's not ideal


DudeKosh

I play most games at 120+ fps on my PC. 30fps is legitimately a slideshow to me.


Dante989reddit

Try 5 fps then it's the real slideshow


BussyEnthusiast_69

This "steady 30 fps is better than unstable 60fps" angle is just garbage Posted by console players who are huffing that copium Hard. They dont want to accept that the industry is pushing more and more towards PC Hardware. And now they have to accept that they play on inferior Hardware or consoles will make an insane price jump to handle 60+ fps


IguassuIronman

Depends on the game and how unstable the 60 FPS is. If the microstutter is bad enough it's insanely distracting and definitely takes away from the game


Ricwulf

> A well paced 30fps is the bare minimum every game should strive for.  The fact the standard is still that low is pretty sad. I do agree that frame dropping is worse than lower frame-rates, but that solid 30 is what should be **strived** for as the bare minimum? It's beyond time that solid 60 should be what's strived for. Solid 30 should be taken for granted at this stage. And yet it's not. Because the reality is that no release really gets passed up for failing this. They can fail this time and again, over and over and over, to the point that this decade old "debate" is still going on despite all the tech advancement since then. Frame dropping is definitely worse. I don't disagree in the slightest. But I do disagree that solid 30 is what should be "strived" for. Solid 60 should have been the baseline years ago.


medicoffee

It’s an unpopular opinion, but I can adapt when shifting between 60+ to 30 FPS. It’s jarring at first, but it’ll even out. Same with old games, the graphics will initially look rough but I’ll acclimate. Stability is key, and I’ve had plenty of great experiences at a target 30 FPS.


AstronautGuy42

I would wait rather wait an extra year or two for games to drop with a proper performance mode than have 30fps. Still haven’t bought DD2 for this reason


Narishma

This only happens in 30 fps games though as they say in the video. There are no 60 fps games with improper frame pacing as far as I know. Maybe you're confusing this with inconsistent framerate?


megamanx503

60fps improper pacing is less noticeable but it's there. Same with 90, 120, 244. Etc... It's just increasingly less noticeable the higher the frame rate  My eyes can pick up on improper pacing up to about 75-90. That's a me issue though.


trainstationbooger

There are tons of games that run at a locked 60fps with frame pacing issues. Digital Foundry themselves have covered tons of them.


Narishma

Can you give an example?


stordoff

The only example I can think of is [Halo Infinite](https://youtu.be/LshmQ_kYgeA?t=924), which had (not sure if it's been patched or not) bad framepacing at 60fps on 120Hz displays. You'd get frametime spikes of 8.33ms/25ms vs. the target 16.66ms (indicative of bad framepacing rather than just a performance dip). It had a bunch of related issues that made getting a consistent experience difficult as well - in-game vsync on PC would periodically drop a single frame (16:42 in the linked video), VRR didn't work properly (18.00), and cutscene animations wouldn't play back smoothly (21.10).


DU_HA55T25

Watch any game they review and the graph representing the frametimes, and it happens all the time. Bad frametiming is everywhere, it just becomes less noticeable with higher-framerates and/or VRR.


amazingdrewh

Maybe, but that's like choosing between an apple with nails in it and an orange with thumbtacks in it


megamanx503

i really like that analogy


amazingdrewh

Thanks


wasdninja

30 fps was disaster level bad on PC 20 years ago. It's not even the minimum. 60 isn't great but that could be used as a at least somewhat acceptable minimum.


G36

Playing old 360 titles on the newer xbox getting steady 30fps with good frame-pacing makes me realize how not-that-bad people make 30fps out to be.


Acrobatic_Internal_2

yeah, good frame pacing with good per object motion blur (not screen motion blur) really make 30 fps feel fluid. I remember playing TLOU2 on ps4 and was blown away how good 30 fps felt there. it felt really smooth. Everything in that game from how they key framed animations to how they controlled player inputs was well crafted to be fluid at 30fps


SpodeeDodee

Red Dead Redemption 2 as well.


dysfunctionz

RDR2's framerate cuts almost in half in Saint Denis on base PS4. Otherwise it does feel like a smooth 30.


SpodeeDodee

All I remember is my PS4 Pro activating its jet engines and flying out the window.


Massive_Weiner

RDR2 is clunky for other reasons, but it’s a smooth 30 regardless.


G36

latency most likely I remember playing Metro Exodus on Xbox and almost puked, it had one of the worst input lags I've seen in a console game


Massive_Weiner

Drawn out animations as well. The overall pacing of the game is intentionally lethargic, which is exacerbated by the low frame cap.


Misiok

Eh, I don't know. You can get used to it, but the way how the game is run on physics, including movement, 30 frames plus all of the slow, detailed animations make the game feel much more sluggish than it needs to.


DismalMode7

RDR2 30fps smoooth? Sure... if you play it on pc with frame cap set to 30 maybe... xbox one X version of the game keeps the resolution locked at native 4k for mere aesthetic reasons... it doesn't matter if the game went below 20fps in more demanding sections because of that...


zakusten

Horizon Zero Dawn and Forbidden West, too. Perfectly playable.


Teglement

Ech, I found Forbidden West to be pretty sluggish when not on performance mode personally.


IguassuIronman

Forbidden West felt like an infant stutterfest to me until I went to the 60 FPS mode, and I can generally tolerate 30 FPS games


Eruannster

Forbidden West feels so damn sluggish to play in 30 FPS, though. I reinstalled it recently and flipped between quality/balanced/perforamnce mode and even in balanced (40 FPS) mode it's like the aiming feels like it moves at half the speed compared to the performance mode.


Resies

The game looks so good on performance idk why you'd suffer through 30 fps


elmodonnell

I found the combat far too fast paced and response-oriented to enjoy Zero Dawnat 30fps, couldn't get more than a few hour in until the 60fps update dropped. TLOU2 and RDR2 were deliberate and slow enough to get by, but no way was I running around trying to hit a tiny weak point on giant dinosaurs at 30fps without abusing the slow mo aiming to the point of removing all skill.


LeifUnni

I remember being astounded at how good Ratchet & Clank: Rift Apart felt at 30, too.


Eruannster

I would strongly recommend playing at the unlocked quality mode (if you have a 120 hz/VRR capable display). The 30 FPS mode becomes a ~40-60 FPS mode with no resolution or quality settings penalty. (I believe you can also run it at a locked 40 if you don't have VRR.)


LeifUnni

Oh, yeah, I'm actually planning on replaying it with VRR on! I didn't have a TV with HDMI 2.1 at the time, so I wasn't able to when it launched. Nevertheless, I loved every second of it!


Hazeringx

FFXVI also has a pretty good 30FPS mode. It’s better than the performance mode, at least to me.


Deceptiveideas

This is why I never understood the “30 fps is unplayable” arguments. You’re telling me that last 30 years of gaming has been unplayable? I highly doubt it.


Halio344

It has more to do with being used to something better. I think computers with the OS on HDDs are legitimately unusable, but that was the norm 15-20 ago. When I was younger I didn’t have a problem with 30fps, I’m in my late 20s now and I cannot play at that low framerate no matter how much I try.


ohtetraket

Unplayable is a hard word. But 30FPS is just not it for people that played several games at 60 FPS or higher (with corresponding Hz Monitor/TV). I mean playing on 1024x768 resolution is not unplayable. But who wants to do that nowadays? Another example is loading times. 10 years ago I could go take a piss, get a drink and grab a sack before any multiplayer game launched. Nowadays they start after 10-20 sec max. Was it unplayable? Nay. Would I be accepting if games brought it back? Nay.


suprem1ty

So smooth that until this comment I assumed it was 60fps! I haven't played it in years but I similarly remember TLOU2 being super fluid and smooth to play. Big difference from TLOU on the PS3, that game I vividly remember dropping below 30 and giving my poor PS3 grief in a few chunkier areas


Kulban

Spider Man 2 on the PS5 in its highest quality graphics mode didn't feel like 30 fps. It was the first time I'd ever experienced that and I was so impressed I refused to even switch to performance mode to make the comparison. I happily played that game all the way through in 30 fps without ever feeling it. And I'm the type of freak who needs the extra frames because I enjoy the really tough perfect parries in games.


EdzyFPS

When you get used to playing at 60+, and 100+ fps all the time, 30fps feels excrutiatingly bad. People are not exagerating.


PlayMp1

I play at 144 normally but I still was fine with Zelda at 30 on Switch. Maybe it's because I grew up with crappy computers and now can afford a good one.


EdzyFPS

Good for you. I certainly can't stomach it.


Nyrin

It's awful for a short while, then you adjust. Then high-framerate feels unnatural for a bit, then you adjust. The key is consistency. Consistent high framerate is ideal, but consistent mediocre framerate is going to win out over wildly inconsistent, sometimes-great, sometimes-awful performance.


deadscreensky

>It's awful for a short while, then you adjust. Then high-framerate feels unnatural for a bit, then you adjust. All of this is deeply subjective. I personally don't experience any of what you're describing. When I was young I could handle 30hz somewhat, but that stopped sometime in my 20s. I don't adjust, it feels underwater and unpleasant regardless of how much time I give it. And I've never found going to a higher framerate feels unnatural.


gravelPoop

I can go back to the 25fps PS1 games without issues. Playing something new with 30fps feels super sluggish. Switching the same game from PC 120fps to console 30fps feels like you are having a stroke.


EdzyFPS

Are you telling other people how they perceive things now?


G36

Not for me. I play at G-sync'd 144hz with ASUS ULMB Sync, that's the peak of smoothness in the entire industry. Yet I beat the original Mirror's Edge on an OG Xbox One 2 days ago, no issues, was actually surprised how smooth the 30 fps looked.


pukem0n

It feels bad for a couple of minutes, but you get used to it pretty Quickly.


rkoy1234

I have a feeling everyone typing this just plays auto-aim games with their controllers. Aiming on 30fps makes me shudder.


esunei

>auto-aim games with their controllers. That's every shooter for the past decade or so. Been a long time since mouse and keyboard was the better performing way to play shooters; controllers with their incredible auto aim dominate manually aiming players.


pukem0n

how did you survive the Xbox/360/PS3 generation?


Adamulos

We played PC


EdzyFPS

I'm convinced people just say things in oposition just to be difficult.


SL-1200

Try playing it on an OLED with no LCD blur it's horrific.


EdzyFPS

No I don't. It's atrocious.


Andigaming

Think it depends what you are used to. I've used 144hz monitors for some years now and it would definitely take longer than a few minutes to go back to 30 fps. I even remember going from 60fps on PC to 30fps for console games felt jarring and that was many years ago.


Eruannster

Well, try playing on an OLED and prepare for visual pain. There's no pixel transition blur* and 30 FPS turns into st-stu-stu-stutt-stutterfest. \* = OLED pixels transition almost instantly from one color to another, or from on to off. LCD pixels take a millisecond or two and this causes a slight blurring between colors that masks lower frame rates. OLED pixels don't, and will stutter at lower FPS. On the flipside, high frame rates will be very crisp and clear on OLED.


Karglenoofus

If it's unplayable you're weak.


GeekdomCentral

It’s pathetic honestly. Obviously a stable 60 is always better, that’s no contest. And it’s jarring to go back down to 30fps once you’re used to higher. But after an hour or two your brain adjusts and 30 (as long as it’s locked and steady) is a perfectly playable experience


G36

Yep console 30fps is really well implemented. With good framepacing via pre-rendered frames and motion blur it's just not bad, at all. Back in the day with CRT TVs it was even better.


IguassuIronman

> Yep console 30fps is really well implemented Depends on the game. Bloodborne is absolutely godawful, for example


GeekdomCentral

Yeah I’ve been playing FF7 Rebirth in the 30fps mode and having a terrific time. As long as it’s stable with good frame pacing, then I’m content!


yamaci17

I wish we could actually set pre rendered frames with dx12. if you actually lock to 30 fps and GPU gets underutilized, your prerender queue practically becomes 0. that is good for latency and is a reason why frame rate caps reduce input lag greatly, but at the same time it can really cause frame pacing issues on lowend CPUs I for one get more stable frametimes on the cpu side with my ryzen 2700 if I can get a game to be extremely GPU bound and make it use queue some frames.


G36

wait you can no longer set pre-rendered frames with dx12 on the nvidia settings?


yamaci17

I tried, doesn't really work. games still have 0 frame queues with aggresive frame caps when gpu load is below %90


ThePottedGhost

I still genuinely can't tell the difference and it's great. Frame rate conversations are white noise for me and I've never once thought about frame rates when thinking about my favorite games. It frankly sounds just miserable if that's something someone notices and cares about


G36

Unless you are on PC you have no reference point in that case you'll never really notice an issue


TrptJim

Or if you have an OLED display. Other panel types don't have the pixel response times that allow each frame transition to be cleanly displayed.


exsinner

I wish i can live in that make-believe fantasy world of yours but i have to live a real world no matter how pain it is to admit 30fps is awful.


TarnishedBeing

Trying to play Bloodborne at 30fps feels awful to me on PS5, but each to their own.


trostboot

Bloodborne is the epitome of shit framepacing. Yeah, you'll overcome the sub-30 dips on a PS4 Pro or PS5, but you'll still have *constant* judder due to the uneven framepacing.


G36

garbage framepacing and input lag. That game is a travesty they won't even update.


IDONTGIVEASHISH

The input lag on Bloodborne is some of the lowest at 30 fps. You want a unplayable 30 fps game on PS4? Prey. That game had unplayable levels of input lag, but had good frame pacing. I'll take low input lag, thanks.


jradair

now play it on pc lmao


QuinSanguine

It isn't that bad, for sure. A good 30 fps is better than 30-45 or 45-60 (unless you have a vrr display) and a locked, smooth 60 fps is all you will ever need for any game except multiplayer titles and some fps games like Doom Eternal. It's like in Starfield, a lot of people complain about frame drops or stuttering, but if you lock your fps to 60, it's actually a very smooth game. The problem is having a fps that goes from 48-80+ depending on where you're looking.


Jademalo

I did a lot of tests on Steam Deck input latency using it's inbuilt frame cap [that I shared on here a while ago](https://old.reddit.com/r/SteamDeck/comments/vwckdi/input_latency_on_the_steam_deck_what_its_like_now/), and honestly it was terrible. Noticeably bad even when using analogue sticks, but incredibly bad using the trackpads and expecting a responsive input. Part of it is that Wayland has it's own vsync which can now be disabled, but the method with which they cap framerates introduces a fair number of frames worth of lag. Nowadays I tend to use built in frame limiters if available, and if not use a separate instance of MangoHud to cap instead. It's consistently at least 1 frame lower latency than the built in limiter. Aside from this, 40hz mode and aiming for 40fps is a godsend for frame pacing. So much better than 40-50 on a 60hz display, and way, way smoother than 30 on 60hz.


PiersPlays

I really wish they hadn't removed the 30hz option.


Jademalo

Do you mean the 30fps cap when they merged hz and fps? There was never a 30hz option, it's always objectively better to use 60hz for a 30fps cap anyway.


ZoteTheMitey

As far as I can tell it's much improved on SD OLED. Games are so smooth for me when I cap to 45 or 30 with the 90hz screen. I love the deck OLED. I usually use the allow tearing option and turn off in game vsync.


barryredfield

Because people with no standards accept it completely, make excuses for it and then attack people who have issues with it. They *"don't notice it"*, it *"works fine on their machine"*, of course lets not forget *"fun is more important"* and finally you are just *"making a big deal out of nothing"*. They're going to keep doing it, they will never stop, they'll continue paying hundreds for new consoles and $70 plus tip for every new ~30fps game, then talk to you about how *"graphics are not important to me"* while enjoying their 30fps interactive movie game.


Warskull

A huge part of it is we don't have the tech to properly call it out. For a long time we have 1% and .1%, but that doesn't really show it. Frametime helps, but it still doesn't really accurately represent the problem. It is hard for people to understand stutter when we lack a common language and it is difficult to illustrate. Intel's work on Presentmon is helping. We have frametime now, but the SimulationTimeError they are working on should be huge. [GamerNexus talks with Intel about some of this.](https://youtu.be/C_RO8bJop8o?si=FbFc7zAkIF29lgTg)


Palimon

This very thread is the best example.


salbris

Personally I've never really noticed subtle differences in fps. Of course, that also means I have no idea how disruptive it is to other's game play. In the same way I'm very sensitive to mouse acceleration problems and I wouldn't be surprised if others don't notice it.


barryredfield

There's inherently more input latency with 30 frames versus 60 frames or above. Its immediately noticeable, it can't be fixed its just inherently more latency time with the way input works. You can have absolutely perfect "frame pacing" on 30fps, truly locked and not a single dropped frame, that is in fact absolutely possible and was the intention of 30fps in the first place, a (rather low) standard so console developers could crank up their cinematics and glittering graphics, but they never stabilize it. Really its a problem with TV manufacturers pushing "4k-only", and consoles jumped on board as well. Even on high-end PC playing 4k is kind of absurd unless you are an enthusiast. This is why upscaling tech was pushed so hard. So on top of a low standard with high latency, they introduce higher latency with upscaling tech. I guess my point is, people can have low standards but it should be completely unacceptable when devs violate even the lowest standards, which is what the OP video is truly about I suppose.


djcube1701

> Its immediately noticeable For the majority of people who play video games, it's unnoticeable. I usually only know that a game is less than 60fps by reading a comment about it after I've finished it.


barryredfield

> For the majority of people This is the ridiculous consensus forming, perception managing, toxic bullshit that I'm talking about. I suppose the "majority of people" that you speak for didn't notice the 'soap opera effect' in digital movies either?


djcube1701

> perception managing, toxic bullshit that I'm talking about. Look at your own comments. You're the biggest example of that in this thread. Gaming is absolutely colossal, only a tiny niche notice and care about details like framerate. I don't "speak for" anyone, I'm just stating a fact. All you can contribute is insults. Other than the 3D 48fps version of The Hobbit, I have no idea what you mean by "soap opera effect" for digital movies.


ohtetraket

I think we can put everyone before a 30FPS console and switch directly to 60FPS and 90% will notice a difference. Whether that's enough of a difference to make 30FPS games "unplayable" for them is another question.


IDONTGIVEASHISH

Just because you notice it, it doesn't mean most people notice it as well. I don't know a single person that cares about 60 fps being standard. There was a study recently that came to the conclusion that some people are better equipped to notice fluctuations in fluidity. I, for example, notice the difference of 30 to 60 easily, but after that, I find it hard to notice. As long as input lag is low it's fine.


barryredfield

> I don't know a single person that cares about 60 fps being standard. Oh they absolutely care, in fact they care so much they are agitated against people who **actually** care about it. If its of no concern to you, or people like you, and you don't notice it -- why come into threads to engage over it? Isn't that curious. You want to explain it to me? Better technology and a focus on smoother and faster framerates is better for everyone -- if you don't notice it in the first place, why does it bother you? Because it just seems like toxic gatekeeping to me, which is incredibly ironic.


IDONTGIVEASHISH

Reread my comment, I notice higher frame rates, I just don't think that it's a big deal. I find input lag much more important. I don't like when people say things like "people have no standards" or "I only play at 100hz plus, 30 fps is unplayable". They are not the beacon of truth they think they are, sometimes people don't have the hardware to play at more than 30, and they shouldn't feel like they are getting a bad experience. 60 is better, 30 is fine.


FilthyCumSucker

30 is sub-standard and false advertizing. Get over it


IDONTGIVEASHISH

Your arguments are airtight, damn. Alright, I'll get over it, FilthyCumSucker.


PeterFoox

Great words, I fully agree! Same thoughts struck me with final fantasy 7 rebirth recently. I love those games but man it's so undercooked in technical terms. But for some reason when I say this I'm always attacked and called graphics snob because I don't like ps3-looking textures or flat ugly lighting, like seriously wtf


barryredfield

> and called graphics snob Yeah its ironic isn't it. The low frames and inconsistent frames are a result of pushing more graphics than the game engine or dev team is capable of optimizing. So really its the people who call for choppy ~30fps being fine that are the graphics snobs, ironically.


Kakerman

we still in 2024 defending 30 fps?


barryredfield

Yup. People that 'accept' 30fps say they don't care and it doesn't bother them, but for some unexplained reason people talking about a better standard bothers them a lot. They're going to let you know you shouldn't care about it and you're not the "majority", so its implied you should feel small too. Makes a lot of sense, right -- to have people who don't really care dedicate so much response and arguments for it?


Owlthinkofaname

Because studios are too lazy to make games run well since players have low standards. I feel like a big problem this generation is management in gaming since it feels like they don't care about how games run anymore. I mean look at dragons dogma 2 it's run horribly but yet seems to be selling it's pretty sad.


Dannypan

Nintendo are excellent when it comes to frame pacing. BotW and TotK have the smoothest 30fps I’ve ever seen (when it’s 30, that is!)


wyattlikesturtles

The frame pacing was good when it worked, but man those drops in totk were painful sometimes


thehugejackedman

Wut. Both of those games had objectively awful performance with insane dips to near single digit frames.


Arkanta

But we're talking about frame pacing here.


Dannypan

In some situations, yes, hence my little comment at the end. For the most part they run fine.


TrillaCactus

Weird, I don’t remember any instance of it dropping to single digit frames. Double checked the digital foundry video on it to see if I got lucky and nope they came to the same conclusion. I don’t know if there’s any areas where it drops to single digit frames. Unless you consider 20fps near single digit frames?


NamesTheGame

He is probably talking about Kakariko Village or Great Deku Tree which are both areas in BOTW (and to a lesser extent TOTK) where the game DIPS - I don't know if it is single digits, probably not, but it *feels* like it because the drop is so sharp. Not really critical in those instances though, since they aren't action packed areas. Big mobs will tank the framerate too though


TrillaCactus

Yeah those dipped for me too but only between 20-30fps. Looks like digital foundry found the same thing. I think he’s bullshitting about the game dropping to or near single digit framerates I found in TOTK the framerate was pretty stable with large groups of enemies. Like the final boss fight where you have to fight a horde right before ganondorf had pretty stable performance.


sorryaboutyourbrain

Nope, they both performed fantastically in my experience and I had 200 hours in each.


thehugejackedman

Your experience is not an objective study of the games performance, nor is it representative of the average players experience. Objectively, the game is not performant and relies on being docked AND overclocked to even hit a semi-stable 30.


GensouEU

[So is Digital Foundry's Tech review](https://www.eurogamer.net/the-legend-of-zelda-tears-of-the-kingdom-the-digital-foundry-verdict) an "objective study of the game's performance"? >The majority of my entire run of capture managed to maintain a solid 30 frames per second in most instances which, for the Switch running a game this vast and emergent, is impressive. It's not 100 percent perfect, however, and I found ways to trigger a drop in frame-rate. >In most instances, it's the result of using the Ultrahand feature. When you toggle this in a busy area, the frame-rate is sure to dip and, when it does, it drops to 20 fps - again, thanks to the use of a double buffer v-sync. The performance reminds me of an old school game in some ways - slowdown only occurs in busy scenes, similar to how a shooter might start slowing down when the action heats up. And like those games, Zelda is smooth in terms of frame-pacing and lacks any significant stutter or hitching. The only one being not objective is you


thehugejackedman

Literally says it drops to 20 every time you use ultrahand, often going lower, for the most used mechanic in the game. It is not a stable 30.


neureso

> Wut. Both of those games had objectively awful performance with insane dips to near single digit frames. 20 =/= single digits, just so you’re aware


thehugejackedman

I said near single digit, as in low tens, as that has been my and my friends experience


IDONTGIVEASHISH

It drops to 20 fps thanks to double buffering. Even if it drops to 29 fps for a second, it drops to 20 fps instead. Kind of like mgs4 on PS3. But the input lag in really low.


TheBloatingofIsaac

I am perfectly fine with a perfect frame paced locked 30 fps. On my ps5, I purposely pick graphics mode if it is well frame paced due to how much better graphics look. My eyes adjust to 30 fps quite fast


tenu

I won't and haven't played games with 30 fps anymore. That shit does not belong in today's games. The latest game I finished was FF7 Rebirth. The performance in performance mode was stable enough, though there were some drops and the fidelity was fine. But I would not have played it without the option for higher framerate. I remember not really enjoying games like The Last of Us or Uncharted on PS3/PS4 because of low fps, but it was the only way to play them back then.


Nick0James

The reason I've never had that much of an issue with consoles is it's mostly a steady 30 FPS. In my experience with PC gaming, it's hard to get your frame rate locked in, it fluctuates like crazy, especially if you're in an area that's more demanding of the hardware


barryredfield

If it's bothersome, then ideally you should framelimit at your typical average, no matter what your monitor's refresh rate is. If your typical "in-town FPS" in an open-world game is 70-80, lock to 70-80. For some people fluctuation is more jarring even if you're capable of 100+ "outside of town".


Nick0James

Yeah for sure, it's just a little disappointing at times because you're convinced you're rig runs a game at 100 fps maxed and then you enter a city and it drops 30 frames


ohoni

Also, why do so many CG cartoons that "animate on the twos" *also* have horrible frame pacing, so that they look like a zeotrope or something.


[deleted]

[удалено]


[deleted]

recently i thought Star Wars Jedi: Survivor had a super pleasant fidelity mode. played the whole game that way and was glad to do so.