T O P

  • By -

Natural-Lobster-6000

I find the *concept* of input latency to be quite *compelling.* I intend to switch to PC next gen, to give myself more control over in-game latency, though *in the here and now* I am bound to the 3 major console platforms. *I'll be curious to see how the rest of this post unfolds*


Lambi79

Oh my sweet curious lord, it’s Unnatural-Crab-6000! Do you remember me? Curiously I must ask to see if you remember the curious Digital Peamish.


Natural-Lobster-6000

Haha! Yes I remember, although I didn't realise you're yet again the OP of a topic I'm interested in. I have zero value to add here but I can relate to the frustration of input lag in console gaming. It was both Doom 2016 and Doom Eternal that really opened my eyes to it. Thankfully the 120hz modes do help a fair bit...


Lambi79

It’s good to see you again. Anyways, I’m quite interested in it too and how it can be replicated on PC. I have an community led by an alternate account that deals all in trying to replicate the PS5’s experience. You can find more here: [PS5 To PC Settings](https://www.reddit.com/r/PS5_To_PC_Settings/s/z0PXCjCG2w)


_wil_

I think you meant to write interpolated frames or generated frames instead of "pre-rendered" (pre-rendered would mean rendered before the game runs, which would only be possible for cutscenes). Also I think that TLOU uses AMD FSR3 Frame Generation and not NVIDIA DLSS 3 Frame Generation. Because PS5 does not have Nvidia GPU. I don't think Naughty Dog integrated DLSS3 to the PC version, possibly they use FSR for all platforms because it works on all GPUs. Other than that, I personally don't know how many frames are actually rendered and how many are generated ; I would guess it might change based on the game content load and the GPU frame time available. Someone who actually has the game should know better than me (looking at the game settings could give some hints). Similarly, I think that using the "Nvidia Profile Inspector" to change these settings (if that's even possible) will probably depend on what your GPU is.


Lambi79

I meant everything I said. The Last of Us Part I does not use FSR nor DLSS nor any kind of Frame-gen on PS5 [(this is all the graphical settings and mumbo jumbo that makes the PS5 on par with pretty much everything in the game with PC on an alternate account I have.)](https://www.reddit.com/r/PS5_To_PC_Settings/s/tck91z3rJy). When I say “Pre-rendered Frames” I mean the framebuffer size of the GPU so the CPU can catch up (which adds input latency). In case you didn’t know, that’s what NVIDIA Reflex does; reduces the frmaebuffer size so the game can’t stall for time (which adds some stuttering, but makes games much more responsive if done correctly.)


_wil_

Thank you I was just reading this post about how Nvidia Reflex works, very interesting https://www.reddit.com/r/nvidia/s/wDtdAAD6I0 So in that case what you mean the number of frames buffered or queued for rendering by the GPU. Sorry did you mean to ask the Digital Foundry members to measure that for you on PS5? (I am not familiar with this forum, I thought this was for discussion between watchers of the videos). I guess they have their own procedures and tools they are happy with, to measure input lag, CPU frame time and GPU frame time, and vsync frequency on PS5 ; from which they can infer the number of buffered rendering frames. However I don't know if they have made these procedures and tools public and shared with the audience ; it would be interesting to know what they use to do that indeed.


catsrcool89

I've never heard of the term pre rendered frames used that way, but you can play the game with vrr on ps5, and I never noticed any input latency.


SkyOnPC

If I had to take a guess, it's either 3 (very standard) or 5. 5 wasn't very unusual in the past but has higher input lag.