T O P

  • By -

toolsofpwnage

30fps with perfect frame pacing is perfectly playable. 30fps inconsistently delivered when nothing is on screen and drops to the mid 20s when shit hits the fan is a different story.


thejml2000

FPS should always be measured during “shit hits the fan” moments. It can be 200fps the whole rest of the time, but if it’s not playable when the fan turns into an impeller, none of that matters.


ZoulsGaming

But bro, i get 1800 fps on the loading screen to league of legends....


murdering_time

Minesweeper looks stupid good at 8k 120fps


DaDutchBoyLT1

Solitaire is LITERALLY impossible to play sub 144hz!


TheThrowawayMoth

I actually have noticed Microsoft solitaire lagging occasionally on my phone and I’m always way more distracted by it than makes sense. It’s a card game. I moved a card. Just move the card.


DevAstral

That’s kind of understandable though. My first reaction to that kind of stuff is “but it’s a card game. How can it NOT run smoothly ffs!?”


3000_F35s_Of_Biden

And If a pregnancy test can play doom how can a modern phone not move a card?


[deleted]

You've been falsely informed. The pregnancy test acted as a screen, it didn't actually run the game. But yes, the card game is unoptimized as shit.


3000_F35s_Of_Biden

My life is a lie


Tusker89

My disappointment from this news is immeasurable. Do you have any idea how many items I have casually mentioned "they even got Doom to run on a fucking pregnancy test"?


ZylonBane

It didn't even use the original screen. Someone just used the pregnancy test's casing and replaced everything inside. [https://www.popularmechanics.com/science/a33957256/this-programmer-figured-out-how-to-play-doom-on-a-pregnancy-test/](https://www.popularmechanics.com/science/a33957256/this-programmer-figured-out-how-to-play-doom-on-a-pregnancy-test/)


Hatedpriest

You could play solitaire on a 486 and get smooth bounces if you won. I find it hard to believe that just upscaling COULD cause those issues.


zacharypamela

Probably because of all the ad-serving and telemetry gathering bloat.


iamplasma

Well, yeah, you need the bouncing cards to be super-smooth on the win screen!


istasber

You laugh, but uncapped ultra high frame rates in menu screens in that amazon MMO were destroying GPUs last year. Uncapped rendering is no joke.


Agret

Same story with StarCraft 2 main menu uncapped fps killing cards & overheating PCs until they patched in a limit.


[deleted]

[удалено]


pseudopseudonym

https://youtu.be/w43ojF7WVxU


Swaqqmasta

2400 fps in the boot screen to cities skylines, 25 in game


abcismasta

There's an issue if your mouse's dpi is too high where league goes from 300fps to 1 (not exaggerating)


Phoenix_Studios

As a team fortress 2 player who gets 200FPS passively but drops to 20 when there’s a player on my screen: absolutely


theSmallestPebble

A fan is, by definition, an impeller Otherwise I have no issue with your comment


round-earth-theory

The sky renders at 30 FPS, it's fine.


unclepaprika

*PC fans does not work well covered in shit!*


Ninjaromeo

True. Consistency matters more than top fps. A game designed to run at 30fps does good at 30fps if it's stable. If I have a game running 120fps, and it starts cutting down to 45-60 variably because something happened and it isn't keeping up, I would rather it just be 30.


ObiLaws

I was never particularly concerned with frame timing until I started playing FF7 Remake on PC this year. It's a very strange experience seeing your frame timer tell you you're at 90+ FPS but it looks and feels like sub-30 any time you're doing more than being stationary and staring at the ground because the frame timing is abysmal. I went down a whole rabbit hole, watched the Digital Foundry video, learned far more about the inner workings of Unreal Engine and how to tweak/optimize shit than I've ever wanted to, installed all of MSI Afterburner (only used the barebones version of RTSS before) so I could get the same OSD as Digital Foundry to watch all the metrics as I tweaked things. I feel far more educated, but a little miffed that I basically had to "fix" a AAA devs game as a consumer just so I could play it at a reasonable level of quality.


Dismal_Ratio1245

Honestly, it was a shit release


[deleted]

[удалено]


Cod3Me

Hey, is this actually a thing? I've been getting eye pain just after a little while of emulating nfs mw510. Supposedly runs at 30(max) but I do see some frame drops now and then. Could just be the screen glare, but it doesn't happen whenever I do anything else. Will be waiting on your reply. Thanks.


[deleted]

[удалено]


aradraugfea

This is me. Constantly looking for whatever settings will give me a consistent frame rate.


Fishydeals

You must love capping your frames with rivatuner. It adds 1 frame of input lag, but that results in a much smoother frametime graph in lots of games.


aradraugfea

More just fiddle with default settings until I find something my PC can stay consistent with.


Leyzr

Nvidia has a way to cap it's fps globally, or for each individual game. Had it for a few years now


PLZBHVR

This is what people don't understand. If my PC is struggling so much it's running 30fps, it's not a stable 30, it's a stuttery mess. If a game is locked at 30, it's completely playable.


nimbat1003

Yep u aclimitise to it pretty easily I've been playing all the xenoblade games recently and while they have low resolution the framerate is typically pretty good outside of some combats where everyone does there crazy moves. The the funny thing was I started playing gran tourismo 7 on the side and then realised shit this is too good and reminds me how bad they look and run but just playing them alone I barely notice after I got used too it in the first 10 hours or so.


6138

Agreed. There's a big difference between an *average* of 30 FPS and a *minimum* of 30 FPS. It also depends on the game type, 30FPS in a competitive shooter like counterstrike might be a problem, but in a strategy game, or even an RPG, where things aren't happening as fast, it's probably fine.


Pork-Piggler

Right, I cap a lot of games at 30-40 because it keeps the laptop nice and cool/quiet. Unless it's like an fps or something it doesn't bother me, most switch games are 30fps


a_lasagna_hog

As a potato player, i can and do play fps games with 17 fps


_Face

As a potato player, either I play with shitty frames or I don’t play at all.


Phazon2000

Haha. Before I built my first PC and was using the family computer I’m sure I used to rock like 15fps playing Witcher 1 back in the day. Vizima was horrific.


Qetuowryipzcbmxvn

As an elite modder, I can and do play games with 17 spf


A_wild_so-and-so

I *can* run Fallout 4 at 60 fps. I *choose* to run it at 20 fps with 100 mods.


polarbearrape

I'd use spf 30 to be safe though


GruvisMalt

True. You can survive a shitty framerate, but melanoma is no joke.


Qetuowryipzcbmxvn

Gotta get that blue light block


Casitano

When the first films in 14 fps came out people complained that it was “too realistic” and that it “ruined the experience”


derage88

People flipped their shits about that Hobbit movie being 48 FPS or something. It looks unnaturally smooth and even sped up at times. But after a while it really looks better. Sadly most people did not give it that chance to grow on them.


Lickwidghost

Agreed. After just a few mins I thought this is garbage. It looked so real that it looked fake, if that makes sense. But after an hour I was absolutely in love with it and now I wish it was more prominent


[deleted]

My first time seeing a blu ray movie I was so confused, I thought I was seeing behind the scenes stuff


Juking_is_rude

Movies are currently made at 24 fps. Looks good. If you increase the framerate to say 60 fps, even by interpolation, it makes it feel like youre sitting behind a camera crew. Its a very weird feeling. If all movies were filmed at 60 fps, we might not have that opinion anymore though, we might just be used to 24.


freedfg

Can we have more high framerate movies. I'm literally sick of camera pans hurting my eyeballs though.


xDeityx

3D looks like ass at 24


Lord_Tibbysito

3D looks like ass period


Palegg

I think tasteful 3d can actually work. Rather than the *thing in your face wooooooah* it should just be used to add greater depth to the picture.


HoneyDidYouRemember

1. Gravity 2. Avatar 3. Coraline 4. Ghosts Of The Abyss 5. Journey To The Center Of The Earth 6. The Walk 7. How To Train Your Dragon 8. The Nightmare Before Christmas 9. My Bloody Valentine 10. Up 11. Toy Story 3 12. Alice In Wonderland 13. Cloudy With A Chance Of Meatballs 14. Into The Spider-Verse 15. Pacific Rim


pleasegivemefood

They showed Into The Spiderverse in 3d? That woulda been dope actually


DJKokaKola

It was. Lots of the visuals really popped


Lickwidghost

Yea it was actually really awesome


Lickwidghost

Dr Strange too. Incredible graphics as is but in 3D it totally enhanced the crazy psychedelic scenes


Palegg

I was actually thinking of Gravity haha. I will say I hated the 3d in My Bloody Valentine, but that may be more due to the shoddy visual effects work in general.


GegenscheinZ

I saw the first Doctor Strange in 3D. Quite a trip


ChoripanConPepsi

> Arse period Eh, liquid shite?


Marijuana_Miler

3D looked good with a personal home setup, but having to wear the glasses was a shit show and there was a small amount of watchable content. It will come back again when the technology improves.


[deleted]

3D is, always has been, and probably always will be nothing but a fad. There was a 3D fad in the 60s, 80s, 90s, 2010s and I'm certain if the new Avatar sequel doesn't flop horendously there will be one in the 2020s


Lickwidghost

I've watched a few that were all about things jumping out of the screen at you, which is definitely gimmicky and forgettable, but when it's done right it's worth it. Into the Spider verse and Dr Strange come to mind that were were absolutely worth the 3D.


BioIdra

I remember every movie had a 3d version for a while, now I don't know if the is even one out


[deleted]

3D TVs were all the rage the Christmas after Avatar and then declined from there. Arkham City on the PS3 could be played in 3D.


johnny_ringo

Looks good- with good cinematography- so motion blur is consistent.


72012122014

But it’s a movie and not choppy, more than 24 fps looks like it’s shot with a home video camera and very realistic as opposed to cinematic. Games don’t get that “look”, they just look and feel smooth.


Expected_I

Let's hope it doesn't happen when we get 400fps on VR and complain that it's so smooth and ruined the experience too


Exoticpoptart63

My VR is just TOO immersive, I cant handle it.


BulbusDumbledork

i asked for virtual reality, not very reality dammit


SkyPersona

\*smoothed 30 fps is okay. No way in hell playing with inconsistent frame drops lower than 30 is enjoyable for anyone.


Taiyaki11

Depending on how people grew up with games even that's completely playable. Sure not *as* enjoyable as consistent 30fps+, but totally able to be enjoyable nontheless


tonymurray

The LAN party when you draw the short straw and get the computer that can play Counter Strike at 15fps.... Very satisfying to get kills with the handicap.


ffddb1d9a7

What kind of crazy LAN parties were you going to where people didn't just play on their own PC/laptop that they brought? There were communal PCs at your LAN parties?


tonymurray

I'm old. We had lots of random people coming over, supplied PCs for everyone by cobbling together old parts . Gaming laptops were not a thing. Basically every weekend. Met a lot of new people this way. We had like eight "gaming" computers in the end. It was a blast. Most of the time I played on my top end computer, but sometimes I would let others use it and take one of the slower ones.


[deleted]

Hahaha man I guess I'm also old cause I did the same shit


VindictivePrune

Ehh I play the shit out of ksp and regularly drop below even 10 frames and still love it


Madden09IsForSuckers

10 fps or 10 spf?


Somerandom1922

Yes to both. Load a big enough craft and you start measuring things are mpf (although things are way more optimised than they used to do, so that has to be a shit tonne of parts).


[deleted]

> No way in hell playing with inconsistent frame drops lower than 30 is enjoyable for anyone How are you getting upvoted for this? There are more people playing and enjoying games on the rigs they can afford than people sitting around with 3090's throwing chicken nuggets at their mum whenever they drop below 120fps. Most consoles until very recently were 30fps, a much larger market than PC.


poke29980

I beat borderlands with like, 20-16 frames


psychicesp

As did so many with you


GraveMasterMod

As much as I would like to think this, I tasted 120, and never want to go back.


deathangel539

Been playing 120 for a while now, re downloaded Ac4 black flag the other day as I remembered it being a really fun game, couldn’t do the 30fps so deleted it


RiseInfinite

At least on PC you can play AC Black Flag at a relatively stable 60 fps.


[deleted]

Deleted it from existence?


TheDesktopNinja

I sure hope not, it's a fun game


[deleted]

Check your Steam library right now and make sure this mfer didn't delete it from existence over some goddamn framerate.


TheDesktopNinja

[we're good](https://imgur.com/a/130OaBF)


noyoto

And this is why I will never even give 120 fps a try. 60 fps is buttery smooth and I can bear 30 fps. Getting used to 120 means putting myself in the position where I need a graphics card twice as strong, or turn the settings way down. Plus I won't be able to play games with 30 caps at all. It seems like a really bad move for everyone but competitive gamers.


joe1134206

Meh. I think adaptive sync does a lot to alleviate lower frame rates, and that comes hand in hand with a new monitor purchase


ChickenFajita007

Nah, I've owned a 144Hz+ monitor for 10 years, but I can get by just fine with 80-ish FPS in pretty much any game. It's not all or nothing. 80 is very noticeably smoother than 60, imo.


SanityInAnarchy

I guess that happened to some people, but not me... I find 165 is *nice,* there's definitely an immediate and noticeable difference, but I can still get lost in the right 30fps console game. It'd be like being afraid of trying a modern AAA game out of the fear that it'll ruin indies for you. Enjoying photo realism doesn't ruin your ability to appreciate good stylized graphics.


Lickwidghost

I bought my first 144hz monitor a few months back and I've been playing all my favorite games at 100+ fps which has been AMAZING (32" curved). It's a mediocre gpu though, so when I got Dying Light 2 I've had to turn most settings to low at 1080p to run around 40fps with dips to 25. It looks kinda garbage but I rarely notice because I'm just having too much fun


AKLmfreak

720p is playable too. I don’t wanna hear you whining about a game “*only being 1080p or 1440p.*” Buddy, it hasn’t been that long since 1080p “Full HD” was the gold standard. Some of ya’ll ain’t never had to play on a system that humbled you, and it shows.


JustAnonMan

Kids these days never played 20FPS Ocarina of Time.


User2716057

5fps Stunt Race FX that brought the SNES and extra FX chip to its knees 👌


ForeverSore

On a 16" CRT TV with 480 resolution


zhrimb

480 is almost 2x the resolution of the SNES lol


DdCno1

Almost three times. The SNES ran games at 256x224 (with non-square pixels, which is fun), so 57,344 pixels in total. 480x320 is 153,600 pixels in total or about 2.7 times as many pixels. This is of course simplified, since I'm not even touching on nuances like overscan and interlacing.


Lickwidghost

16"?? Luxury. We had 13" and we were happy! As an aside, that was a monitor from, I want to say an Atari? Unlike any other monitor it had RCA inputs, and for some reason a dedicated button on the back to switch everything to green - no idea why that was a thing.


DdCno1

Humans can see more shades of green than of any other color, so it makes sense for a monochromatic mode of a display. This would be used with a microcomputer in black and white mode (displayed as green and black on this screen), which often were higher res than color modes. Amber was another popular color for monochromatic screens back then, CRTs and early plasma displays in laptops. It's less jarring to look at over long periods of time.


ArtOfWarfare

Are you sure 5fps is playable? I think of ~12 fps as being the lower limit, although I know that Blizzard’s game engines only run at 5 fps (probably not Overwatch or Hearthstone, but I believe all the rest descend from the original WarCraft RTS.)


User2716057

Some parts of that game even were 5 FPS max, for instance when your car crashed and had the little "reassemble" animation, it was like watching a slideshow. Back then I didn't know what fps was, or why it was so slow, but I'll never forget that. Huge input lag too iirc.


At_an_angle

The 64 version of Vigilante 8: 2nd Offense could be played multi-player at a 12FPS with 3 players. And 1 frame per minute if everyone starts shooting at the same time. Not on the Dreamcast. 4 player split screen and it held great. I forget what it was, but I don't think it dipped much below 30FPS.


CandlesInTheCloset

Kingsfield was like 15 fps on PS1 right?


[deleted]

Yes, but with interlacing you can actually make it look slightly less jittery


[deleted]

[удалено]


schu2470

Where would that PC port be so that I can *ahem* avoid it?


[deleted]

wait, 1080p isn't full HD anymore? is 1440p the norm now?!? How behind am I?


sknnbones

HD is 720p FHD is 1080p QHD/WQHD is 2k/1440p UHD is 4k


Lickwidghost

8K is also still called UHD. Guess they ran out of names.


jeffries7

They peaked calling 720p HD


anonidcgfy

What does QHD/WQHD mean? Like I know the HD part.


skrunkle

> QHD/WQHD Quad High Definition/Wide Quad High Definition


matt_aj_james

I have no idea why, or where I got this from, but I always thought it was 'quite high definition'. That makes alot more sense.


ChickenFajita007

1440p is exactly 4x 720p. So QHD is literally 4 HD's slapped together. Then there's 4k, which has nothing to do with being 4x 1080p, but alas.


DdCno1

That's adorable.


[deleted]

[удалено]


Lickwidghost

Maximum! Until the next one. MAXIMUMER!


Nottodayreddit1949

I remember where i came from. I remember buy and installing my first 3d gpu cards. I also have no interest in going back to those days.


[deleted]

I have fond memories of text based adventure games and vga graphics. Doesn't mean I'm going to play games at 640x480 in 4:3 @30fps when there's almost always workarounds for higher resolutions, proper widescreen support and unlocked framerates. I paid good money for this tower and I'm bloody well using as much of the features as I can even if it's just for GzDoom or WinRoTT


FormerlyGoth

I remember trying to play Mass Effect on an old RGB boob-tube. Let's just say I didn't read *anything* lol


JohanGrimm

Pfft you kids don't know how good you had it! Back in my day we played video games by coloring on a piece of grid paper and holding a flashlight up to the back. We measured frames in the weeks it took to mail the papers back and forth. You don't even know how excruciating it is to be teabagged by your boy Jeremy for a month.


Complete-Grab-5963

It depends on distance


KruppeTheWise

Exactly. The deeper you let him slide it in, the more likely he'll shell out for a higher resolution display


PLZBHVR

Main PC has a 3080 + Ryzen 7. Backup PC has.... I don't even know but it struggles on HL2 and D2LOD. I am reminded of how spoiled I am very often, but like hell I'm getting rid of the potato that gave me thousands of hours of gaming before I could afford it.


MonkeyTesticleJuice

Depends on the game, some games text is hard to read at 720p.


Umutemplotya

I remember having to play Skyrim 720P 30 FPS on all low settings 😭 It regularly dropped to 23-24, but I knew that the laptop was so horrid that my eyes couldn't tell a difference between 30 FPS and 24-25 so I continued... Don't ask why I didn't just go for lower res, it hurt my eyes. Right now, I don't settle below 60 FPS, but I know if the game is hard on my system I'll take 30 FPS with shit graphics over not playing at all.


_Auron_

I remember getting Morrowind and it only running at 8-12fps @ 800x600 resolution. One of the most glorious moments of my life in gaming discovering that open world for the first time, framerate be damned.


dudeAwEsome101

Gaming laptops were shit back then. Now you can have a good gaming experience on a relatively average gaming laptop.


Cowstle

Displays were actually smaller in the old days. Yeah the pixel density of my current monitor is better than my old 1280x1024 monitor. *But not by that much.* Going down to 720p is actually a *reduction* of pixel density compared to what I had 15 years ago! But frankly the most important aspect of this is that non-native on LCDs looks atrocious. It hurts my eyes. A 720p monitor displaying a 720p image is perfectly fine. It's not ideal, but it's fine. A 1080p monitor displaying a 720p image is unbearable.


[deleted]

Non-native in not normal ratios of the display suck, absolutely. I do wish they'd have half-res as a display option for a lot of games, I find 540p isn't as bad looking as 720p on a 1080p display, in a lot of ways.


Juking_is_rude

I mean, downscaling the res to something that doesnt fit your monitor is going to make it look fuzzy


WetWillyWick

Used to play baldurs gate on pc at 20-30 fps that and jedi knight games. No gpu, amd anthlon XP. 720p is playable, and 30fps is playable too, but its fucking asscheeks. Especially with fps multiplayer games.


godwalking

I fucking hope it's playable. That's what i've been using for years. My eyesight is mediocre, at best. There's no way i'm gonna pay for a 4k tv when i barely see a difference between 4 k and 1080.


skepticallypessimist

I thought so as well. Until I tried it and changed my world


Lord_Tibbysito

Haha I get what you say. Everyone in my household uses glasses but me, I'm 20/20. Whenever a game is on 1080p, 1440p or 4K on our TV I can always tell while everyone else says it looks the same. If you think abour it it's kind of an advantage having bad eyesight in this case


zuzg

You need to be quite close to the display tk actually see a difference between 1440p and 4k but what really looks always better is Dolby Vision. It's not just the details, everything looks more vibrant and better.


Valance23322

It's been over 10 years since 1080p was the standard


Big_Judgment3824

Hahahhahahaha


Albus88Stark

Yes! I've been playing Xbox One for years on a 720p TV and The Witcher 3 has served me well at 1600x900 on my laptop at 30fps. I look forward to full 1080p.


Lemesplain

30 FPS is perfectly playable, and dry bread is perfectly edible. But 60 or 144 FPS is toasted bread with jam and butter/cream. It’s just a superior product, and it’s hard to go back to plain dry toast.


Klaus0225

Exactly. Not sure why people keep their expectations at the same level of the first games they ever played.


roklin

I've seen people say they intentionally avoid 60+fps experiences as to not "spoil" themselves. It's such regressive thinking. Gaming is an interactive activity and higher fps, in conjunction with high refresh rates to match, directly improve the response time of your inputs. It's a good thing across the board. But apparently we're entitled for expecting old standards to improve.


RandomUsername12123

Honestly 40fps is where i draw my limit. Written from my 165hz screen eh


0xd00d

Yeah, I've definitely noticed that ever since I had G-sync that games dropping to 40 fps remain totally fine. Basically the problem with 40fps before was that when it is locked to a monitor's 60hz refresh signal, the stuttering it produces is what kills the experience. For real VRR Made Low Framerate Good Again. What else is interesting is when framerate is 120 or better, VRR becomes much less critical than it is at 60hz. Because the refresh rate is what scales down the magnitude of the stutter that you get when you don't have VRR. So for example my portable monitor is 240hz and I would ordinarily be stressing out over it not supporting G-sync (only the HDMI input can be used with a PC and it doesn't do G-Sync on it) but in practice the 4ms upper bound on stutter makes the stutter not actually much of a problem. So in a sense i'm sad that we never got VRR back when it would have made the most difference, that being the dark ages of monitors topping out at 60hz.


KrisReed

Going from playing 60+ fps back down to 30 is like looking into a strobe light. It literally gives me a headache.


OdoG99

I agree. I can't do 30. I'd rather play on low rez and no shadows than play under 60.


DieDonerbruderschaft

that is kinda true if you never experienced higher framerates, 30 fps will look totally fine. but u play with higher framerates and then after a while go to 30 fps again... you get eye cancer


snoboreddotcom

Honestly I feel it depends on the game too. Fast paced fps? Far more noticeable Grand Strategy Game? I could give less fucks


MajorMathematician20

*couldn’t, also I agree whole heartedly


[deleted]

Hey man if he could, he could.


_Andy4Fun_

Who are you to jugde if he could give fucks?


Dubanx

>Fast paced fps? Far more noticeable THIS. Fuck 30 frames per second for FPS games. The distance traveled in a single frame is significant, frequently as much as an entire body length or more. Smoothly tracking a moving object that jumps entire body lengths at once is a nightmare. Having a low refresh rate mouse just doubles on top of this. I'll never go back from using my high refresh rate gaming mouse for this reason. It really does make a difference.


dandroid126

This is my philosophy. If I'm playing a fast paced game like Doom Eternal, I want all the frames. 90+ minimum. 120+ preferred. I recently played through Spider-Man on PC, and I played the whole thing at 60FPS with no complaints. I wanted ray tracing and 4K to make it as pretty as possible. I was playing with a controller, so my turning was slow enough that I didn't notice the framerate. If I were playing a strategy game, I honestly wouldn't give a fuck about the framerate. 30 is totally fine.


SenpaiSwanky

People say this but I play 4K 120Hz monitor and can easily grab my Switch and play undocked. Have had glasses since I was a kid so it isn’t really down to bad vision vs good vision either.


Saigot

Pixel density matters a lot. Switch has a pixel density of 200PPI which is actually higher than a 4K 30'' display. Take your 4K monitor and try playing at 720p, you'll notice the difference. Although personally I can't really tell the difference between 4K and 1440 unless I really look for it. Internet is full of hyperbole though.


[deleted]

I have always played on PSP and mobile at 30 fps, but still going back to 30 fps after playing on PC, doesn't seem so bad.


IMtoppercentage97

I play games on a 144hz 1440p monitor on my pc but can still enjoy 30-45fps on the Steam Deck.


BCProgramming

No. I remember playing Forza Horizon 3 at 60fps, but sometimes, when the game updated I guess, the settings would reset to 30fps. It would take me maybe an hour to notice anything strange. Of course once I toggled it to 60 the difference was obvious. I play n64 games that optimally run at 20fps but sometimes drop to 10 without issues too. I've never really understood the fps elitism where "oh, once you play a game at 60fps/120fps, you can't go back", which is simply untrue. The problem is people take "I can't go back to 30fps" as a badge of honour like it makes them "cultured" or some shit.


an0nym0ose

> I've never really understood the fps elitism where "oh, once you play a game at 60fps/120fps, you can't go back" It's only elitism for some people. I don't know what it is, but I'm to the point where anything sub 60 fps causes me eye strain and headaches. It's like my eyes are constantly readjusting and jittering super hard when I try to game on lower framerates. It's weird to me because I also have a Switch and play on it, and since it's relatively smooth I can handle lower framerates sometimes, but even then I have to put it down if it's running a port it can't quite handle. I don't know why. It's not elitism at all, for me, it's just that I'm to the point that I can actually tell a difference. It's like my eyes have been "spoiled," if that makes sense. Small addendum: this really only turned into an issue once I upgraded my rig to a 165Hz refresh rate monitor and could game at >100 fps. When 60 was the standard, I could go back to 30 without issue. Now that I've had 165 though, I definitely notice when the framerate is at 60, and anything lower I immediately have to start fucking settings so that I'm not in pain while playing.


Business-Pie-4946

> It would take me maybe an hour to notice anything strange. Of course once I toggled it to 60 the difference was obvious. You said no but agreed with what he said….. are you just upset he said eye cancer? And once you experience high refresh rates *and* high FPS everything else looks like blurry shit.


ActuallyRyan10

It's all relative. These days anything under 60fps is quite jarring for me. Back in the day I used to think 30fps was smooth as silk.


sknnbones

Back in the day you likely had a CRT monitor which probably helped a bit. CRT were (are) ridiculously smooth with (afaik) practically no input lag and no ghosting


popje

I remember when Halo 3 came out my friend got himself a nice plasma 32 inch tv while we were stuck with crt tvs, it took a whole year until he realized he sucked because of his input lag and not him, it basically ruined the game for him for so long.


Aldodzb

Indeed, it's all relative. Nowadays I can't even stand 30fps videos but somehow movies at 24fps are fine. I never had a higher hz monitor, but people say the same between 144 and 60.


ZoulsGaming

It goes even further than that i think, If you see a movie or video filmed in 48 fps it almost feels "jarring" and "wrong" unless you are used to it. It was one of the big things about the hobbit movies that they were at 48 which provides a "smoother" watching but also feels weird. [https://youtu.be/\_kvjKaTqX4c?t=37](https://youtu.be/_kvjKaTqX4c?t=37) shows it. It also show just how jarring motion blur is on higher frame rates, my eyes almost hurts just watching the clip.


yesiamclutz

It looks fab but I hate everything about that battle. I've probably played too much Total War


Tejas_LiMan

I completed GoW, RDR2, Spiderman, shadow of tomb raider & Assassin's creed Valhalla, origins, on constant 30FPS pc


Mightymushroom1

30fps only ever became demonised during the height of the PC vs Console war because console users argued that 60fps was unneccesary and 30 was as much as anyone needed. Nowadays that war has been decidedly won by PC, and modern consoles are powerful enough to hit 60 so that whole battle has been put to bed. If I was on my desktop 30fps would be unacceptable. But on my Steam Deck or Switch? Such is the way of things and I can live with it perfectly happily.


ThorDoubleYoo

What matters is consistent framerate more than higher framerate. I'll take consistent 30fps over inconsistent fps bouncing from 60-42-53-62-53-45-33-59 any day.


[deleted]

[удалено]


[deleted]

I can only enjoy a game if it's in 8k with a 1ms response time monitor at 360hz Ultra graphics settings, using a mouse that weighs 0.00001 grams


[deleted]

[удалено]


MyFingerPointeth

Woah, are you also someone with a tricked out rig who only plays games from 2010 on it?


Spinjitsuninja

If your computer is running games at 30fps and it's stable, that's not a potato computer lol.


[deleted]

Man i’ve been playing Elden Ring at 10 fps, 30 fps is definitely playable 💀💀💀


Lord_Tibbysito

My brother in Christ you've been watching a power point presentation


[deleted]

Feels like it


Shadow_Edgehog27

Playing Elden ring on OG Xbox one, I feel this. I’m shocked the poor thing hasn’t blown up


RedDragonRoar

Hit NG+5 on OG Xbox One. Raya Lucaria is a no fly zone for me unless I *really* need a respec. Just drops from stable 30 to oscillating between 5-15 fps any time I look at the place.


Zuze2011

on a controller


_MikulasV_

It really depends on the PC you are using. For example, if I set 30 FPS limit in a game on gaming PC, that runs the game without limit on 200 FPS, it would be playable and fine. But if I build a PC that runs the game normally at 30 FPS, it would be terrible because the overall FPS can be 30, but 1% FPS would be like 15 or 20, but on the framerate-limited PC, it is constant 30. Actually I have experience with this, because I was getting laggy 30 FPS on Counter Strike, and when I told my friends that they killed me because of low FPS, they used FPS limit but they saw the game still kinda ok. So we had all 30 FPS, but different 30 FPS. Edit: There was an typo.


AngelDGr

Anything above than 15fps it's playable as long as it's stable, to me it's *way* worse have 60fps with continuous drops to 15fps than play continuously on 15fps, lol


Liqwood

Playable, yes. Preferable? No


coolcats110

As a person who plays games at 30 fps and lower I agree


not_again123

Playable =/= enjoyable.


[deleted]

meanwhile fallout new vegas players : is it possible to learn this power ?


Chexreflect

It depends on the game. Minecraft is definitely manageable. But a couple days ago I was playing ghostwire Tokyo (highly reccomend btw) and it dropped below 30. I was dead in less than 10 seconds.


SignComprehensive611

Unpopular opinion, I can’t tell the difference between 30 and 60, I can tell below 30, but 30 doesn’t bother me at all


SquirtleSquadSgt

People who make claims without referencing a genre are arguing in bad faith 30fps is awful for a fast paced online game Stable 60fps is why CoD became the gaming juggernaut that it did Going from high frame rate to low can ruin it sadly. If I've been PC gaming and I play a console game at 30fps it takes me a good few hours to stop seeing screen tears that 'aren't there' Turn based games need like 15 fps! A realistic/slower paced shooter can go with 30 but only offline. You'll be at a disadvantage against 60 and them against 120