30 FPS or 48 FPS or 60 FPS? Which one would you prefer to play games?
Now this comment deserve a rap, here we go
I was asking, 30 or 60,
Everyone is commentin,
And then someone ask me what you smoking,
I don't smoking,
I love fucking,
yeah over the top, over the terrace,
I love to race, i love it bigger,
I don't game, yeah, baby, i just shoot,
Not a bullet, a clip, a 1000 rounds :D
Everyone here, is playing games
I love to make memes, and rhymes,
So let's get up, get up and run
Yeah baby you cringe
But i FUCK :D
Comment has been collapsed.
almost all Movies are at around 24 frames. some are lower that people realize, like most of Fury Road runs slightly below 24
I dont even know any bluray close to 60 frames, unless its...heh, another kind of artsy movie. You can still get the hobbit movies at 48fps, thats about it.
And the difference between 48 and 24 is night and day. But its very difficult to immerse yourself when you start noticing little details you wouldnt otherwise, since most movies have fake props, or some choreography.
Comment has been collapsed.
If you want a game to feel movie like, shouldnt you play at 24 frames?
Comment has been collapsed.
24 fps in a 60 Hz monitor produces judder (some people is more sensible than others to this) because frametime is not constant in this situation. Watching a movie at a refresh rate that is a multiple of 24 (like 120 or 240) produces constant frametime and looks much better.
Comment has been collapsed.
He's trying to argue that 30-48fps in-game gameplay is better because he claims it produces a "movie effect". (basically the mindset that movies look good because they are shot in 24fps instead of 60fps, so he thinks you should play games at 30fps instead of 60.
I play at 144fps, and would never go below 60fps.
Comment has been collapsed.
I don't know how 30 fps can feel more natural as reality certainly doesn't move that slow.
Anyway i prefer 60 fps - of course -, but it's not mandatory to enjoy a game. When my GPU won't be enough to render at 60 fps i'm not gonna change it.
Comment has been collapsed.
I don't see a reason not to have it as high as possible. Even if your monitor is 60hz like mine is, having more frames drawn will have it so you are more likely to get an non-delayed image pulled for your monitor to show. My answer is 1000, so there is one drawn about every millisecond.
Comment has been collapsed.
I believe that it's the developer's job to make the game look realistic. Real life is not based on frames, so higher framerates then would make it more realistic. I personally cannot play games at anything below 40-ish as it gives me motion sickness.
Comment has been collapsed.
That is not how our eyes works. We constantly have light going into our eyes, which our brain then processes. If that were the case, high fps would not even be noticeable and perceived the same as lower fps. If you had an fps cap on your eye at, let's say 100, it would look the same seeing 100 fps as 240 fps.
Comment has been collapsed.
We see continuous flow of events and not single direct frames of data, that's correct, but the amount of frames that our eyes can actually notice is finite and quite limited. If I showed you a video in 10k fps and 100k fps (assuming I'd have monitor capable to display that), even if you tried your hardest, you wouldn't be able to tell a difference, assuming it'd be a continuous video. So in this case yes, you wouldn't see any difference and claim it's the same video in the same setting, exactly the same how 1k fps video and 300 fps video would look like.
Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.
But in this case pilots have expected the object and were able to identify that based on continuous lack of data vs a direct signal, so this could probably find its usage only when you're e.g. camping in CS:GO waiting for your enemy to appear, not in "normal" playthrough where you're just playing and not being super focused at some particular area of your screen. If you limit your recognition skills to more simple matters (e.g. light vs no light), you can go up to 300 FPS in theory, but your skills will be limited to saying whether there was something happening or not (aka easier recognition of data vs continuous flow of no-data, exactly the same as with pilots, as opposed to constant flow of data in a game).
Actual human limitation varies between human to human but it's most of the time between 90 and 105 fps, with 120 really being the upper limit of what our brain can process before it starts to drop frames as impossible to distinguish (of continuous video). You'd have to be super-human to notice a difference in smoothness between 120 and 144 FPS, simply because even though we can physically see it, we're unable to identify and distinguish it that fast. There might be humans able to do so, because everybody is different, but for vast majority including myself, 105-120 FPS is really the upper limit of what can be considered to "make sense" and this is also the amount of FPSes that I'm aiming for in the games, simply because more doesn't make sense, even if I have 144 HZ monitor. During playing Witcher 3 I couldn't see drops from 120 to 90 without being really focused, but the difference between 60 and 90 is quite huge, and 60 vs 30 isn't even debatable.
Data for backup: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2826883/figure/F2/
Comment has been collapsed.
"The participant's task on each trial was to select the comparison interval that contained the gap or that differed from the standard (which never contained a gap)." This study is about finding gaps. Video games don't have a constant picture and 1 frame of nothing to see if you noticed or nor, it's constant motion. You can see that the motion is smoother at high framerates than at lower ones far beyond if there was a blank frame. Many people, including me, can still quite easily distinguish the weather something is 144fps or 240fps.
Comment has been collapsed.
If you can't find a gap between lack of data and data in a single frame then you can't realistically distinguish 219 fps from 220 fps due to missing a frame.
Yes, you will be able to distinguish more FPSes but not in the way you think you do. The game is smoother because of less input lag, not because of more frames rendered on your screen. Playing in 240 fps on 144 hz screen will look and feel exactly the same as playing in 240 fps on 240 hz monitor. Likewise playing in 600 FPS on 144hz, 240hz and 600hz one. See my comment below.
Input lag however isn't crucial at those values and unless you're playing in competitive esport where your reaction time matters, achieving input lag of 10-20ms is enough for comfortable gaming. Our brain can fix to that easily and I can't see any situation where you'd want to go below that, except nit-picking or "just because I can". This is proved by lack of headaches during using VR at those values.
Comment has been collapsed.
Problem is that you aren't dealing with perfect output devices. LCD color overshoot, varying transition times and backlighting (PWM) cause image degradation. While the 144Hz panel might be sufficient in terms of input lag and perceived motion fluidity, 240Hz panel might offer better transitions and therefore more precise information delivered to eyes.
Comment has been collapsed.
You can always "improve" the end result by offering less discrete and more continuous signal to our eyes, the more probes per second the better chance that our eyes won't see the image as discrete but as a motion. However, at some point going further simply isn't really going to work because our eyes can't distinguish any more frames than you're already displaying and you're not going to get rid of the aspects you've written above by doing that. The only thing to further improve is to deal with varying transition times and that part is "fixed" by g-sync/freesync to the point of not being a huge concern anymore. The rest, higher framerate won't fix either. 240 Hz panel won't fix any of that, and making signal more continuous further ahead of 144 FPS isn't going to make us see more frames and motion than we can, neither improve quality. It can only reduce input lag, and for that you don't need 240 Hz panel. Our eyes aren't the same and aren't fixed, one person will see less (90), another more (105), some might have exceptional eyes and see a difference even at 120, but at some point you're just bluffing yourself that there is a difference, while the difference doesn't come from amount of frames being displayed but the game being rendered in more frames, which the panel doesn't have any contibution to.
Of course you might believe that there is some difference due to marketing having to sell their products in one way or another, but scientifically even if you took in account all non-perfect display quirks, higher framerate won't help with any of them. If anything will, those are technologies that might decide to render more frames to achieve other results (e.g. reduction of input lag), but pure displaying more frames per second isn't going to do anything, and for the rendering part you don't need faster display panel. Like I said above, going even further might only help to see something happening one frame earlier, when you expect it to happen, but that does not change smoothness of a continuous video being played.
Exactly the same thing is happening with all other discrete signals that are percepted as continuous by our brain, for example with music. Yes, you could go to 10000 kbps signal and make the signal even less discrete, but it won't differ in any way from signal of "just" 1000 kbps as we lose any kind of perception at ~96 kbps value (of opus, not crappy mp3 which sounds worse than opus even at 320 kbps). I could now encode all my lossless FLAC music into 300+ kbps opuses if I wanted to, but I'm 100% sure that 128 kbps is 32 kbps more than I need to reach the point of having a fully continuous and flawless signal. 96 kbps is already imperceptible, but a safety buffer for eventual technology of the future and human ears mutations won't hurt, exactly the same how 144 fps isn't going to hurt already crystal smooth 120. See research for more details.
Comment has been collapsed.
Actually you might want to update your science: https://www.nature.com/articles/srep07861
When using images/displays with edges (as opposed to uniform light source like your linked study), flicker detection at 500 Hz was achieved.
Comment has been collapsed.
There is nothing in your article that goes against what I said above. That research was done using binary data vs no-data test with LED displays, and we already know from previous researches that human eye can easily go up to 220 Hz if it's focused and expects something to happen. This is entirely different from interpreting and interacting with particular objects that we recognize and distinguish from the rest, where the smoothness and our perception is heavily degraded compared to a binary 0-1 signal of something existing or not. You will not be able to notice any difference in displayed video or rendered game at 500 Hz compared to e.g. 450 Hz, but you might notice a difference if you were served a constant lights-on and lights-off events, this is because our eyes do not work discretly like the signal, and will notice a difference even if being impossible to distinguish exact number of frames.
Comment has been collapsed.
LED displays, LMFAO it was not a display, it was an LED light box. It's like you haven't even read the study you're parroting.
So we notice the difference with static contrasted images but not games or movies is your claim? You've never played a game with contrasting images? You don't understand that twitch FPSs are nothing like movies and frequently have completely different images from frame to frame when a player flicks?
Guess I won't convince you of your ignorance, oh well. I'll take research on display technology and human perception (using 4000 Hz DLP tech) any day over a study about gap detection in old people using a light box cobbled together out of spare parts.
Comment has been collapsed.
Dont care about the fps per se, as long as they are stable. However, I dont understand people that won't play games unless they're at a perfect 60. They wouldnt even touch 30, even though they wouldnt notice a difference and miss out on a lot of potential fun.
Oh well, I'm just glad i dont have that issue :)
Comment has been collapsed.
Do you have any suggestion for smooth games at 30 fps as i am interested to try a few of them.
Comment has been collapsed.
Hey there, give me your PC configuration, and i will be able to suggest some of them that you can play at 30-48 fps and enjoy.
I've already mentioned some of them above, here are some more
If you want to see some samples, it's already present above
Comment has been collapsed.
Thanks for the list. Unfortunately those games arent my cup of tea as they seem too blank and generic open worlds games but maybe i can try evil within 2, after i try the first one of course.
Comment has been collapsed.
Once someone made a example of the different fps and I checked them and to be honest I didn't see a different between the 30 and the 60 at all, even when I read what people said was different, I didn't see it.
Am sure there are games where it can matter a lot, like shooter games and such, so for me, so far, it never mattered, BUT that is only about locked games, games that can go higher make it to 60fps anyway for me, even with my old graphic card so maybe that matters as well.
Comment has been collapsed.
They wouldnt even touch 30, even though they wouldnt notice a difference
But that's the point. I will immediately notice the difference. 30fps feels never smooth to me. And yes, I have skipped games because they were 30fps. Not on principle, but because it just wasn't fun that way. I can play at 30 if I really, really want to play the game. Good example: PSNow. Most PS3 games and even many PS4 games are at 30fps. So I played The Last Of US and a few others at 30, because there was no other way to do it (aside from buying an actual PS4 in case of TLOU, but that was no option for me). But generally I avoid 30fps experiences, as it's really a sub-par experience, to me at least. I really like when games give me a stable 100fps, if my PC can handle it. 60 to 100 is also very noticable.
Comment has been collapsed.
30 FPS IS more cinematic, 60 feels closer to lifelike. I always have the feeling that with 60 fps (especially in television, like a movie) feels fast, and too smooth for what are we used to on TV and generally, screens. Games with 60 fps feels faster (that is weird for me) but without a doubt they are more responsive.So for something like a (rip) telltale game I wouldn't mind 30 fps at all and for some simpler, easier games, but for roguelikes and other genres where you need timing and being careful and accurate, 60 fps is simply better.
Comment has been collapsed.
I would agree that playing a telltale game at 30fps is fine, but I wouldn't say its better than 60fps. Slow paced games look perfectly fine at lower fps sometimes without any noticeable difference at all.
Comment has been collapsed.
♫ Zawwmbeez, Zawwmbeez, yeehhhh yeeehhh yeeehhhhh ♫
Comment has been collapsed.
It depends on the game. For certain games, adventure, cinematic games maybe, some retro games, 60 or fewer is ok. Having a monitor with a higher refresh rate, I've become accustomed to higher FPS in in shooters and action games (especially multiplayer), so I prefer to keep it above 100.
Comment has been collapsed.
He's fucking with you - this question is ridiculous.
Of course, higher is better in gaming. I go for 60FPS because I have a shitty monitor with a low refresh rate, and my computer isn't a beast so some games can drop to 40-50 (which is extremely annoying, anything close to 40 starts being hard to endure). Otherwise, I'd probably try for even high framerate since people I know have told me it's worth it.
Comment has been collapsed.
Ah my bad, I misread the first sentence in the post. Um. I guess you're... kidding yourself, then. :P
Well, I guess it's a matter of taste. Personally I dread going back to anything below the high 40s-50, and I'm speaking as someone who had a shitty PC for years and had ample experience gaming on 25-40 FPS (typically ~35).
Comment has been collapsed.
Next time you think about a new monitor, I can highly recommend going for GSync/VSync. It's a dream. I just played Shadow of the Tomb Raider, and in 3440*1440 I had a hard time staying above 60 all the time. It would go down to 50 at some points. With my old monitor even drops to 58 would have annoyed me to death. With GSync it's actually quite alright. It looks smooth all the time, even if at 50 it has no right to be. ;)
Comment has been collapsed.
+1 to GSYNC. It was more expensive but worth it for something that I'm looking at for the greater part of the day. A good monitor should last a long time.
Comment has been collapsed.
Corsair AX1200i
64 GB RAM
Interesting pieces, why so high? You'll never fully use both with that setup.
I mean, an AX1200i is unnecessary even if you did a 1080 SLI.
Comment has been collapsed.
I get the argument when talking about movies (kind of). But for games this is very different. Movies look more "natural" in 24/30fps, because we don't know anything else. We got used to it and have a hard time getting away from that. Games are very different, for various reasons. Game graphics are not the same as video footage. A movie frame was captured with a camera that has an exposure time. The camera has to be exposed to the light for a certain amount of time. So when stuff moves, a bit of that movement is captured in the frame, so to speak. You could probably call it natural motion blur (I guess that's exactly what it is).
That does not happen with computer graphics. When a frame is rendered in a game, that is a still image. It is a capture of only that particular moment. Like the exposure time was infinitely small. So when stuff moves in a computer game, the effect is different. It doesn't look as good. It looks more stuttery. You can compensate that in two different ways. Either you add synthetic motion blur, which can work quite well, but can also blur the whole image quite a bit. Or you increase the frame rate, which keeps the image clear and produces more seamless animations. I very much prefer the latter. Maybe in combination with slight motion blur for certain objects (not the whole image). You mentioned AC: Origins. I played it at 60fps, but also saw it at 30 on a friend's console, and it looked to much better on the PC. I personally cannot understand why anyone would ever prefer the lower frame rate. But everyone is entitled to their opinion, of course. :)
On top of that, the maybe most important aspect of high frame rates in games is that games are interactive. You control your character. The higher the frame rate, the lower the input lag. Going from 30 to 60 is very, very noticable in terms of controls. I remember having bought that Turtles game from Platinum (can't remember the exact name right now). I started it up, and it took me less than 5 seconds to realize that it's running at 30fps. The camera controls just felt sluggish. You can certainly argue that you like the look of 30fps better, but controls are objectively better at 60 or above. I have a 100Hz monitor now (so happy with it ^^), and when I got it I immediately felt how the mouse cursor in Windows contrrolled so much smoother. To me this feeling, getting a quick response to my input on the screen, is really important. Especially in games. But also with the little things, like the mouse cursor. ;)
Comment has been collapsed.
Yeah, moreover the "blur" effect is persistent and it gives a graphical bump to the whole gaming experience, if you played ORIGINS, then try them at 60 and 30, and that game locks at 30 fps because i checked it myself, you'll clearly see a visual overhaul, you'll notice is clearly that at 60fps, though the game looks too smooth, it's not look realistic, and at 30-48, it looks real, it feels real, and the additional blur, along with v-sync on, will make the gaming experience, amazing as it could be.
Comment has been collapsed.
30 fps is acceptable in movies mainly due to lack of interaction. You'd see a difference between 30 fps and 60 fps, but it doesn't matter for you as a viewer. Our brain is capable to "fix" itself watching something with very slow moves, like a snail or turtle, exactly the same how it's able to see super fast tiger running even though it's not capable to see every single step it makes.
Gaming however is entirely different aspect because of interaction. Playing in 30 fps is equal to at least 33.3-66.6 ms of delay for rendering the next frame, while in 60 fps the delay is only between 16.6-33.3, exactly a half of expected delay with 30 fps. This is called input lag and very often games only go even higher with that, due to pre-buffering and pre-rendering frames in advance as optimization trick.
Based on my tests above with value of 100 fps as being the upper limit, I'd say that reaction of 10-20 ms is acceptable for us, where anything higher is smaller or bigger discomfort. This is even more important in VR games where FPSes lower than 90 cause headaches, not because of lack of fluidity or smoothness, but in fact reaction time that freaks out our brain into direct signals as opposed to continous flow of events. This is non-natural, it causes huge amount of conflicting signals from our inner ear and is problematic for us, humans, because we're not used to that. With normal gaming this is not a problem as we only see direct signal in front of us, while for example our room or hands on the keyboard are moving in continous flow.
Because the issue was so huge, many VR devs implemented something like asynchronous spacewarp to achieve constant and high framerate, even if the game doesn't have enough of power to do so, simply to trick our brain by giving it continous flow of information. You probably know well enough that we HATE stuttering, and if you ever wondered why we hate it so much, this is your answer of what would happen if our "eyes" started to stutter.
Achieving less input lag is better, and this is also why rendering more frames (even if we can't display them all) can make things even smoother for us, though we can't see it anymore. This is how we praised 100 fps games on 60 Hz display, and this is how we praise 200 fps games on 144 Hz display today, except the difference is much smaller and it'll only be even less as we achieve higher scores, so it's not as huge as saving more than 30 ms of delay in 30 vs 60. I'm not sure what input lag we might still feel, but 10-20ms is acceptable for our brain, as showed by VR.
Comment has been collapsed.
The reason why people play at 30 FPS is because their setup doesn't handle 60 FPS. The 'movie effect' excuse is just to make it less sad ;)
Comment has been collapsed.
20 "Fluid" fps https://www.youtube.com/watch?v=A78Y63qW65Y
Comment has been collapsed.
Stability is more important then raw numbers.
I'll take a game running at 30fps over one running at 60fps with micro stutter and frame timing issues any day.
Comment has been collapsed.
True, you could also say that running a game at max settings at 30fps may be a better experience than 60fps at low settings.
However the OP has a 1080ti, and stutter or frame drops doesn't seem to be the case.
Comment has been collapsed.
The thing is that in this situation, a personal experience is essential requirement in order to put things in perspective, and when i played the said game, i find it difficult to actually enjoy it at that frame. I could have put both videos here but I'm not quite sure that's allowable.
But sometimes people should do orher things besides spitting shit both from the mouth and also from the back hole. Guess some people don't understand that quite yet, even though they made up their name like a ....
Comment has been collapsed.
You claim to have decades worth of gaming experience, yet only now it occurred to you to not limit games to horrible looking console FPS? Maybe you should have actually tried 60FPS or even more before making threads like this. But I guess thinking happens months after you say something, not before and the most important thing is to spam as many silly threads as possible to get any users to your whatever social media crap channels.
Comment has been collapsed.
The thing is that some games look amazing at 30, and some at 60. Games like BPR needs fast frame rates to actually capture those amazing speed experience.
But to you, it seems like you don't have any other work to do than shitcrit. Besides it's internet, i cannot blame you, and besides you actually take pleasure in crapping shit. That's understandable.
I guess you really don't have that many friends, that's the reason you actually start whining at me, and most of in my posts. If you need any help, lemme know.
Comment has been collapsed.
The thing is that each and every game looks better at 60, like every sane person already knows.
I don't have any other work? What do you do then except spout your shit online and cry when people don't like it? I have never seen you do anything else. I'm quite happily playing games and watching streams at the same time I'm making fun of you, since you know some people can do more than 1 thing at the same time.
I don't have any friends? Yeah that must be why I'm making shitload of stupid posts in desperate attempts to get people to like me and get the first user for my useless groups. Oh wait, it was someone else doing that than me. You're the one that should seek help since you obviously have far worse problems.
Comment has been collapsed.
You do realize that everyone knows that you're only talking about your own problems? Get help for them before you try to project them on others. You're the one so desperate to find even one friend here that you need to keep on making silly threads begging for it. Why on earth would I get sad or annoyed when I can just be happy while making fun on you? You goofed up your recruitment here and only ended up with blacklists, so just accept that facts already instead of trying to blame others for
Comment has been collapsed.
Well thanks, your crap just went from weird to crazy, so yes it helped me a lot to get happier on you. :D
Maybe I have misunderstood your purpose in life, it hasn't been posting walls of huge text with no sense to recruit your first friend, it has all been a grand plan to make me happy by laughing at you. So if this is the case like you claim, why are you so sad and angry when I do? Don't you have all the work you claim to have to do instead of entertaining me here?
Comment has been collapsed.
Higher FPS is generally better. but consistency is much more important.
Oh, and FPS in step with a refresh rate that your monitor natively supports. That's the best way to avoid screen tearing.
Comment has been collapsed.
Just wait, the new RTX cards get a whopping 40 FPS on 1080p with ray tracing ^^
Comment has been collapsed.
If you are used to 30fps, games will look fine with it, but as an example 20 years ago games like Half Life and Banjo Kazooie would be seen as top notch graphics. Slow games you might not notice it jump around as much but anything fast paced will be noticeably worse right off the bat. I locked my monitor to 30fps recently and played a game of Rocket League forgetting I did that, it made me really want to upgrade to 144hz lol
24fps works well in movies given what they are, but with games you need more frames than that to smoothly show everything flying in and out of the screen.
Comment has been collapsed.
84 Comments - Last post 33 minutes ago by ChrisKutcher
16,299 Comments - Last post 3 hours ago by Carenard
56 Comments - Last post 10 hours ago by Carenard
1,811 Comments - Last post 10 hours ago by ngoclong19
72 Comments - Last post 12 hours ago by Reidor
545 Comments - Last post 15 hours ago by UltraMaster
41 Comments - Last post 15 hours ago by ViToos
118 Comments - Last post 13 minutes ago by ginwarbear
184 Comments - Last post 15 minutes ago by kampeao
55 Comments - Last post 37 minutes ago by ChrisKutcher
32 Comments - Last post 48 minutes ago by marianoag
53 Comments - Last post 48 minutes ago by Tiajma
96 Comments - Last post 59 minutes ago by Rehail
149 Comments - Last post 1 hour ago by Swordoffury
HELLO COMMUNITY
It's been a long time i was busy with some work. But recently I'd a discussion with one of my friends (Who's a YouTuber, and a good person).
I said that playing games at 30fps do produce a MOVIE effect that feels more pleasant and is absent when I play at 60fps, I've posted some of the 30fps game videos, and some 60 fps game videos of the same game. He's saying that the actual difference will not be easy to be demonstrated on a VIDEO and 30 fps and 60 fps on a video are just the same :D.
Well in my opinion, gaming is all about how the way you get attached to games, how do games actually look, and why do you love to play it over the times. I do know that 60fps is something like the "trend" of the current situation. But i do feel that playing storytype games, like Assassins creed origins, Far Cry 5, The Evil Within 2, Life Is Strange 2, The DIvision are some of the games that shows best capability.
With 30-48fps i feel that the game looks more natural, i mean to say more REAL.
NOTE - The settings on which i usually play game is 1920*1080 at 30fps to 48fps. And I'm talking everything about this resolution. Because some games don't have 30 so I've to adjust with 48. HERE I'm talking about the gameplay, not the gameplay videos.
Here are some of the game videos. Check them before voting so that you can clearly understand what I'm talking about
I'll put some more videos regarding the same.
Regards,
Abrix
Comment has been collapsed.