by sofar » Wed Dec 09, 2015 23:20
anecdotes are evidence... ;^)
Many of the games I played in the last 10 years were unable to get 60frames on my 30" short-response-time dell monitor, but when they started get better and my gfx cards upgraded to I was unable to accept anything less than max framerate. My biggest problem was playing oblivion/skyrim or even WoW(yikes) in 30fps was eye-tearing ugly due to the long walks through scenery where you felt like walking through a slide deck at your grandparents house - my eyes just never felt like 30fps was smooth, and I can easily visually perceive individual frame drops at 60Hz in most games. And it's disturbing to gameplay.
So, anything under 60Hz is a problem to my eyes.
Now, my panels are limited to 60Hz, so I could never compare my eye response to higher frames. That's a good thing, since those 30" 2560x1600 panels were muy expensive back when I got them in 2006...
bottom line is that developers should avoid to "corn-hole" all users into "this much FPS is plenty" and just allow users to change the sliders to any desirable value.
If someone wants 15fps, I could care less, have them at it. I sometimes play on a laptop and tune down the max FPS to those levels, since the panels on many laptops are low quality and have a long response time.
But when I buy a 4k monitor in the next few years, I expect to be able to play minetest at 120Hz if I so desire.
It's not even important what scientific evidence there may or may not be. If people desire to shell out 5000$ for a gaming rig that can do a bazillion fps, then why are we writing software that makes it hard for people to get a bazillion fps? Sooner or later, those exact users are the one that will help us fix performance bottlenecks and drive innovation and performance optimizations. At the other end, so are the current android users - they're doing the same thing at the *other* end - forcing gpu optimizations to lower cpu cycles and save battery time. All good stuff.