Like I’m not one of THOSE. I know higher = better with framerates.
BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.
The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!
… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.
Yet like.
I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.
And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?
FPS and alternating current frequency are not at all the same thing
I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet
Because CRTs (and maybe other displays) are slaved to the input and insensitive to exact timing, and console chipset designers used convenient line counts/clock frequencies, consoles often have frame rates slightly different from standard NTSC (which is 60000/1001 or ~59.94 fields per second).
The 262 AND A HALF lines per field NTSC uses to get the dumb oscillator in a CRT to produce interlacing, is not convenient. “240p” moves the VSYNC pulse, shortening the frame duration.
So NES’s run at -60.1 FPS.