Like I’m not one of THOSE. I know higher = better with framerates.

BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.

The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!

… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.

Yet like.

I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.

And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?

  • otp@sh.itjust.works
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 day ago

    Bro when Majora’s mask came out nothing was 60fps lol

    Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.

    Actually, 60.0988fps according to speed runners.

    • bleistift2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 day ago

      The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.

      That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.

        • Count Regal Inkwell@pawb.socialOP
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 day ago

          Framerates weren’t really a

          Thing.

          Before consoles had frame-buffers – Because Framebuffers are what allow the machine to build a frame of animation over several VBlank Intervals before presenting to the viewer.

          The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.

          Before that, you were in beam-racing town.

          If your processing wasn’t enough to keep up with the TV’s refresh rate (60i/30p in NTSC territories, 50i/25p in PAL) – Things didn’t get stuttery or drop frames like modern games. They’d either literally run in slow-motion, or not display stuff (often both, as anyone who’s ever played a Shmup on NES can tell you)

          You had the brief window of the HBlank and VBlank intervals of the television to calc stuff and get the next frame ready.

          Buuuut, as of the PSX/N64/Saturn, most games were running anywhere between 15 and 60 FPS, with most sitting at the 20s.

          PC is a whole different beast, as usual.

          • otp@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            3 hours ago

            The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.

            I cared about the 3DO…

            Thanks for the info though!

          • _NetNomad@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            i think you’re mixing up a few different things here. beam-racing was really only a thing with the 2600 and stopped once consoles had VRAM, which is essentially a frame-buffer. but even then many games would build the frame in a buffer in regular RAM and then copy everything into VRAM at the vblank. in other cases you had two frames in VRAM and would just swap between them with a pointer every other frams. if it took longer than one frame to build the image, you could write your interrupt handler to just skip every other or every three vblank interrupts, which is how a game like super hang-on on the megadrive runs at 15 FPS even though the VDP is chucking out 60 frames a second. you could also disable interrupts when the buffer was still being filled, which is how you end up with slowdown on certain games when too many objects were on the screen. too many objects could also lead to going over the limits of how many sprites you can have on a scanline, which is why things would vanish- bit that is it’s own seperate issue. if you don’t touch VRAM between interrupts then the image shown last frame will show this frame as well

      • otp@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet

        • aubeynarf@lemmynsfw.com
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          Because CRTs (and maybe other displays) are slaved to the input and insensitive to exact timing, and console chipset designers used convenient line counts/clock frequencies, consoles often have frame rates slightly different from standard NTSC (which is 60000/1001 or ~59.94 fields per second).

          The 262 AND A HALF lines per field NTSC uses to get the dumb oscillator in a CRT to produce interlacing, is not convenient. “240p” moves the VSYNC pulse, shortening the frame duration.

          So NES’s run at -60.1 FPS.