• MonkderDritte@feddit.de
    link
    fedilink
    arrow-up
    9
    ·
    6 months ago

    Jokes on you, because i looked into this once. I don’t know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.

    • SorryQuick@lemmy.ca
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      6 months ago

      According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.

    • iopq@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      6 months ago

      It’s not about training, eye tracking is just that much more sensitive to pixels jumping

      You can immediately see choppy movement when you look around in a 1st person view game. Or if it’s an RTS you can see the trail behind your mouse anyway

      I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen

      Just give me a 480 FPS OLED with black frame insertion already, FFS

      • MonkderDritte@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i’m still pretty good in shooters.

        Yeah, it’s bad that our current tech stack doesn’t allow to just change image where change happens.