• LouNeko@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I wonder why exactly somebody decided that the search for a perfect AA method has to stop TAA. We went from jaggy edges to edge detection and oversampling (MSAA) being the standard in 2000-2012 but people where unsatisfied with the performance tank so we needed a lighter method. So we got post processing AA like SMAA which is a scam and does absolutely nothing or FXAA which simlpy applies a blur filter to edges. Not the most elegant solutions but they will do if you can’t effort to use MSAA. Then TAA came around the corner and I dont even know how it looks so bad, because it sounds fine on paper. Using multiple frames to detect differences in contrast and then smoothing out those diffrences seems like an OK alternative, but it should’ve never become the main AA method.
      I’ve honestly expected the AA journey to end with 4K resolution being the standard. AA is mostly a matter of pixeldesity over viewing distance. Mobile games have mostly no AA because their pixel density is ridiculous, Console games also rarely have AA because you sit 10 feet away from the screen. PC being the only outlier but certainly having the spare power to run at higher resolutions than consoles. But somewhere along the way, Nvidia decided to go all in on Raytracing and Dynamic Resolution instead of raw 4K performance. And Nvidia basiacly dictates where the gaming industry goes.
      So I honestly blame Nvdia for this whole mess and most people can agree that Nvidia has dropped the ball the last couple of years. Their Flagship cards cost more than an all consoles from Sony, Microsoft and Nintendo combined. They cost more than mid-high range gaming laptops. And the raw power gain has been like 80% over the last 10 years, because they put all their R&D into gimmicks.

      • Fushuan [he/him]@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        I got quite the good AA by rendering the screen at 4k and letting the graphic card underscale it into the screen’s 1080p resolution. No AA needed, looks fiine.

        • beefcat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That is an insanely expensive solution to this problem. You are cutting performance by 75% or more to make that possible, meaning your 30 FPS game could be doing 120 if you stuck to native 1080p.

          • Fushuan [he/him]@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            That’s the thing, my game is running at 60+, and I don’t need more.

            In any case new graphic cards AR prepared to run for 4k games, so having a 1080p screen which which I’m content is a godsend performance wise, it let’s me do stuff like this without practical performance losses.

        • LouNeko@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          That is basically MSAA without the edge dection. Rendering in 4K and downscaling is the dirtiest but most effective AA method. But downscaling the whole screen also applies to UI elements, this often times results in tiny blurry fonts if the UI isn’t scaled appropriately. But more and more games have started to add a render resolution scale option that goes beyond 100% without affecting the UI. Downscaling also causes latency issues. I can run Metal Gear Solid 5 at a stable 60 FPS at 4K but the display latency is very noticeable compared to 1440p at 60.
          I miss the time when you could just disable the games native AA and force MSAA through Nvidia control panel. But most newer titles dont accept Nvdias override, especialy Unreal games.

          • beefcat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            MSAA only samples the geometry multiple times, not the whole scene. It doesn’t work very well in games with a lot of shaders and other post process work, which is basically every game made in the last decade.

            What GP is describing is SSAA (Super sampled anti-aliasing).

            • LouNeko@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Thats what I meant by edge detection. I think part of the downfall of MSAA in modern gaming is foliage. Nowadays every field in videogames is filled with lush grass, same goes for trees and bushes. They aren’t flat textures of low poly models anymore. Most engines use completely different rendering methods for foliage to get the 1000s of swaying leafs and grass on screen with minimum performance impact. But having to detect all the edges of every single piece of grass and apply oversampling to it, would make any game run at single digit frames. There are certainly a few other things that GPUs have to render in bulk to justfy novel rendering methods, but foliage is by far the best example. So I can understand why post processing AA is easier to implement. But is TAA really the best we cab do? Especially because things like swaying grass becomes a green blob through TAA. Slow and fine movement like swaying is really the bane of temporal sampling.