Yo

  • ashinadash [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    Right and all of these are probably inferior to the 780M found in the 8700G? I was looking at 780M/8700G numbers and watching it slug its way through Alan Wake 2 at 1080/FSR/lowest and not even getting 30fps consistently desolate

    AMD’s new mobile naming scheme is… slightly unpleasant, not as bad as Intel’s but lol. At least it tells you which ones are old architectures…

    • LalSalaamComrade@lemmy.ml
      cake
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 months ago

      Right and all of these are probably inferior to the 780M found in the 8700G?

      No, some of them have the 780M, so do look for them. The 680M is kinda good, but I’d ignore that.

      I was looking at 780M/8700G numbers and watching it slug its way through Alan Wake 2 at 1080/FSR/lowest and not even getting 30fps consistently desolate

      The Strix Point may be what you’re looking for. However the game you’ve mentioned here - Alan Wake 2 - struggles with even the RTX 3000 and 4000 series GPU - just wanted to let you know that this game is terribly optimized.

      AMD’s new mobile naming scheme is… slightly unpleasant, not as bad as Intel’s but lol. At least it tells you which ones are old architectures…

      दुःख, दर्द, पीड़ा |

      • ashinadash [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Yeah it’s also a dogshit game, this much is true of Callisto Protocol as well, but I use their numbers as a metric for future proofing, kinda. If a GPU can’t run this now, how bad is it gonna be in a year?

        It’s kind of a contradiction, where everyone is yelling about how you need 16GB VRAM and a fast GPU, and you do to run stuff… but almost none of the big games are worth it madeline-sadeline pointless

        • LalSalaamComrade@lemmy.ml
          cake
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Yeah it’s also a dogshit game, this much is true of Callisto Protocol as well, but I use their numbers as a metric for future proofing, kinda. If a GPU can’t run this now, how bad is it gonna be in a year?

          You need to give some love to games like Hollow Knight, Noita and Triangle Strategy. The triple-A game scene will be getting worse soon.

          It’s kind of a contradiction, where everyone is yelling about how you need 16GB VRAM and a fast GPU, and you do to run stuff… but almost none of the big games are worth it

          This is what happens when investors are your real customers. Sure as well hope that TES6 be cancelled. Because it will suck anyway, now that we’ve seen how their space game was absolute garbage.

          Anyways, if you remember the shared memory tech used in Xbox and older PlayStations, which made use of the AMD processors, they used to have shared RAM for CPU and GPU. Those were primitive tech, and limited to atmost 2 or 4GB. I’ve heard that AMD is bringing that for PCs, desktop and laptop alike, so the performance gains may be really phenomenal. This one is be called the Strix Halo, aka the Medusa Point, and right now, it is being tested for cache and RAM sharing (I think, I may be wrong tho) - it’s called the Infinity Fabric or something.

          But you’ll have to wait till 2025.

          • ashinadash [she/her]@hexbear.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            Metroidvania is cringe, but I’m literally playing Celeste right now, having just finished Tactics Ogre, and then I’m gonna play Fata Morgana. I agree, but if that’s your bag then you don’t need to consider any of this. A GTX 1050Ti will suffice for that, the point of stressing about big games running is future proofing. The last AAA game I bought was Doom Eternal, so I’m not that fussed, was only curious about APUs as a console alternative.

            Yeah Bethesda’s last good game was before the 2008 recession, I would laugh if TES6 got canned. Skyrim was fuuuckin shiiit!

            thinking-about-it Infinity Fabric is the name for the interconnect between CCXs in all Ryzen CPUs… I’ve never been clear on how memory sharing works in consoles 'cause Idek, I tend to view it the same as all iGPUs do it, that you just allocate however much (2GB, 4GB) memory to GPU and the rest to the CPU.

            I would be hesitant of getting hyped for any memory sharing stuff AMD talks about–back when they were still doing Modules with shared elements in Bulldozer and Piledriver architectures, their APU line featured a bit of this functionality with the Mantle API - Heterogenous System Architecture, or HSA for short. AMD talked a lot of shit about how cool it was for the processor and graphics parts of the APU to access the same memory, and how cool it was that they would be treated “equally” computationally. They dropped this idea hardcore when Ryzen launched because it was basically a GPU-shaped crutch for their garbage CPUs, and the marketing talk was all cope, lol