• NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    20 hours ago

    There is a massive push right now for energy efficient alternatives to nvidia GPUs for AI/ML. PLENTY of companies are dumping massive amounts of money on macs and rapidly learning the lesson the rest of us learned decades ago in terms of power and performance.

    The reality is that this is going to be marketed for AI because it has an APU which, keeping it simple, is a CPU+GPU. And plenty of companies are going to rush to buy them for that and a very limited subset will have a good experience because they don’t have time sensitive operations.

    But yeah, this is very much geared for light-moderate gaming, video rendering, and HTPCs. That is what APUs are actually good for. They make amazing workstations. I could also see this potentially being very useful for a small business/household local LLM for stuff like code generation and the like but… those small scale models don’t need anywhere near these resources.

    As for framework being involved: Someone has kindly explained to me that even though you have to replace the entire mobo to increase the amount of memory, you can still customize your side panels at any moment so I guess that is fitting the mission statement.

    • ilinamorato@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      14 hours ago

      For modularity: There’s also modular front I/O using the existing USB-C cards, and everything they installed uses standard connectors.