Thx in advice.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    Buy the cheapest graphics card with 16 or 24GB of VRAM. In the past people bought used NVidia 3090 cards. You can also buy a GPU from AMD, they’re cheaper but ROCm is a bit more difficult to work with. Or if you own a MacBook or any Apple device with a M2 or M3, use that. And hopefully you paid for enough RAM in it.

    • thirdBreakfast@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      4 months ago

      An M1 MacBook with 16GB cheerfully runs llama3:8b outputting about 5 words a second. A second hand MacBook like that probably costs half to a third of a secondhand RTX3090.

      It must suck to be a bargain hunting gamer. First bitcoin, and now AI.

      edit: a letter

      • Damage
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        Patient gamers at least have the steam deck option now

        • nfsm@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          Ok. I get it now. I’ve been trying to build something cheap as a Linux gaming setup and I’ve come to the conclusion that I’m better off buying the steam deck.

          • Damage
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            I think an older Ryzen and an RX590 can be had for decent prices, no?

            • nfsm@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              Yeah, but the form factor of the steam deck makes it more appealing if I want to set It up in the living room

    • Fisch@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      I actually use an AMD card for running image generation and LLMs on my PC on Linux. It’s actually not hard to set up.

        • Russ@bitforged.space
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          I’m not the original person you replied to, but I also have a similar setup. I’m using a 6700XT, with both InvokeAI and stable-diffusion-webui-forge setup to run without any issues. While I’m running Arch Linux, I have it setup in Distrobox so its agnostic to the distro I’m running (since I’ve hopped between quite a few distros) - the container is actually an Ubuntu based container.

          The only hiccup I ran into is that while ROCm does support this card, you need to set an environmental variable for it to be picked up correctly. At the start of both sd-webui and invokeai’s launch scripts, I just use:

          export HSA_OVERRIDE_GFX_VERSION=10.3.0
          

          In order to set that up, and it works perfectly. This is the link to the distrobox container file I use to get that up and running.

          • s38b35M5@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            Thx. I’m dabbling rn with a 2015 Intel i5 SFF and a low profile 6400 GPU, but it looks like I’ll be getting back to all my gear soon, and was curious to see what others are having success running with.

            I think I’m looking at upgrading to a 7600 or greater GPU in a ryzen 7, but still on the sidelines watching the ryzen 9k rollout.

            I still haven’t tried any image generation, have only used llamafile and LM studio, but would like to did a little deeper, while accounting for my dreaded ADHD that makes it miserable to learn new skills…

        • Fisch@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          I have Fedora installed on my system (don’t know how the situation is on other distros regarding rocm) and my GPU is an RX 6700 XT. For image generation I use stable duffusion webui and for LLMs I use text generation webui. Both installed everything they needed by themselves and work perfectly fine on my AMD GPU. I can also give you more info if there’s anything else you wanna know.