In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

  • variants@possumpat.io
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 minutes ago

    Host steam-headless and use the GPU for that so you can have remote gaming on your phone anywhere you have 5G

    • model_tar_gz@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 minutes ago

      Holy shit that’s awesome.

      My work gives me unlimited uptime on a dedicated A10. AI engineer working for NVIDIA. Doubt they’d care if I set something up like this, or if they’d even know.

      But I have a 4090 at home anyway so like, do I even need or is this just another way to explore infinite hobby of tinkering with computing.

  • kevincox@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 hour ago

    Most Intel GPUs are great at transcoding. Reliable, widely supported and quite a bit of transcoding power for very little electrical power.

    I think the main thing I would check is what formats are supported. If the other GPU can support newer formats like AV1 it may be worth it (if you want to store your videos in these more efficient formats or you have clients who can consume these formats and will appreciate the reduced bandwidth).

    But overall I would say if you aren’t having any problems no need to bother. The onboard graphics are simple and efficient.

  • exu@feditown.com
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 hours ago

    QuickSync is usually plenty to transcode. You will get more performance with a dedicated GPU, but the power consumption will increase massively.

    Nvidia also has a limit how many streams can be transcoded at the same time. There are driver hacks to circumvent that.

  • precarious_primes@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 hour ago

    I ran a 1650 super for a while. At idle it added about 10W and would draw 30-40W while transcoding. I ended up taking it out because the increased power wasn’t worth the slight performance increase for me.

    • ShimitarOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      Yeah look like a lot… Probably not worth it.