In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.
I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.
So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?
And as power consumption, will the increase be noticeable? Should I do it or pass?
Host steam-headless and use the GPU for that so you can have remote gaming on your phone anywhere you have 5G
Holy shit that’s awesome.
My work gives me unlimited uptime on a dedicated A10. AI engineer working for NVIDIA. Doubt they’d care if I set something up like this, or if they’d even know.
But I have a 4090 at home anyway so like, do I even need or is this just another way to explore infinite hobby of tinkering with computing.
Most Intel GPUs are great at transcoding. Reliable, widely supported and quite a bit of transcoding power for very little electrical power.
I think the main thing I would check is what formats are supported. If the other GPU can support newer formats like AV1 it may be worth it (if you want to store your videos in these more efficient formats or you have clients who can consume these formats and will appreciate the reduced bandwidth).
But overall I would say if you aren’t having any problems no need to bother. The onboard graphics are simple and efficient.
If it is working for you as is, no need to make a change
QuickSync is usually plenty to transcode. You will get more performance with a dedicated GPU, but the power consumption will increase massively.
Nvidia also has a limit how many streams can be transcoded at the same time. There are driver hacks to circumvent that.
I ran a 1650 super for a while. At idle it added about 10W and would draw 30-40W while transcoding. I ended up taking it out because the increased power wasn’t worth the slight performance increase for me.
Yeah look like a lot… Probably not worth it.