So I am a familiar with some of the pros and cons of a dedicated GPU. I know it will be idle 95% of the time, and if I allow time (hours/days) the CPU would eventually handle the workload. However for my use-case I think it would be a good fit and help with future proofing so wanted to get some suggestions.
I am currently using the GPU in my remote gaming PC for my Immich jobs and for the most part I have it working pretty good. There are some issues I had including: complexity setting up Rocm, some of the the models would not DL correctly within the container, I have to always have the container running, and apparently not ALL Immich jobs can even use the remote ML GPU (?).
So this is why I would like to get ALL possible jobs to use GPU and having it local seems like the only way here? Plus AI models are increasing in size and will require more and more resources as time goes on. I could also use larger models to further improve results within any Immich job.
So I am thinking a NVIDIA 5060 Dual-fan model (2.5-slot Design) which should fit in the case fine.
Suggestions here?