eGPU passthrough for LLM hosting

Hi All:

I’m new to this forum so apologizes upfront if the topic has already been addressed.

Currently running Dragonfish-24.04.2 (upgrading soon to ElectricEel and beyond). I have modest 9 yr old hardware that has served me well due to the ix-systems development prowess and vision. Kudos!

I’m curious to learn of any experiences around using an Oculink connected eGPU for pass-through to apps/VMs. My use case would be to host a local LLM.

I realize I’ll likely need to upgrade the hardware. But, I would be remiss if I went down that rabbit hole without asking for insight from anyone who’s already jumped into the deep end :grinning:

Feedback would be appreciated.
Thanks!

1 Like