Anyone use Electric Eel, Ollama and Nvidia?

Has anyone run ollama in a container with nvidia_smi support in Electric Eel?
I have a Dell Poweredge T640 with an NVIDIA RTX 2080Ti and I’m unable to run anything with --gpus all or see nvidia-smi in my shell. Also there’s some apparmor or something preventing installing the drivers manually as root.

I’m in a similar boat…

I’m running EE on a T620 with a 1660Super, Nvidia 550 drivers and the log in ollama shows:-

library=cuda variant=v12 compute=7.5 driver=12.4 name=“NVIDIA GeForce GTX 1660 SUPER” total=“5.8 GiB” available=“5.7 GiB”

However, loading a small LLM it doesnt seem to kick in the GPU at all (it worked fine previously before I had to re-install under E.E.) and I’m struggling to find out why.

the server has a P4 and the 1660 Super in there and both have been usable by the system across apps previously. The server hasnt got any GPU isolated etc and from the logs it appears that at least the 1660S is detectable under the Ollama startup.

trying to diagnose whats happening here…

Ive the WebUI running (use this to access the Ollama instance) and I’ve tried with both the GPU enabled and not used in the web interface, whilst leaving it always enabled in Ollama.

Previously I could specify a device ID to use and maybe thats the issue here?

keeping at it.

and then I bounced everything and it kicked into life. Who knows… :).

the install process was just scrubbing both webui and Ollama, upgrade to e.e., re-install open webui and Ollama. enure the driver option was selected during the setup and both GPUs were available to the system, then bounce everything and of it trotted…