I’m running TrueNAS 25.04.0 and installed Ollama and Open Web UI. Over Open Web UI I did download two models, but if I want to use the chat I get the following error (in both models):
500: Ollama: 500, message=‘Internal Server Error’, url=‘httpxxx//192.168.1.100:30068/api/chat’
Got it running, the problem was, that I used too large models for my CPU.
I asked Gemini about suitable models for a Intel N355 CPU and I have installed now the following:
gemma3:1b
qwen2:1.5b
tinyllama:latest
They are quite slow, the main problem is that Ollama can’t use the INTEL GPU of Alder Lake systems, according the logs.