Open Web UI doesn't allow to chat with Ollama

I’m running TrueNAS 25.04.0 and installed Ollama and Open Web UI. Over Open Web UI I did download two models, but if I want to use the chat I get the following error (in both models):

500: Ollama: 500, message=‘Internal Server Error’, url=‘httpxxx//192.168.1.100:30068/api/chat’

Franky I don’t have any idea what could be wrong.

I used this guide to get me up and running.

1 Like

Got it running, the problem was, that I used too large models for my CPU.
I asked Gemini about suitable models for a Intel N355 CPU and I have installed now the following:

  • gemma3:1b
  • qwen2:1.5b
  • tinyllama:latest

They are quite slow, the main problem is that Ollama can’t use the INTEL GPU of Alder Lake systems, according the logs.

im trying to figure out how to get my A750 working with ollama and LLMs on truenas.