Here on FT and click-click Ollama.
Ok, … hmm, “Ollama is running”
Ok I need " Open WebUI". Click-click. Done.
It needs a model. Ok, got it.
I can go on forever to no avail.
Question: is there a simple guide to use by clueless click-click-ers ( like me ) ?
Because I thought that installing ( click-click, install defaults ) would be enough to get to the “UI / Site” and, have fun 
500: Ollama: 500, message=‘Internal Server Error’, url=‘http://localhost:11434/api/chat’
That is the error I get asking at a small model ( smollm:latest )
Edit:
2025-05-10 02:56:38.903824+00:00time=2025-05-10T02:56:38.903Z level=INFO source=images.go:463 msg="total blobs: 0"
2025-05-10 02:56:38.903839+00:00time=2025-05-10T02:56:38.903Z level=INFO source=images.go:470 msg="total unused blobs removed: 0"
2025-05-10 02:56:38.905684+00:00time=2025-05-10T02:56:38.905Z level=INFO source=routes.go:1300 msg="Listening on [::]:30068 (version 0.6.8)"
2025-05-10 02:56:38.907248+00:00time=2025-05-10T02:56:38.907Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
2025-05-10 02:56:38.916977+00:00time=2025-05-10T02:56:38.916Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
2025-05-10 02:56:38.917004+00:00time=2025-05-10T02:56:38.916Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="7.8 GiB" available="2.4 GiB"
I think the GPU is the problem