my question is, should I start with a complete new Server-HW to use OLLAMA LLM’S
or
does it make sense/it’s possible to use my current HW (AOOSTAR WTR PRO N150, 16GB) furthermore and connect via eGPU (Razor Core X eGpu) a NVIDIA card to “speedup” OLLAMA LLM’S.
Are you going to be running the models locally? If so your experience is not going to be a good one even with an external GPU. I would suggest looking into utilizing openrouter with the hardware you listed.
Depends entirely on what your goals are. I like to play around with free stuff since I’m very new to this. Ultimately you will have to make the choice of what works best for you and what you are wanting to accomplish and choose the model that best fits your goals.