Skibo Posted August 11 Share Posted August 11 Hello, I installed the Ollama by joly0 - I have installed the Nvidia drivers for my rig, and everything is working, BUT I can't seem to download the models. I keep getting "Open WebUI: Server Connection Error" looking at the logs i get "INFO: <myinternalIP>:55648 - "POST /ollama/api/pull/0 HTTP/1.1" 500 Internal Server Error. Can someone tell me what I am doing wrong or not seeing what is happening? Quote Link to comment
JorgeB Posted August 11 Share Posted August 11 Look for the appropriate support thread: Quote Link to comment
Skibo Posted August 11 Author Share Posted August 11 Sorry jorgeB, the support page talks about the actual open-webui & Ollama configuration when installing it on your system. I have installed ollama and open-webui on my laptop and it works great, however its something do with Unraid effecting the outcome. Quote Link to comment
Skibo Posted August 22 Author Share Posted August 22 Well, it was a good run, but Unraid has been a big epic failure - it was a good run. Quote Link to comment
CasaP Posted September 8 Share Posted September 8 You need to pull the models from the "console" of the docker using a command like this; ollama pull hermes3:latest After it downloads, you need to refresh the browser to see it. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.