{"id":2533,"date":"2025-12-06T21:55:02","date_gmt":"2025-12-06T20:55:02","guid":{"rendered":"https:\/\/yellotab.se\/x056\/?p=2533"},"modified":"2025-12-29T19:37:40","modified_gmt":"2025-12-29T18:37:40","slug":"webb-gui-ollama","status":"publish","type":"post","link":"https:\/\/yellotab.se\/x056\/2025\/12\/06\/webb-gui-ollama\/","title":{"rendered":"Webb GUI Ollama"},"content":{"rendered":"\n<h1 class=\"wp-block-heading\">\u2705&nbsp;<strong>Easiest: Install a Ready-Made Web Interface<\/strong><\/h1>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>1. Open WebUI (formerly Ollama WebUI)<\/strong>&nbsp;\u2013&nbsp;<em>most popular, simple setup<\/em><\/h2>\n\n\n\n<p>Works with any Ollama model including ministral-3:3b.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Install<\/strong><\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">docker run -d -p 3000:3000 \\\n  -v open-webui:\/app\/backend\/data \\\n  --name open-webui \\\n  --restart always \\\n  ghcr.io\/open-webui\/open-webui:main\n<\/code><\/pre>\n\n\n\n<p>Then open in browser:<\/p>\n\n\n\n<p>\u27a1\ufe0f&nbsp;<strong><a>http:\/\/localhost:3000<\/a><\/strong><\/p>\n\n\n\n<p>Ollama must be running:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">ollama serve &amp;\n<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Use model:<\/strong><\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">ollama pull ministral:3b\n<\/code><\/pre>\n\n\n\n<p>(or the exact tag you want, e.g.&nbsp;<code class=\"\" data-line=\"\">ministral-3:3b<\/code>)<\/p>\n\n\n\n<p>Open WebUI will automatically detect it.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>2. Ollama WebUI Lite (no Docker)<\/strong><\/h2>\n\n\n\n<p>If you don\u2019t want Docker:<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Install<\/strong><\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">pip install open-webui\nopen-webui serve\n<\/code><\/pre>\n\n\n\n<p>Browse to:<\/p>\n\n\n\n<p>\u27a1\ufe0f&nbsp;<strong><a>http:\/\/127.0.0.1:8080<\/a><\/strong><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>3. Continue.dev (VS Code-like web UI)<\/strong><\/h2>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">docker run -d -p 8000:8000 ghcr.io\/continuedev\/continue\n<\/code><\/pre>\n\n\n\n<p>Supports Ollama as a backend for coding workflows.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h1 class=\"wp-block-heading\">\ud83c\udfa8&nbsp;<strong>More advanced UIs (specialized)<\/strong><\/h1>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>4. AnythingLLM<\/strong><\/h2>\n\n\n\n<p>A team-oriented interface with memory &amp; agents.<\/p>\n\n\n\n<p>Install:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">docker run -d -p 3001:3001 \\\n  --name anythingllm \\\n  mintplexlabs\/anything-llm\n<\/code><\/pre>\n\n\n\n<p>Configure Ollama \u2192 pick your ministral model.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>5. LM Studio (desktop app UI for local models)<\/strong><\/h2>\n\n\n\n<p>Mac\/Windows GUI.<br>You can import Ollama models or run parallel.<\/p>\n\n\n\n<p>Download:&nbsp;<a href=\"https:\/\/lmstudio.ai\/\">https:\/\/lmstudio.ai<\/a>&nbsp;(does not require Docker)<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h1 class=\"wp-block-heading\">\ud83d\udee0\ufe0f&nbsp;<strong>Create Your Own Web Interface<\/strong><\/h1>\n\n\n\n<p>If you want a custom UI:<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Minimal HTML\/JS Chat UI<\/strong><\/h2>\n\n\n\n<p>Uses the Ollama HTTP API at&nbsp;<code class=\"\" data-line=\"\">http:\/\/localhost:11434\/api\/chat<\/code>.<\/p>\n\n\n\n<p>Create a file&nbsp;<code class=\"\" data-line=\"\">index.html<\/code>:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code class=\"\" data-line=\"\">&lt;!DOCTYPE html&gt;\n&lt;html&gt;\n&lt;body&gt;\n&lt;textarea id=&quot;input&quot;&gt;&lt;\/textarea&gt;\n&lt;button onclick=&quot;send()&quot;&gt;Send&lt;\/button&gt;\n&lt;pre id=&quot;output&quot;&gt;&lt;\/pre&gt;\n\n&lt;script&gt;\nasync function send() {\n  const msg = document.getElementById(&#039;input&#039;).value;\n  const out = document.getElementById(&#039;output&#039;);\n\n  const response = await fetch(&#039;http:\/\/localhost:11434\/api\/chat&#039;, {\n    method: &quot;POST&quot;,\n    headers: {&quot;Content-Type&quot;:&quot;application\/json&quot;},\n    body: JSON.stringify({\n      model: &quot;ministral:3b&quot;,\n      messages: &#091;{role:&quot;user&quot;, content: msg}]\n    })\n  });\n\n  const reader = response.body.getReader();\n  let text = &quot;&quot;;\n\n  while (true) {\n    const { value, done } = await reader.read();\n    if (done) break;\n    const chunk = new TextDecoder().decode(value);\n    text += chunk;\n    out.textContent = text;\n  }\n}\n&lt;\/script&gt;\n&lt;\/body&gt;\n&lt;\/html&gt;\n<\/code><\/pre>\n\n\n\n<p>Open in your browser.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h1 class=\"wp-block-heading\">\ud83d\udca1 Which option should you pick?<\/h1>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"table has-fixed-layout\"><thead><tr><th>Goal<\/th><th>Best Option<\/th><\/tr><\/thead><tbody><tr><td><strong>Simple chat UI<\/strong><\/td><td><strong>Open WebUI<\/strong><\/td><\/tr><tr><td><strong>Full-featured RAG, embeddings, agents<\/strong><\/td><td><strong>AnythingLLM<\/strong><\/td><\/tr><tr><td><strong>No Docker<\/strong><\/td><td><strong>Open WebUI (pip)<\/strong><\/td><\/tr><tr><td><strong>Want to code your own<\/strong><\/td><td>HTML\/JS example above<\/td><\/tr><tr><td><strong>Coding assistant<\/strong><\/td><td>Continue.dev or LM Studio<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n","protected":false},"excerpt":{"rendered":"<p>\u2705&nbsp;Easiest: Install a Ready-Made Web Interface 1. Open WebUI (formerly Ollama WebUI)&nbsp;\u2013&nbsp;most popular, simple setup Works with any Ollama model including ministral-3:3b. Install Then open in browser: \u27a1\ufe0f&nbsp;http:\/\/localhost:3000 Ollama must be running: Use model: (or the exact tag you want, e.g.&nbsp;ministral-3:3b) Open WebUI will automatically detect it. 2. Ollama WebUI Lite (no Docker) If you [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[236],"tags":[],"class_list":["post-2533","post","type-post","status-publish","format-standard","hentry","category-llm"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/posts\/2533","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/comments?post=2533"}],"version-history":[{"count":1,"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/posts\/2533\/revisions"}],"predecessor-version":[{"id":2534,"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/posts\/2533\/revisions\/2534"}],"wp:attachment":[{"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/media?parent=2533"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/categories?post=2533"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/yellotab.se\/x056\/wp-json\/wp\/v2\/tags?post=2533"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}