Getting Started
Automation
DevOps & Monitoring
Gaming
Expose Open WebUI
Share your local Open WebUI (ChatGPT-like interface for Ollama) with anyone.
What is Open WebUI?
Open WebUI is a self-hosted ChatGPT-style frontend for Ollama and other LLM backends. It runs on port 3000 (or 8080).
Setup
Start Open WebUI (typically via Docker: docker run -p 3000:8080 ghcr.io/open-webui/open-webui:main).
Run the Skytunnel command with the host port (3000).
Share the public URL — anyone can chat with your local LLMs through a clean web interface.
? FAQ
No, if Open WebUI connects to Ollama via localhost inside Docker's network. Only expose the Open WebUI port.