askill
openwebui

openwebuiSafety 80Repository

Open WebUI AI chat interface management via Podman Quadlet. Provides a web UI for interacting with Ollama models. Use when users need to configure, start, or manage the Open WebUI service.

0 stars
1.2k downloads
Updated 1/26/2026

Package Files

Loading files...
SKILL.md

Open WebUI - AI Chat Interface

Overview

The openwebui command manages the Open WebUI service using Podman Quadlet containers. It provides a web-based chat interface for interacting with Ollama LLM models.

Key Concept: Open WebUI connects to Ollama via the bazzite-ai network using DNS (http://ollama:11434). Ensure Ollama is running before using Open WebUI.

Quick Reference

ActionCommandDescription
Configujust openwebui configConfigure Open WebUI
Deleteujust openwebui deleteRemove instance config and container
Logsujust openwebui logs [--lines=N]View container logs
Restartujust openwebui restartRestart server
Shellujust openwebui shell [-- CMD]Open shell or execute command in container
Startujust openwebui startStart Open WebUI server
Statusujust openwebui statusShow instance status
Stopujust openwebui stopStop Open WebUI server
URLujust openwebui urlShow web UI access URL

Parameters

ParameterLong FlagShortDefaultDescription
Port--port-p3000Host port for web UI
Image--image-ighcr.io/open-webui/open-webui:mainContainer image
Tag--tag-tmainImage tag
Bind--bind-b127.0.0.1Bind address
Config Dir--config-dir-c~/.config/openwebui/1Config/data directory
Workspace--workspace-dir-w(empty)Workspace mount
GPU Type--gpu-type-gautoGPU type
Instance--instance-n1Instance number or all
Lines--lines-l50Log lines to show

Configuration

# Default configuration (port 3000, localhost only)
ujust openwebui config

# Custom port (long form)
ujust openwebui config --port=3001

# Custom port (short form)
ujust openwebui config -p 3001

# Network-wide access
ujust openwebui config --bind=0.0.0.0

# Combine parameters (long form)
ujust openwebui config --port=3001 --bind=0.0.0.0

# Combine parameters (short form)
ujust openwebui config -p 3001 -b 0.0.0.0

# GPU-optimized image
ujust openwebui config --image=ghcr.io/open-webui/open-webui:cuda

Update Existing Configuration

Running config when already configured updates the existing settings:

# Change only the bind address
ujust openwebui config --bind=0.0.0.0

# Update port without affecting other settings
ujust openwebui config --port=3002

Container Images

ImageDescription
ghcr.io/open-webui/open-webui:mainStandard image (default)
ghcr.io/open-webui/open-webui:cudaNVIDIA CUDA optimized
ghcr.io/open-webui/open-webui:ollamaBundled with Ollama (not recommended)

Note: GPU is auto-detected and attached regardless of image choice.

Lifecycle Management

# Start Open WebUI
ujust openwebui start

# Stop service
ujust openwebui stop

# Restart (apply config changes)
ujust openwebui restart

# View logs (default 50 lines)
ujust openwebui logs

# View more logs (long form)
ujust openwebui logs --lines=200

# View more logs (short form)
ujust openwebui logs -l 200

# Check status
ujust openwebui status

# Show access URL
ujust openwebui url

Multi-Instance Support

# Start all instances (long form)
ujust openwebui start --instance=all

# Start all instances (short form)
ujust openwebui start -n all

# Stop specific instance
ujust openwebui stop --instance=2

# Delete all instances
ujust openwebui delete --instance=all

Shell Access

# Interactive shell
ujust openwebui shell

# Run specific command (use -- separator)
ujust openwebui shell -- ls -la /app/backend/data
ujust openwebui shell -- cat /app/backend/data/config.json

Network Architecture

Open WebUI uses the bazzite-ai bridge network for cross-container DNS:

+-------------------+     DNS      +-------------------+
|   Open WebUI      | -----------> |      Ollama       |
|   (openwebui)     |              |    (ollama)       |
|   Port 3000       |              |   Port 11434      |
+-------------------+              +-------------------+
         |                                  |
         +------ bazzite-ai network --------+

Environment Variables (injected automatically):

OLLAMA_BASE_URL=http://ollama:11434
OLLAMA_HOST=http://ollama:11434
JUPYTER_HOST=http://jupyter:8888
COMFYUI_HOST=http://comfyui:8188

Network Binding

Bind AddressAccessUse Case
127.0.0.1Localhost onlyDefault, secure
0.0.0.0All interfacesNetwork access, Tailscale

Security Note: Using --bind=0.0.0.0 exposes the service to your network. Consider using Tailscale for secure remote access:

# Expose via Tailscale (secure)
ujust tailscale serve --service=openwebui

Data Persistence

PathDescription
~/.config/openwebui/<INSTANCE>/dataUsers, chats, settings

Data persists across container restarts. Each instance has isolated data.

Common Workflows

Initial Setup

# 1. Ensure Ollama is running
ujust ollama start

# 2. Configure Open WebUI
ujust openwebui config

# 3. Start the service
ujust openwebui start

# 4. Access the web UI
ujust openwebui url
# Output: http://127.0.0.1:3000

Remote Access Setup

# Configure for network access
ujust openwebui config --bind=0.0.0.0

# Start the service
ujust openwebui start

# Or use Tailscale for secure access
ujust tailscale serve --service=openwebui

Upgrade Container Image

# Stop service
ujust openwebui stop

# Update to new image
ujust openwebui config --image=ghcr.io/open-webui/open-webui:main

# Restart
ujust openwebui start

GPU Support

GPU is automatically detected and attached:

GPU TypeDetectionQuadlet Config
NVIDIAnvidia-smiAddDevice=nvidia.com/gpu=all
AMDlspciAddDevice=/dev/dri
IntellspciAddDevice=/dev/dri

Check GPU status:

ujust openwebui shell -- nvidia-smi

Troubleshooting

Service Won't Start

# Check status
ujust openwebui status

# View logs
ujust openwebui logs --lines=100

# Check if Ollama is running
ujust ollama status

Common causes:

  • Port 3000 already in use
  • Ollama not running
  • Container image not pulled

Can't Connect to Ollama

Symptom: "No models available" in web UI

Check:

# Verify Ollama is running
ujust ollama status

# Test Ollama connection from Open WebUI container
ujust openwebui shell -- curl http://ollama:11434/api/tags

Fix:

# Start Ollama first
ujust ollama start

# Restart Open WebUI
ujust openwebui restart

Web UI Not Accessible

Symptom: Browser can't connect to http://localhost:3000

Check:

ujust openwebui status
ujust openwebui url

Fix:

# If using wrong bind address
ujust openwebui config --bind=127.0.0.1
ujust openwebui restart

Clear Data and Start Fresh

# Delete everything
ujust openwebui delete --instance=all

# Reconfigure
ujust openwebui config
ujust openwebui start

Cross-References

  • Required: ollama (Ollama must be running for models)
  • Related: jupyter (ML development), comfyui (image generation)
  • Network: Uses bazzite-ai network (shared with ollama, jupyter, comfyui)
  • Docs: Open WebUI GitHub

When to Use This Skill

Use when the user asks about:

  • "install open webui", "setup chat interface", "web ui for ollama"
  • "configure openwebui", "change port", "network access"
  • "open webui not working", "can't see models", "connection error"
  • "open webui logs", "debug open webui"
  • "delete open webui", "uninstall"

Install

Download ZIP
Requires askill CLI v1.0+

AI Quality Score

88/100Analyzed 2/19/2026

High-quality skill document for managing Open WebUI via Podman Quadlet. Comprehensive coverage of configuration, lifecycle management, networking, GPU support, and troubleshooting. Well-organized with tables, diagrams, and clear command examples. Includes \"when to use\" trigger section. Specific to bazzite-ai project but structurally excellent. Score reflects strong technical accuracy, actionability, and clarity with minor扣分 for project-specific coupling.

80
95
65
90
95

Metadata

Licenseunknown
Version-
Updated1/26/2026
Publisheratrawog

Tags

apigithubllmsecuritytesting