Skip to content
Merged

sync #1171

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion docs/features/extensibility/plugin/functions/action.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,14 @@ sidebar_position: 2
title: "Action Function"
---

Action functions allow you to write custom buttons that appear in the message toolbar for end users to interact with. This feature enables more interactive messaging, allowing users to grant permission before a task is performed, generate visualizations of structured data, download an audio snippet of chats, and many other use cases.
# 🎬 Action Function: Custom Interactive Buttons

:::danger ⚠️ Critical Security Warning
**Action Functions execute arbitrary Python code on your server.** Function creation is restricted to administrators only. Only install from trusted sources and review code before importing. A malicious Function could access your file system, exfiltrate data, or compromise your entire system. For full details, see the [Plugin Security Warning](/features/extensibility/plugin/).
:::

Action functions allow you to write custom buttons that appear in the message toolbar for end users to interact with. This feature enables more interactive messaging, allowing users to grant permission before a task is performed, generate visualizations of structured data, download an audio snippet of chats, and many other use cases.

:::warning Use Async Functions for Future Compatibility
Action functions should always be defined as `async`. The backend is progressively moving toward fully async execution, and synchronous functions may block execution or cause issues in future releases.
:::
Expand Down
4 changes: 2 additions & 2 deletions docs/features/extensibility/plugin/functions/filter.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@ sidebar_position: 3
title: "Filter Function"
---

# 🪄 Filter Function: Modify Inputs and Outputs

:::danger ⚠️ Critical Security Warning
**Filter Functions execute arbitrary Python code on your server.** Function creation is restricted to administrators only. Only install from trusted sources and review code before importing. A malicious Function could access your file system, exfiltrate data, or compromise your entire system. For full details, see the [Plugin Security Warning](/features/extensibility/plugin/).
:::

# 🪄 Filter Function: Modify Inputs and Outputs

Welcome to the comprehensive guide on Filter Functions in Open WebUI! Filters are a flexible and powerful **plugin system** for modifying data *before it's sent to the Large Language Model (LLM)* (input) or *after it’s returned from the LLM* (output). Whether you’re transforming inputs for better context or cleaning up outputs for improved readability, **Filter Functions** let you do it all.

This guide will break down **what Filters are**, how they work, their structure, and everything you need to know to build powerful and user-friendly filters of your own. Let’s dig in, and don’t worry—I’ll use metaphors, examples, and tips to make everything crystal clear! 🌟
Expand Down
244 changes: 165 additions & 79 deletions docs/features/extensibility/plugin/functions/index.mdx

Large diffs are not rendered by default.

3 changes: 2 additions & 1 deletion docs/features/extensibility/plugin/functions/pipe.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,12 @@ sidebar_position: 4
title: "Pipe Function"
---

# 🚰 Pipe Function: Create Custom "Agents/Models"

:::danger ⚠️ Critical Security Warning
**Pipe Functions execute arbitrary Python code on your server.** Function creation is restricted to administrators only. Only install from trusted sources and review code before importing. A malicious Function could access your file system, exfiltrate data, or compromise your entire system. For full details, see the [Plugin Security Warning](/features/extensibility/plugin/).
:::

# 🚰 Pipe Function: Create Custom "Agents/Models"
Welcome to this guide on creating **Pipes** in Open WebUI! Think of Pipes as a way to **adding** a new model to Open WebUI. In this document, we'll break down what a Pipe is, how it works, and how you can create your own to add custom logic and processing to your Open WebUI models. We'll use clear metaphors and go through every detail to ensure you have a comprehensive understanding.

:::warning Use Async Functions for Future Compatibility
Expand Down
24 changes: 9 additions & 15 deletions docs/features/extensibility/plugin/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,6 @@ title: "Tools & Functions (Plugins)"

# 🛠️ Tools & Functions

Imagine you've just stumbled upon Open WebUI, or maybe you're already using it, but you're a bit lost with all the talk about "Tools", "Functions", and "Pipelines". Everything sounds like some mysterious tech jargon, right? No worries! Let's break it down piece by piece, super clearly, step by step. By the end of this, you'll have a solid understanding of what these terms mean, how they work, and why know it's not as complicated as it seems.

## TL;DR

- **Tools** extend the abilities of LLMs, allowing them to collect real-world, real-time data like weather, stock prices, etc.
- **Functions** extend the capabilities of the Open WebUI itself, enabling you to add new AI model support (like Anthropic or Vertex AI) or improve usability (like creating custom buttons or filters).
- **Pipelines** are more for advanced users who want to transform Open WebUI features into API-compatible workflows—mainly for offloading heavy processing.

Getting started with Tools and Functions is easy because everything’s already built into the core system! You just **click a button** and **import these features directly from the community**, so there’s no coding or deep technical work required.

## What are "Tools" and "Functions"?

---

:::danger ⚠️ Critical Security Warning

**Tools, Functions, Pipes, Filters, and Pipelines execute arbitrary Python code on your server.** This is by design—it's what makes them powerful. However, this also means:
Expand All @@ -37,7 +23,15 @@ Getting started with Tools and Functions is easy because everything’s already

:::

---
## TL;DR

- **Tools** extend the abilities of LLMs, allowing them to collect real-world, real-time data like weather, stock prices, etc.
- **Functions** extend the capabilities of the Open WebUI itself, enabling you to add new AI model support (like Anthropic or Vertex AI) or improve usability (like creating custom buttons or filters).
- **Pipelines** are more for advanced users who want to transform Open WebUI features into API-compatible workflows—mainly for offloading heavy processing.

Getting started with Tools and Functions is easy because everything’s already built into the core system! You just **click a button** and **import these features directly from the community**, so there’s no coding or deep technical work required.

## What are "Tools" and "Functions"?

Let's start by thinking of **Open WebUI** as a "base" software that can do many tasks related to using Large Language Models (LLMs). But sometimes, you need extra features or abilities that don't come *out of the box*—this is where **tools** and **functions** come into play.

Expand Down
2 changes: 2 additions & 0 deletions docs/features/extensibility/plugin/tools/development.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ sidebar_position: 2
title: "Development"
---

# 🔧 Tool Development

:::danger ⚠️ Critical Security Warning
**Workspace Tools execute arbitrary Python code on your server.** Only install from trusted sources, review code before importing, and restrict Workspace access to trusted administrators only. Granting a user the ability to create or import Tools is equivalent to giving them shell access to the server. For full details, see the [Plugin Security Warning](/features/extensibility/plugin/).
:::
Expand Down
4 changes: 2 additions & 2 deletions docs/features/extensibility/plugin/tools/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,12 @@ title: "Tools"

# What are Tools?

⚙️ Tools are the various ways you can extend an LLM's capabilities beyond simple text generation. When enabled, they allow your chatbot to do amazing things — like search the web, scrape data, generate images, talk back using AI voices, and more.

:::danger ⚠️ Critical Security Warning
**Workspace Tools and Functions execute arbitrary Python code on your server.** Only install from trusted sources, review code before importing, and restrict Workspace access to trusted administrators only. Granting a user the ability to create or import Tools is equivalent to giving them shell access to the server. For full details, see the [Plugin Security Warning](/features/extensibility/plugin/).
:::

⚙️ Tools are the various ways you can extend an LLM's capabilities beyond simple text generation. When enabled, they allow your chatbot to do amazing things — like search the web, scrape data, generate images, talk back using AI voices, and more.

Because there are several ways to integrate "Tools" in Open WebUI, it's important to understand which type you are using.

---
Expand Down
15 changes: 15 additions & 0 deletions docs/getting-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,21 @@ Everything you need for a working setup. Choose Docker for the fastest path, Pyt

---

## 🤖 Connect an Agent

**Go beyond simple model providers. Connect an autonomous AI agent.**

AI agents like Hermes Agent and OpenClaw bring their own tools (terminal, file ops, web search, memory) and use Open WebUI as a rich chat frontend. The agent decides when to use tools, executes them, and streams results back to you.

| | |
| :--- | :--- |
| 🧠 **Hermes Agent** | Nous Research's agent with terminal, file ops, web search, and skills |
| 🐾 **OpenClaw** | Self-hosted agent framework with shell access, web browsing, and channel bots |

[**Connect an agent →**](/getting-started/quick-start/connect-an-agent)

---

## Sharing Open WebUI

**Bring AI to your entire organization with a single deployment.**
Expand Down
74 changes: 74 additions & 0 deletions docs/getting-started/quick-start/connect-a-provider/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
---
sidebar_position: 0
title: "Connect a Provider"
---

# 🔌 Connect a Provider

**Connect Open WebUI to any model provider and start chatting in minutes.**

Open WebUI supports multiple connection protocols, including **Ollama**, **OpenAI-compatible APIs**, and **Open Responses**. Any cloud API or local server that speaks one of these protocols works out of the box. Just add a URL and API key, and your models appear in the dropdown.

---

## How It Works

```
┌──────────────┐ ┌──────────────────┐ ┌──────────────┐
│ │ HTTP │ │ Inference│ │
│ Open WebUI │────────▶│ Provider API │────────▶ │ Model │
│ (frontend) │◀────────│ (cloud/local) │◀──────── │ (LLM/VLM) │
│ │ Stream │ │ Tokens │ │
└──────────────┘ └──────────────────┘ └──────────────┘
```

1. **You type a message** in Open WebUI
2. Open WebUI sends it to your provider's API endpoint
3. The provider runs inference on the selected model
4. Tokens **stream back** to Open WebUI in real time
5. You see the response in the chat interface

:::tip
Adding a provider is as simple as entering a URL and API key in **Admin Settings → Connections**. Open WebUI auto-detects available models from most providers.
:::

---

## Cloud Providers

Hosted APIs that require an account and API key. No hardware needed.

| Provider | Models | Guide |
|----------|--------|-------|
| **Ollama** | Llama, Mistral, Gemma, Phi, and thousands more (local) | [Starting with Ollama →](./starting-with-ollama) |
| **OpenAI** | GPT-4o, GPT-4.1, o3, o4-mini | [Starting with OpenAI →](./starting-with-openai) |
| **Anthropic** | Claude Opus, Sonnet, Haiku | [Starting with Anthropic →](./starting-with-anthropic) |
| **OpenAI-Compatible** | Google Gemini, DeepSeek, Mistral, Groq, OpenRouter, Amazon Bedrock, Azure, and more | [OpenAI-Compatible Providers →](./starting-with-openai-compatible) |

---

## Local Servers

Run models on your own hardware. No API keys, no cloud dependency.

| Server | Description | Guide |
|--------|-------------|-------|
| **llama.cpp** | Efficient GGUF model inference with OpenAI-compatible API | [Starting with llama.cpp →](./starting-with-llama-cpp) |
| **vLLM** | High-throughput inference engine for production workloads | [Starting with vLLM →](./starting-with-vllm) |

More local servers (LM Studio, LocalAI, Docker Model Runner, Lemonade) are covered in the [OpenAI-Compatible Providers](./starting-with-openai-compatible#local-servers) guide.

---

## Other Connection Methods

| Feature | Description | Guide |
|---------|-------------|-------|
| **Open Responses** | Connect providers using the Open Responses specification | [Starting with Open Responses →](./starting-with-open-responses) |
| **Functions** | Extend Open WebUI with custom pipe functions for any backend | [Starting with Functions →](./starting-with-functions) |

---

## Looking for Agents?

If you want to connect an autonomous AI agent (with terminal access, file operations, web search, and more) instead of a plain model provider, see [**Connect an Agent**](/getting-started/quick-start/connect-an-agent).
Loading
Loading