Skip to content
This repository was archived by the owner on Mar 6, 2026. It is now read-only.

zaherg/ai-provider-for-ollama

Repository files navigation

AI Provider for Ollama

Caution

This was built using AI, so make sure to read our DISCLAIMER file before using the plugin. Also, this is none-official plugin and has nothing to do with the WP AI Core team.

An AI Provider for Ollama for the PHP AI Client SDK. Works as both a Composer package and a WordPress plugin.

This package is based on the WordPress package WordPress/ai-provider-for-openai and adapts that provider implementation for Ollama's OpenAI-compatible /v1 API (for example /v1/models and /v1/chat/completions).

Requirements

  • PHP 7.4 or higher
  • A reachable OpenAI-compatible /v1 server endpoint for your deployment
  • Exact server URL/path (default: http://localhost:11434/v1)
  • When using with WordPress, WordPress 7.0 or higher

Installation

As a Composer Package

First, add the GitHub repository to your composer.json:

{
    "repositories": [
        {
            "type": "vcs",
            "url": "https://github.com/zaherg/ai-provider-for-ollama"
        }
    ]
}

Then install the package:

composer require zaherg/ai-provider-for-ollama:^0.1.3

As a WordPress Plugin

  1. Upload the plugin files
  2. Upload to /wp-content/plugins/ai-provider-for-ollama/
  3. Ensure the PHP AI Client plugin is installed and activated
  4. Activate the plugin through the WordPress admin

Usage

With WordPress

The provider automatically registers itself with the PHP AI Client on the init hook.

// Required: set the exact OpenAI-compatible /v1 base URL for your server.
define('OLLAMA_BASE_URL', 'http://localhost:11434/v1');
// Optional: if your Ollama server requires bearer auth.
define('OLLAMA_API_KEY', 'your-api-key');

$result = AiClient::prompt('Hello, world!')
    ->usingProvider('ollama')
    ->generateTextResult();

As a Standalone Package

use WordPress\AiClient\AiClient;
use Zaherg\OllamaAiProvider\Provider\OllamaProvider;

$registry = AiClient::defaultRegistry();
$registry->registerProvider(OllamaProvider::class);

define('OLLAMA_BASE_URL', 'http://localhost:11434/v1');
// Optional: if your Ollama server requires bearer auth.
define('OLLAMA_API_KEY', 'your-api-key');

$result = AiClient::prompt('Explain quantum computing')
    ->usingProvider('ollama')
    ->generateTextResult();

echo $result->toText();

Configuration

  • OLLAMA_BASE_URL (required): Exact server URL/path using the OpenAI-compatible /v1 base URL (default: http://localhost:11434/v1). Set via a constant in wp-config.php (define('OLLAMA_BASE_URL', '...');).
  • OLLAMA_API_KEY (optional): Bearer token for secured/proxied Ollama servers. Set via define('OLLAMA_API_KEY', '...'); if your server requires auth.

Supported Features (Current)

  • Text generation / chat completion
  • Function calling (tool declarations)
  • JSON output (response_format / schema)
  • Model discovery via /v1/models

Image generation and other non-text capabilities are not implemented in this package yet.

License

GPL-2.0-or-later

About

AI Provider for Ollama — Integrate self-hosted Ollama language models with the WordPress PHP AI Client SDK for chat completion, function calling, and JSON output.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors