Skip to content
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions apps/web/client/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,9 @@ GITHUB_APP_PRIVATE_KEY="<Your app's private key hash>"

# ------------- Optional: Alternative LLM providers -------------

# MiniMax
MINIMAX_API_KEY="<Your api key from https://platform.minimaxi.com/user-center/basic-information/interface-key>"

# Anthropic
ANTHROPIC_API_KEY="<Your api key from https://console.anthropic.com/settings/keys>"

Expand Down
2 changes: 2 additions & 0 deletions apps/web/client/src/env.ts
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ export const env = createEnv({
ANTHROPIC_API_KEY: z.string().optional(),
GOOGLE_AI_STUDIO_API_KEY: z.string().optional(),
OPENAI_API_KEY: z.string().optional(),
MINIMAX_API_KEY: z.string().optional(),

// n8n
N8N_WEBHOOK_URL: z.string().optional(),
Expand Down Expand Up @@ -130,6 +131,7 @@ export const env = createEnv({
GOOGLE_AI_STUDIO_API_KEY: process.env.GOOGLE_AI_STUDIO_API_KEY,
OPENAI_API_KEY: process.env.OPENAI_API_KEY,
OPENROUTER_API_KEY: process.env.OPENROUTER_API_KEY,
MINIMAX_API_KEY: process.env.MINIMAX_API_KEY,

// n8n
N8N_WEBHOOK_URL: process.env.N8N_WEBHOOK_URL,
Expand Down
6 changes: 5 additions & 1 deletion docs/content/docs/self-hosting/external-services.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,12 @@ For assistance with self-hosting these services, please contact us at [founders@

### 3. AI Providers
To configure custom AI providers:
1. Update the providers in [`packages/ai/src/chat/providers.ts`](https://github.com/onlook-dev/onlook/blob/main/packages/ai/src/chat/providers.ts). We already support Anthropic and OpenRouter as examples. You can follow the same format to add a new provider.
1. Update the providers in [`packages/ai/src/chat/providers.ts`](https://github.com/onlook-dev/onlook/blob/main/packages/ai/src/chat/providers.ts). We already support OpenRouter, MiniMax, and Anthropic as examples. You can follow the same format to add a new provider.
2. Update the usages by searching for [`initModel`](https://github.com/search?q=repo%3Aonlook-dev%2Fonlook+%22await+initModel%22&type=code)
3. Update your API keys in the `apps/web/client/.env` file to the provider's expected API keys.

Built-in providers:
- **OpenRouter** (default) — access to multiple models via a single API key
- **[MiniMax](https://platform.minimaxi.com)** — MiniMax-M2.5 models with 204K context window

Note: We support any provider from the [AI SDK providers](https://ai-sdk.dev/providers/ai-sdk-providers). You can add a custom provider by following these AI SDK provider guides: [OpenAI compatible](https://ai-sdk.dev/providers/openai-compatible-providers/custom-providers) and [Community](https://ai-sdk.dev/providers/community-providers/custom-providers).
1 change: 1 addition & 0 deletions packages/ai/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@
"typescript": "^5.5.4"
},
"dependencies": {
"@ai-sdk/openai-compatible": "^1.0.34",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check available versions of `@ai-sdk/openai-compatible`
curl -s https://registry.npmjs.org/@ai-sdk/openai-compatible | jq '.versions | keys | .[-5:]'

# Check peer dependencies
curl -s https://registry.npmjs.org/@ai-sdk/openai-compatible/1.0.34 | jq '.peerDependencies'

Repository: onlook-dev/onlook

Length of output: 173


Update @ai-sdk/openai-compatible to a valid version.

Version 1.0.34 does not exist in the npm registry. The latest stable versions are 2.0.8 and 2.0.9. Update the dependency to one of these versions or the latest available version that meets the project requirements.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/ai/package.json` at line 36, The package.json entry for the
dependency "@ai-sdk/openai-compatible" uses a nonexistent version ("^1.0.34");
update that dependency in packages/ai/package.json by replacing the version
string with a valid released version such as "^2.0.9" (or "^2.0.8" if preferred)
so the dependency resolves correctly during installs.

"@mendable/firecrawl-js": "^1.29.1",
"@openrouter/ai-sdk-provider": "^1.2.0",
"ai": "5.0.60",
Expand Down
17 changes: 17 additions & 0 deletions packages/ai/src/chat/providers.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
import {
LLMProvider,
MINIMAX_MODELS,
MODEL_MAX_TOKENS,
OPENROUTER_MODELS,
type InitialModelPayload,
type ModelConfig
} from '@onlook/models';
import { assertNever } from '@onlook/utility';
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import type { LanguageModel } from 'ai';

Expand Down Expand Up @@ -33,6 +35,9 @@ export function initModel({
? { ...providerOptions, anthropic: { cacheControl: { type: 'ephemeral' } } }
: providerOptions;
break;
case LLMProvider.MINIMAX:
model = getMinimaxProvider(requestedModel);
break;
default:
assertNever(requestedProvider);
}
Expand All @@ -52,3 +57,15 @@ function getOpenRouterProvider(model: OPENROUTER_MODELS): LanguageModel {
const openrouter = createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY });
return openrouter(model);
}

function getMinimaxProvider(model: MINIMAX_MODELS): LanguageModel {
if (!process.env.MINIMAX_API_KEY) {
throw new Error('MINIMAX_API_KEY must be set');
}
const minimax = createOpenAICompatible({
name: 'minimax',
baseURL: 'https://api.minimax.io/v1',
apiKey: process.env.MINIMAX_API_KEY,
});
return minimax(model);
}
9 changes: 9 additions & 0 deletions packages/models/src/llm/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import type { LanguageModel } from 'ai';

export enum LLMProvider {
OPENROUTER = 'openrouter',
MINIMAX = 'minimax',
}

export enum OPENROUTER_MODELS {
Expand All @@ -13,8 +14,14 @@ export enum OPENROUTER_MODELS {
OPEN_AI_GPT_5_NANO = 'openai/gpt-5-nano',
}

export enum MINIMAX_MODELS {
MINIMAX_M2_5 = 'MiniMax-M2.5',
MINIMAX_M2_5_HIGHSPEED = 'MiniMax-M2.5-highspeed',
}

interface ModelMapping {
[LLMProvider.OPENROUTER]: OPENROUTER_MODELS;
[LLMProvider.MINIMAX]: MINIMAX_MODELS;
}

export type InitialModelPayload = {
Expand All @@ -37,4 +44,6 @@ export const MODEL_MAX_TOKENS = {
[OPENROUTER_MODELS.OPEN_AI_GPT_5_NANO]: 400000,
[OPENROUTER_MODELS.OPEN_AI_GPT_5_MINI]: 400000,
[OPENROUTER_MODELS.OPEN_AI_GPT_5]: 400000,
[MINIMAX_MODELS.MINIMAX_M2_5]: 204000,
[MINIMAX_MODELS.MINIMAX_M2_5_HIGHSPEED]: 204000,
} as const;