Skip to content

feat: add FuturMix AI Gateway as chat model provider#6286

Open
FuturMix wants to merge 2 commits intoFlowiseAI:mainfrom
FuturMix:feat/add-futurmix-provider
Open

feat: add FuturMix AI Gateway as chat model provider#6286
FuturMix wants to merge 2 commits intoFlowiseAI:mainfrom
FuturMix:feat/add-futurmix-provider

Conversation

@FuturMix
Copy link
Copy Markdown

Summary

Add FuturMix.ai as a new chat model provider, following the same pattern as ChatOpenRouter.

Files added:

  • packages/components/credentials/FuturMixApi.credential.ts — API key credential
  • packages/components/nodes/chatmodels/ChatFuturMix/ChatFuturMix.ts — Chat model component
  • packages/components/nodes/chatmodels/ChatFuturMix/FlowiseChatFuturMix.ts — Multi-modal wrapper
  • packages/components/nodes/chatmodels/ChatFuturMix/futurmix.svg — Provider icon

What is FuturMix?

FuturMix.ai is a unified AI gateway providing access to 22+ models (Claude, GPT, Gemini) through a single OpenAI-compatible API with 99.99% SLA. Base URL: https://futurmix.ai/v1

Implementation:

  • Extends LangChain ChatOpenAI with custom baseURL (same pattern as ChatOpenRouter)
  • Implements IVisionChatModal for multi-modal image upload support
  • Supports streaming, temperature, max tokens, top-p, frequency/presence penalty
  • Default base path: https://futurmix.ai/v1 (configurable)

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces the FuturMix AI Gateway integration, adding a new credential type and a chat model node. The implementation leverages a wrapper around LangChain's OpenAI class to provide access to various models through the FuturMix API. Feedback was provided regarding the handling of the temperature parameter to prevent potential NaN values when the input is undefined or empty.

Comment on lines +150 to +156
const obj: ChatOpenAIFields = {
temperature: parseFloat(temperature),
modelName,
openAIApiKey: futurmixApiKey,
apiKey: futurmixApiKey,
streaming: streaming ?? true
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The temperature parameter is currently assigned directly using parseFloat(temperature). If temperature is undefined or an empty string, this will result in NaN, which can cause issues in the underlying model request. It should be handled conditionally, consistent with how maxTokens, topP, and other optional parameters are processed later in the function.

Suggested change
const obj: ChatOpenAIFields = {
temperature: parseFloat(temperature),
modelName,
openAIApiKey: futurmixApiKey,
apiKey: futurmixApiKey,
streaming: streaming ?? true
}
const obj: ChatOpenAIFields = {
modelName,
openAIApiKey: futurmixApiKey,
apiKey: futurmixApiKey,
streaming: streaming ?? true
}
if (temperature) obj.temperature = parseFloat(temperature)

Move temperature out of the ChatOpenAIFields initializer and apply it
conditionally, matching the pattern used by maxTokens, topP,
frequencyPenalty and presencePenalty. This prevents parseFloat from
returning NaN when the temperature input is undefined or empty.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@FuturMix FuturMix force-pushed the feat/add-futurmix-provider branch from 8ce4ca1 to 132d978 Compare April 25, 2026 12:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant