feat: add per-context AI model configuration (chat vs automated tasks)#1513
feat: add per-context AI model configuration (chat vs automated tasks)#1513andronedev wants to merge 3 commits intowe-promise:mainfrom
Conversation
📝 WalkthroughWalkthroughAdds separate OpenAI chat/background model settings (UI, controller validation, Setting fields), provider helpers for effective chat/background model selection, updates consumers to use the new background/chat model helpers, clears AI caches when settings change, and adds two traderepublic DB tables plus a securities constraint tweak. Changes
Sequence Diagram(s)sequenceDiagram
participant Admin as Admin (UI)
participant Controller as Settings::HostingsController
participant Setting as Setting (model)
participant Provider as Provider::Openai
participant Logger as Rails.logger
participant Job as ClearAiCacheJob
participant Family as Family records
Admin->>Controller: Submit OpenAI settings (openai_chat_model, openai_background_model, openai_uri_base?)
Controller->>Provider: derive effective_uri_base (param or Setting.openai_uri_base)
Controller->>Provider: validate_openai_config!(model: openai_chat_model, uri_base) (if present)
Controller->>Provider: validate_openai_config!(model: openai_background_model, uri_base) (if present)
Controller->>Setting: Setting.openai_chat_model = value (if present)
alt value changed and previous present
Setting->>Logger: log change
Setting->>Job: enqueue ClearAiCacheJob for each Family
Job->>Family: clear AI cache (async)
end
Controller->>Setting: Setting.openai_background_model = value (if present)
alt value changed and previous present
Setting->>Logger: log change
Setting->>Job: enqueue ClearAiCacheJob for each Family
end
Controller->>Admin: render success
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 6314fdaec3
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
app/controllers/settings/hostings_controller.rb (1)
149-171:⚠️ Potential issue | 🟠 MajorValidate the new model overrides before saving them.
openai_chat_modelandopenai_background_modelbypass the existing OpenAI config validation, so an unsupported override can be persisted and break only at runtime for chat/background jobs.Proposed validation wiring
if hosting_params.key?(:openai_uri_base) || hosting_params.key?(:openai_model) Setting.validate_openai_config!( uri_base: hosting_params[:openai_uri_base], model: hosting_params[:openai_model] ) end + + if hosting_params.key?(:openai_uri_base) || hosting_params.key?(:openai_chat_model) + chat_model = hosting_params.key?(:openai_chat_model) ? hosting_params[:openai_chat_model].presence : Setting.openai_chat_model + + if chat_model.present? + Setting.validate_openai_config!( + uri_base: hosting_params[:openai_uri_base], + model: chat_model + ) + end + end + + if hosting_params.key?(:openai_uri_base) || hosting_params.key?(:openai_background_model) + background_model = hosting_params.key?(:openai_background_model) ? hosting_params[:openai_background_model].presence : Setting.openai_background_model + + if background_model.present? + Setting.validate_openai_config!( + uri_base: hosting_params[:openai_uri_base], + model: background_model + ) + end + end if hosting_params.key?(:openai_uri_base) Setting.openai_uri_base = hosting_params[:openai_uri_base] end🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/controllers/settings/hostings_controller.rb` around lines 149 - 171, The new override fields openai_chat_model and openai_background_model are not being validated before persistence; update the validation block that calls Setting.validate_openai_config! (currently passing uri_base and model) to also include chat and background model values when hosting_params includes :openai_chat_model or :openai_background_model (use hosting_params[:openai_chat_model].presence and hosting_params[:openai_background_model].presence so blank strings are treated as nil), then only persist the Setting.openai_chat_model and Setting.openai_background_model after validation succeeds.db/schema.rb (1)
1279-1287:⚠️ Potential issue | 🟠 MajorRemove the unused columns from the migration —
rails-settings-cachedstores field data invar/valuerows, not columns.
Setting < RailsSettings::Basepersists eachfield :openai_chat_model/field :openai_background_modelas a row in thesettingstable keyed byvar, with the value serialized into thevaluetext column. The physical string columns added by the migration will never be read or written; all access goes through the row-based storage via custom setters (lines 210–222 ofapp/models/setting.rb) that delegate torails-settings-cached. Theopenai_modelfield works identically without a physical column, confirming the pattern.The two columns will remain NULL indefinitely. Either drop the migration and regenerate
db/schema.rb, or if column-backed storage is intentional, implement custom readers/writers and perform a data backfill from existingvar/valuerows.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@db/schema.rb` around lines 1279 - 1287, The schema adds physical columns openai_chat_model and openai_background_model to the settings table but Setting < RailsSettings::Base persists fields into var/value rows via rails-settings-cached (see Setting class and its custom setters), so remove those unused columns from the migration/schema: delete the openai_chat_model and openai_background_model column definitions (or create a new migration to drop them) so storage remains consistent with rails-settings-cached; if column-backed storage is actually desired instead, implement matching readers/writers on the Setting model and perform a one-time backfill from existing var/value rows into those columns.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@app/models/setting.rb`:
- Around line 210-222: The setter openai_background_model= currently just
assigns raw_openai_background_model but does not clear AI caches like
openai_chat_model= does; either add the same invalidation logic (log the change
and enqueue ClearAiCacheJob for each Family, or better: enqueue a single fan-out
job) so Transaction.clear_ai_cache / Entry.clear_ai_cache results are
invalidated when the background model changes, or add a short comment explaining
why background-model changes should not trigger cache clearing; locate the
method openai_background_model= and mirror the behavior from openai_chat_model=
(or replace Family.find_each { ClearAiCacheJob.perform_later(family) } with a
single scalable fan-out job).
---
Outside diff comments:
In `@app/controllers/settings/hostings_controller.rb`:
- Around line 149-171: The new override fields openai_chat_model and
openai_background_model are not being validated before persistence; update the
validation block that calls Setting.validate_openai_config! (currently passing
uri_base and model) to also include chat and background model values when
hosting_params includes :openai_chat_model or :openai_background_model (use
hosting_params[:openai_chat_model].presence and
hosting_params[:openai_background_model].presence so blank strings are treated
as nil), then only persist the Setting.openai_chat_model and
Setting.openai_background_model after validation succeeds.
In `@db/schema.rb`:
- Around line 1279-1287: The schema adds physical columns openai_chat_model and
openai_background_model to the settings table but Setting < RailsSettings::Base
persists fields into var/value rows via rails-settings-cached (see Setting class
and its custom setters), so remove those unused columns from the
migration/schema: delete the openai_chat_model and openai_background_model
column definitions (or create a new migration to drop them) so storage remains
consistent with rails-settings-cached; if column-backed storage is actually
desired instead, implement matching readers/writers on the Setting model and
perform a one-time backfill from existing var/value rows into those columns.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: bb1344b1-4784-401a-adcf-70241fd9578b
📒 Files selected for processing (12)
app/controllers/settings/hostings_controller.rbapp/models/chat.rbapp/models/family/auto_categorizer.rbapp/models/family/auto_merchant_detector.rbapp/models/provider/openai.rbapp/models/provider_merchant/enhancer.rbapp/models/setting.rbapp/views/settings/hostings/_openai_settings.html.erbconfig/locales/views/settings/hostings/en.ymlconfig/locales/views/settings/hostings/fr.ymldb/migrate/20260419103006_add_model_overrides_to_settings.rbdb/schema.rb
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@db/schema.rb`:
- Around line 1486-1500: Add a partial unique index on traderepublic_accounts to
prevent duplicate (item, account) rows: add an index on [:traderepublic_item_id,
:account_id] with unique: true and the WHERE clause "(account_id IS NOT NULL)"
using a descriptive name like
index_traderepublic_accounts_on_item_and_account_id (implement this via an
ActiveRecord migration using add_index with the where option and run the
migration).
- Around line 1486-1520: The schema includes unrelated Trade Republic tables
that must be removed from this PR: delete the create_table blocks for
traderepublic_accounts and traderepublic_items (and their associated foreign
key/index changes) so the migration set only contains the OpenAI model config
changes; if Trade Republic is intended in a separate PR, add a dedicated
migration and model files there and (a) ensure sensitive columns in
traderepublic_items (pin, session_token, refresh_token, session_cookies) are
stored encrypted at the DB or app layer, and (b) add a unique composite index on
traderepublic_accounts for (traderepublic_item_id, account_id) to match other
integrations and prevent duplicates.
- Around line 1502-1520: Create a new ActiveRecord model TraderepublicItem
mirroring other *_items models and add conditional encryption by declaring
encrypts :pin, :session_token, :refresh_token, :session_cookies wrapped with the
same if encryption_ready? pattern used in PlaidItem; update parameter filtering
by adding :pin, :session_token, :refresh_token (and :session_cookies if desired)
to the filter list in config/initializers/filter_parameter_logging.rb; ensure
any code that builds or persists raw_payload never populates it with the
plaintext pin (remove or strip pin before assigning raw_payload and from any
logs) and review the phone_number column to decide if it should be treated as
PII (hash or encrypt similarly) and align raw_payload nullability with other
*_items tables if that constraint is unintentional.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: bbad0e06-4f9c-416d-954d-c8d0a28a15e1
📒 Files selected for processing (3)
app/controllers/settings/hostings_controller.rbapp/models/setting.rbdb/schema.rb
🚧 Files skipped from review as they are similar to previous changes (2)
- app/controllers/settings/hostings_controller.rb
- app/models/setting.rb
- Introduced `openai_chat_model` and `openai_background_model` settings. - Updated controllers, models, and views to handle new settings. - Added migration to include new columns in the `settings` table. - Adjusted logic to prioritize specific models for chat and background tasks.
- Added validation for OpenAI chat and background models. - Removed unused migration for OpenAI model overrides. - Updated schema to reflect migration rollback. - Clear AI cache when OpenAI background model changes.
4db9714 to
e6b56df
Compare
|
The traderepublic_* tables in db/schema.rb were an accidental artifact from merging origin/main into my branch. Fixed with a rebase on upstream/main, the schema is now clean |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@app/models/provider/openai.rb`:
- Around line 17-26: The current methods effective_chat_model and
effective_background_model use ENV.fetch which treats an empty ENV value as
present and thus bypasses the stored Setting; change them to check
ENV["OPENAI_CHAT_MODEL"] / ENV["OPENAI_BACKGROUND_MODEL"] with .presence first,
then fall back to Setting.openai_chat_model / Setting.openai_background_model
.presence, and finally to effective_model so blank env vars do not override
saved settings.
In `@app/models/setting.rb`:
- Around line 210-228: The setters openai_chat_model= and
openai_background_model= currently guard cache invalidation with
old_value.present?, which prevents clearing AI cache when an explicit model is
set for the first time; remove that present? check so the invalidation runs
whenever the value actually changes. Update both methods (openai_chat_model= and
openai_background_model=) to use if old_value != value then log and call
Family.find_each { |family| ClearAiCacheJob.perform_later(family) } (i.e., drop
the old_value.present? condition).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: fb258677-799e-4535-9cac-e4c29dbda0d9
📒 Files selected for processing (11)
app/controllers/settings/hostings_controller.rbapp/models/chat.rbapp/models/family/auto_categorizer.rbapp/models/family/auto_merchant_detector.rbapp/models/provider/openai.rbapp/models/provider_merchant/enhancer.rbapp/models/setting.rbapp/views/settings/hostings/_openai_settings.html.erbconfig/locales/views/settings/hostings/en.ymlconfig/locales/views/settings/hostings/fr.ymldb/schema.rb
✅ Files skipped from review due to trivial changes (1)
- config/locales/views/settings/hostings/fr.yml
🚧 Files skipped from review as they are similar to previous changes (8)
- app/models/provider_merchant/enhancer.rb
- app/models/family/auto_categorizer.rb
- app/models/chat.rb
- config/locales/views/settings/hostings/en.yml
- app/models/family/auto_merchant_detector.rb
- app/views/settings/hostings/_openai_settings.html.erb
- app/controllers/settings/hostings_controller.rb
- db/schema.rb
| # Returns the model to use for interactive chat. Falls back to effective_model. | ||
| def self.effective_chat_model | ||
| ENV.fetch("OPENAI_CHAT_MODEL") { Setting.openai_chat_model }.presence || effective_model | ||
| end | ||
|
|
||
| # Returns the model to use for background tasks (auto-categorization, merchant detection). | ||
| # Falls back to effective_model. | ||
| def self.effective_background_model | ||
| ENV.fetch("OPENAI_BACKGROUND_MODEL") { Setting.openai_background_model }.presence || effective_model | ||
| end |
There was a problem hiding this comment.
Don’t let blank ENV variables bypass saved model settings.
With ENV.fetch, an empty OPENAI_CHAT_MODEL / OPENAI_BACKGROUND_MODEL prevents reading the corresponding Setting, so a saved UI override is silently ignored.
Proposed fix
def self.effective_chat_model
- ENV.fetch("OPENAI_CHAT_MODEL") { Setting.openai_chat_model }.presence || effective_model
+ ENV["OPENAI_CHAT_MODEL"].presence || Setting.openai_chat_model.presence || effective_model
end
# Returns the model to use for background tasks (auto-categorization, merchant detection).
# Falls back to effective_model.
def self.effective_background_model
- ENV.fetch("OPENAI_BACKGROUND_MODEL") { Setting.openai_background_model }.presence || effective_model
+ ENV["OPENAI_BACKGROUND_MODEL"].presence || Setting.openai_background_model.presence || effective_model
end🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@app/models/provider/openai.rb` around lines 17 - 26, The current methods
effective_chat_model and effective_background_model use ENV.fetch which treats
an empty ENV value as present and thus bypasses the stored Setting; change them
to check ENV["OPENAI_CHAT_MODEL"] / ENV["OPENAI_BACKGROUND_MODEL"] with
.presence first, then fall back to Setting.openai_chat_model /
Setting.openai_background_model .presence, and finally to effective_model so
blank env vars do not override saved settings.
| def openai_chat_model=(value) | ||
| old_value = raw_openai_chat_model | ||
| self.raw_openai_chat_model = value | ||
|
|
||
| if old_value != value && old_value.present? | ||
| Rails.logger.info("OpenAI chat model changed from #{old_value} to #{value}, clearing AI cache for all families") | ||
| Family.find_each { |family| ClearAiCacheJob.perform_later(family) } | ||
| end | ||
| end | ||
|
|
||
| def openai_background_model=(value) | ||
| old_value = raw_openai_background_model | ||
| self.raw_openai_background_model = value | ||
|
|
||
| if old_value != value && old_value.present? | ||
| Rails.logger.info("OpenAI background model changed from #{old_value} to #{value}, clearing AI cache for all families") | ||
| Family.find_each { |family| ClearAiCacheJob.perform_later(family) } | ||
| end | ||
| end |
There was a problem hiding this comment.
Clear AI cache when a context override is set for the first time.
old_value.present? skips invalidation when changing from the inherited/global model to a new explicit chat/background model. That leaves existing AI-enriched fields locked even though future background enrichment will use a different model.
Proposed fix
def openai_chat_model=(value)
- old_value = raw_openai_chat_model
+ old_value = raw_openai_chat_model.presence
+ new_value = value.presence
self.raw_openai_chat_model = value
- if old_value != value && old_value.present?
- Rails.logger.info("OpenAI chat model changed from #{old_value} to #{value}, clearing AI cache for all families")
+ if old_value != new_value
+ Rails.logger.info("OpenAI chat model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end
def openai_background_model=(value)
- old_value = raw_openai_background_model
+ old_value = raw_openai_background_model.presence
+ new_value = value.presence
self.raw_openai_background_model = value
- if old_value != value && old_value.present?
- Rails.logger.info("OpenAI background model changed from #{old_value} to #{value}, clearing AI cache for all families")
+ if old_value != new_value
+ Rails.logger.info("OpenAI background model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def openai_chat_model=(value) | |
| old_value = raw_openai_chat_model | |
| self.raw_openai_chat_model = value | |
| if old_value != value && old_value.present? | |
| Rails.logger.info("OpenAI chat model changed from #{old_value} to #{value}, clearing AI cache for all families") | |
| Family.find_each { |family| ClearAiCacheJob.perform_later(family) } | |
| end | |
| end | |
| def openai_background_model=(value) | |
| old_value = raw_openai_background_model | |
| self.raw_openai_background_model = value | |
| if old_value != value && old_value.present? | |
| Rails.logger.info("OpenAI background model changed from #{old_value} to #{value}, clearing AI cache for all families") | |
| Family.find_each { |family| ClearAiCacheJob.perform_later(family) } | |
| end | |
| end | |
| def openai_chat_model=(value) | |
| old_value = raw_openai_chat_model.presence | |
| new_value = value.presence | |
| self.raw_openai_chat_model = value | |
| if old_value != new_value | |
| Rails.logger.info("OpenAI chat model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families") | |
| Family.find_each { |family| ClearAiCacheJob.perform_later(family) } | |
| end | |
| end | |
| def openai_background_model=(value) | |
| old_value = raw_openai_background_model.presence | |
| new_value = value.presence | |
| self.raw_openai_background_model = value | |
| if old_value != new_value | |
| Rails.logger.info("OpenAI background model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families") | |
| Family.find_each { |family| ClearAiCacheJob.perform_later(family) } | |
| end | |
| end |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@app/models/setting.rb` around lines 210 - 228, The setters openai_chat_model=
and openai_background_model= currently guard cache invalidation with
old_value.present?, which prevents clearing AI cache when an explicit model is
set for the first time; remove that present? check so the invalidation runs
whenever the value actually changes. Update both methods (openai_chat_model= and
openai_background_model=) to use if old_value != value then log and call
Family.find_each { |family| ClearAiCacheJob.perform_later(family) } (i.e., drop
the old_value.present? condition).
This pull request introduces support for overriding the default OpenAI model used for different types of tasks. Specifically, it allows administrators to set separate models for interactive chat and for automated background tasks (such as auto-categorization and merchant detection). These overrides can be configured via environment variables or through the settings UI, and the changes propagate through the backend logic to ensure the correct model is used for each context.
Screenshots:


The most important changes are:
Model Override Support
openai_chat_model,openai_background_model) to theSettingmodel, stored viarails-settings-cachedkey/value rows (no new database columns).Backend Logic and Usage
Provider::Openai.effective_chat_modelandProvider::Openai.effective_background_modelmethods to select the appropriate model for chat and background tasks, falling back to the global default if not set.Settings Management
uri_baseis set).User Interface and Localization
These changes make it possible to fine-tune which OpenAI model is used for different types of AI-powered features, improving flexibility and control for administrators.
Summary by CodeRabbit
New Features
New Integrations