Skip to content

feat: add per-context AI model configuration (chat vs automated tasks)#1513

Open
andronedev wants to merge 3 commits intowe-promise:mainfrom
andronedev:feat/per-context-ai-model
Open

feat: add per-context AI model configuration (chat vs automated tasks)#1513
andronedev wants to merge 3 commits intowe-promise:mainfrom
andronedev:feat/per-context-ai-model

Conversation

@andronedev
Copy link
Copy Markdown

@andronedev andronedev commented Apr 19, 2026

This pull request introduces support for overriding the default OpenAI model used for different types of tasks. Specifically, it allows administrators to set separate models for interactive chat and for automated background tasks (such as auto-categorization and merchant detection). These overrides can be configured via environment variables or through the settings UI, and the changes propagate through the backend logic to ensure the correct model is used for each context.

Screenshots:
image
image

The most important changes are:

Model Override Support

  • Added new fields (openai_chat_model, openai_background_model) to the Setting model, stored via rails-settings-cached key/value rows (no new database columns).
  • Updated the settings controller and permitted parameters to handle the new model fields, and exposed them in the settings UI.

Backend Logic and Usage

  • Introduced Provider::Openai.effective_chat_model and Provider::Openai.effective_background_model methods to select the appropriate model for chat and background tasks, falling back to the global default if not set.
  • Updated all relevant background processing and chat code to use the new model selection logic, ensuring each task type uses the correct model as configured.

Settings Management

  • When the chat or background model is changed, the system now clears the AI cache for all families to ensure consistency.
  • New fields validate against the existing OpenAI config (e.g. model is required when a custom uri_base is set).

User Interface and Localization

  • Updated the settings UI to allow configuring the new model overrides, with clarified labels ("Default Model", "Chat Model", "Automated Tasks Model") and help texts in English and French.

These changes make it possible to fine-tune which OpenAI model is used for different types of AI-powered features, improving flexibility and control for administrators.

Summary by CodeRabbit

  • New Features

    • Two new OpenAI model settings: separate defaults for interactive chat and background tasks; configurable in Settings or via env vars, with fallback to the global default.
    • Settings form updated with localized labels/placeholders for both fields.
    • Background tasks (auto-categorization, merchant detection, provider enhancements) now respect the background-model setting.
    • Changing a model triggers AI cache clearing.
  • New Integrations

    • Database additions to support TradeRepublic accounts/items.

@brin-security-scanner brin-security-scanner Bot added contributor:verified Contributor passed trust analysis. pr:verified PR passed security analysis. labels Apr 19, 2026
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 19, 2026

📝 Walkthrough

Walkthrough

Adds separate OpenAI chat/background model settings (UI, controller validation, Setting fields), provider helpers for effective chat/background model selection, updates consumers to use the new background/chat model helpers, clears AI caches when settings change, and adds two traderepublic DB tables plus a securities constraint tweak.

Changes

Cohort / File(s) Summary
Controller & View
app/controllers/settings/hostings_controller.rb, app/views/settings/hostings/_openai_settings.html.erb
Permit, validate (using effective URI base), and persist openai_chat_model and openai_background_model; add corresponding form inputs (disabled when ENV overrides exist).
Settings model & Side Effects
app/models/setting.rb
Add openai_chat_model and openai_background_model fields, raw accessor aliases, and setters that log changes and enqueue ClearAiCacheJob for all Family records when a prior value existed and the value changed.
Provider helpers
app/models/provider/openai.rb
Add effective_chat_model and effective_background_model methods (ENV → Setting → effective_model fallback).
Model consumers
app/models/chat.rb, app/models/family/auto_categorizer.rb, app/models/family/auto_merchant_detector.rb, app/models/provider_merchant/enhancer.rb
Use Provider::Openai.effective_chat_model for interactive Chat; pass Provider::Openai.effective_background_model to background LLM calls (auto-categorizer, merchant detector, merchant enhancer).
Localization
config/locales/views/settings/hostings/en.yml, config/locales/views/settings/hostings/fr.yml
Rename base label to “Default Model” and add labels/placeholders/help texts for chat_model_* and background_model_*.
Database schema
db/schema.rb
Add traderepublic_items and traderepublic_accounts tables (UUIDs, indexes, FKs); adjust securities.kind check constraint typing.

Sequence Diagram(s)

sequenceDiagram
  participant Admin as Admin (UI)
  participant Controller as Settings::HostingsController
  participant Setting as Setting (model)
  participant Provider as Provider::Openai
  participant Logger as Rails.logger
  participant Job as ClearAiCacheJob
  participant Family as Family records

  Admin->>Controller: Submit OpenAI settings (openai_chat_model, openai_background_model, openai_uri_base?)
  Controller->>Provider: derive effective_uri_base (param or Setting.openai_uri_base)
  Controller->>Provider: validate_openai_config!(model: openai_chat_model, uri_base) (if present)
  Controller->>Provider: validate_openai_config!(model: openai_background_model, uri_base) (if present)
  Controller->>Setting: Setting.openai_chat_model = value (if present)
  alt value changed and previous present
    Setting->>Logger: log change
    Setting->>Job: enqueue ClearAiCacheJob for each Family
    Job->>Family: clear AI cache (async)
  end
  Controller->>Setting: Setting.openai_background_model = value (if present)
  alt value changed and previous present
    Setting->>Logger: log change
    Setting->>Job: enqueue ClearAiCacheJob for each Family
  end
  Controller->>Admin: render success
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Suggested labels

codex

Suggested reviewers

  • jjmata
  • sokie

Poem

🐰 Two models split to serve each task,
Chat for talk and background for the mask.
When names are changed I clear the stash,
Jobs hop off and caches crash—whoosh! dash!
The rabbit cheers: new fields, new ask.

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 6.67% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main purpose of the PR: adding support for per-context AI model configuration, allowing separate models for chat versus automated tasks.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6314fdaec3

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread app/models/setting.rb
Comment thread app/models/setting.rb
Comment thread db/schema.rb
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
app/controllers/settings/hostings_controller.rb (1)

149-171: ⚠️ Potential issue | 🟠 Major

Validate the new model overrides before saving them.

openai_chat_model and openai_background_model bypass the existing OpenAI config validation, so an unsupported override can be persisted and break only at runtime for chat/background jobs.

Proposed validation wiring
     if hosting_params.key?(:openai_uri_base) || hosting_params.key?(:openai_model)
       Setting.validate_openai_config!(
         uri_base: hosting_params[:openai_uri_base],
         model: hosting_params[:openai_model]
       )
     end
+
+    if hosting_params.key?(:openai_uri_base) || hosting_params.key?(:openai_chat_model)
+      chat_model = hosting_params.key?(:openai_chat_model) ? hosting_params[:openai_chat_model].presence : Setting.openai_chat_model
+
+      if chat_model.present?
+        Setting.validate_openai_config!(
+          uri_base: hosting_params[:openai_uri_base],
+          model: chat_model
+        )
+      end
+    end
+
+    if hosting_params.key?(:openai_uri_base) || hosting_params.key?(:openai_background_model)
+      background_model = hosting_params.key?(:openai_background_model) ? hosting_params[:openai_background_model].presence : Setting.openai_background_model
+
+      if background_model.present?
+        Setting.validate_openai_config!(
+          uri_base: hosting_params[:openai_uri_base],
+          model: background_model
+        )
+      end
+    end
 
     if hosting_params.key?(:openai_uri_base)
       Setting.openai_uri_base = hosting_params[:openai_uri_base]
     end
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/controllers/settings/hostings_controller.rb` around lines 149 - 171, The
new override fields openai_chat_model and openai_background_model are not being
validated before persistence; update the validation block that calls
Setting.validate_openai_config! (currently passing uri_base and model) to also
include chat and background model values when hosting_params includes
:openai_chat_model or :openai_background_model (use
hosting_params[:openai_chat_model].presence and
hosting_params[:openai_background_model].presence so blank strings are treated
as nil), then only persist the Setting.openai_chat_model and
Setting.openai_background_model after validation succeeds.
db/schema.rb (1)

1279-1287: ⚠️ Potential issue | 🟠 Major

Remove the unused columns from the migration — rails-settings-cached stores field data in var/value rows, not columns.

Setting < RailsSettings::Base persists each field :openai_chat_model / field :openai_background_model as a row in the settings table keyed by var, with the value serialized into the value text column. The physical string columns added by the migration will never be read or written; all access goes through the row-based storage via custom setters (lines 210–222 of app/models/setting.rb) that delegate to rails-settings-cached. The openai_model field works identically without a physical column, confirming the pattern.

The two columns will remain NULL indefinitely. Either drop the migration and regenerate db/schema.rb, or if column-backed storage is intentional, implement custom readers/writers and perform a data backfill from existing var/value rows.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@db/schema.rb` around lines 1279 - 1287, The schema adds physical columns
openai_chat_model and openai_background_model to the settings table but Setting
< RailsSettings::Base persists fields into var/value rows via
rails-settings-cached (see Setting class and its custom setters), so remove
those unused columns from the migration/schema: delete the openai_chat_model and
openai_background_model column definitions (or create a new migration to drop
them) so storage remains consistent with rails-settings-cached; if column-backed
storage is actually desired instead, implement matching readers/writers on the
Setting model and perform a one-time backfill from existing var/value rows into
those columns.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@app/models/setting.rb`:
- Around line 210-222: The setter openai_background_model= currently just
assigns raw_openai_background_model but does not clear AI caches like
openai_chat_model= does; either add the same invalidation logic (log the change
and enqueue ClearAiCacheJob for each Family, or better: enqueue a single fan-out
job) so Transaction.clear_ai_cache / Entry.clear_ai_cache results are
invalidated when the background model changes, or add a short comment explaining
why background-model changes should not trigger cache clearing; locate the
method openai_background_model= and mirror the behavior from openai_chat_model=
(or replace Family.find_each { ClearAiCacheJob.perform_later(family) } with a
single scalable fan-out job).

---

Outside diff comments:
In `@app/controllers/settings/hostings_controller.rb`:
- Around line 149-171: The new override fields openai_chat_model and
openai_background_model are not being validated before persistence; update the
validation block that calls Setting.validate_openai_config! (currently passing
uri_base and model) to also include chat and background model values when
hosting_params includes :openai_chat_model or :openai_background_model (use
hosting_params[:openai_chat_model].presence and
hosting_params[:openai_background_model].presence so blank strings are treated
as nil), then only persist the Setting.openai_chat_model and
Setting.openai_background_model after validation succeeds.

In `@db/schema.rb`:
- Around line 1279-1287: The schema adds physical columns openai_chat_model and
openai_background_model to the settings table but Setting < RailsSettings::Base
persists fields into var/value rows via rails-settings-cached (see Setting class
and its custom setters), so remove those unused columns from the
migration/schema: delete the openai_chat_model and openai_background_model
column definitions (or create a new migration to drop them) so storage remains
consistent with rails-settings-cached; if column-backed storage is actually
desired instead, implement matching readers/writers on the Setting model and
perform a one-time backfill from existing var/value rows into those columns.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: bb1344b1-4784-401a-adcf-70241fd9578b

📥 Commits

Reviewing files that changed from the base of the PR and between b32e9db and 6314fda.

📒 Files selected for processing (12)
  • app/controllers/settings/hostings_controller.rb
  • app/models/chat.rb
  • app/models/family/auto_categorizer.rb
  • app/models/family/auto_merchant_detector.rb
  • app/models/provider/openai.rb
  • app/models/provider_merchant/enhancer.rb
  • app/models/setting.rb
  • app/views/settings/hostings/_openai_settings.html.erb
  • config/locales/views/settings/hostings/en.yml
  • config/locales/views/settings/hostings/fr.yml
  • db/migrate/20260419103006_add_model_overrides_to_settings.rb
  • db/schema.rb

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@db/schema.rb`:
- Around line 1486-1500: Add a partial unique index on traderepublic_accounts to
prevent duplicate (item, account) rows: add an index on [:traderepublic_item_id,
:account_id] with unique: true and the WHERE clause "(account_id IS NOT NULL)"
using a descriptive name like
index_traderepublic_accounts_on_item_and_account_id (implement this via an
ActiveRecord migration using add_index with the where option and run the
migration).
- Around line 1486-1520: The schema includes unrelated Trade Republic tables
that must be removed from this PR: delete the create_table blocks for
traderepublic_accounts and traderepublic_items (and their associated foreign
key/index changes) so the migration set only contains the OpenAI model config
changes; if Trade Republic is intended in a separate PR, add a dedicated
migration and model files there and (a) ensure sensitive columns in
traderepublic_items (pin, session_token, refresh_token, session_cookies) are
stored encrypted at the DB or app layer, and (b) add a unique composite index on
traderepublic_accounts for (traderepublic_item_id, account_id) to match other
integrations and prevent duplicates.
- Around line 1502-1520: Create a new ActiveRecord model TraderepublicItem
mirroring other *_items models and add conditional encryption by declaring
encrypts :pin, :session_token, :refresh_token, :session_cookies wrapped with the
same if encryption_ready? pattern used in PlaidItem; update parameter filtering
by adding :pin, :session_token, :refresh_token (and :session_cookies if desired)
to the filter list in config/initializers/filter_parameter_logging.rb; ensure
any code that builds or persists raw_payload never populates it with the
plaintext pin (remove or strip pin before assigning raw_payload and from any
logs) and review the phone_number column to decide if it should be treated as
PII (hash or encrypt similarly) and align raw_payload nullability with other
*_items tables if that constraint is unintentional.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: bbad0e06-4f9c-416d-954d-c8d0a28a15e1

📥 Commits

Reviewing files that changed from the base of the PR and between 6314fda and 8b27b5a.

📒 Files selected for processing (3)
  • app/controllers/settings/hostings_controller.rb
  • app/models/setting.rb
  • db/schema.rb
🚧 Files skipped from review as they are similar to previous changes (2)
  • app/controllers/settings/hostings_controller.rb
  • app/models/setting.rb

Comment thread db/schema.rb
Comment thread db/schema.rb
Comment thread db/schema.rb
- Introduced `openai_chat_model` and `openai_background_model` settings.
- Updated controllers, models, and views to handle new settings.
- Added migration to include new columns in the `settings` table.
- Adjusted logic to prioritize specific models for chat and background
  tasks.
- Added validation for OpenAI chat and background models.
- Removed unused migration for OpenAI model overrides.
- Updated schema to reflect migration rollback.
- Clear AI cache when OpenAI background model changes.
@andronedev andronedev force-pushed the feat/per-context-ai-model branch from 4db9714 to e6b56df Compare April 19, 2026 11:20
@andronedev
Copy link
Copy Markdown
Author

The traderepublic_* tables in db/schema.rb were an accidental artifact from merging origin/main into my branch. Fixed with a rebase on upstream/main, the schema is now clean

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@app/models/provider/openai.rb`:
- Around line 17-26: The current methods effective_chat_model and
effective_background_model use ENV.fetch which treats an empty ENV value as
present and thus bypasses the stored Setting; change them to check
ENV["OPENAI_CHAT_MODEL"] / ENV["OPENAI_BACKGROUND_MODEL"] with .presence first,
then fall back to Setting.openai_chat_model / Setting.openai_background_model
.presence, and finally to effective_model so blank env vars do not override
saved settings.

In `@app/models/setting.rb`:
- Around line 210-228: The setters openai_chat_model= and
openai_background_model= currently guard cache invalidation with
old_value.present?, which prevents clearing AI cache when an explicit model is
set for the first time; remove that present? check so the invalidation runs
whenever the value actually changes. Update both methods (openai_chat_model= and
openai_background_model=) to use if old_value != value then log and call
Family.find_each { |family| ClearAiCacheJob.perform_later(family) } (i.e., drop
the old_value.present? condition).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: fb258677-799e-4535-9cac-e4c29dbda0d9

📥 Commits

Reviewing files that changed from the base of the PR and between 4db9714 and e6b56df.

📒 Files selected for processing (11)
  • app/controllers/settings/hostings_controller.rb
  • app/models/chat.rb
  • app/models/family/auto_categorizer.rb
  • app/models/family/auto_merchant_detector.rb
  • app/models/provider/openai.rb
  • app/models/provider_merchant/enhancer.rb
  • app/models/setting.rb
  • app/views/settings/hostings/_openai_settings.html.erb
  • config/locales/views/settings/hostings/en.yml
  • config/locales/views/settings/hostings/fr.yml
  • db/schema.rb
✅ Files skipped from review due to trivial changes (1)
  • config/locales/views/settings/hostings/fr.yml
🚧 Files skipped from review as they are similar to previous changes (8)
  • app/models/provider_merchant/enhancer.rb
  • app/models/family/auto_categorizer.rb
  • app/models/chat.rb
  • config/locales/views/settings/hostings/en.yml
  • app/models/family/auto_merchant_detector.rb
  • app/views/settings/hostings/_openai_settings.html.erb
  • app/controllers/settings/hostings_controller.rb
  • db/schema.rb

Comment on lines +17 to +26
# Returns the model to use for interactive chat. Falls back to effective_model.
def self.effective_chat_model
ENV.fetch("OPENAI_CHAT_MODEL") { Setting.openai_chat_model }.presence || effective_model
end

# Returns the model to use for background tasks (auto-categorization, merchant detection).
# Falls back to effective_model.
def self.effective_background_model
ENV.fetch("OPENAI_BACKGROUND_MODEL") { Setting.openai_background_model }.presence || effective_model
end
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Don’t let blank ENV variables bypass saved model settings.

With ENV.fetch, an empty OPENAI_CHAT_MODEL / OPENAI_BACKGROUND_MODEL prevents reading the corresponding Setting, so a saved UI override is silently ignored.

Proposed fix
 def self.effective_chat_model
-  ENV.fetch("OPENAI_CHAT_MODEL") { Setting.openai_chat_model }.presence || effective_model
+  ENV["OPENAI_CHAT_MODEL"].presence || Setting.openai_chat_model.presence || effective_model
 end
 
 # Returns the model to use for background tasks (auto-categorization, merchant detection).
 # Falls back to effective_model.
 def self.effective_background_model
-  ENV.fetch("OPENAI_BACKGROUND_MODEL") { Setting.openai_background_model }.presence || effective_model
+  ENV["OPENAI_BACKGROUND_MODEL"].presence || Setting.openai_background_model.presence || effective_model
 end
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/models/provider/openai.rb` around lines 17 - 26, The current methods
effective_chat_model and effective_background_model use ENV.fetch which treats
an empty ENV value as present and thus bypasses the stored Setting; change them
to check ENV["OPENAI_CHAT_MODEL"] / ENV["OPENAI_BACKGROUND_MODEL"] with
.presence first, then fall back to Setting.openai_chat_model /
Setting.openai_background_model .presence, and finally to effective_model so
blank env vars do not override saved settings.

Comment thread app/models/setting.rb
Comment on lines +210 to +228
def openai_chat_model=(value)
old_value = raw_openai_chat_model
self.raw_openai_chat_model = value

if old_value != value && old_value.present?
Rails.logger.info("OpenAI chat model changed from #{old_value} to #{value}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end

def openai_background_model=(value)
old_value = raw_openai_background_model
self.raw_openai_background_model = value

if old_value != value && old_value.present?
Rails.logger.info("OpenAI background model changed from #{old_value} to #{value}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Clear AI cache when a context override is set for the first time.

old_value.present? skips invalidation when changing from the inherited/global model to a new explicit chat/background model. That leaves existing AI-enriched fields locked even though future background enrichment will use a different model.

Proposed fix
 def openai_chat_model=(value)
-  old_value = raw_openai_chat_model
+  old_value = raw_openai_chat_model.presence
+  new_value = value.presence
   self.raw_openai_chat_model = value
 
-  if old_value != value && old_value.present?
-    Rails.logger.info("OpenAI chat model changed from #{old_value} to #{value}, clearing AI cache for all families")
+  if old_value != new_value
+    Rails.logger.info("OpenAI chat model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families")
     Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
   end
 end
 
 def openai_background_model=(value)
-  old_value = raw_openai_background_model
+  old_value = raw_openai_background_model.presence
+  new_value = value.presence
   self.raw_openai_background_model = value
 
-  if old_value != value && old_value.present?
-    Rails.logger.info("OpenAI background model changed from #{old_value} to #{value}, clearing AI cache for all families")
+  if old_value != new_value
+    Rails.logger.info("OpenAI background model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families")
     Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
   end
 end
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def openai_chat_model=(value)
old_value = raw_openai_chat_model
self.raw_openai_chat_model = value
if old_value != value && old_value.present?
Rails.logger.info("OpenAI chat model changed from #{old_value} to #{value}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end
def openai_background_model=(value)
old_value = raw_openai_background_model
self.raw_openai_background_model = value
if old_value != value && old_value.present?
Rails.logger.info("OpenAI background model changed from #{old_value} to #{value}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end
def openai_chat_model=(value)
old_value = raw_openai_chat_model.presence
new_value = value.presence
self.raw_openai_chat_model = value
if old_value != new_value
Rails.logger.info("OpenAI chat model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end
def openai_background_model=(value)
old_value = raw_openai_background_model.presence
new_value = value.presence
self.raw_openai_background_model = value
if old_value != new_value
Rails.logger.info("OpenAI background model changed from #{old_value || "(default)"} to #{new_value || "(default)"}, clearing AI cache for all families")
Family.find_each { |family| ClearAiCacheJob.perform_later(family) }
end
end
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/models/setting.rb` around lines 210 - 228, The setters openai_chat_model=
and openai_background_model= currently guard cache invalidation with
old_value.present?, which prevents clearing AI cache when an explicit model is
set for the first time; remove that present? check so the invalidation runs
whenever the value actually changes. Update both methods (openai_chat_model= and
openai_background_model=) to use if old_value != value then log and call
Family.find_each { |family| ClearAiCacheJob.perform_later(family) } (i.e., drop
the old_value.present? condition).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor:verified Contributor passed trust analysis. pr:verified PR passed security analysis.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant