Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
ca9dcbe
Add LLM prompt for API endpoint consistency (fixes #944)
Feb 9, 2026
801afb0
Add tests for API endpoint consistency implementation
Feb 9, 2026
952cae7
Add editable LLM prompts and per-family preferred model (fixes #938)
Feb 9, 2026
d33c85c
Add sidekiq-throttled for test boot; fix Chat and AI prompts controll…
Feb 9, 2026
8d6f38f
Per-Family AI endpoint: uri_base + model, fix Registry visibility and…
Feb 9, 2026
e4c2201
Require admin for AI prompts (Codex P1): ensure_admin on show/update
Feb 9, 2026
de9a740
Merge branch 'main' into feature/editable-llm-prompts-938
jjmata Feb 10, 2026
fac59c9
Merge branch 'main' into feature/editable-llm-prompts-938
jjmata Feb 11, 2026
9066b1e
Merge we-promise/main: resolve family.rb, schema.rb, verify_api_endpo…
Feb 14, 2026
4b2395f
Merge origin/feature/editable-llm-prompts-938: resolve db/schema.rb (…
Feb 14, 2026
fdbf0da
Add editable system prompt UI and show OpenAI prompts toggle
Feb 15, 2026
db7a015
add db/schema.rb
Feb 18, 2026
b6750d7
Merge remote-tracking branch 'origin/main' into feature/editable-llm-…
Feb 23, 2026
06a374f
Use safe navigation for chat.user.family in provider lookup
Feb 23, 2026
141ca7b
Merge we-promise/main: resolve assistant.rb and schema.rb conflicts
Feb 23, 2026
70430d5
Move AI prompts and OpenAI endpoint config off Family to BuiltinAssis…
Feb 24, 2026
278327e
Fix property edit modal: wrap in turbo_frame_tag so Edit loads in modal
Feb 24, 2026
fddec83
Merge origin/main: resolve db/schema.rb version conflict (use main sc…
MkDev11 Mar 11, 2026
16787e5
Add Langfuse prompt support for assistant system instructions
MkDev11 Mar 11, 2026
c375fb0
Fix RuboCop Layout in configurable: case/end alignment
MkDev11 Mar 11, 2026
709fd63
Prompts in Langfuse only: no schema or UI for local prompts (per @jjm…
MkDev11 Mar 13, 2026
c33edd1
Merge upstream/main: resolve db/schema.rb conflict (use upstream sche…
MkDev11 Mar 16, 2026
be3f4ae
Merge upstream/main into feature/editable-llm-prompts-938
MkDev11 Mar 20, 2026
2fd8d0d
fix(db): add snaptrade_accounts.account_id for schema load
MkDev11 Mar 20, 2026
d2cc458
fix(db): valid migration timestamp for snaptrade account_id
MkDev11 Mar 20, 2026
0744eea
fix(db): reset schema.rb to upstream/main, add only intentional changes
MkDev11 Mar 25, 2026
e1a9447
fix: revert unintended Gemfile, CSS, and view changes
MkDev11 Mar 25, 2026
3bebc98
fix: fall back to default prompt when Langfuse is absent
MkDev11 Mar 25, 2026
08468ee
Merge branch 'main' into feature/editable-llm-prompts-938
jjmata Mar 26, 2026
7567265
fix: clean up review issues in AI prompts PR
MkDev11 Mar 26, 2026
a145ccc
fix: remove out-of-scope snaptrade migration
MkDev11 Mar 26, 2026
23592b0
refactor: collapse builtin_assistant_configs into families table
MkDev11 Mar 26, 2026
24668ff
Merge branch 'main' into feature/editable-llm-prompts-938
jjmata Mar 31, 2026
a9a2d91
Rename instructions_prompt to prompt_metadata for Langfuse linkage
MkDev11 Apr 1, 2026
7c11588
Merge upstream/main; resolve db/schema.rb conflict
MkDev11 Apr 8, 2026
adac5ca
chore: empty commit to rerun CI
MkDev11 Apr 10, 2026
da94736
Merge upstream/main; resolve Family and schema conflicts
MkDev11 Apr 14, 2026
0fbceaf
Merge upstream/main; resolve Chat and OpenAI provider conflicts
MkDev11 Apr 16, 2026
5988aa0
test: expect get_model_provider with family in Assistant builtin tests
MkDev11 Apr 16, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion app/controllers/api/v1/chats_controller.rb
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ def create
@message = @chat.messages.build(
content: chat_params[:message],
type: "UserMessage",
ai_model: chat_params[:model].presence || Chat.default_model
ai_model: chat_params[:model].presence || Chat.default_model(Current.user.family)
)

if @message.save
Expand Down
2 changes: 1 addition & 1 deletion app/controllers/api/v1/messages_controller.rb
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ def create
@message = @chat.messages.build(
content: message_params[:content],
type: "UserMessage",
ai_model: message_params[:model].presence || Chat.default_model
ai_model: message_params[:model].presence || Chat.default_model(@chat.user.family)
)

if @message.save
Expand Down
3 changes: 2 additions & 1 deletion app/controllers/chats_controller.rb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,8 @@ def new
end

def create
@chat = Current.user.chats.start!(chat_params[:content], model: chat_params[:ai_model])
default_model = Chat.default_model(Current.user.family)
@chat = Current.user.chats.start!(chat_params[:content], model: chat_params[:ai_model].presence || default_model)
set_last_viewed_chat(@chat)
redirect_to chat_path(@chat, thinking: true)
end
Expand Down
2 changes: 1 addition & 1 deletion app/controllers/messages_controller.rb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ def create
@message = UserMessage.create!(
chat: @chat,
content: message_params[:content],
ai_model: message_params[:ai_model].presence || Chat.default_model
ai_model: message_params[:ai_model].presence || Chat.default_model(@chat.user.family)
)

redirect_to chat_path(@chat, thinking: true)
Expand Down
39 changes: 34 additions & 5 deletions app/controllers/settings/ai_prompts_controller.rb
Original file line number Diff line number Diff line change
@@ -1,12 +1,41 @@
class Settings::AiPromptsController < ApplicationController
layout "settings"
before_action :ensure_admin, only: [ :show, :update ]
before_action :set_family

def show
@breadcrumbs = [
[ "Home", root_path ],
[ "AI Prompts", nil ]
]
@family = Current.family
@breadcrumbs = [ [ "Home", root_path ], [ "AI Prompts", nil ] ]
@assistant_config = Assistant.config_for(OpenStruct.new(user: Current.user))
@effective_model = Chat.default_model(@family)
@show_openai_prompts = show_openai_prompts?
end

def update
if @family.update(family_ai_params)
redirect_to settings_ai_prompts_path, notice: t(".success")
else
@assistant_config = Assistant.config_for(OpenStruct.new(user: Current.user))
@effective_model = Chat.default_model(@family)
@show_openai_prompts = show_openai_prompts?
render :show, status: :unprocessable_entity
end
end

private

def ensure_admin
redirect_to root_path, alert: t("settings.ai_prompts.not_authorized") unless Current.user&.admin?
end

def set_family
@family = Current.family
end

def family_ai_params
params.require(:family).permit(:preferred_ai_model, :openai_uri_base)
end

def show_openai_prompts?
@effective_model.blank? || @effective_model.match?(/\A(gpt-|o1-|gpt4)/i)
end
end
5 changes: 2 additions & 3 deletions app/helpers/application_helper.rb
Original file line number Diff line number Diff line change
Expand Up @@ -125,9 +125,8 @@ def assistant_icon
end

def default_ai_model
# Always return a valid model, never nil or empty
# Delegates to Chat.default_model for consistency
Chat.default_model
family = Current.user&.family
Chat.default_model(family)
end

# Renders Markdown text using Redcarpet
Expand Down
15 changes: 11 additions & 4 deletions app/models/assistant/builtin.rb
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,24 @@ class Assistant::Builtin < Assistant::Base
include Assistant::Provided
include Assistant::Configurable

attr_reader :instructions
attr_reader :instructions, :prompt_metadata

class << self
def for_chat(chat)
config = config_for(chat)
new(chat, instructions: config[:instructions], functions: config[:functions])
new(
chat,
instructions: config[:instructions],
prompt_metadata: config[:prompt_metadata],
functions: config[:functions]
)
end
end

def initialize(chat, instructions: nil, functions: [])
def initialize(chat, instructions: nil, prompt_metadata: nil, functions: [])
super(chat)
@instructions = instructions
@prompt_metadata = prompt_metadata
@functions = functions
end

Expand All @@ -24,14 +30,15 @@ def respond_to(message)
ai_model: message.ai_model
)

llm_provider = get_model_provider(message.ai_model)
llm_provider = get_model_provider(message.ai_model, family: chat.user&.family)
unless llm_provider
raise StandardError, build_no_provider_error_message(message.ai_model)
end

responder = Assistant::Responder.new(
message: message,
instructions: instructions,
prompt_metadata: prompt_metadata,
function_tool_caller: function_tool_caller,
llm: llm_provider
)
Expand Down
95 changes: 84 additions & 11 deletions app/models/assistant/configurable.rb
Original file line number Diff line number Diff line change
Expand Up @@ -3,24 +3,40 @@ module Assistant::Configurable

class_methods do
def config_for(chat)
preferred_currency = Money::Currency.new(chat.user.family.currency)
preferred_date_format = chat.user.family.date_format
family = chat.user.family
preferred_currency = Money::Currency.new(family.currency)
preferred_date_format = family.date_format

if chat.user.ui_layout_intro?
{
instructions: intro_instructions(preferred_currency, preferred_date_format),
functions: []
}
instructions_config = intro_instructions_config(preferred_currency, preferred_date_format)
{ instructions: instructions_config[:content], prompt_metadata: instructions_config[:prompt], functions: [] }
else
{
instructions: default_instructions(preferred_currency, preferred_date_format),
functions: default_functions
}
instructions_config = default_instructions(preferred_currency, preferred_date_format)
{ instructions: instructions_config[:content], prompt_metadata: instructions_config[:prompt], functions: default_functions }
end
end

private
def intro_instructions(preferred_currency, preferred_date_format)
def intro_instructions_config(preferred_currency, preferred_date_format)
langfuse_intro = langfuse_intro_instructions(preferred_currency, preferred_date_format)
if langfuse_intro.present?
{ content: langfuse_intro[:content], prompt: langfuse_intro }
else
content = fallback_intro_instructions(preferred_currency, preferred_date_format)
{ content: content, prompt: { name: "intro_instructions", version: 0, template: nil, content: content } }
end
end

def langfuse_intro_instructions(preferred_currency, preferred_date_format)
fetch_langfuse_prompt("intro_instructions",
preferred_currency_symbol: preferred_currency.symbol,
preferred_currency_iso_code: preferred_currency.iso_code,
preferred_date_format: preferred_date_format,
current_date: Date.current
)
end

def fallback_intro_instructions(preferred_currency, preferred_date_format)
<<~PROMPT
## Your identity

Expand Down Expand Up @@ -56,6 +72,30 @@ def default_functions
end

def default_instructions(preferred_currency, preferred_date_format)
langfuse_instructions = langfuse_default_instructions(preferred_currency, preferred_date_format)

if langfuse_instructions.present?
{ content: langfuse_instructions[:content], prompt: langfuse_instructions }
else
content = fallback_default_instructions(preferred_currency, preferred_date_format)
{ content: content, prompt: { name: "default_instructions", version: 0, template: nil, content: content } }
end
end

def langfuse_default_instructions(preferred_currency, preferred_date_format)
fetch_langfuse_prompt("default_instructions",
preferred_currency_symbol: preferred_currency.symbol,
preferred_currency_iso_code: preferred_currency.iso_code,
preferred_currency_default_precision: preferred_currency.default_precision,
preferred_currency_default_format: preferred_currency.default_format,
preferred_currency_separator: preferred_currency.separator,
preferred_currency_delimiter: preferred_currency.delimiter,
preferred_date_format: preferred_date_format,
current_date: Date.current
)
end

def fallback_default_instructions(preferred_currency, preferred_date_format)
<<~PROMPT
## Your identity

Expand Down Expand Up @@ -111,5 +151,38 @@ def default_instructions(preferred_currency, preferred_date_format)
the data you're presenting represents and what context it is in (i.e. date range, account, etc.)
PROMPT
end

def fetch_langfuse_prompt(name, compile_params)
return unless langfuse_client

prompt = langfuse_client.get_prompt(name)
compiled_prompt = prompt.compile(**compile_params)

content = case compiled_prompt
when String
compiled_prompt
when Array
compiled_prompt.filter_map { |message| message[:content] }.join("\n\n")
else
nil
end

return if content.blank?

template = prompt.respond_to?(:prompt) ? prompt.prompt : (prompt.respond_to?(:template) ? prompt.template : nil)
{ name: prompt.name, version: prompt.version, template: template, content: content }
rescue => e
Rails.logger.warn("Langfuse prompt retrieval failed (#{name}): #{e.message}")
nil
end

def langfuse_client
return unless ENV["LANGFUSE_PUBLIC_KEY"].present? && ENV["LANGFUSE_SECRET_KEY"].present?

@langfuse_client ||= Langfuse.new
rescue => e
Rails.logger.warn("Langfuse client initialization failed: #{e.message}")
nil
end
end
end
7 changes: 6 additions & 1 deletion app/models/assistant/provided.rb
Original file line number Diff line number Diff line change
@@ -1,7 +1,12 @@
module Assistant::Provided
extend ActiveSupport::Concern

def get_model_provider(ai_model)
def get_model_provider(ai_model, family: nil)
family_provider = family.present? ? Provider::Registry.openai_for_family(family) : nil
if family_provider&.supports_model?(ai_model)
return family_provider
end

registry.providers.find { |provider| provider.supports_model?(ai_model) }
end

Expand Down
6 changes: 4 additions & 2 deletions app/models/assistant/responder.rb
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
class Assistant::Responder
def initialize(message:, instructions:, function_tool_caller:, llm:)
def initialize(message:, instructions:, function_tool_caller:, llm:, prompt_metadata: nil)
@message = message
@instructions = instructions
@function_tool_caller = function_tool_caller
@llm = llm
@prompt_metadata = prompt_metadata
end

def on(event_name, &block)
Expand Down Expand Up @@ -44,7 +45,7 @@ def respond(previous_response_id: nil)
end

private
attr_reader :message, :instructions, :function_tool_caller, :llm
attr_reader :message, :instructions, :function_tool_caller, :llm, :prompt_metadata

def handle_follow_up_response(response)
streamer = proc do |chunk|
Expand Down Expand Up @@ -77,6 +78,7 @@ def get_llm_response(streamer:, function_results: [], previous_response_id: nil)
message.content,
model: message.ai_model,
instructions: instructions,
prompt_metadata: prompt_metadata,
functions: function_tool_caller.function_definitions,
function_results: function_results,
messages: conversation_history,
Expand Down
11 changes: 7 additions & 4 deletions app/models/chat.rb
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,13 @@ def generate_title(prompt)
prompt.first(80)
end

# Returns the default AI model to use for chats
# Priority: AI Config > Setting
def default_model
Provider::Openai.effective_model.presence || Setting.openai_model
# Returns the default AI model to use for chats.
# Priority: family preferred > Provider::Openai.effective_model (ENV > Setting > default).
# @param family [Family, nil] optional family for per-family preferred model
# @return [String]
def default_model(family = nil)
family&.preferred_ai_model.presence ||
Provider::Openai.effective_model
end
end

Expand Down
18 changes: 17 additions & 1 deletion app/models/family.rb
Original file line number Diff line number Diff line change
Expand Up @@ -45,15 +45,25 @@ class Family < ApplicationRecord
has_many :llm_usages, dependent: :destroy
has_many :recurring_transactions, dependent: :destroy

PREFERRED_AI_MODEL_MAX_LENGTH = 128
OPENAI_URI_BASE_MAX_LENGTH = 512

validates :locale, inclusion: { in: I18n.available_locales.map(&:to_s) }
validates :date_format, inclusion: { in: DATE_FORMATS.map(&:last) }
validates :month_start_day, inclusion: { in: 1..28 }
validates :moniker, inclusion: { in: MONIKERS }
validates :assistant_type, inclusion: { in: ASSISTANT_TYPES }
validates :default_account_sharing, inclusion: { in: SHARING_DEFAULTS }
validates :preferred_ai_model, length: { maximum: PREFERRED_AI_MODEL_MAX_LENGTH }, allow_blank: true
validates :openai_uri_base, length: { maximum: OPENAI_URI_BASE_MAX_LENGTH }, allow_blank: true
validate :preferred_ai_model_required_when_custom_endpoint

before_validation :normalize_enabled_currencies!

def custom_openai_endpoint?
openai_uri_base.present?
end

def primary_currency_code
normalize_currency_code(currency) || "USD"
end
Expand All @@ -80,7 +90,6 @@ def secondary_enabled_currency_objects(extra: [])
enabled_currency_objects(extra:).reject { |currency| currency.iso_code == primary_currency_code }
end


def moniker_label
moniker.presence || "Family"
end
Expand Down Expand Up @@ -329,6 +338,13 @@ def self_hoster?
end

private

def preferred_ai_model_required_when_custom_endpoint
return unless openai_uri_base.present? && preferred_ai_model.blank?

errors.add(:preferred_ai_model, :blank)
end

def normalize_enabled_currencies!
if enabled_currencies.blank?
self.enabled_currencies = nil
Expand Down
1 change: 1 addition & 0 deletions app/models/provider/llm_concept.rb
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ def chat_response(
prompt,
model:,
instructions: nil,
prompt_metadata: nil,
functions: [],
function_results: [],
messages: nil,
Expand Down
Loading
Loading