Skip to content

Conversation

@nativestranger
Copy link

@nativestranger nativestranger commented Nov 28, 2025

What this does

Adds two complementary features to RubyLLM:

1. prompt() method - Non-persisting chat interactions

A new method that works like ask() but doesn't persist anything to the database. Perfect for:

  • A/B testing: Generate multiple responses with different temperatures/models, let users pick one
  • Injecting: prompts without polluting chat history (tool calls side effects allowed intentionally)
  • Speculative generation: Try different prompts/settings without commitment - generative variation without altering history

A/B testing example

msg = chat.messages.create!(role: :user, content: "Explain 'optionality'")
response_a = chat.with_temperature(0.3).prompt(msg)
response_b = chat.with_temperature(0.9).prompt(msg)

Then add create an assistant message with both responses and a 'choose' action. When user picks best answer, then we manually persist the winner (with backfilled tool calls if desired) removing the A/B selection content, replacing it with the selected response.

2. include_attachments: parameter - Performance optimization

Added optional parameter to Message#to_llm() to skip attachment downloads:

Window attachments: apply arbitrary logic to which messages ought have their attachments downloaded for each chat run

class Message
  # Overriding in the application 
  def to_llm(include_attachments: false)
    super(include_attachments: (include_attachments || created_at > 5.minutes.ago))
  end
end

Why this matters:

  • Long chats with many attachments = performance issues
  • Enables smart attachment windowing strategies
  • Defaults to true (backward compatible)

Type of change

  • New feature
  • Bug fix
  • Breaking change
  • Documentation
  • Performance improvement (attachment windowing)

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Why this belongs in RubyLLM:

  • prompt() is a fundamental LLM interaction pattern (like ask() but non-persisting)
  • Attachment windowing is a general performance concern for multimodal chats
  • Both features enable common patterns: A/B testing, attachment windowing

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name] - Requires maintainer with API credentials
    • All tests pass
  • I updated documentation if needed - Would appreciate guidance on where docs should go
  • I didn't modify auto-generated files manually (models.json, aliases.json)

Test Coverage:

  • Added tests for prompt() method
  • Added tests for include_attachments: parameter (all passing)

API changes

  • New public methods/classes
    • Chat#prompt(message, with: nil, &block) (both new and legacy APIs)
    • Message#prompt_messages (singleton method on response) for accessibility
  • Changed method signatures
    • Message#to_llm(include_attachments: true) (optional parameter, defaults to true)
  • Breaking change
  • No API changes

Backward Compatibility:

  • include_attachments: defaults to true (existing behavior unchanged)
  • prompt() is additive
  • ✅ Feature parity: Both features added to new AND legacy APIs (respects use_new_acts_as config)

Related issues

Not sure.

@nativestranger nativestranger changed the title Support ephemeral #prompt with Chat and add include_attachments kwarg to Message#to_llm Support ephemeral #prompt vs persistent #ask && add include_attachments option to Message#to_llm Nov 28, 2025
@nativestranger nativestranger changed the title Support ephemeral #prompt vs persistent #ask && add include_attachments option to Message#to_llm Supports ephemeral #prompt vs persistent #ask & adds include_attachments option to Message#to_llm Nov 28, 2025
@nativestranger nativestranger changed the title Supports ephemeral #prompt vs persistent #ask & adds include_attachments option to Message#to_llm Ephemeral #prompt vs persistent #ask & include_attachments option to Message#to_llm Nov 28, 2025
@nativestranger nativestranger changed the title Ephemeral #prompt vs persistent #ask & include_attachments option to Message#to_llm Ephemeral #prompt vs persistent #ask & include_attachments option Nov 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant