Skip to content

feat: improved global 429 management with nginx and improved SQLite usage with mmap#363

Merged
TeKrop merged 2 commits intomainfrom
feature/429-managed-in-nginx
Feb 16, 2026
Merged

feat: improved global 429 management with nginx and improved SQLite usage with mmap#363
TeKrop merged 2 commits intomainfrom
feature/429-managed-in-nginx

Conversation

@TeKrop
Copy link
Owner

@TeKrop TeKrop commented Feb 16, 2026

Summary by Sourcery

Improve rate limiting handling via Nginx integration and add configurable SQLite mmap-based performance tuning.

New Features:

  • Introduce configurable SQLite memory-mapped I/O size via a new sqlite_mmap_size setting.
  • Add configurable environment variables for Blizzard rate limiting keys and retry headers used by Nginx/Valkey Lua handling.

Enhancements:

  • Clarify and document the in-app Blizzard rate limit check to complement Nginx-based enforcement.
  • Extend Nginx entrypoint templating to pass Blizzard rate limit configuration through to the Valkey Lua handler.

Build:

  • Update Nginx/Valkey Lua templates and entrypoint to support new rate-limiting configuration parameters.

@TeKrop TeKrop self-assigned this Feb 16, 2026
@TeKrop TeKrop added the enhancement New feature or request label Feb 16, 2026
@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Feb 16, 2026

Reviewer's Guide

Implements improved global 429 rate limit handling by wiring Nginx/Valkey-based checks into configuration and clarifying the Python client’s defensive rate-limit check, and adds optional SQLite performance tuning via memory-mapped I/O controlled by configuration.

Sequence diagram for improved global 429 rate limit handling

sequenceDiagram
    actor Client
    participant Nginx as Nginx_OpenResty
    participant Valkey as Valkey
    participant App as Python_App
    participant BClient as BlizzardClient
    participant Cache as CacheManager

    Client->>Nginx: HTTP request
    Nginx->>Valkey: Check global rate limit state
    alt Rate limit active in Valkey
        Nginx-->>Client: 429 Too Many Requests
    else Not rate limited (or cache miss)
        Nginx->>App: Forward request
        App->>BClient: Execute API-related operation
        BClient->>Cache: is_being_rate_limited()
        Cache-->>BClient: Rate limit state
        alt Rate limited at application level
            BClient-->>App: Raise 429 with Retry-After
            App-->>Client: 429 Too Many Requests
        else Not rate limited
            BClient-->>App: Proceed with Blizzard API call
            App-->>Client: Success response
        end
    end
Loading

Class diagram for Settings, SQLite storage, and Blizzard client changes

classDiagram
    class Settings {
        +str storage_path
        +int sqlite_mmap_size
    }

    class SQLiteStorage {
        +_get_connection() aiosqlite_Connection
    }

    class BlizzardClient {
        +get(url, params, headers, timeout) httpx_Response
        +aclose() None
        -_check_rate_limit() None
    }

    class CacheManager {
        +is_being_rate_limited() bool
    }

    Settings <.. SQLiteStorage : config
    SQLiteStorage o-- Settings : uses

    BlizzardClient o-- CacheManager : uses

    class aiosqlite_Connection {
        +execute(sql)
    }

    SQLiteStorage --> aiosqlite_Connection : manages

    class RateLimitConfig {
        +str BLIZZARD_RATE_LIMIT_KEY
        +int BLIZZARD_RATE_LIMIT_RETRY_AFTER
        +str RETRY_AFTER_HEADER
    }

    RateLimitConfig <.. BlizzardClient
    RateLimitConfig <.. CacheManager
Loading

File-Level Changes

Change Details Files
Clarified and documented the Blizzard client’s internal rate-limit check to coexist with new Nginx-based global 429 handling.
  • Updated docstring of the internal rate limit check method to explain its behavior and relationship to Nginx cache-miss checks.
  • Expanded inline comments before invoking the rate-limit check in the HTTP GET method to justify retaining the application-level check for race conditions and defense in depth.
app/adapters/blizzard/client.py
Extended Nginx/OpenResty entrypoint configuration to support Blizzard-specific rate-limit keys and headers for Lua Valkey handler templating.
  • Introduced default environment variables for Blizzard rate-limit key, default retry-after value, and the header name used for retry information.
  • Updated envsubst invocation to pass the new rate-limit-related variables to the Valkey Lua handler template instead of Prometheus-only variables.
build/nginx/entrypoint.sh
Added optional SQLite mmap-based I/O tuning controlled by configuration to improve read performance.
  • Added a new sqlite_mmap_size setting to the Settings configuration with a default of 0 (disabled).
  • Configured SQLite connections to set PRAGMA mmap_size when the configured mmap size is greater than zero, with comments explaining its performance implications.
app/adapters/storage/sqlite_storage.py
app/config.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@sonarqubecloud
Copy link

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 1 security issue, and left some high level feedback:

Security issues:

  • Avoiding SQL string concatenation: untrusted input concatenated with raw SQL query can result in SQL Injection. In order to execute raw query safely, prepared statement should be used. SQLAlchemy provides TextualSQL to easily used prepared statement with named parameters. For complex SQL composition, use SQL Expression Language or Schema Definition Language. In most cases, SQLAlchemy ORM will be a better option. (link)

General comments:

  • For the SQLite mmap_size PRAGMA, consider validating or normalizing settings.sqlite_mmap_size (e.g., enforcing non-negative values and alignment to page size) before interpolating it into the SQL to avoid unexpected SQLite behavior or errors.
  • Now that rate limiting and Retry-After behavior is partially configured via Nginx env vars and partially in the Python client, it may be worth centralizing the configuration (or at least sharing constants) so that changes in retry delay or header naming can’t drift between layers.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- For the SQLite `mmap_size` PRAGMA, consider validating or normalizing `settings.sqlite_mmap_size` (e.g., enforcing non-negative values and alignment to page size) before interpolating it into the SQL to avoid unexpected SQLite behavior or errors.
- Now that rate limiting and `Retry-After` behavior is partially configured via Nginx env vars and partially in the Python client, it may be worth centralizing the configuration (or at least sharing constants) so that changes in retry delay or header naming can’t drift between layers.

## Individual Comments

### Comment 1
<location> `app/adapters/storage/sqlite_storage.py:100` </location>
<code_context>
                    await db.execute(f"PRAGMA mmap_size={settings.sqlite_mmap_size}")
</code_context>

<issue_to_address>
**security (python.sqlalchemy.security.sqlalchemy-execute-raw-query):** Avoiding SQL string concatenation: untrusted input concatenated with raw SQL query can result in SQL Injection. In order to execute raw query safely, prepared statement should be used. SQLAlchemy provides TextualSQL to easily used prepared statement with named parameters. For complex SQL composition, use SQL Expression Language or Schema Definition Language. In most cases, SQLAlchemy ORM will be a better option.

*Source: opengrep*
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

# This can significantly improve read performance by mapping database
# pages directly into memory
if settings.sqlite_mmap_size > 0:
await db.execute(f"PRAGMA mmap_size={settings.sqlite_mmap_size}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

security (python.sqlalchemy.security.sqlalchemy-execute-raw-query): Avoiding SQL string concatenation: untrusted input concatenated with raw SQL query can result in SQL Injection. In order to execute raw query safely, prepared statement should be used. SQLAlchemy provides TextualSQL to easily used prepared statement with named parameters. For complex SQL composition, use SQL Expression Language or Schema Definition Language. In most cases, SQLAlchemy ORM will be a better option.

Source: opengrep

@TeKrop TeKrop merged commit 7a81f67 into main Feb 16, 2026
4 of 5 checks passed
@TeKrop TeKrop deleted the feature/429-managed-in-nginx branch February 16, 2026 22:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments