Conversation
Improves performance by caching environment data and reducing database calls. Co-authored-by: dima <dima@novu.co>
|
Cursor Agent can help with this pull request. Just |
✅ Deploy Preview for dashboard-v2-novu-staging canceled.
|
|
Hey there and thank you for opening this pull request! 👋 We require pull request titles to follow specific formatting rules and it looks like your proposed title needs to be adjusted. Your PR title is: Requirements:
Expected format: Details: PR title must end with 'fixes TICKET-ID' (e.g., 'fixes NOV-123') or include ticket ID in branch name |
|
Caution Review failedThe pull request is closed. WalkthroughThis change introduces a feature flag-driven caching mechanism for environment data retrieval in the ExecuteBridgeJob usecase. An LRU cache and in-flight request tracking were added to store and reuse environment lookup results, preventing duplicate fetches. A new private getEnvironment() method was created to conditionally apply caching when the corresponding feature flag is enabled. The method integrates with FeatureFlagsService and EnvironmentRepository, with environment retrieval in execute-paths updated to use this new caching layer. 📜 Recent review detailsConfiguration used: Repository UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
What changed? Why was the change needed?
Introduced an LRU cache for fetching environment data from MongoDB within the
execute-bridge-jobuse case in the worker. This change reduces database load and improves performance by caching frequently accessed environments, mirroring existing caching patterns for workflows.Screenshots
Expand for optional sections
Related enterprise PR
Special notes for your reviewer
The cache has a 60-second TTL and a maximum size of 500 items. It is enabled via the
IS_LRU_CACHE_ENABLEDfeature flag, specifically for theworker-environmentcomponent. The implementation pattern, including handling inflight requests, is consistent with existing LRU cache usage in the codebase.Slack Thread