-
Notifications
You must be signed in to change notification settings - Fork 955
Fix langfuse #386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix langfuse #386
Conversation
WalkthroughThis pull request introduces monitoring capabilities using Langfuse across several components of the Cognee project. The changes include adding optional Langfuse configuration parameters to the base configuration, conditionally importing the Changes
Possibly related PRs
Suggested reviewers
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (3)
cognee/base_config.py (1)
13-15: LGTM! Remember to document the new environment variables.The Langfuse configuration is properly implemented using environment variables for sensitive credentials.
Consider adding these new environment variables to the project's documentation or README to help other developers set up monitoring.
cognee/infrastructure/llm/openai/adapter.py (1)
46-47: Standardize @observe decorator usage.The decorator usage is inconsistent:
acreate_structured_output:@observe()create_structured_output:@observeChoose one style and apply it consistently.
Also applies to: 69-70
cognee/api/v1/cognify/code_graph_pipeline.py (1)
56-57: Consider consolidating monitoring approaches.The function currently has two monitoring mechanisms:
- Langfuse monitoring via
@observe- Explicit
send_telemetrycallsConsider consolidating these approaches to avoid duplicate monitoring overhead.
Would you like assistance in refactoring this to use a single monitoring approach?
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
poetry.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
cognee/api/v1/cognify/code_graph_pipeline.py(2 hunks)cognee/base_config.py(1 hunks)cognee/infrastructure/llm/openai/adapter.py(6 hunks)pyproject.toml(2 hunks)
🧰 Additional context used
🪛 Ruff (0.8.2)
cognee/infrastructure/llm/openai/adapter.py
43-43: Local variable base_config is assigned to but never used
Remove assignment to unused variable base_config
(F841)
🔇 Additional comments (1)
pyproject.toml (1)
49-49: Verify breaking changes in weaviate-client upgrade
The version upgrade from 4.6.7 to 4.9.6 is significant. While it's an optional dependency, we should verify the changelog for any breaking changes.
✅ Verification successful
Version upgrade appears safe to proceed
After reviewing the changelog between versions 4.6.7 and 4.9.6, there are no breaking changes identified. The updates primarily include:
- New features (multi-vector search, new embedding models support)
- Bug fixes and improvements (connection handling, timeout configurations)
- Documentation updates
The changes are backward compatible and the upgrade should be safe, especially since it's marked as an optional dependency.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check weaviate-client changelog and release notes
gh api repos/weaviate/weaviate-python-client/releases \
--jq '.[] | select(.tag_name | contains("v4.")) | {version: .tag_name, body: .body}' \
| jq 'select(.version | contains("4.7") or contains("4.8") or contains("4.9"))'
Length of output: 15064
| base_config = get_base_config() | ||
|
|
||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove unused variable.
The base_config variable is assigned but never used.
- base_config = get_base_config()
-
-🧰 Tools
🪛 Ruff (0.8.2)
43-43: Local variable base_config is assigned to but never used
Remove assignment to unused variable base_config
(F841)
| if MonitoringTool.LANGFUSE: | ||
| from langfuse.decorators import observe |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Incorrect usage of MonitoringTool enum for conditional import
The current implementation checks the enum value MonitoringTool.LANGFUSE directly for conditional import, which will always evaluate to True since it's a valid enum member. Instead, this should compare against the configured monitoring tool from base_config.monitoring_tool, similar to how it's correctly used in generic_llm_api/adapter.py.
cognee/api/v1/cognify/code_graph_pipeline.py: Replaceif MonitoringTool.LANGFUSE:withif base_config.monitoring_tool == MonitoringTool.LANGFUSE:cognee/infrastructure/llm/openai/adapter.py: Replaceif MonitoringTool.LANGFUSE:withif base_config.monitoring_tool == MonitoringTool.LANGFUSE:
🔗 Analysis chain
Verify Langfuse import condition.
The condition if MonitoringTool.LANGFUSE: checks an enum value directly, which might not be the intended way to check if Langfuse monitoring is enabled.
Let me check how MonitoringTool is defined to better understand the enum implementation.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check how MonitoringTool enum is used across the codebase
rg -A 2 "MonitoringTool\." --type py
Length of output: 1288
Script:
#!/bin/bash
# Find MonitoringTool enum definition
rg -A 5 "class MonitoringTool" --type py
Length of output: 357
Added langfuse as mandatory dependency, added poetry.lock file Fix
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
cognee/api/v1/cognify/code_graph_pipeline.py (2)
40-40: Move import statement to the top of the fileThe
summarize_codeimport should be moved to the top of the file with other imports to maintain consistency and follow Python's style guidelines.from cognee.tasks.storage import add_data_points +from cognee.tasks.summarization import summarize_code from cognee.base_config import get_base_config from cognee.shared.data_models import MonitoringTool monitoring = get_base_config().monitoring_tool if monitoring == MonitoringTool.LANGFUSE: from langfuse.decorators import observe -from cognee.tasks.summarization import summarize_code🧰 Tools
🪛 Ruff (0.8.2)
40-40: Module level import not at top of file
(E402)
74-74: Consider telemetry strategy in deprecated functionThe function is marked as deprecated but new monitoring capabilities are being added. Additionally, there might be redundancy between Langfuse monitoring and existing
send_telemetrycalls.Consider:
- Adding monitoring to the new
run_code_graph_pipelinefunction instead- Documenting the transition plan from
send_telemetryto Langfuse
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
cognee/api/v1/cognify/code_graph_pipeline.py(2 hunks)
🧰 Additional context used
🪛 Ruff (0.8.2)
cognee/api/v1/cognify/code_graph_pipeline.py
40-40: Module level import not at top of file
(E402)
🔇 Additional comments (2)
cognee/api/v1/cognify/code_graph_pipeline.py (2)
33-38: LGTM: Correct implementation of monitoring configuration
The monitoring tool configuration is properly implemented using the base configuration and correct enum comparison.
36-38: Verify Langfuse configuration availability
Let's ensure that the necessary Langfuse configuration is properly defined in the base configuration.
✅ Verification successful
Let me gather more information about the monitoring configuration to ensure it's properly set up.
Let me check one more thing to see if there's any Langfuse initialization or setup code elsewhere.
Langfuse configuration is properly set up
The verification shows that Langfuse is well-integrated in the codebase:
- Configuration keys (public, secret, host) are properly defined in
base_config.py - Langfuse is set as the default monitoring tool
- The import is used consistently across multiple components (code_graph_pipeline, openai adapter, generic_llm_api)
- Proper environment variable configuration is in place
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check if Langfuse configuration is properly defined
rg -A 5 "langfuse_(public|secret|host)" --type py
Length of output: 621
Script:
#!/bin/bash
# Check MonitoringTool enum definition and its usage
rg -A 5 "class MonitoringTool" --type py
# Check if there's any initialization of Langfuse client
rg "Langfuse\(" --type py
Length of output: 385
Script:
#!/bin/bash
# Check for any Langfuse-related initialization or setup
rg -B 2 -A 5 "from langfuse" --type py
# Check if there's any other Langfuse configuration or setup
rg "langfuse" -i --type py
Length of output: 2966
Summary by CodeRabbit
New Features
Bug Fixes
Chores
weaviate-clientand the addition ofhttpx.