-
Notifications
You must be signed in to change notification settings - Fork 1k
Feat/cognee mcp add option to install extras #1696
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat/cognee mcp add option to install extras #1696
Conversation
Please make sure all the checkboxes are checked:
|
|
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThis PR introduces telemetry version/tenant tracking across API routes and task pipelines, adds a feedback enrichment feature with extraction/generation/linking workflows, enhances the retrieval system with structured completions, updates Docker deployment with runtime extras installation, refactors infrastructure components (ontology resolver, vector engine, graph adapter, logging), significantly revamps network visualization, and updates prompt templates. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant API as API Endpoint
participant Telemetry as Telemetry System
participant LLMGateway
participant GraphEngine
User->>API: Request with context (e.g., cognify)
API->>API: Import cognee_version
API->>Telemetry: Send telemetry with cognee_version + tenant_id
Telemetry->>Telemetry: Log enriched metadata
API->>GraphEngine: Execute core logic
GraphEngine-->>API: Result
API-->>User: Response
Note over Telemetry: Version & tenant now tracked across all endpoints
sequenceDiagram
participant GraphData as Graph Data
participant ExtractTask as extract_feedback_interactions
participant GenerateTask as generate_improved_answers
participant CreateTask as create_enrichments
participant LinkTask as link_enrichments_to_feedback
participant GraphEngine
GraphData->>ExtractTask: Fetch feedback & interactions
ExtractTask->>ExtractTask: Match pairs, filter negative
ExtractTask->>ExtractTask: Generate human-readable context via LLM
ExtractTask-->>GenerateTask: FeedbackEnrichment records
GenerateTask->>GenerateTask: Initialize GraphCompletionCotRetriever
GenerateTask->>LLMGateway: Render reaction prompt & get structured completion
GenerateTask->>GenerateTask: Update improved_answer & explanation
GenerateTask-->>CreateTask: Enriched records
CreateTask->>LLMGateway: Generate report via feedback_report_prompt
CreateTask->>CreateTask: Assign to NodeSet
CreateTask-->>LinkTask: Enrichments with text & set
LinkTask->>LinkTask: Create enriches_feedback & improves_interaction edges
LinkTask->>GraphEngine: Add & index edges
GraphEngine-->>LinkTask: Success
LinkTask-->>User: FeedbackEnrichment nodes linked to graph
Note over ExtractTask,LinkTask: Complete feedback enrichment pipeline
sequenceDiagram
participant Startup as Entrypoint
participant EnvVars as Environment
participant Installer as pip install
participant Docker as Container
Docker->>Startup: Container start with EXTRAS env var
Startup->>EnvVars: Check EXTRAS
alt EXTRAS provided (e.g., "aws,postgres")
Startup->>Startup: Parse & deduplicate
Startup->>Startup: Get installed cognee version
Startup->>Installer: Install cognee[aws,postgres]==version
Installer-->>Startup: Install complete
Startup->>Docker: Log success + extras installed
else EXTRAS not set
Startup->>Docker: Log no optional dependencies specified
end
Docker->>Docker: Continue normal startup
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Key areas requiring extra attention:
Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (3 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Description
Fixes #1681
We do not support AWS out of the box in MCP right now.
This PR enables users to select what extras they want to start their
cognee-mcpwith, so users can extend MCP if need be.Testing
AWS_REGION,AWS_ACCESS_KEY_ID, andAWS_SECRET_ACCESS_KEYinto.envcognifywith s3 path to the generated storyLogs
Details
``` daulet@Mac cognee-claude % docker run \ -e TRANSPORT_MODE=sse \ -e EXTRAS=aws,postgres,neo4j \ --env-file ./.env \ -p 8121:8000 \ --rm -it cognee/cognee-mcp:custom-deps Debug mode: Environment: Installing optional dependencies: aws,postgres,neo4j Current cognee version: 0.3.7 Installing cognee with extras: aws,postgres,neo4j Running: uv pip install 'cognee[aws,postgres,neo4j]==0.3.7' Resolved 137 packages in 679ms Prepared 7 packages in 484ms Installed 7 packages in 86ms + aiobotocore==2.25.1 + aioitertools==0.12.0 + boto3==1.40.61 + botocore==1.40.61 + jmespath==1.0.1 + s3fs==2025.3.2 + s3transfer==0.14.0✓ Optional dependencies installation completed
Transport mode: sse
Debug port: 5678
HTTP port: 8000
Direct mode: Using local cognee instance
Running database migrations...
2025-10-29T16:56:01.562650 [info ] Logging initialized [cognee.shared.logging_utils] cognee_version=0.3.7 database_path=/app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases graph_database_name= os_info='Linux 6.12.5-linuxkit (#1 SMP Tue Jan 21 10:23:32 UTC 2025)' python_version=3.12.12 relational_config=cognee_db structlog_version=25.4.0 vector_config=lancedb
2025-10-29T16:56:01.562816 [info ] Database storage: /app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases [cognee.shared.logging_utils]
2025-10-29T16:56:01.782204 [warning ] Failed to import protego, make sure to install using pip install protego>=0.1 [cognee.shared.logging_utils]
2025-10-29T16:56:01.782403 [warning ] Failed to import playwright, make sure to install using pip install playwright>=1.9.0 [cognee.shared.logging_utils]
Database migrations done.
Starting Cognee MCP Server with transport mode: sse
2025-10-29T16:56:06.184893 [info ] Logging initialized [cognee.shared.logging_utils] cognee_version=0.3.7 database_path=/app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases graph_database_name= os_info='Linux 6.12.5-linuxkit (#1 SMP Tue Jan 21 10:23:32 UTC 2025)' python_version=3.12.12 relational_config=cognee_db structlog_version=25.4.0 vector_config=lancedb
2025-10-29T16:56:06.185069 [info ] Database storage: /app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases [cognee.shared.logging_utils]
2025-10-29T16:56:06.245181 [warning ] Failed to import protego, make sure to install using pip install protego>=0.1 [cognee.shared.logging_utils]
2025-10-29T16:56:06.245327 [warning ] Failed to import playwright, make sure to install using pip install playwright>=1.9.0 [cognee.shared.logging_utils]
2025-10-29T16:56:06.582115 [info ] Cognee client initialized in direct mode [cognee.shared.logging_utils]
2025-10-29T16:56:06.582268 [info ] Starting MCP server with transport: sse [cognee.shared.logging_utils]
2025-10-29T16:56:06.582618 [info ] Running MCP server with SSE transport on 0.0.0.0:8000 [cognee.shared.logging_utils]
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO: 192.168.65.1:54572 - "GET /sse HTTP/1.1" 200 OK
INFO: 192.168.65.1:57034 - "GET /sse HTTP/1.1" 200 OK
INFO: 192.168.65.1:57104 - "GET /sse HTTP/1.1" 200 OK
INFO: 192.168.65.1:55411 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted
INFO: 192.168.65.1:55411 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted
INFO: 192.168.65.1:22986 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted
Processing request of type CallToolRequest
2025-10-29T16:57:52.864665 [info ] Cognify process starting. [cognee.shared.logging_utils]
2025-10-29T16:57:58.159329 [info ] Pipeline run started:
469d4729-328a-542d-aac1-3ea4167d9b83[run_tasks_with_telemetry()]2025-10-29T16:57:58.332811 [info ] Coroutine task started:
resolve_data_directories[run_tasks_base]2025-10-29T16:57:58.604008 [info ] Coroutine task started:
ingest_data[run_tasks_base]2025-10-29T16:57:58.899181 [info ] Registered loader: pypdf_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:58.899637 [info ] Registered loader: text_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:58.899817 [info ] Registered loader: image_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:58.899964 [info ] Registered loader: audio_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:58.900103 [info ] Registered loader: unstructured_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:58.900197 [info ] Registered loader: advanced_pdf_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:58.900321 [info ] Registered loader: beautiful_soup_loader [cognee.infrastructure.loaders.LoaderEngine]
2025-10-29T16:57:59.022385 [info ] Coroutine task completed:
ingest_data[run_tasks_base]2025-10-29T16:57:59.183862 [info ] Coroutine task completed:
resolve_data_directories[run_tasks_base]2025-10-29T16:57:59.347932 [info ] Pipeline run completed:
469d4729-328a-542d-aac1-3ea4167d9b83[run_tasks_with_telemetry()]2025-10-29T16:57:59.600721 [info ] Loaded JSON extension [cognee.shared.logging_utils]
2025-10-29T16:57:59.616370 [info ] Ontology file 'None' not found. No owl ontology will be attached to the graph. [OntologyAdapter]
2025-10-29T16:57:59.633495 [info ] Pipeline run started:
21372a56-1d44-5e19-a024-209d03a99218[run_tasks_with_telemetry()]2025-10-29T16:57:59.784198 [info ] Coroutine task started:
classify_documents[run_tasks_base]2025-10-29T16:57:59.933817 [info ] Coroutine task started:
check_permissions_on_dataset[run_tasks_base]2025-10-29T16:58:00.147315 [info ] Async Generator task started:
extract_chunks_from_documents[run_tasks_base]2025-10-29T16:58:00.366572 [info ] Coroutine task started:
extract_graph_from_data[run_tasks_base]2025-10-29T16:58:51.639973 [info ] No close match found for 'person' in category 'classes' [OntologyAdapter]
2025-10-29T16:58:51.642293 [info ] No close match found for 'khaélith orun' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.642456 [info ] No close match found for 'location' in category 'classes' [OntologyAdapter]
2025-10-29T16:58:51.642551 [info ] No close match found for 'vorrxundra' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.642632 [info ] No close match found for 'creature' in category 'classes' [OntologyAdapter]
2025-10-29T16:58:51.642739 [info ] No close match found for 'thirvalque' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.642823 [info ] No close match found for 'ossaryn' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.642928 [info ] No close match found for 'fyrneloch' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.643126 [info ] No close match found for 'mirror-river' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.643237 [info ] No close match found for 'zyrrhalin' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.643325 [info ] No close match found for 'artifact' in category 'classes' [OntologyAdapter]
2025-10-29T16:58:51.643410 [info ] No close match found for 'crystal plates' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:51.643468 [info ] No close match found for 'concept' in category 'classes' [OntologyAdapter]
2025-10-29T16:58:51.643551 [info ] No close match found for 'forgotten futures' in category 'individuals' [OntologyAdapter]
2025-10-29T16:58:52.968522 [info ] Coroutine task started:
summarize_text[run_tasks_base]2025-10-29T16:59:03.982055 [info ] Coroutine task started:
add_data_points[run_tasks_base]2025-10-29T16:59:04.884215 [info ] Coroutine task completed:
add_data_points[run_tasks_base]2025-10-29T16:59:05.038833 [info ] Coroutine task completed:
summarize_text[run_tasks_base]2025-10-29T16:59:05.200412 [info ] Coroutine task completed:
extract_graph_from_data[run_tasks_base]2025-10-29T16:59:05.361403 [info ] Async Generator task completed:
extract_chunks_from_documents[run_tasks_base]2025-10-29T16:59:05.529879 [info ] Coroutine task completed:
check_permissions_on_dataset[run_tasks_base]2025-10-29T16:59:05.694801 [info ] Coroutine task completed:
classify_documents[run_tasks_base]2025-10-29T16:59:05.852353 [info ] Pipeline run completed:
21372a56-1d44-5e19-a024-209d03a99218[run_tasks_with_telemetry()]2025-10-29T16:59:06.042754 [info ] Cognify process finished. [cognee.shared.logging_utils]
INFO: 192.168.65.1:25778 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted
Processing request of type CallToolRequest
2025-10-29T17:01:03.372413 [info ] Graph projection completed: 17 nodes, 33 edges in 0.01s [CogneeGraph]
2025-10-29T17:01:03.790022 [info ] Vector collection retrieval completed: Retrieved distances from 6 collections in 0.07s [cognee.shared.logging_utils]