Skip to content

Conversation

@daukadolt
Copy link
Contributor

@daukadolt daukadolt commented Oct 29, 2025

Description

Fixes #1681

We do not support AWS out of the box in MCP right now.

This PR enables users to select what extras they want to start their cognee-mcp with, so users can extend MCP if need be.

Testing

  1. Added AWS_REGION, AWS_ACCESS_KEY_ID, and AWS_SECRET_ACCESS_KEY into .env
  2. Uploaded completely generated story into S3
  3. Started mcp, using MCP inspector called cognify with s3 path to the generated story
Screenshot 2025-10-29 at 17 08 10 4. Called "search" tool with story-specific question Screenshot 2025-10-29 at 17 08 34

Logs

Details ``` daulet@Mac cognee-claude % docker run \ -e TRANSPORT_MODE=sse \ -e EXTRAS=aws,postgres,neo4j \ --env-file ./.env \ -p 8121:8000 \ --rm -it cognee/cognee-mcp:custom-deps Debug mode: Environment: Installing optional dependencies: aws,postgres,neo4j Current cognee version: 0.3.7 Installing cognee with extras: aws,postgres,neo4j Running: uv pip install 'cognee[aws,postgres,neo4j]==0.3.7' Resolved 137 packages in 679ms Prepared 7 packages in 484ms Installed 7 packages in 86ms + aiobotocore==2.25.1 + aioitertools==0.12.0 + boto3==1.40.61 + botocore==1.40.61 + jmespath==1.0.1 + s3fs==2025.3.2 + s3transfer==0.14.0

✓ Optional dependencies installation completed
Transport mode: sse
Debug port: 5678
HTTP port: 8000
Direct mode: Using local cognee instance
Running database migrations...

2025-10-29T16:56:01.562650 [info ] Logging initialized [cognee.shared.logging_utils] cognee_version=0.3.7 database_path=/app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases graph_database_name= os_info='Linux 6.12.5-linuxkit (#1 SMP Tue Jan 21 10:23:32 UTC 2025)' python_version=3.12.12 relational_config=cognee_db structlog_version=25.4.0 vector_config=lancedb

2025-10-29T16:56:01.562816 [info ] Database storage: /app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases [cognee.shared.logging_utils]

2025-10-29T16:56:01.782204 [warning ] Failed to import protego, make sure to install using pip install protego>=0.1 [cognee.shared.logging_utils]

2025-10-29T16:56:01.782403 [warning ] Failed to import playwright, make sure to install using pip install playwright>=1.9.0 [cognee.shared.logging_utils]
Database migrations done.
Starting Cognee MCP Server with transport mode: sse

2025-10-29T16:56:06.184893 [info ] Logging initialized [cognee.shared.logging_utils] cognee_version=0.3.7 database_path=/app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases graph_database_name= os_info='Linux 6.12.5-linuxkit (#1 SMP Tue Jan 21 10:23:32 UTC 2025)' python_version=3.12.12 relational_config=cognee_db structlog_version=25.4.0 vector_config=lancedb

2025-10-29T16:56:06.185069 [info ] Database storage: /app/.venv/lib/python3.12/site-packages/cognee/.cognee_system/databases [cognee.shared.logging_utils]

2025-10-29T16:56:06.245181 [warning ] Failed to import protego, make sure to install using pip install protego>=0.1 [cognee.shared.logging_utils]

2025-10-29T16:56:06.245327 [warning ] Failed to import playwright, make sure to install using pip install playwright>=1.9.0 [cognee.shared.logging_utils]

2025-10-29T16:56:06.582115 [info ] Cognee client initialized in direct mode [cognee.shared.logging_utils]

2025-10-29T16:56:06.582268 [info ] Starting MCP server with transport: sse [cognee.shared.logging_utils]

2025-10-29T16:56:06.582618 [info ] Running MCP server with SSE transport on 0.0.0.0:8000 [cognee.shared.logging_utils]
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO: 192.168.65.1:54572 - "GET /sse HTTP/1.1" 200 OK
INFO: 192.168.65.1:57034 - "GET /sse HTTP/1.1" 200 OK
INFO: 192.168.65.1:57104 - "GET /sse HTTP/1.1" 200 OK
INFO: 192.168.65.1:55411 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted
INFO: 192.168.65.1:55411 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted
INFO: 192.168.65.1:22986 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted

Processing request of type CallToolRequest

2025-10-29T16:57:52.864665 [info ] Cognify process starting. [cognee.shared.logging_utils]

2025-10-29T16:57:58.159329 [info ] Pipeline run started: 469d4729-328a-542d-aac1-3ea4167d9b83 [run_tasks_with_telemetry()]

2025-10-29T16:57:58.332811 [info ] Coroutine task started: resolve_data_directories [run_tasks_base]

2025-10-29T16:57:58.604008 [info ] Coroutine task started: ingest_data [run_tasks_base]

2025-10-29T16:57:58.899181 [info ] Registered loader: pypdf_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:58.899637 [info ] Registered loader: text_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:58.899817 [info ] Registered loader: image_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:58.899964 [info ] Registered loader: audio_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:58.900103 [info ] Registered loader: unstructured_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:58.900197 [info ] Registered loader: advanced_pdf_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:58.900321 [info ] Registered loader: beautiful_soup_loader [cognee.infrastructure.loaders.LoaderEngine]

2025-10-29T16:57:59.022385 [info ] Coroutine task completed: ingest_data [run_tasks_base]

2025-10-29T16:57:59.183862 [info ] Coroutine task completed: resolve_data_directories [run_tasks_base]

2025-10-29T16:57:59.347932 [info ] Pipeline run completed: 469d4729-328a-542d-aac1-3ea4167d9b83 [run_tasks_with_telemetry()]

2025-10-29T16:57:59.600721 [info ] Loaded JSON extension [cognee.shared.logging_utils]

2025-10-29T16:57:59.616370 [info ] Ontology file 'None' not found. No owl ontology will be attached to the graph. [OntologyAdapter]

2025-10-29T16:57:59.633495 [info ] Pipeline run started: 21372a56-1d44-5e19-a024-209d03a99218 [run_tasks_with_telemetry()]

2025-10-29T16:57:59.784198 [info ] Coroutine task started: classify_documents [run_tasks_base]

2025-10-29T16:57:59.933817 [info ] Coroutine task started: check_permissions_on_dataset [run_tasks_base]

2025-10-29T16:58:00.147315 [info ] Async Generator task started: extract_chunks_from_documents [run_tasks_base]

2025-10-29T16:58:00.366572 [info ] Coroutine task started: extract_graph_from_data [run_tasks_base]

2025-10-29T16:58:51.639973 [info ] No close match found for 'person' in category 'classes' [OntologyAdapter]

2025-10-29T16:58:51.642293 [info ] No close match found for 'khaélith orun' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.642456 [info ] No close match found for 'location' in category 'classes' [OntologyAdapter]

2025-10-29T16:58:51.642551 [info ] No close match found for 'vorrxundra' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.642632 [info ] No close match found for 'creature' in category 'classes' [OntologyAdapter]

2025-10-29T16:58:51.642739 [info ] No close match found for 'thirvalque' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.642823 [info ] No close match found for 'ossaryn' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.642928 [info ] No close match found for 'fyrneloch' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.643126 [info ] No close match found for 'mirror-river' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.643237 [info ] No close match found for 'zyrrhalin' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.643325 [info ] No close match found for 'artifact' in category 'classes' [OntologyAdapter]

2025-10-29T16:58:51.643410 [info ] No close match found for 'crystal plates' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:51.643468 [info ] No close match found for 'concept' in category 'classes' [OntologyAdapter]

2025-10-29T16:58:51.643551 [info ] No close match found for 'forgotten futures' in category 'individuals' [OntologyAdapter]

2025-10-29T16:58:52.968522 [info ] Coroutine task started: summarize_text [run_tasks_base]

2025-10-29T16:59:03.982055 [info ] Coroutine task started: add_data_points [run_tasks_base]

2025-10-29T16:59:04.884215 [info ] Coroutine task completed: add_data_points [run_tasks_base]

2025-10-29T16:59:05.038833 [info ] Coroutine task completed: summarize_text [run_tasks_base]

2025-10-29T16:59:05.200412 [info ] Coroutine task completed: extract_graph_from_data [run_tasks_base]

2025-10-29T16:59:05.361403 [info ] Async Generator task completed: extract_chunks_from_documents [run_tasks_base]

2025-10-29T16:59:05.529879 [info ] Coroutine task completed: check_permissions_on_dataset [run_tasks_base]

2025-10-29T16:59:05.694801 [info ] Coroutine task completed: classify_documents [run_tasks_base]

2025-10-29T16:59:05.852353 [info ] Pipeline run completed: 21372a56-1d44-5e19-a024-209d03a99218 [run_tasks_with_telemetry()]

2025-10-29T16:59:06.042754 [info ] Cognify process finished. [cognee.shared.logging_utils]
INFO: 192.168.65.1:25778 - "POST /messages/?session_id=33fc2c041d184f3ab62597c69259d036 HTTP/1.1" 202 Accepted

Processing request of type CallToolRequest

2025-10-29T17:01:03.372413 [info ] Graph projection completed: 17 nodes, 33 edges in 0.01s [CogneeGraph]

2025-10-29T17:01:03.790022 [info ] Vector collection retrieval completed: Retrieved distances from 6 collections in 0.07s [cognee.shared.logging_utils]

</details>

## Type of Change
<!-- Please check the relevant option -->
- [ ] Bug fix (non-breaking change that fixes an issue)
- [ ] New feature (non-breaking change that adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to change)
- [ ] Documentation update
- [ ] Code refactoring
- [ ] Performance improvement
- [ ] Other (please specify):

## Screenshots/Videos (if applicable)
<!-- Add screenshots or videos to help explain your changes -->

## Pre-submission Checklist
<!-- Please check all boxes that apply before submitting your PR -->
- [ ] **I have tested my changes thoroughly before submitting this PR**
- [ ] **This PR contains minimal changes necessary to address the issue/feature**
- [ ] My code follows the project's coding standards and style guidelines
- [ ] I have added tests that prove my fix is effective or that my feature works
- [ ] I have added necessary documentation (if applicable)
- [ ] All new and existing tests pass
- [ ] I have searched existing PRs to ensure this change hasn't been submitted already
- [ ] I have linked any relevant issues in the description
- [ ] My commits have clear and descriptive messages

## DCO Affirmation
I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin.

@pull-checklist
Copy link

Please make sure all the checkboxes are checked:

  • I have tested these changes locally.
  • I have reviewed the code changes.
  • I have added end-to-end and unit tests (if applicable).
  • I have updated the documentation and README.md file (if necessary).
  • I have removed unnecessary code and debug statements.
  • PR title is clear and follows the convention.
  • I have tagged reviewers or team members for feedback.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 29, 2025

Important

Review skipped

Auto reviews are disabled on base/target branches other than the default branch.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Walkthrough

This PR introduces telemetry version/tenant tracking across API routes and task pipelines, adds a feedback enrichment feature with extraction/generation/linking workflows, enhances the retrieval system with structured completions, updates Docker deployment with runtime extras installation, refactors infrastructure components (ontology resolver, vector engine, graph adapter, logging), significantly revamps network visualization, and updates prompt templates.

Changes

Cohort / File(s) Summary
Telemetry enrichment (version & tenant tracking)
cognee/api/v1/add/routers/get_add_router.py, cognee/api/v1/cognify/routers/get_cognify_router.py, cognee/api/v1/datasets/routers/get_datasets_router.py, cognee/api/v1/delete/routers/get_delete_router.py, cognee/api/v1/memify/routers/get_memify_router.py, cognee/api/v1/permissions/routers/get_permissions_router.py, cognee/api/v1/search/routers/get_search_router.py, cognee/api/v1/sync/routers/get_sync_router.py, cognee/api/v1/update/routers/get_update_router.py, cognee/api/v1/users/routers/get_visualize_router.py
Added cognee_version import and propagated to telemetry payloads across all API endpoint routers
Pipeline task telemetry
cognee/modules/pipelines/operations/run_tasks_base.py, cognee/modules/pipelines/operations/run_tasks_with_telemetry.py, cognee/modules/search/methods/search.py
Enhanced telemetry with cognee_version and tenant_id (derived from user context or defaulting to "Single User Tenant")
Docker deployment
cognee-mcp/README.md, cognee-mcp/entrypoint.sh, cognee/api/v1/ui/ui.py
Added runtime optional dependencies installation via EXTRAS environment variable; updated MCP Docker image tag from feature-standalone-mcp to main; documented available extras groups
Feedback enrichment feature
cognee/tasks/feedback/__init__.py, cognee/tasks/feedback/models.py, cognee/tasks/feedback/extract_feedback_interactions.py, cognee/tasks/feedback/generate_improved_answers.py, cognee/tasks/feedback/create_enrichments.py, cognee/tasks/feedback/link_enrichments_to_feedback.py
New comprehensive feedback enrichment workflow: data model, extraction from graph, LLM-powered answer improvement, enrichment generation, and graph linking
Feedback prompts
cognee/infrastructure/llm/prompts/feedback_reaction_prompt.txt, cognee/infrastructure/llm/prompts/feedback_report_prompt.txt, cognee/infrastructure/llm/prompts/feedback_user_context_prompt.txt, cognee/infrastructure/llm/prompts/extract_query_time.txt
New prompt templates for feedback reaction generation, report creation, and context summarization; rewrote temporal query extraction prompt
Retrieval system enhancements
cognee/modules/retrieval/graph_completion_cot_retriever.py, cognee/modules/retrieval/utils/completion.py, cognee/modules/retrieval/temporal_retriever.py
Introduced structured completion support with response_model handling; added get_structured_completion method; added datetime context for temporal queries
Infrastructure improvements
cognee/infrastructure/databases/vector/create_vector_engine.py
Normalized provider comparisons; added explicit LanceDB branching; consolidated error messaging with supported provider enumeration
Ontology resolver
cognee/modules/ontology/get_default_ontology_resolver.py, cognee/modules/ontology/rdf_xml/RDFLibOntologyResolver.py
Extended to support multiple ontology files via comma-separated paths or list input; multi-file loading with per-file validation and aggregation
Graph adapter
cognee/infrastructure/databases/graph/kuzu/adapter.py
Enhanced node query projection in get_filtered_graph_data to return richer structure with name, type, and properties fields instead of minimal structure
File type detection
cognee/infrastructure/files/utils/guess_file_type.py
Removed PDF file type support (CustomPdfMatcher class and related registration)
Logging utilities
cognee/shared/logging_utils.py, cognee/shared/utils.py
Refactored root logger configuration with graceful file handler fallback; added UUID-based property sanitization for telemetry; made get_anonymous_id resilient to filesystem errors
Network visualization
cognee/modules/visualization/cognee_network_visualization.py
Major overhaul: new color mappings, D3 density visualization, info panel, improved tooltip system, hover-based highlighting, ID normalization, safe JSON embedding, responsive label sizing, and robust error handling
Tests & examples
cognee/tests/test_feedback_enrichment.py, cognee/tests/unit/modules/ontology/test_ontology_adapter.py, cognee/tests/unit/modules/retrieval/graph_completion_retriever_cot_test.py, examples/python/feedback_enrichment_minimal_example.py, examples/python/temporal_example.py
Added end-to-end feedback enrichment integration test; multi-file ontology unit tests; structured completion validation tests; minimal feedback enrichment example; added temporal query example
Minor updates
cognee/tasks/ingestion/migrate_relational_database.py
Removed parent row ID reference from ColumnValue node description

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant API as API Endpoint
    participant Telemetry as Telemetry System
    participant LLMGateway
    participant GraphEngine

    User->>API: Request with context (e.g., cognify)
    API->>API: Import cognee_version
    API->>Telemetry: Send telemetry with cognee_version + tenant_id
    Telemetry->>Telemetry: Log enriched metadata
    API->>GraphEngine: Execute core logic
    GraphEngine-->>API: Result
    API-->>User: Response

    Note over Telemetry: Version & tenant now tracked across all endpoints
Loading
sequenceDiagram
    participant GraphData as Graph Data
    participant ExtractTask as extract_feedback_interactions
    participant GenerateTask as generate_improved_answers
    participant CreateTask as create_enrichments
    participant LinkTask as link_enrichments_to_feedback
    participant GraphEngine

    GraphData->>ExtractTask: Fetch feedback & interactions
    ExtractTask->>ExtractTask: Match pairs, filter negative
    ExtractTask->>ExtractTask: Generate human-readable context via LLM
    ExtractTask-->>GenerateTask: FeedbackEnrichment records

    GenerateTask->>GenerateTask: Initialize GraphCompletionCotRetriever
    GenerateTask->>LLMGateway: Render reaction prompt & get structured completion
    GenerateTask->>GenerateTask: Update improved_answer & explanation
    GenerateTask-->>CreateTask: Enriched records

    CreateTask->>LLMGateway: Generate report via feedback_report_prompt
    CreateTask->>CreateTask: Assign to NodeSet
    CreateTask-->>LinkTask: Enrichments with text & set

    LinkTask->>LinkTask: Create enriches_feedback & improves_interaction edges
    LinkTask->>GraphEngine: Add & index edges
    GraphEngine-->>LinkTask: Success
    LinkTask-->>User: FeedbackEnrichment nodes linked to graph

    Note over ExtractTask,LinkTask: Complete feedback enrichment pipeline
Loading
sequenceDiagram
    participant Startup as Entrypoint
    participant EnvVars as Environment
    participant Installer as pip install
    participant Docker as Container

    Docker->>Startup: Container start with EXTRAS env var
    Startup->>EnvVars: Check EXTRAS
    alt EXTRAS provided (e.g., "aws,postgres")
        Startup->>Startup: Parse & deduplicate
        Startup->>Startup: Get installed cognee version
        Startup->>Installer: Install cognee[aws,postgres]==version
        Installer-->>Startup: Install complete
        Startup->>Docker: Log success + extras installed
    else EXTRAS not set
        Startup->>Docker: Log no optional dependencies specified
    end
    Docker->>Docker: Continue normal startup
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Key areas requiring extra attention:

  • Feedback enrichment feature (cognee/tasks/feedback/*) — substantial new feature with multi-step orchestration, LLM integration, graph linking, and validation logic across 6 new modules
  • Network visualization (cognee/modules/visualization/cognee_network_visualization.py) — intricate D3 density visualization, color mappings, interaction handlers, and numerous styling enhancements; requires visual validation
  • Retrieval system changes (cognee/modules/retrieval/graph_completion_cot_retriever.py, cognee/modules/retrieval/utils/completion.py) — structural refactoring introducing structured completions with response_model handling; affects multiple downstream callers
  • Ontology resolver multi-file support (cognee/modules/ontology/rdf_xml/RDFLibOntologyResolver.py) — public API signature change; multi-file loading with edge cases (partial failures, all missing files)
  • Telemetry changes density — while mostly repetitive, verify consistent tenant_id derivation logic across ~10 router files
  • Logging refactor (cognee/shared/logging_utils.py, cognee/shared/utils.py) — error handling divergence when file handler creation fails; verify fallback behavior preserves functionality

Possibly related PRs

Suggested labels

review-required, feature-feedback-enrichment, telemetry, visualization, refactor-retrieval

Suggested reviewers

  • borisarzentar
  • dexters1
  • Vasilije1990

Poem

🐰 Version tracking hops through every frame,
Feedback whispers guide the way,
Density clouds dance in D3's flame,
Ontologies bloom in many a way,
One PR to weave them all the same!

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Out of Scope Changes Check ⚠️ Warning The changeset contains extensive modifications beyond the scope of issue #1681. While the EXTRAS installation feature (entrypoint.sh, README.md, ui.py) directly addresses the linked issue, the PR also includes a comprehensive feedback enrichment system (extract_feedback_interactions, generate_improved_answers, create_enrichments, link_enrichments_to_feedback, FeedbackEnrichment model, and associated tests), structured completion enhancements in graph retrievers, temporal retriever modifications, visualization overhaul with density visualization and interaction enhancements, logging infrastructure improvements, ontology multi-file support, vector engine provider normalization, Kuzu adapter graph data structure changes, telemetry version tracking additions across all API routers, and prompt template additions. These changes are not mentioned in issue #1681 and appear to represent additional features beyond the stated scope of enabling optional dependencies for AWS S3 support in MCP. Consider whether the feedback enrichment system, retrieval enhancements, visualization improvements, telemetry additions, and other infrastructure changes should be split into separate pull requests linked to their respective issues. If these changes are intentionally included as supporting infrastructure for the EXTRAS feature, document this relationship in the PR description and link any relevant issues that justify their inclusion.
Docstring Coverage ⚠️ Warning Docstring coverage is 72.53% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (3 passed)
Check name Status Explanation
Title Check ✅ Passed The title "Feat/cognee mcp add option to install extras" accurately reflects the primary feature being added to the pull request. The entrypoint.sh modification enables runtime installation of optional dependencies through an EXTRAS environment variable, the README documents this feature with usage examples, and the ui.py updates the Docker image tag. The title is concise, specific, and clearly communicates the main change—enabling users to select and install optional extras when running Cognee MCP.
Linked Issues Check ✅ Passed The linked issue #1681 requires adding s3fs as a dependency to the Cognee MCP Docker image to support AWS S3 configuration. The PR addresses the core objective through entrypoint.sh, which implements runtime installation of optional extras via the EXTRAS environment variable, and testing logs confirm that s3fs and related AWS libraries (aiobotocore, boto3, botocore, s3transfer) are successfully installed and functional. The approach enables dynamic dependency installation rather than pre-baking dependencies into the image, which achieves the functional goal of making S3 available when configured, as demonstrated by the successful cognify pipeline execution with S3 data ingestion shown in the testing logs.
Description Check ✅ Passed The pull request description is comprehensive and meets the core requirements of the template. It includes a clear human-generated explanation linking to issue #1681, a detailed testing section with screenshots and console logs demonstrating AWS S3 integration working end-to-end, and evidence that optional dependencies (aws, postgres, neo4j) are successfully installed and functional. While the Type of Change checkboxes and pre-submission checklist items remain unchecked, the essential content about what changes were made, why, and how they were tested is fully present.

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@daukadolt daukadolt changed the base branch from main to dev October 29, 2025 17:13
@Vasilije1990 Vasilije1990 merged commit a2dbc0c into dev Oct 29, 2025
134 of 139 checks passed
@Vasilije1990 Vasilije1990 deleted the feat/cognee-mcp-add-option-to-install-extras branch October 29, 2025 18:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: s3fs module is missing from Cognee MCP Docker image

3 participants