Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
6773121
fix: datasets status without datasets parameter
borisarzentar Jul 28, 2025
9793cd5
version: 0.2.2.dev0
borisarzentar Jul 28, 2025
961fa5e
chore: update uv.lock file
borisarzentar Jul 28, 2025
4ea4b10
fix: datasets status (#1166)
borisarzentar Jul 28, 2025
f78af0c
feature: solve edge embedding duplicates in edge collection + retriev…
hajdul88 Jul 29, 2025
14ba3e8
feat: Enable async execution of data items for incremental loading (#…
dexters1 Jul 29, 2025
cd930ed
Merge remote-tracking branch 'origin/main' into dev
borisarzentar Jul 30, 2025
5b6e946
fix: Add async lock for dynamic vector table creation (#1175)
dexters1 Aug 1, 2025
9faa47f
feat: add default tokenizer in case hugging face is not available (#1…
dexters1 Aug 1, 2025
fc7a91d
feature: implement FEELING_LUCKY search type (#1178)
EricXiao95 Aug 2, 2025
df17ae7
added fix to weighted edges
Vasilije1990 Aug 2, 2025
c686e39
fix: Add better docstrings to mcp (#1189)
Vasilije1990 Aug 4, 2025
f4a37ed
Cognee mcp docker fix (#1195)
dexters1 Aug 4, 2025
ab425e4
Merge branch 'main' into merge-main-vol-4
dexters1 Aug 5, 2025
343d990
Merge main vol 4 (#1200)
dexters1 Aug 5, 2025
d237b80
Merge branch 'dev' into merge-main-vol-4
dexters1 Aug 5, 2025
ba62466
Merge main vol 4 (#1201)
dexters1 Aug 5, 2025
8d4ed35
Fix low level pipeline (#1203)
dexters1 Aug 5, 2025
dabd091
feat: Cog 2082 add BAML to cognee (#1054)
Vasilije1990 Aug 6, 2025
b54e843
Add neo4j multi db support (#1207)
dexters1 Aug 6, 2025
a9e74da
Update LiteLLMEmbeddingEngine.py (#1205)
Aug 6, 2025
4e816ad
fix: changing deletion logic to use document id instead of content ha…
hajdul88 Aug 6, 2025
0ea5894
added distributed fixes
Vasilije1990 Aug 5, 2025
c8202c5
format fix
Vasilije1990 Aug 6, 2025
6dbd8e8
feat: dynamic multiple edges in datapoints (#1212)
lxobr Aug 7, 2025
913a639
Fix Graph Visualization Access for Users with Read Permissions
EricXiao95 Aug 7, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Merge main vol 4 (#1200)
<!-- .github/pull_request_template.md -->

## Description
<!-- Provide a clear description of the changes in this PR -->

## DCO Affirmation
I affirm that all code in every commit of this pull request conforms to
the terms of the Topoteretes Developer Certificate of Origin.

---------

Signed-off-by: Andrew Carbonetto <andrew.carbonetto@improving.com>
Signed-off-by: Andy Kwok <andy.kwok@improving.com>
Co-authored-by: Vasilije <8619304+Vasilije1990@users.noreply.github.com>
Co-authored-by: vasilije <vas.markovic@gmail.com>
Co-authored-by: Andrew Carbonetto <andrew.carbonetto@improving.com>
Co-authored-by: Andy Kwok <andy.kwok@improving.com>
  • Loading branch information
5 people authored Aug 5, 2025
commit 343d990fcc17e4a22adc58aaf7d60a594fd7517d
6 changes: 4 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,15 +99,17 @@ python cognee/cognee/tests/test_library.py

## 4. 📤 Submitting Changes

1. Push your changes:
1. Install ruff on your system
2. Run ```ruff format .``` and ``` ruff check ``` and fix the issues
3. Push your changes:
```shell
git add .
git commit -s -m "Description of your changes"
git push origin feature/your-feature-name
```

2. Create a Pull Request:
- Go to the [**cognee** repository](https://github.com/topoteretes/cognee)
- Go to the [**cognee** repository](https://github.com/topoteretes/cognee) or [cognee community repository](https://github.com/topoteretes/cognee-community)
- Click "Compare & Pull Request" and open a PR against dev branch
- Fill in the PR template with details about your changes

Expand Down
1 change: 1 addition & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ COPY alembic/ /app/alembic
# Then, add the rest of the project source code and install it
# Installing separately from its dependencies allows optimal layer caching
COPY ./cognee /app/cognee
COPY ./distributed /app/distributed
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --extra debug --extra api --extra postgres --extra qdrant --extra neo4j --extra llama-index --extra gemini --extra ollama --extra mistral --extra groq --extra anthropic --frozen --no-dev --no-editable

Expand Down
1 change: 1 addition & 0 deletions cognee-mcp/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ RUN apt-get update && apt-get install -y \

WORKDIR /app

# Copy the virtual environment from the uv stage
COPY --from=uv /usr/local /usr/local
COPY --from=uv /app /app

Expand Down
6 changes: 3 additions & 3 deletions cognee-mcp/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -48,15 +48,15 @@ if [ "$ENVIRONMENT" = "dev" ] || [ "$ENVIRONMENT" = "local" ]; then
if [ "$DEBUG" = "true" ]; then
echo "Waiting for the debugger to attach..."
if [ "$TRANSPORT_MODE" = "sse" ]; then
exec python -m debugpy --wait-for-client --listen 0.0.0.0:$DEBUG_PORT -m cognee --transport sse --no-migration
exec python -m debugpy --wait-for-client --listen 0.0.0.0:$DEBUG_PORT -m cognee --transport sse --host 0.0.0.0 --port $HTTP_PORT --no-migration
elif [ "$TRANSPORT_MODE" = "http" ]; then
exec python -m debugpy --wait-for-client --listen 0.0.0.0:$DEBUG_PORT -m cognee --transport http --host 0.0.0.0 --port $HTTP_PORT --no-migration
else
exec python -m debugpy --wait-for-client --listen 0.0.0.0:$DEBUG_PORT -m cognee --transport stdio --no-migration
fi
else
if [ "$TRANSPORT_MODE" = "sse" ]; then
exec cognee --transport sse --no-migration
exec cognee --transport sse --host 0.0.0.0 --port $HTTP_PORT --no-migration
elif [ "$TRANSPORT_MODE" = "http" ]; then
exec cognee --transport http --host 0.0.0.0 --port $HTTP_PORT --no-migration
else
Expand All @@ -65,7 +65,7 @@ if [ "$ENVIRONMENT" = "dev" ] || [ "$ENVIRONMENT" = "local" ]; then
fi
else
if [ "$TRANSPORT_MODE" = "sse" ]; then
exec cognee --transport sse --no-migration
exec cognee --transport sse --host 0.0.0.0 --port $HTTP_PORT --no-migration
elif [ "$TRANSPORT_MODE" = "http" ]; then
exec cognee --transport http --host 0.0.0.0 --port $HTTP_PORT --no-migration
else
Expand Down
3 changes: 3 additions & 0 deletions cognee-mcp/src/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -947,6 +947,9 @@ async def main():

args = parser.parse_args()

mcp.settings.host = args.host
mcp.settings.port = args.port

if not args.no_migration:
# Run Alembic migrations from the main cognee directory where alembic.ini is located
logger.info("Running database migrations...")
Expand Down
2,276 changes: 1,273 additions & 1,003 deletions cognee-mcp/uv.lock

Large diffs are not rendered by default.

57 changes: 57 additions & 0 deletions cognee/infrastructure/databases/graph/get_graph_engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,63 @@ def create_graph_engine(
graph_database_password=graph_database_password or None,
)

elif graph_database_provider == "neptune":
try:
from langchain_aws import NeptuneAnalyticsGraph
except ImportError:
raise ImportError(
"langchain_aws is not installed. Please install it with 'pip install langchain_aws'"
)

if not graph_database_url:
raise EnvironmentError("Missing Neptune endpoint.")

from .neptune_driver.adapter import NeptuneGraphDB, NEPTUNE_ENDPOINT_URL

if not graph_database_url.startswith(NEPTUNE_ENDPOINT_URL):
raise ValueError(
f"Neptune endpoint must have the format {NEPTUNE_ENDPOINT_URL}<GRAPH_ID>"
)

graph_identifier = graph_database_url.replace(NEPTUNE_ENDPOINT_URL, "")

return NeptuneGraphDB(
graph_id=graph_identifier,
)

elif graph_database_provider == "neptune_analytics":
"""
Creates a graph DB from config
We want to use a hybrid (graph & vector) DB and we should update this
to make a single instance of the hybrid configuration (with embedder)
instead of creating the hybrid object twice.
"""
try:
from langchain_aws import NeptuneAnalyticsGraph
except ImportError:
raise ImportError(
"langchain_aws is not installed. Please install it with 'pip install langchain_aws'"
)

if not graph_database_url:
raise EnvironmentError("Missing Neptune endpoint.")

from ..hybrid.neptune_analytics.NeptuneAnalyticsAdapter import (
NeptuneAnalyticsAdapter,
NEPTUNE_ANALYTICS_ENDPOINT_URL,
)

if not graph_database_url.startswith(NEPTUNE_ANALYTICS_ENDPOINT_URL):
raise ValueError(
f"Neptune endpoint must have the format '{NEPTUNE_ANALYTICS_ENDPOINT_URL}<GRAPH_ID>'"
)

graph_identifier = graph_database_url.replace(NEPTUNE_ANALYTICS_ENDPOINT_URL, "")

return NeptuneAnalyticsAdapter(
graph_id=graph_identifier,
)

from .networkx.adapter import NetworkXAdapter

graph_client = NetworkXAdapter(filename=graph_file_path)
Expand Down
15 changes: 15 additions & 0 deletions cognee/infrastructure/databases/graph/neptune_driver/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
"""Neptune Analytics Driver Module

This module provides the Neptune Analytics adapter and utilities for interacting
with Amazon Neptune Analytics graph databases.
"""

from .adapter import NeptuneGraphDB
from . import neptune_utils
from . import exceptions

__all__ = [
"NeptuneGraphDB",
"neptune_utils",
"exceptions",
]
Loading
Loading