Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
or
```
import cognee
cognee.config.llm_api_key = "YOUR_OPENAI_API_KEY"
cognee.config.set_llm_api_key("YOUR_OPENAI_API_KEY")
```
You can also set the variables by creating .env file, here is our <a href="https://github.com/topoteretes/cognee/blob/main/.env.template">template.</a>
To use different LLM providers, for more info check out our <a href="https://topoteretes.github.io/cognee">documentation</a>
Expand Down
40 changes: 29 additions & 11 deletions cognee/api/v1/config/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
from cognee.infrastructure.data.chunking.config import get_chunk_config
from cognee.infrastructure.databases.vector import get_vectordb_config
from cognee.infrastructure.databases.graph.config import get_graph_config
from cognee.infrastructure.llm.config import get_llm_config
from cognee.infrastructure.databases.relational import get_relational_config
from cognee.infrastructure.files.storage import LocalStorage

Expand Down Expand Up @@ -55,19 +56,36 @@ def set_graph_database_provider(graph_database_provider: str):
graph_config.graph_database_provider = graph_database_provider

@staticmethod
def llm_provider(llm_provider: str):
graph_config = get_graph_config()
graph_config.llm_provider = llm_provider
def set_llm_provider(llm_provider: str):
llm_config = get_llm_config()
llm_config.llm_provider = llm_provider

@staticmethod
def llm_endpoint(llm_endpoint: str):
graph_config = get_graph_config()
graph_config.llm_endpoint = llm_endpoint
def set_llm_endpoint(llm_endpoint: str):
llm_config = get_llm_config()
llm_config.llm_endpoint = llm_endpoint

@staticmethod
def llm_model(llm_model: str):
graph_config = get_graph_config()
graph_config.llm_model = llm_model
def set_llm_model(llm_model: str):
llm_config = get_llm_config()
llm_config.llm_model = llm_model
Comment on lines +59 to +71
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Breaking change not properly documented, needs attention

The method renaming from llm_provider, llm_endpoint, and llm_model to their set_ prefixed versions is a breaking change that needs documentation. While the configuration is well documented in docs/configuration.md and docs/local_models.md, these docs still show the old method names in their examples.

Please update:

  • docs/configuration.md: Update configuration examples to use new set_ prefixed methods
  • docs/local_models.md: Update configuration examples to use new set_ prefixed methods
🔗 Analysis chain

LGTM: Method renaming improves clarity.

The renaming of methods to include 'set_' prefix better indicates their mutating nature and aligns with the class's naming conventions.

Let's verify if this breaking change is documented:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for documentation of the breaking changes
rg -i "breaking.?chang|upgrade.?guide|migration.?guide" README.md CHANGELOG.md docs/

Length of output: 139


Script:

#!/bin/bash
# Let's check if there are any documentation files in the repository
fd -e md -e rst -e txt

# Also check if there are any references to these methods in any file
rg -i "llm_(provider|endpoint|model)" 

Length of output: 5406


@staticmethod
def set_llm_api_key(llm_api_key: str):
llm_config = get_llm_config()
llm_config.llm_api_key = llm_api_key

@staticmethod
def set_llm_config(config_dict: dict):
"""
Updates the llm config with values from config_dict.
"""
llm_config = get_llm_config()
for key, value in config_dict.items():
if hasattr(llm_config, key):
object.__setattr__(llm_config, key, value)
else:
raise AttributeError(f"'{key}' is not a valid attribute of the config.")
Comment on lines +79 to +88
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider reusing set_llm_config for individual setters.

The bulk update logic in set_llm_config could be reused by individual setters to maintain consistency in validation and error handling.

Example refactor for set_llm_api_key:

 @staticmethod
 def set_llm_api_key(llm_api_key: str):
-    llm_config = get_llm_config()
-    llm_config.llm_api_key = llm_api_key
+    config.set_llm_config({"llm_api_key": llm_api_key})

Committable suggestion was skipped due to low confidence.


@staticmethod
def set_chunk_strategy(chunk_strategy: object):
Expand Down Expand Up @@ -137,5 +155,5 @@ def set_graphistry_config(graphistry_config: dict[str, str]):
if "username" not in graphistry_config or "password" not in graphistry_config:
raise ValueError("graphistry_config dictionary must contain 'username' and 'password' keys.")

base_config.graphistry_username = graphistry_config.username
base_config.graphistry_password = graphistry_config.password
base_config.graphistry_username = graphistry_config.get("username")
base_config.graphistry_password = graphistry_config.get("password")
14 changes: 7 additions & 7 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,11 @@ Check available configuration options:
from cognee.infrastructure.databases.vector import get_vectordb_config
from cognee.infrastructure.databases.graph.config import get_graph_config
from cognee.infrastructure.databases.relational import get_relational_config
from cognee.infrastructure.llm.config import get_llm_config
print(get_vectordb_config().to_dict())
print(get_graph_config().to_dict())
print(get_relational_config().to_dict())
print(get_llm_config().to_dict())

```

Expand All @@ -29,8 +31,7 @@ GRAPH_DATABASE_PROVIDER = 'lancedb'
Otherwise, you can set the configuration yourself:

```python

cognee.config.llm_provider = 'ollama'
cognee.config.set_llm_provider('ollama')
```

## 🚀 Getting Started with Local Models
Expand All @@ -52,15 +53,14 @@ LLM_PROVIDER = 'ollama'
Otherwise, you can set the configuration for the model:

```bash
cognee.config.llm_provider = 'ollama'
cognee.config.set_llm_provider('ollama')

```
You can also set the HOST and model name:

```bash

cognee.config.llm_endpoint = "http://localhost:11434/v1"
cognee.config.llm_model = "mistral:instruct"
cognee.config.set_llm_endpoint("http://localhost:11434/v1")
cognee.config.set_llm_model("mistral:instruct")
```


Expand All @@ -73,7 +73,7 @@ LLM_PROVIDER = 'custom'
Otherwise, you can set the configuration for the model:

```bash
cognee.config.llm_provider = 'custom'
cognee.config.set_llm_provider('custom')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Complete the custom provider configuration examples.

The documentation shows environment variables for endpoint, model, and API key, but lacks the corresponding code examples using the new method calls. Consider adding complete programmatic configuration examples:

 cognee.config.set_llm_provider('custom')
+# Configure custom provider settings
+cognee.config.set_llm_model("mistralai/Mixtral-8x7B-Instruct-v0.1")
+cognee.config.set_llm_endpoint("https://api.endpoints.anyscale.com/v1")
+cognee.config.set_llm_api_key("your_api_key")

Committable suggestion was skipped due to low confidence.


```
You can also set the HOST and model name:
Expand Down
Loading