-
Notifications
You must be signed in to change notification settings - Fork 960
fix: Fix cognee graphistry and llm configuration through code #174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -5,6 +5,7 @@ | |
| from cognee.infrastructure.data.chunking.config import get_chunk_config | ||
| from cognee.infrastructure.databases.vector import get_vectordb_config | ||
| from cognee.infrastructure.databases.graph.config import get_graph_config | ||
| from cognee.infrastructure.llm.config import get_llm_config | ||
| from cognee.infrastructure.databases.relational import get_relational_config | ||
| from cognee.infrastructure.files.storage import LocalStorage | ||
|
|
||
|
|
@@ -55,19 +56,36 @@ def set_graph_database_provider(graph_database_provider: str): | |
| graph_config.graph_database_provider = graph_database_provider | ||
|
|
||
| @staticmethod | ||
| def llm_provider(llm_provider: str): | ||
| graph_config = get_graph_config() | ||
| graph_config.llm_provider = llm_provider | ||
| def set_llm_provider(llm_provider: str): | ||
| llm_config = get_llm_config() | ||
| llm_config.llm_provider = llm_provider | ||
|
|
||
| @staticmethod | ||
| def llm_endpoint(llm_endpoint: str): | ||
| graph_config = get_graph_config() | ||
| graph_config.llm_endpoint = llm_endpoint | ||
| def set_llm_endpoint(llm_endpoint: str): | ||
| llm_config = get_llm_config() | ||
| llm_config.llm_endpoint = llm_endpoint | ||
|
|
||
| @staticmethod | ||
| def llm_model(llm_model: str): | ||
| graph_config = get_graph_config() | ||
| graph_config.llm_model = llm_model | ||
| def set_llm_model(llm_model: str): | ||
| llm_config = get_llm_config() | ||
| llm_config.llm_model = llm_model | ||
|
|
||
| @staticmethod | ||
| def set_llm_api_key(llm_api_key: str): | ||
| llm_config = get_llm_config() | ||
| llm_config.llm_api_key = llm_api_key | ||
|
|
||
| @staticmethod | ||
| def set_llm_config(config_dict: dict): | ||
| """ | ||
| Updates the llm config with values from config_dict. | ||
| """ | ||
| llm_config = get_llm_config() | ||
| for key, value in config_dict.items(): | ||
| if hasattr(llm_config, key): | ||
| object.__setattr__(llm_config, key, value) | ||
| else: | ||
| raise AttributeError(f"'{key}' is not a valid attribute of the config.") | ||
|
Comment on lines
+79
to
+88
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🛠️ Refactor suggestion Consider reusing set_llm_config for individual setters. The bulk update logic in Example refactor for @staticmethod
def set_llm_api_key(llm_api_key: str):
- llm_config = get_llm_config()
- llm_config.llm_api_key = llm_api_key
+ config.set_llm_config({"llm_api_key": llm_api_key})
|
||
|
|
||
| @staticmethod | ||
| def set_chunk_strategy(chunk_strategy: object): | ||
|
|
@@ -137,5 +155,5 @@ def set_graphistry_config(graphistry_config: dict[str, str]): | |
| if "username" not in graphistry_config or "password" not in graphistry_config: | ||
| raise ValueError("graphistry_config dictionary must contain 'username' and 'password' keys.") | ||
|
|
||
| base_config.graphistry_username = graphistry_config.username | ||
| base_config.graphistry_password = graphistry_config.password | ||
| base_config.graphistry_username = graphistry_config.get("username") | ||
| base_config.graphistry_password = graphistry_config.get("password") | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -14,9 +14,11 @@ Check available configuration options: | |
| from cognee.infrastructure.databases.vector import get_vectordb_config | ||
| from cognee.infrastructure.databases.graph.config import get_graph_config | ||
| from cognee.infrastructure.databases.relational import get_relational_config | ||
| from cognee.infrastructure.llm.config import get_llm_config | ||
| print(get_vectordb_config().to_dict()) | ||
| print(get_graph_config().to_dict()) | ||
| print(get_relational_config().to_dict()) | ||
| print(get_llm_config().to_dict()) | ||
|
|
||
| ``` | ||
|
|
||
|
|
@@ -29,8 +31,7 @@ GRAPH_DATABASE_PROVIDER = 'lancedb' | |
| Otherwise, you can set the configuration yourself: | ||
|
|
||
| ```python | ||
|
|
||
| cognee.config.llm_provider = 'ollama' | ||
| cognee.config.set_llm_provider('ollama') | ||
| ``` | ||
|
|
||
| ## 🚀 Getting Started with Local Models | ||
|
|
@@ -52,15 +53,14 @@ LLM_PROVIDER = 'ollama' | |
| Otherwise, you can set the configuration for the model: | ||
|
|
||
| ```bash | ||
| cognee.config.llm_provider = 'ollama' | ||
| cognee.config.set_llm_provider('ollama') | ||
|
|
||
| ``` | ||
| You can also set the HOST and model name: | ||
|
|
||
| ```bash | ||
|
|
||
| cognee.config.llm_endpoint = "http://localhost:11434/v1" | ||
| cognee.config.llm_model = "mistral:instruct" | ||
| cognee.config.set_llm_endpoint("http://localhost:11434/v1") | ||
| cognee.config.set_llm_model("mistral:instruct") | ||
| ``` | ||
|
|
||
|
|
||
|
|
@@ -73,7 +73,7 @@ LLM_PROVIDER = 'custom' | |
| Otherwise, you can set the configuration for the model: | ||
|
|
||
| ```bash | ||
| cognee.config.llm_provider = 'custom' | ||
| cognee.config.set_llm_provider('custom') | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Complete the custom provider configuration examples. The documentation shows environment variables for endpoint, model, and API key, but lacks the corresponding code examples using the new method calls. Consider adding complete programmatic configuration examples: cognee.config.set_llm_provider('custom')
+# Configure custom provider settings
+cognee.config.set_llm_model("mistralai/Mixtral-8x7B-Instruct-v0.1")
+cognee.config.set_llm_endpoint("https://api.endpoints.anyscale.com/v1")
+cognee.config.set_llm_api_key("your_api_key")
|
||
|
|
||
| ``` | ||
| You can also set the HOST and model name: | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Breaking change not properly documented, needs attention
The method renaming from
llm_provider,llm_endpoint, andllm_modelto theirset_prefixed versions is a breaking change that needs documentation. While the configuration is well documented indocs/configuration.mdanddocs/local_models.md, these docs still show the old method names in their examples.Please update:
docs/configuration.md: Update configuration examples to use newset_prefixed methodsdocs/local_models.md: Update configuration examples to use newset_prefixed methods🔗 Analysis chain
LGTM: Method renaming improves clarity.
The renaming of methods to include 'set_' prefix better indicates their mutating nature and aligns with the class's naming conventions.
Let's verify if this breaking change is documented:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
Length of output: 139
Script:
Length of output: 5406