-
Notifications
You must be signed in to change notification settings - Fork 1k
Added Mistral support as LLM provider using litellm #1449
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Please make sure all the checkboxes are checked:
|
|
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughConsolidates prompt utilities by removing LLMGateway prompt helpers and switching call sites to cognee.infrastructure.llm.prompts. Overhauls the BAML structured-output stack to a single AcreateStructuredOutput path with generic ResponseModel. Adds Mistral provider across settings and client selection. Introduces Lexical/Jaccard chunk retrieval and wires a new CHUNKS_LEXICAL search type. Minor workflow/script and doc updates. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Caller
participant LLMGateway
participant Config as LLM Config
participant BAML as BAML Client
participant Lite as LiteLLM Instructor
Caller->>LLMGateway: acreate_structured_output(text, system_prompt, model)
LLMGateway->>Config: get_llm_config()
alt structured_output_framework == BAML
LLMGateway->>BAML: AcreateStructuredOutput(text, system_prompt, options)
BAML-->>LLMGateway: ResponseModel
else instructor (LiteLLM)
LLMGateway->>Lite: acreate_structured_output(text, system_prompt, model)
Lite-->>LLMGateway: response
end
LLMGateway-->>Caller: response
sequenceDiagram
autonumber
actor User
participant Search as get_search_type_tools
participant Retriever as JaccardChunksRetriever
participant Graph as GraphEngine
User->>Search: request CHUNKS_LEXICAL tools(top_k)
Search-->>User: { get_context, get_completion }
User->>Retriever: initialize()
Retriever->>Graph: load DocumentChunks
Graph-->>Retriever: chunks
Retriever-->>User: ready
User->>Retriever: get_context(query)
Retriever->>Retriever: tokenize & score (Jaccard)
Retriever-->>User: top_k chunks (±scores)
sequenceDiagram
autonumber
actor Caller
participant Factory as get_llm_client
participant Providers as LLMProvider
note over Factory,Providers: raise_api_key_error flag added
Caller->>Factory: get_llm_client(raise_api_key_error=?)
alt provider == OPENAI/OLLAMA/CUSTOM/GEMINI
Factory->>Providers: validate api key (conditional on flag)
Providers-->>Factory: adapter
else provider == MISTRAL
Factory->>Providers: validate api key
Providers-->>Factory: MistralAdapter
end
Factory-->>Caller: adapter
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @AniLeo-01, thank you for submitting a PR! We will respond as soon as possible.
|
hey @AniLeo-01 thank you very much for this contribution re: issue #1426 :) we'll review it shortly! |
cognee/infrastructure/llm/structured_output_framework/litellm_instructor/llm/mistral/adapter.py
Outdated
Show resolved
Hide resolved
siillee
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @AniLeo-01 , thank you for the PR, it looks great!
I added some very small comments that should make this PR perfect. The changes shouldn't be big.
|
Thanks a lot @siillee ❤️ |
AniLeo-01
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @siillee, I've added the changes accordingly. Please check, thanks!!
cognee/infrastructure/llm/structured_output_framework/litellm_instructor/llm/mistral/adapter.py
Outdated
Show resolved
Hide resolved
… completion method
siillee
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good @AniLeo-01 !
|
@AniLeo-01 one small thing, can you resolve uv.lock conflict and we will merge it tomorrow? |
|
@Vasilije1990 sure, will do it! |
|
Hi @Vasilije1990, I've resolved the uv.lock, and updated the code to dev, could you please take a look? |
|
@AniLeo-01 i still see it as an issue |
|
I did merge the changes yesterday tho, lemme check if I can redo it, thanks! |
Thanks for checking, but I still see it. Do you need help with this one? |
|
@Vasilije1990 Yeah, so what I'm doing is deleting the lockfile and then pulling the changes from dev. Shouldn't that solve the issue? Or should I remove the lockfile and create again using |
- Added "mistralai==1.9.10" to the dependencies in pyproject.toml. - Updated sdist entries in uv.lock to remove unnecessary upload-time fields for various packages. - Ensured consistency in package specifications across the project files.
- Deleted the uv.lock file to streamline dependency management. - This change may require regeneration of the lock file in future dependency updates.
- Changed "mistralai==1.9.10" to "mistralai>=1.9.10" for more flexible versioning. - Removed "mistralai" from the optional dependencies under "mistral". - Expanded the "docs" dependency to include "pdf" support.
|
@AniLeo-01 I handled it in a new PR and will merge it after tests are done. This way we guarantee everything was checked properly. We appreciate the contribution! |
Description
Added Mistral API support to the LLM Providers
Type of Change
Changes Made
Testing
By running the
python cognee/cognee/tests/test_library.pyScreenshots/Videos (if applicable)
None
Pre-submission Checklist
Related Issues
#1426
Additional Notes
None
DCO Affirmation
I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin.