-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Feature/vllm support #2981
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/vllm support #2981
Conversation
- Add VllmLLM provider with OpenAI-compatible API - Support for environment variables (VLLM_BASE_URL, VLLM_API_KEY) - Complete tool calling support - Comprehensive test coverage - Documentation and usage examples - Production-ready implementation following mem0 patterns
Hi @NILAY1556, thanks for adding support for this. Can you please sign the CLA and resolve conflicts so that we can review the PR? |
yes sure , i resolved it and singed in . |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please add vLLM here as well: https://github.com/mem0ai/mem0/blob/main/docs/docs.json#L120?
docs/components/llms/models/vllm.mdx
Outdated
title: vLLM | ||
--- | ||
|
||
vLLM is a high-performance inference engine for large language models that provides significant performance improvements for local inference. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please add a link to vLLM docs here?
mem0/llms/vllm.py
Outdated
|
||
try: | ||
from openai import OpenAI | ||
except ImportError: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We won't need this as openai gets installed with default installation.
docs/components/llms/config.mdx
Outdated
@@ -90,45 +92,39 @@ Config is essential for: | |||
Here's a comprehensive list of all parameters that can be used across different LLMs: | |||
|
|||
<Tabs> | |||
<Tab title="Python"> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are we changing this?
Hey @NILAY1556, Thanks for the contribution! |
Description
mem0 vllm support
Fixes #2918
Type of change
How Has This Been Tested?
Checklist: