Skip to content

Conversation

NILAY1556
Copy link
Contributor

Description

mem0 vllm support

Fixes #2918

Type of change

  • New feature (non-breaking change which adds functionality)
  • Documentation update

How Has This Been Tested?

  • Test Script

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged and published in downstream modules
  • I have checked my code and corrected any misspellings

- Add VllmLLM provider with OpenAI-compatible API
- Support for environment variables (VLLM_BASE_URL, VLLM_API_KEY)
- Complete tool calling support
- Comprehensive test coverage
- Documentation and usage examples
- Production-ready implementation following mem0 patterns
@CLAassistant
Copy link

CLAassistant commented Jun 19, 2025

CLA assistant check
All committers have signed the CLA.

@deshraj deshraj requested a review from Dev-Khant June 19, 2025 02:49
@deshraj
Copy link
Collaborator

deshraj commented Jun 19, 2025

Hi @NILAY1556, thanks for adding support for this. Can you please sign the CLA and resolve conflicts so that we can review the PR?

@NILAY1556
Copy link
Contributor Author

yes sure , i resolved it and singed in .
could you please review it once, if any other problem in my integration than inform me

Copy link
Contributor

@Dev-Khant Dev-Khant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

title: vLLM
---

vLLM is a high-performance inference engine for large language models that provides significant performance improvements for local inference.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please add a link to vLLM docs here?


try:
from openai import OpenAI
except ImportError:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We won't need this as openai gets installed with default installation.

@NILAY1556 NILAY1556 requested a review from Dev-Khant June 19, 2025 15:30
@@ -90,45 +92,39 @@ Config is essential for:
Here's a comprehensive list of all parameters that can be used across different LLMs:

<Tabs>
<Tab title="Python">
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we changing this?

@NILAY1556 NILAY1556 requested a review from Dev-Khant June 20, 2025 17:15
@Dev-Khant
Copy link
Contributor

Hey @NILAY1556, Thanks for the contribution!

@Dev-Khant Dev-Khant merged commit 89499ae into mem0ai:main Jun 23, 2025
1 of 2 checks passed
merlinfrombelgium pushed a commit to merlinfrombelgium/mem0 that referenced this pull request Jul 4, 2025
Gradonhf pushed a commit to Gradonhf/mem0 that referenced this pull request Jul 11, 2025
brockshanson pushed a commit to brockshanson/mem0 that referenced this pull request Jul 30, 2025
thestumonkey pushed a commit to thestumonkey/mem0 that referenced this pull request Sep 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

support vllm
4 participants