Skip to content

Conversation

@e7217
Copy link
Contributor

@e7217 e7217 commented Nov 6, 2024

Description:
This PR modifies the documentation regarding the configuration of the VLLM with the LoRA adapter. The updates aim to provide clear instructions for users on how to set up the LoRA adapter when using the VLLM.

  • before
VLLM(..., enable_lora=True)
  • after
VLLM(..., 
    vllm_kwargs={
        "enable_lora": True
    }
)

This change clarifies that users should use the vllm_kwargs to enable the LoRA adapter.

@vercel
Copy link

vercel bot commented Nov 6, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback Nov 6, 2024 0:35am

@e7217
Copy link
Contributor Author

e7217 commented Nov 10, 2024

@ccurme Could you please check this PR? Thank you.

@dosubot dosubot bot added the lgtm label Nov 11, 2024
@ccurme ccurme merged commit 9484cc0 into langchain-ai:master Nov 11, 2024
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

No open projects
Archived in project

Development

Successfully merging this pull request may close these issues.

2 participants