This repository demonstrates how to use MCP (Model Context Protocol) with LangChain using the langchain_mcp_adapters.client library. This approach provides a streamlined way to integrate MCP servers with LangChain applications.
This repository contains examples and configurations for integrating Linear MCP servers with LangChain, providing a powerful way to manage Linear projects through natural language interactions.
- Linear MCP Integration: Pre-configured examples for Linear project management
- Multiple Example Types: Basic, advanced, and dedicated Linear examples
- Ready-to-Use Configuration: Your Linear MCP server credentials are already configured
- Interactive Mode: Command shortcuts and natural language queries
- 23 Linear Tools: Full access to Linear's project management capabilities
The langchain_mcp_adapters.client library provides:
- MultiServerMCPClient: Connect to multiple MCP servers simultaneously
- Automatic Tool Loading: Seamlessly load tools from MCP servers into LangChain
- Configuration Management: JSON-based server configuration
- Error Handling: Built-in retry logic and error management
- Async Support: Full async/await support for better performance
pip install -r requirements.txtSet your OpenAI API key:
export OPENAI_API_KEY="your-api-key-here"mcp_client_example.py- Basic example using MultiServerMCPClient with Linearadvanced_mcp_client.py- Advanced example with configuration managementlinear_mcp_example.py- Dedicated Linear MCP integration examplemcp_config.json- Linear MCP server configurationMCP_CONFIGURATION_GUIDE.md- Comprehensive guide for configuring your MCP serversrequirements.txt- Required dependencies
The configuration is already set up for your Linear MCP server. You can run:
python linear_mcp_example.pyThis will:
- Connect to your Linear MCP server
- Load Linear tools into LangChain
- Create a ReAct agent for Linear project management
- Run Linear-specific test queries
- Enter interactive Linear management mode
Run the basic MCP client example:
python mcp_client_example.pyThis will:
- Connect to your Linear MCP server
- Load Linear tools into LangChain
- Create a ReAct agent
- Run test queries and enter interactive mode
Run the advanced example with configuration management:
python advanced_mcp_client.pyFeatures:
- JSON-based server configuration
- Error handling and retry logic
- Tool information display
- Complex Linear workflow examples
- Enhanced interactive mode
Your Linear MCP server provides access to Linear's project management features:
- Issue Management: Create, update, and query Linear issues
- Project Tracking: Monitor project progress and status
- Team Collaboration: Access team information and assignments
- Workflow Automation: Automate Linear workflows with AI
The MultiServerMCPClient allows you to connect to multiple MCP servers:
from langchain_mcp_adapters.client import MultiServerMCPClient
server_configs = {
"linear": {
"command": "npx",
"args": ["-y", "@natomalabs/natoma-mcp-gateway@latest", "--enterprise"],
"transport": "stdio",
"env": {
"NATOMA_MCP_API_KEY": "your-api-key",
"NATOMA_MCP_SERVER_INSTALLATION_ID": "your-installation-id"
}
}
}
async with MultiServerMCPClient(server_configs) as client:
tools = await client.get_tools()
# Use Linear tools with LangChain agentServer configurations can be stored in JSON files:
{
"math_server": {
"command": "python",
"args": ["math_server.py"],
"transport": "stdio",
"timeout": 30,
"retry_attempts": 3
}
}The client includes built-in error handling:
- Connection retries
- Timeout management
- Graceful error recovery
Your MCP servers must meet these requirements to work with LangChain:
- MCP Protocol Compliance: Must implement the MCP specification
- Tool Definition: Each tool must have:
- Type hints for all parameters and return types
- Clear docstrings describing functionality
- Proper error handling
- JSON-serializable return values
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Your Server Name")
@mcp.tool()
def your_tool(param1: str, param2: int) -> str:
"""Description of what your tool does."""
# Your implementation
return f"Result: {param1} and {param2}"
if __name__ == "__main__":
mcp.run(transport="stdio")See MCP_CONFIGURATION_GUIDE.md for detailed requirements and examples.
class MCPClientManager:
def __init__(self):
self.client = None
self.agent = None
async def connect_to_servers(self, configs):
self.client = MultiServerMCPClient(configs)
await self.client.__aenter__()
tools = await self.client.get_tools()
self.agent = create_react_agent(model, tools)
async def run_query(self, query):
return await self.agent.ainvoke({"messages": [HumanMessage(content=query)]})async def load_server_config(config_path: str):
with open(config_path, 'r') as f:
return json.load(f)async def get_available_tools(client):
tools = await client.get_tools()
return [
{
"name": tool.name,
"description": tool.description,
"schema": str(tool.args_schema)
}
for tool in tools
]query = """
Create a file called 'data.txt' with the content 'Sample data: 42, 17, 89, 156'.
Then read the file, extract the numbers, calculate their sum, and create a JSON file with the results.
"""query = """
Calculate the factorial of 6, then find the square root of that result,
and finally raise it to the power of 3. Format the final result as a title case string.
"""query = """
Get the current time, create a file with today's date as the filename,
write the current time to that file, then read it back and format the date in a nice readable format.
"""-
Server Connection Failed
- Check that server scripts exist and are executable
- Verify Python path in configuration
- Check for syntax errors in server scripts
-
Tool Loading Errors
- Ensure proper type hints in tool functions
- Check that docstrings are present
- Verify tool function signatures
-
Configuration Issues
- Validate JSON syntax in config files
- Check file paths are correct
- Ensure all required fields are present
Enable debug logging:
import logging
logging.basicConfig(level=logging.DEBUG)- Simplified Integration: No need to manage separate server processes
- Configuration Management: JSON-based server configuration
- Error Handling: Built-in retry logic and error recovery
- Multi-Server Support: Connect to multiple MCP servers simultaneously
- Async Support: Full async/await support for better performance
- Tool Discovery: Automatic tool loading and information display
Feel free to extend these examples with your own MCP servers and configurations. The modular nature of MCP makes it easy to add new capabilities to your LangChain applications.