Skip to content

N8Maynard91/nateoma_langchain_examples

Repository files navigation

nateoma_langchain_examples

This repository demonstrates how to use MCP (Model Context Protocol) with LangChain using the langchain_mcp_adapters.client library. This approach provides a streamlined way to integrate MCP servers with LangChain applications.

Repository Overview

This repository contains examples and configurations for integrating Linear MCP servers with LangChain, providing a powerful way to manage Linear projects through natural language interactions.

What's Included

  • Linear MCP Integration: Pre-configured examples for Linear project management
  • Multiple Example Types: Basic, advanced, and dedicated Linear examples
  • Ready-to-Use Configuration: Your Linear MCP server credentials are already configured
  • Interactive Mode: Command shortcuts and natural language queries
  • 23 Linear Tools: Full access to Linear's project management capabilities

What is langchain_mcp_adapters.client?

The langchain_mcp_adapters.client library provides:

  • MultiServerMCPClient: Connect to multiple MCP servers simultaneously
  • Automatic Tool Loading: Seamlessly load tools from MCP servers into LangChain
  • Configuration Management: JSON-based server configuration
  • Error Handling: Built-in retry logic and error management
  • Async Support: Full async/await support for better performance

Installation

pip install -r requirements.txt

Set your OpenAI API key:

export OPENAI_API_KEY="your-api-key-here"

Files Overview

  • mcp_client_example.py - Basic example using MultiServerMCPClient with Linear
  • advanced_mcp_client.py - Advanced example with configuration management
  • linear_mcp_example.py - Dedicated Linear MCP integration example
  • mcp_config.json - Linear MCP server configuration
  • MCP_CONFIGURATION_GUIDE.md - Comprehensive guide for configuring your MCP servers
  • requirements.txt - Required dependencies

Quick Start

1. Linear MCP Integration (Ready to Use!)

The configuration is already set up for your Linear MCP server. You can run:

python linear_mcp_example.py

This will:

  • Connect to your Linear MCP server
  • Load Linear tools into LangChain
  • Create a ReAct agent for Linear project management
  • Run Linear-specific test queries
  • Enter interactive Linear management mode

2. Basic Example

Run the basic MCP client example:

python mcp_client_example.py

This will:

  • Connect to your Linear MCP server
  • Load Linear tools into LangChain
  • Create a ReAct agent
  • Run test queries and enter interactive mode

3. Advanced Example

Run the advanced example with configuration management:

python advanced_mcp_client.py

Features:

  • JSON-based server configuration
  • Error handling and retry logic
  • Tool information display
  • Complex Linear workflow examples
  • Enhanced interactive mode

Key Features

Linear MCP Integration

Your Linear MCP server provides access to Linear's project management features:

  • Issue Management: Create, update, and query Linear issues
  • Project Tracking: Monitor project progress and status
  • Team Collaboration: Access team information and assignments
  • Workflow Automation: Automate Linear workflows with AI

MultiServerMCPClient

The MultiServerMCPClient allows you to connect to multiple MCP servers:

from langchain_mcp_adapters.client import MultiServerMCPClient

server_configs = {
    "linear": {
        "command": "npx",
        "args": ["-y", "@natomalabs/natoma-mcp-gateway@latest", "--enterprise"],
        "transport": "stdio",
        "env": {
            "NATOMA_MCP_API_KEY": "your-api-key",
            "NATOMA_MCP_SERVER_INSTALLATION_ID": "your-installation-id"
        }
    }
}

async with MultiServerMCPClient(server_configs) as client:
    tools = await client.get_tools()
    # Use Linear tools with LangChain agent

Configuration Management

Server configurations can be stored in JSON files:

{
  "math_server": {
    "command": "python",
    "args": ["math_server.py"],
    "transport": "stdio",
    "timeout": 30,
    "retry_attempts": 3
  }
}

Error Handling

The client includes built-in error handling:

  • Connection retries
  • Timeout management
  • Graceful error recovery

MCP Server Requirements

Your MCP servers must meet these requirements to work with LangChain:

Required Elements

  1. MCP Protocol Compliance: Must implement the MCP specification
  2. Tool Definition: Each tool must have:
    • Type hints for all parameters and return types
    • Clear docstrings describing functionality
    • Proper error handling
    • JSON-serializable return values

Basic Server Template

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Your Server Name")

@mcp.tool()
def your_tool(param1: str, param2: int) -> str:
    """Description of what your tool does."""
    # Your implementation
    return f"Result: {param1} and {param2}"

if __name__ == "__main__":
    mcp.run(transport="stdio")

Configuration Requirements

See MCP_CONFIGURATION_GUIDE.md for detailed requirements and examples.

Advanced Usage Patterns

1. Dynamic Server Management

class MCPClientManager:
    def __init__(self):
        self.client = None
        self.agent = None
    
    async def connect_to_servers(self, configs):
        self.client = MultiServerMCPClient(configs)
        await self.client.__aenter__()
        tools = await self.client.get_tools()
        self.agent = create_react_agent(model, tools)
    
    async def run_query(self, query):
        return await self.agent.ainvoke({"messages": [HumanMessage(content=query)]})

2. Configuration Loading

async def load_server_config(config_path: str):
    with open(config_path, 'r') as f:
        return json.load(f)

3. Tool Information

async def get_available_tools(client):
    tools = await client.get_tools()
    return [
        {
            "name": tool.name,
            "description": tool.description,
            "schema": str(tool.args_schema)
        }
        for tool in tools
    ]

Example Workflows

Data Processing Pipeline

query = """
Create a file called 'data.txt' with the content 'Sample data: 42, 17, 89, 156'. 
Then read the file, extract the numbers, calculate their sum, and create a JSON file with the results.
"""

Mathematical Analysis

query = """
Calculate the factorial of 6, then find the square root of that result, 
and finally raise it to the power of 3. Format the final result as a title case string.
"""

File and Date Operations

query = """
Get the current time, create a file with today's date as the filename, 
write the current time to that file, then read it back and format the date in a nice readable format.
"""

Troubleshooting

Common Issues

  1. Server Connection Failed

    • Check that server scripts exist and are executable
    • Verify Python path in configuration
    • Check for syntax errors in server scripts
  2. Tool Loading Errors

    • Ensure proper type hints in tool functions
    • Check that docstrings are present
    • Verify tool function signatures
  3. Configuration Issues

    • Validate JSON syntax in config files
    • Check file paths are correct
    • Ensure all required fields are present

Debug Mode

Enable debug logging:

import logging
logging.basicConfig(level=logging.DEBUG)

Benefits of This Approach

  1. Simplified Integration: No need to manage separate server processes
  2. Configuration Management: JSON-based server configuration
  3. Error Handling: Built-in retry logic and error recovery
  4. Multi-Server Support: Connect to multiple MCP servers simultaneously
  5. Async Support: Full async/await support for better performance
  6. Tool Discovery: Automatic tool loading and information display

Resources

Contributing

Feel free to extend these examples with your own MCP servers and configurations. The modular nature of MCP makes it easy to add new capabilities to your LangChain applications.

About

MCP (Model Context Protocol) integration examples with LangChain

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages