Skip to content

doobidoo/MCP-Context-Provider

Repository files navigation

MCP Context Provider

MCP Context Provider Architecture

The stable, glowing orb at the center represents the persistent context that survives across chat sessions. The flowing data streams show how ongoing conversations connect to and draw from this stable core of information, preventing context loss.

A static MCP (Model Context Protocol) server that provides AI models with persistent tool context, preventing context loss between chat sessions. This server automatically loads and injects tool-specific rules, syntax preferences, and best practices at Claude Desktop startup.

Overview

The Context Provider acts as a persistent neural core for your AI interactions, eliminating the need to re-establish context in each new chat session by:

  • πŸ”„ Persistent Context: Like the stable orb in the visualization, rules and preferences survive across Claude Desktop restarts
  • ⚑ Automatic Injection: Context flows seamlessly into every conversation, just as the data streams connect to the central core
  • 🎯 Tool-Specific: Each tool gets its own context rules and syntax preferences, creating specialized knowledge pathways
  • πŸ”§ Auto-Corrections: Automatic syntax transformations (e.g., Markdown β†’ DokuWiki) ensure consistency across all interactions
  • πŸ“ˆ Scalable: Easy to add new tools and context rules, expanding the knowledge network
  • 🏒 Enterprise-Ready: Version-controlled context management provides organizational stability

The Neural Network Metaphor

Just like the image depicts, your MCP Context Provider functions as:

  • Central Orb: The stable, persistent context core that maintains consistency
  • Neural Pathways: Tool-specific context rules that create specialized knowledge channels
  • Data Streams: Individual chat sessions that flow through and benefit from the persistent context
  • Network Stability: Prevents the ephemeral nature of conversations from losing important contextual information

Quick Start

Option 1: Automated Installation (Recommended)

The easiest way to install MCP Context Provider is using the provided installation scripts:

Unix/Linux/macOS:

# Clone the repository (contains latest source files)
git clone https://github.com/doobidoo/MCP-Context-Provider.git
cd MCP-Context-Provider

# Run the automated installer (builds fresh package)
./scripts/install.sh

Windows:

# Clone the repository first
git clone https://github.com/doobidoo/MCP-Context-Provider.git
cd MCP-Context-Provider

# Run the Windows installer
.\scripts\install.bat

The installation script automatically:

  • Builds the latest DXT package from source
  • Creates a Python virtual environment
  • Installs all required dependencies
  • Configures Claude Desktop settings

Option 2: Manual Installation from DXT

# Install DXT CLI (if not already installed)
npm install -g @anthropic-ai/dxt

# Download the DXT package
wget https://github.com/doobidoo/MCP-Context-Provider/raw/main/mcp-context-provider-1.2.1.dxt

# Unpack the extension to your desired location
dxt unpack mcp-context-provider-1.2.1.dxt ~/mcp-context-provider

# Navigate to the installation directory
cd ~/mcp-context-provider

# Create and activate a Python virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install mcp>=1.9.4

Option 3: Installation from Source

# Clone the repository
git clone https://github.com/doobidoo/MCP-Context-Provider.git
cd MCP-Context-Provider

# Create and activate a Python virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

2. Configuration

Update your Claude Desktop configuration file:

Configuration File Location:

  • Linux: ~/.config/claude/claude_desktop_config.json
  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

For Virtual Environment Installation (Recommended):

{
  "mcpServers": {
    "context-provider": {
      "command": "/path/to/mcp-context-provider/venv/bin/python",
      "args": ["/path/to/mcp-context-provider/context_provider_server.py"],
      "env": {
        "CONTEXT_CONFIG_DIR": "/path/to/mcp-context-provider/contexts",
        "AUTO_LOAD_CONTEXTS": "true"
      }
    }
  }
}

For System Python Installation:

{
  "mcpServers": {
    "context-provider": {
      "command": "python",
      "args": ["context_provider_server.py"],
      "cwd": "/path/to/MCP-Context-Provider",
      "env": {
        "CONTEXT_CONFIG_DIR": "./contexts",
        "AUTO_LOAD_CONTEXTS": "true"
      }
    }
  }
}

Important: Replace /path/to/mcp-context-provider with the actual installation path.

3. Verify Installation

Run the verification script to ensure everything is configured correctly:

python scripts/verify_install.py

4. Restart Claude Desktop

After updating the configuration, restart Claude Desktop to load the MCP server.

How It Works

Architecture

  1. Context Provider Server: Python MCP server that loads JSON context files
  2. Context Files: Tool-specific rules stored in /contexts directory
  3. Claude Desktop Integration: MCP server registered in configuration
  4. Automatic Loading: Context is injected at startup and persists across chats

Context Flow

Startup β†’ Load Context Files β†’ Register MCP Tools β†’ Context Available in All Chats

Available Tools

Once loaded, the following tools are available in all chat sessions:

Core Context Tools:

  • get_tool_context: Get context rules for specific tool
  • get_syntax_rules: Get syntax conversion rules
  • list_available_contexts: List all loaded context categories
  • apply_auto_corrections: Apply automatic syntax corrections

Phase 1 - Session Management:

  • execute_session_initialization: Initialize session with memory service integration
  • get_session_status: Retrieve detailed session initialization status

Phase 2 - Dynamic Context Management:

  • create_context_file: Create new context files dynamically with validation
  • update_context_rules: Update existing context rules with backup and validation
  • add_context_pattern: Add patterns to auto-trigger sections for memory integration

Phase 3 - Intelligent Learning (v1.6.0+):

  • analyze_context_effectiveness: Analyze context effectiveness with memory-driven insights
  • suggest_context_optimizations: Generate global optimization suggestions based on usage patterns
  • get_proactive_suggestions: Provide proactive context suggestions for workflow improvement
  • auto_optimize_context: Automatically optimize contexts based on learning engine recommendations
MCP Context Provider Tools in Action

Screenshot showing the MCP Context Provider in action within Claude Desktop. The tool automatically detects and lists all available context categories (dokuwiki, terraform, azure, git, general_preferences) and provides interactive access to tool-specific rules and guidelines.

Context Files

The server loads context files from the /contexts directory:

  • dokuwiki_context.json: DokuWiki syntax rules and preferences
  • terraform_context.json: Terraform naming conventions and best practices
  • azure_context.json: Azure resource naming and compliance rules
  • git_context.json: Git commit conventions and workflow patterns
  • general_preferences.json: Cross-tool preferences and standards

Context File Structure

Each context file follows this pattern:

{
  "tool_category": "toolname",
  "description": "Tool-specific context rules",
  "auto_convert": true,
  "syntax_rules": {
    "format_rules": "conversion patterns"
  },
  "preferences": {
    "user_preferences": "settings"
  },
  "auto_corrections": {
    "regex_patterns": "automatic fixes"
  },
  "metadata": {
    "version": "1.0.0",
    "applies_to_tools": ["tool:*"]
  }
}

Examples

DokuWiki Syntax Conversion

Input (Markdown):

# My Header
This is `inline code` and here's a [link](http://example.com).

Auto-converted to DokuWiki:

====== My Header ======
This is ''inline code'' and here's a [[http://example.com|link]].

Azure Resource Naming

Input: storage_account_logs_prod Auto-corrected to: stlogsprod (following Azure naming conventions)

Git Commit Messages

Input: Fixed the login bug Auto-corrected to: fix: resolve login authentication issue

Adding New Context

To add support for a new tool:

  1. Create a new JSON file: contexts/{toolname}_context.json
  2. Follow the standard context structure
  3. Restart Claude Desktop to load the new context

The server automatically detects and loads any *_context.json files in the contexts directory.

Benefits

For Developers

  • No need to re-establish context in new chats
  • Automatic syntax corrections save time
  • Consistent formatting across all work
  • Best practices automatically applied

For Teams

  • Shared context rules across team members
  • Version-controlled standards
  • Consistent code and documentation formatting
  • Enterprise compliance automatically enforced

For Organizations

  • Centralized context management
  • Scalable across multiple tools
  • Audit trail of context changes
  • Easy deployment and updates

🧠 Phase 3: Intelligent Learning System (v1.6.0+)

Revolutionary Learning Capabilities

Version 1.6.0 introduces the Synergistic Integration with Intelligent Learning system, transforming the MCP Context Provider from a static configuration tool into an intelligent, self-improving context evolution platform.

🎯 Key Learning Features

Intelligent Context Evolution

  • Automatic Effectiveness Analysis: Contexts self-analyze based on usage patterns and memory data
  • Smart Optimization Suggestions: AI-driven recommendations for context improvements
  • Auto-Optimization: Contexts automatically improve through pattern learning and preference tuning
  • Proactive Intelligence: Suggests missing tool contexts and workflow enhancements

Real Memory Service Integration

  • Persistent Learning: Full integration with mcp-memory-service for persistent learning data
  • Usage Pattern Tracking: Comprehensive tracking of context modifications and effectiveness
  • Memory-Driven Insights: Historical data analysis for continuous improvement
  • Team Knowledge Propagation: Shared learning insights across team members

Advanced MCP Tools

4 new intelligent tools for context management:

  • analyze_context_effectiveness: Memory-driven effectiveness analysis
  • suggest_context_optimizations: Global optimization recommendations
  • get_proactive_suggestions: Workflow improvement suggestions
  • auto_optimize_context: Automatic context optimization based on learning

πŸ”„ Learning Workflow

Context Usage β†’ Memory Storage β†’ Pattern Analysis β†’ Optimization Suggestions β†’ Auto-Improvement
     ↓              ↓                 ↓                    ↓                     ↓
Session Data β†’ Learning Engine β†’ Effectiveness Score β†’ Proactive Recommendations β†’ Enhanced Contexts

πŸ“Š Learning Metrics

The system tracks and analyzes:

  • Context Effectiveness Scores (0.0-1.0 scale)
  • Usage Pattern Recognition (frequency, modifications, interactions)
  • Session Performance Optimization (sub-second initialization targets)
  • Memory-Driven Trend Analysis (historical usage and improvement data)

πŸš€ Phase 3 Setup

Prerequisites: Requires mcp-memory-service integration

  1. Configure Memory Service (.mcp.json):
{
  "mcpServers": {
    "memory": {
      "command": "/path/to/uv",
      "args": ["--directory", "/path/to/mcp-memory-service", "run", "memory"],
      "env": {
        "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec",
        "MCP_MEMORY_SQLITE_PATH": "/path/to/memory.db"
      }
    },
    "context-provider": {
      "command": "python",
      "args": ["context_provider_server.py"],
      "env": {
        "CONTEXT_CONFIG_DIR": "./contexts",
        "AUTO_LOAD_CONTEXTS": "true"
      }
    }
  }
}
  1. Test Learning Features:
# Run comprehensive Phase 3 tests
python tests/test_phase3_learning.py

# Check learning engine health
python -c "
from context_provider_server import ContextProvider
import asyncio
async def test():
    provider = ContextProvider()
    stats = await provider.memory_service.get_memory_stats()
    print(f'Learning system status: {stats}')
asyncio.run(test())
"

πŸ“š Phase 3 Documentation

πŸŽ‰ Implementation Roadmap Complete

βœ… Phase 1: Session initialization with memory service integration βœ… Phase 2: Dynamic context file creation and management βœ… Phase 3: Synergistic integration with intelligent learning

The MCP Context Provider now offers enterprise-ready intelligent context evolution with self-improving contexts that learn from usage patterns and automatically optimize through real memory service integration.

Advanced Usage

Custom Context Rules

Create your own context files by following the established pattern. The server supports:

  • Regex-based auto-corrections
  • Tool-specific preferences
  • Conditional formatting rules
  • Multi-tool context inheritance

Environment-Specific Context

Use environment variables to load different context sets:

{
  "env": {
    "CONTEXT_CONFIG_DIR": "./contexts/production",
    "ENVIRONMENT": "prod"
  }
}

Troubleshooting

Common Issues

  1. Context not loading: Check file path in Claude Desktop config
  2. Server not starting: Verify Python dependencies installed
  3. Rules not applying: Check JSON syntax in context files

See TROUBLESHOOTING.md for detailed solutions.

Documentation

πŸ“š Wiki & Use Cases

Explore advanced integrations and real-world use cases in our Community Wiki:

  • Wiki Homepage: Comprehensive guide to what the Context Provider is good for
  • AppleScript with Memory Integration: Advanced workflow showcasing intelligent script management with persistent memory
  • Integration Examples: Community-driven examples of Context Provider workflows
  • Best Practices: Tips and patterns for maximizing Context Provider effectiveness

The wiki demonstrates how the Context Provider transforms from simple rule storage into intelligent, self-improving workflow automation.

DXT Package Distribution

The MCP Context Provider is available as a Desktop Extension (DXT) package for easy distribution and installation:

  • Package: mcp-context-provider-1.0.0.dxt (18.6 MB)
  • Contents: Complete server with all dependencies bundled
  • Platform: Windows, macOS, Linux with Python 3.8+
  • Dependencies: Self-contained (no external pip requirements)

Building DXT Package

To build your own DXT package from source:

# Install DXT CLI
npm install -g @anthropic-ai/dxt

# Build the package
cd dxt
dxt pack

# The package will be created as mcp-context-provider-1.0.0.dxt

Distribution Notes

  • The DXT package includes all Python dependencies (MCP SDK, Pydantic, etc.)
  • Total unpacked size: ~45 MB including all dependencies
  • Optimized for offline installation and deployment
  • Compatible with corporate environments and air-gapped systems

Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/new-context
  3. Add your context file to /contexts
  4. Test with your Claude Desktop setup
  5. Submit a pull request

License

MIT License - see LICENSE file for details.

About

A static MCP server that provides AI models with persistent tool context, preventing context loss between chats.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •