A powerful AI agent implementation using Pydantic and Chainlit, capable of web browsing and interaction through MCP (Multi-Command Protocol).
- Web browsing capabilities with automated interactions
- Integration with Ollama for local LLM support
- Chainlit-based interactive chat interface
- Pydantic models for type-safe data handling
- Configurable MCP server integration
- Python 3.8+
- Node.js and npm (for MCP server)
- Ollama installed locally
- MCP server access
- Clone the repository:
git clone https://github.com/RyanNg1403/pydantic-ai-mcp-agent-with-chainlit.git
cd pydantic-ai-mcp-agent-with-chainlit- Install Python dependencies:
pip install -r requirements.txt- Install Node.js dependencies:
npm install- Copy the template configuration file:
cp mcp_config.template.json mcp_config.json- Edit
mcp_config.jsonwith your configuration settings. The file is ignored by git for security.
chainlit run pydantic_mcp_chainlit.pypython pydantic_mcp_agent.pypydantic_mcp_agent.py: Core agent implementationpydantic_mcp_chainlit.py: Chainlit interface implementationmcp_client.py: MCP client implementationrequirements.txt: Python dependenciesmcp_config.template.json: Template for configuration.gitignore: Specifies which files git should ignore
The following environment variables can be set in your .env file:
EXA_API_KEY: Your MCP API keyOLLAMA_HOST: Ollama host address (default: http://localhost:11434)
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request