A collection of AI agent tutorials demonstrating integration with FluxLoop for simulation, evaluation, and testing.
FluxLoop is an open-source toolkit for running reproducible, offline-first simulations of AI agents against dynamic scenarios. It empowers developers to rigorously test agent behavior, evaluate performance against custom criteria, and build confidence before shipping to production.
- 🎯 Simple Decorator-Based Instrumentation: Add
@fluxloop.agent()to trace agent execution - 🧪 Offline-First Simulation: Run experiments locally with full control and reproducibility
- 📊 Evaluation-First Testing: Define custom evaluators and success criteria
- 🔌 Framework-Agnostic: Works with LangGraph, LangChain, and custom agent frameworks
Visit the FluxLoop repository for installation and documentation.
This repository provides ready-to-run AI agent examples that can be used for:
- Learning: Understand how to build agents with popular frameworks
- FluxLoop Integration: See practical examples of instrumenting agents with FluxLoop decorators
- Testing & Evaluation: Use as baseline implementations for your own simulations
- Benchmarking: Compare different agent architectures and approaches
Each tutorial is a self-contained project with:
- Complete source code
- Setup instructions
- CLI interface for easy testing
- FluxLoop instrumentation examples (where applicable)
A console-friendly port of the official LangGraph customer support tutorial. Demonstrates a multi-stage agentic system with tool calling, memory, and human-in-the-loop workflows.
What You'll Learn:
- Building stateful agents with LangGraph
- Implementing tool calling for database queries and bookings
- Managing conversation state and checkpointing
- Progressive complexity across 4 tutorial stages
Features:
- 🛠️ Booking tools for flights, hotels, car rentals, and excursions
- 💾 SQLite database integration with travel data
- 🔄 Four progressive graph implementations (Part 1-4)
- 🎨 Rich console UI with streaming responses
- ⚙️ Configurable LLM provider (Anthropic/OpenAI)
Quick Start:
cd langgraph/customer-support
uv sync
uv run python -m customer_support.main --demoSee the customer-support README for detailed setup and usage.
All tutorials in this repository are designed to work seamlessly with FluxLoop. Here's how to instrument and evaluate these agents:
pip install fluxloop fluxloop-cliAdd the @fluxloop.agent() decorator to trace agent execution:
import fluxloop
@fluxloop.agent()
def run_customer_support_agent(query: str):
# Your agent code here
return graph.invoke({"messages": [query]})Create diverse test scenarios for your agent:
fluxloop generate inputs --limit 50Execute batch experiments with different configurations:
fluxloop run experimentParse and analyze agent performance:
fluxloop parse experiment experiments/<experiment_dir>For detailed FluxLoop integration guides, see the FluxLoop documentation.
fluxloop-tutorials/
├── README.md # This file
├── langgraph/ # LangGraph framework tutorials
│ └── customer-support/ # Customer support agent example
│ ├── README.md
│ ├── pyproject.toml
│ └── src/
│ └── customer_support/
│ ├── main.py # CLI entry point
│ ├── graphs/ # Part 1-4 implementations
│ ├── tools/ # Booking and policy tools
│ ├── data/ # Database utilities
│ └── utils/ # Shared helpers
└── [more frameworks coming soon] # LlamaIndex, CrewAI, etc.
We're continuously expanding this collection with more frameworks and use cases:
- LangGraph: Customer Support Agent
- LangGraph: Research Assistant
- LangGraph: Code Review Agent
- LlamaIndex: RAG-based Q&A Agent
- CrewAI: Multi-Agent Collaboration
- Custom Framework: Simple Reasoning Agent
We welcome contributions! If you have an interesting agent implementation you'd like to share:
- Fork this repository
- Add your tutorial in a new directory with:
- Complete source code
- README with setup instructions
- Example usage and expected outputs
- Submit a pull request
Please ensure your tutorial:
- Is self-contained and reproducible
- Includes clear documentation
- Demonstrates FluxLoop integration (or can be easily integrated)
- Follows the existing structure and style
- Python 3.11+ (for most tutorials)
- API keys for LLM providers (OpenAI, Anthropic, etc.)
- Optional: FluxLoop SDK for instrumentation and evaluation
Specific requirements are listed in each tutorial's README.
This repository is open-source and available under the MIT License. Individual tutorials may use different licenses—check each tutorial's directory for details.
- FluxLoop: GitHub | Documentation
- LangGraph: Docs | Tutorials
- LangChain: Docs
- Community: FluxLoop Issues
- Open an issue in this repository
- Check the FluxLoop community for questions about instrumentation
- Each tutorial has its own README with specific troubleshooting tips
Happy building! 🚀