xpander.ai offers Backend-as-a-Service infrastructure for autonomous agents: memory, tools, multi-user state, various agent triggering options (MCP, A2A, API, Web interfaces), storage, agent-to-agent messaging — designed to support any agent framework and SDK
xpander.ai-demo.mov
| Feature | Description |
|---|---|
| 🛠️ Framework Flexibility | Choose from popular frameworks like OpenAI ADK, Agno, CrewAI, LangChain, or work directly with native LLM APIs |
| 🧰 Tool Integration | Access our comprehensive MCP-compatible tools library and pre-built integrations |
| 🚀 Scalable Hosting | Deploy and scale your agents effortlessly on our managed infrastructure |
| 💾 State Management | Opt for framework-specific local state or leverage our distributed state management system |
| ⚡ Real-time Events | Harness our event streaming capabilities for Slackbots, ChatUIs, Agent2Agent communication, and Webhook integrations |
| 🛡️ API Guardrails | Implement robust guardrails using our Agent-Graph-System to define and manage dependencies between API actions of tool-use |
By abstracting away infrastructure complexity, xpander.ai empowers you to focus on what matters most: building intelligent, effective, production-ready AI agents.
- Login to https://app.xpander.ai and go to the Templates section
- Deploy the Coding agent
- Send tasks to the agent.
Examples:Clone the <my-repo-name> repo and add the following feature to the codebase ..., then create a PR with the new code.Find all open PRs that have been waiting on review for more than 3 days. - Continue customizing, adding tools, configure triggering (MCP, A2A, Webhooks), multi-agent collaboration, and more.
# Python
pip install xpander-sdk
# Node.js
npm install @xpander-ai/sdk
# CLI (for agent creation)
npm install -g xpander-clixpander login
xpander agent new
python xpander_handler.py # <-- Events with entry point for your agentsAdd one line of code to xpander_handler.py and your agent will be accessible via Agent2Agent, Slackbots, MCP servers, or WebUI.
on_execution_request(execution_task: AgentExecution) -> AgentExecutionResult:
your_agent.invoke(execution_task.input.text)
return AgentExecutionResult(
result="your-agent-result",
is_success=True,
) from xpander_sdk import XpanderClient, Agent
# Init the clients
xpander_client = XpanderClient(api_key="YOUR_XPANDER_API_KEY")
agent_backend : Agent = xpander_client.agents.get(agent_id="YOUR_AGENT_ID")
# Initializing a new task creates a new conversation thread with empty agent state
xpander_agent.add_task("What can you do?")
response = openai_client.chat.completions.create(
model="gpt-4o",
messages=agent_backend.messages, # <-- Automatically loads the current state in the LLM format
tools=agent_backend.get_tools(), ## <-- Automatically loads all the tool schemas from the cloud
tool_choice=agent_backend.tool_choice,
temperature=0.0
)
# Save the LLM Current state
agent.add_messages(response.model_dump())
# Extract the tools requested by the AI Model
tool_calls = XpanderClient.extract_tool_calls(llm_response=response.model_dump())
# Execute tools automatically and securely in the cloud after validating schema and loading user overrides and authentication
agent.run_tools(tool_calls=tool_calls)xpander deploy # Will deploy the Docker container to the cloud and run it via the xpander_handler.py file
xpander logs # Will stream logs locally from the agent configured locally| Project | Description | License | Tech Stack | Link |
|---|---|---|---|---|
| 💻 Coding Agent | Framework-agnostic agent that reads, writes, and commits code to Git repositories | MIT | Python, OpenAI, Anthropic, Gemini, Llama 3 | Repo |
| 🎥 NVIDIA Meeting Recorder | AI assistant that records, transcribes, and extracts insights from meetings | Apache 2.0 | Python, NVIDIA SDKs, Speech Recognition | Repo |
| 🌍 Hello World Example | Simple starter template for building agents with xpander.ai | Apache 2.0 | Python, OpenAI | Repo |
The Getting-Started/hello-world directory contains a simple agent implementation to demonstrate core concepts of how to run asynchronous AI Agents with local tools and cloud tools, and a fully managed stateful state in a backend with xpander.ai:
hello-world/
├── app.py # CLI entry point for the agent with local thread
├── my_agent.py # Agent implementation (Your agent code goes here)
├── xpander_handler.py # Event handler for incoming events from the platform
├── Dockerfile # For containerized deployment
├── providers/
│ ├── ai_frameworks/ # Framework integrations
│ └── llms/ # LLM provider implementations
│ ├── openai/ # OpenAI specific implementation
│ └── ...
└── tools/
├── local_tools.py # Custom tools implementation
└── async_function_caller.py # Async function caller utility
See Hello-world.md for more details
- Open-source runtime: Apache License 2.0
- Hosted platform: Commercial (free tier available)