Ollamarama is an AI chatbot for IRC that uses local LLMs with Ollama. It can roleplay as almost anything you can think of with customizable personalities, individual user chat histories, and collaborative features.
- 🎭 Roleplay as any character or personality - Set any default personality that can be changed at any time
- 👥 Individual chat histories - Each user maintains their own separate conversation with their chosen personality
- 🤝 Collaborative mode - Users can interact with each other's chat histories for collaboration
- 🔄 Real-time personality switching - Change personalities on the fly during conversations
- Matrix: ollamarama-matrix
- Terminal: ollamarama
Install and familiarize yourself with Ollama. Make sure you can run local LLMs successfully.
Linux installation:
curl https://ollama.com/install.sh | shWindows installation:
Download the app from the Ollama website.
Or install the bot as a package (optional):
pip install .- Browse and download models that work best for your use case
- Add your chosen models to the
config.jsonfile - Install models using:
ollama pull modelname
Edit config.json with nested keys (see docs/configuration.md):
IRC:
irc.server: IRC server (e.g.,irc.libera.chat)irc.nickname: bot nicknameirc.password: NickServ password (optional)irc.channels: list of channels (or legacyirc.channel)irc.admins: list of admin nicknames (for admin commands)
Ollama:
ollama.api_url: e.g.,http://localhost:11434/api/chatollama.models: mapping of names to model IDsollama.default_model: selected model (name or ID)ollama.options: generation options (temperature, top_p, etc.)ollama.timeout: request timeout secondsollama.mcp_servers: optional MCP servers
Create a virtual environment and install dependencies:
Linux/macOS:
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txtWindows:
python3 -m venv venv
venv\Scripts\activate
pip install -r requirements.txtpython -m ollamarama_irc -c config.jsonInstalled script (after pip install .):
ollamarama-irc -c config.jsonIn Docker or CI, prefer plain logs:
ollamarama-irc -c config.json --log-format plain| Command | Description |
|---|---|
.ai message or botname: message |
Basic chat with the AI |
.x user message |
Talk to another user's chat history |
.persona personality |
Change personality (character, type, object, idea) |
.custom prompt |
Set a custom system prompt instead of roleplay |
.reset |
Reset to preset personality |
.stock |
Remove personality and use standard model settings |
.help |
Display the help menu |
.model |
Show current model and available models (admin only) |
| `.model name | reset` |
.clear |
Clear all histories and reset bot (admin only) |
| `.verbose [on | off |
- Full docs index: docs/index.md
- Ollama setup: docs/ollama.md
- Quickstart: docs/getting-started.md
- Configuration: docs/configuration.md
- Commands: docs/commands.md
- CLI: docs/cli.md
- Architecture: docs/architecture.md
- Tools & MCP: docs/tools-and-mcp.md
- Operations: docs/operations.md
- Development: docs/development.md
- Migration: docs/migration.md
- Legacy map: docs/legacy-map.md
- Refactor plan: docs/refactor-plan.md
- Security: docs/security.md
- AI output disclaimer: docs/ai-output-disclaimer.md
- Not a companion: docs/not-a-companion.md
Build and run:
docker build -t ollamarama-irc .
docker run --rm -v $PWD/config.json:/app/config.json:ro ollamarama-ircCompose:
services:
bot:
build: .
volumes:
- ./config.json:/app/config.json:ro
environment:
- OLLAMARAMA_OLLAMA_URL=http://host.docker.internal:11434/api/chat.ai Hello, how are you today?
botname: What's the weather like?
.persona Sherlock Holmes
.x alice What do you think about this mystery?
.custom You are a helpful programming assistant