A scalable real-time chat application built with FastAPI, Socket.IO, and Redis for message broadcasting across multiple instances.
- Real-time messaging using WebSockets via Socket.IO
- Scalable architecture with Redis as a message broker
- Global chat room for all connected users
- User join/leave notifications
- Docker containerization for easy deployment
- Nginx as a reverse proxy for load balancing
- Backend: FastAPI (Python)
- Real-time Communication: Socket.IO
- Message Broker: Redis
- Containerization: Docker & Docker Compose
- Reverse Proxy: Nginx
- Docker and Docker Compose
- Git
git clone <repository-url>
cd chat_appThe application uses environment variables for configuration. These are set in the docker-compose.yml file:
PORT: The port on which the FastAPI application runs (default: 8000)REDIS_HOST: Hostname for Redis (default: redis)REDIS_PORT: Port for Redis (default: 6379)
Build and start the containers:
docker compose up --buildThe application will be available at:
- Frontend: http://localhost
- Backend API: http://localhost/api
For local development without Docker:
- Install Python 3.8+ and Redis
- Create a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
- Install dependencies:
pip install -r backend/requirements.txt
- Run Redis locally
- Start the application:
cd backend uvicorn main:socket_app --reload --host 0.0.0.0 --port 8000
GET /: Root endpoint, returns a welcome messageGET /health: Health check endpoint, verifies Redis connectionPOST /send_message: Send a message to all connected clients
chat_message: Send a chat message to all users in the global chat room
connected: Sent when a client connects, includes the session IDmessage: Received chat messages or system notificationsbroadcast: System messages about user activity (join/leave)
chat_app/
├── backend/
│ ├── main.py # FastAPI and Socket.IO application
│ ├── Dockerfile # Docker configuration for backend
│ └── requirements.txt # Python dependencies
├── docker-compose.yml # Docker Compose configuration
├── nginx.conf # Nginx configuration
└── index.html # Simple frontend for testing
The application is designed to scale horizontally. The Redis message broker allows multiple instances of the application to communicate with each other, ensuring that messages are broadcast to all connected clients regardless of which instance they are connected to.
To scale the application, you can use Docker Compose:
docker compose up --scale chat=3If the application cannot connect to Redis, check:
- Redis service is running
- Environment variables are correctly set
- Network connectivity between services
If clients cannot connect via WebSocket:
- Check browser console for errors
- Verify Nginx is properly configured for WebSocket proxying
- Ensure CORS settings are appropriate for your environment
Contributions are welcome! Please feel free to submit a Pull Request.