A comprehensive flood risk prediction system using machine learning and hydrological modeling. This project combines physics-informed neural networks with multi-scale CNN architectures to predict flood depths and assess flood risks.
- Multi-Scale CNN Architecture: U-Net based encoder-decoder with multi-scale input processing
- Physics-Informed Constraints: Mass conservation and hydrological physics integration
- Geospatial Processing: DEM processing, terrain feature extraction, and hydrological conditioning
- Real-time Prediction API: FastAPI-based REST API for flood predictions
- Scalable Architecture: Docker containerization with Redis caching and PostgreSQL storage
- Monitoring & Observability: Integrated Prometheus metrics and Grafana dashboards
- Docker and Docker Compose
- Python 3.11+
- Git
-
Clone the repository
git clone <repository-url> cd FloodRisk
-
Set up environment variables
cp .env.example .env # Edit .env with your configuration -
Start the development environment
make dev-up
-
Run database migrations
make db-migrate
-
Access the application
- API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
- Grafana Dashboard: http://localhost:3000 (admin/admin123)
- Jupyter Notebook: http://localhost:8888
-
Create virtual environment
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies
pip install -r requirements.txt pip install -e . -
Set up pre-commit hooks
pre-commit install
-
Run tests
make test
# Start all services
make dev-up
# View logs
make logs
# Stop services
make dev-down
# Rebuild containers
make buildFloodRisk/
├── src/ # Source code
│ ├── models/ # ML models and architectures
│ │ ├── flood_cnn.py # Multi-scale CNN implementation
│ │ └── __init__.py
│ ├── preprocessing/ # Data preprocessing modules
│ │ ├── dem/ # DEM processing
│ │ └── terrain/ # Terrain feature extraction
│ ├── api/ # FastAPI application
│ ├── tasks/ # Background tasks (Celery)
│ └── utils/ # Utility functions
├── tests/ # Test suite
├── docs/ # Documentation
├── data/ # Data directory
├── models/ # Trained model artifacts
├── logs/ # Application logs
├── docker-compose.yml # Docker services configuration
├── Dockerfile # Docker image definition
├── requirements.txt # Python dependencies
├── Makefile # Development commands
└── pytest.ini # Test configuration
curl http://localhost:8000/healthcurl -X POST "http://localhost:8000/api/v1/predict" \
-H "Content-Type: application/json" \
-d '{
"elevation_data": [...],
"rainfall_data": [...],
"terrain_features": {...}
}'curl -X POST "http://localhost:8000/api/v1/dem/upload" \
-F "file=@path/to/dem.tif"The system uses a multi-scale CNN architecture with:
- Input Scales: 256m high-resolution + 512m/1024m context
- Physics Constraints: Mass conservation loss functions
- Attention Mechanisms: Multi-scale feature fusion
- Dimensionless Features: Normalized terrain characteristics
- PhysicsInformedLoss: Incorporates hydrological constraints
- MultiScaleEncoder: Processes multiple resolution inputs
- AttentionDecoder: Fuses multi-scale features
- RainfallScaling: Handles dynamic rainfall inputs
-
DEM Preprocessing
- Hydrological conditioning
- Sink removal and filling
- Flow direction calculation
-
Feature Extraction
- Slope and curvature calculation
- Flow accumulation analysis
- Height Above Nearest Drainage (HAND)
-
Multi-scale Preparation
- Resampling to multiple resolutions
- Normalization and standardization
# Run all tests
make test
# Run with coverage
make test-cov
# Run specific test module
pytest tests/test_models.py
# Run integration tests
pytest tests/integration/The application exposes Prometheus metrics at /metrics:
- Request latency and throughput
- Model prediction accuracy
- Resource utilization
- Background task status
Structured logging with configurable levels:
- Application logs:
./logs/floodrisk.log - Error tracking with Sentry (optional)
- Application health:
/health - Database connectivity:
/health/db - Redis connectivity:
/health/redis
-
Set production environment variables
export ENVIRONMENT=production export DEBUG=false export DATABASE_URL=postgresql://...
-
Use production Docker image
docker build --target production -t floodrisk:prod . -
Configure SSL and security headers
- Use strong secret keys
- Enable SSL/TLS termination
- Configure CORS appropriately
- Implement rate limiting
- Regular security updates
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Run the test suite
- Submit a pull request
- Follow PEP 8
- Use Black for formatting
- Sort imports with isort
- Type hints required
- Docstrings for all functions
[License information]
- Documentation: [Link to docs]
- Issues: [GitHub Issues]
- Discussions: [GitHub Discussions]
-
LISFLOOD-FP modeling framework
-
PyTorch community
-
Geospatial Python ecosystem