$ git clone https://github.com/ixdlabs/django-api-templateYou should have python 3.11 or above installed in your system. Use venv to install the dependencies.
Use the following command to create the new environment.
$ python -m venv .venv
$ source .venv/bin/activate
# Or in Windows
$ .venv\Scripts\activateAfter the environment is created and activated, install the necessary dependencies using the requirements.txt file. (Make sure you are in the correct virtual environment)
$ pip install -r requirements.txtCheck if following command prints the available command. If the installation is successful, this should not cause an error.
$ python manage.pySome operations in the project rely on environment variables. These are listed in the .env.example file. To configure them, copy this file and rename it to .env. The Django project automatically loads variables from this file, allowing you to set environment-specific values conveniently.
This project uses pre-commit hooks to check the code before committing. Install the hooks by running the following command.
$ pre-commit installFor linting, this project uses mypy. You can run the linter using the following command.
$ mypy .This project includes a testing and coverage setup using Django's built-in test framework and coverage.py.
To run all the tests:
$ python manage.py testTo run tests with coverage:
$ coverage run --source='.' manage.py testTo generate HTML coverage report:
$ coverage htmland the coverage report will be available at: htmlcov/index.html
This project uses sqlite3 as the default database for development. It will automatically create a sqlite.db file in the project root.
This was chosen for simplicity and ease of use during development. You can start the project without any additional database setup.
If you want to emulate the production environment, you can use postgres as the database. Follow the instructions below to set up the database.
Install postgres and setup it according to your OS instructions. Use following
command to login to the psql shell.
$ psql -U postgresThen enter below commands.
CREATE ROLE db_user WITH LOGIN PASSWORD 'password';
CREATE DATABASE django_api_template_db;
GRANT ALL PRIVILEGES ON DATABASE django_api_template_db TO db_user;
\qThen login to psql as db_user and check if the setup is done correctly. Password should be password.
$ psql -U db_user django_api_template_dbRemember to set the DATABASE_URL environment variable.
DATABASE_URL=postgres://db_user:password@localhost:5432/django_api_template_db
First run the database migration and create the necessary tables. Make sure you are in the correct virtual environment. Whenever there is a database model change, you should re-run this.
$ python manage.py migrateThen create the static files required for the project. You should run this again when you pull from the upstream.
$ python manage.py collectstaticThen compile translation message files. Run this whenever you add/update translations or pull new locale files.
$ python manage.py compilemessagesNote: Install gettext if missing
- Ubuntu/Debian:
sudo apt-get install gettext- macOS:
brew install gettext && brew link --force gettext- Windows (Chocolatey):
choco install gettext
Finally, create the user account. This will be the default admin user for the system. Give a preferred username and password.
# This will create the first super admin (nothing will happen if there is one already)
$ python manage.py superuser --username superadmin --email [email protected] --password userpassword
# Or to create interactively
$ python manage.py createsuperuserAfterward, try running the project. The default url for the dashboard is http://127.0.0.1:8000/
$ python manage.py runserverThis project uses Django as the web framework. Django is a high-level Python Web framework that encourages rapid development and clean, pragmatic design.
Slight modifications are made to the default Django project structure to make it more modular and easy to maintain. For
example, the apps are put inside the apps directory and the settings.py and other configuration files are moved to
the config directory.
This project uses RabbitMQ as the message broker for Celery, caching, and other asynchronous tasks. You can run RabbitMQ using Docker with the following commands parallel to the Django server.
$ cd tools/rabbitmq
$ docker build -t rabbitmq .
$ docker run -d --name rabbitmq -p 5672:5672 -p 15672:15672 rabbitmqYou can access the RabbitMQ management interface at http://localhost:15672/ with the default username and password as guest and guest.
This project uses Redis for caching and background task queuing.
To run Redis locally with authentication and optional ACL (username + password) support:
$ cd tools/redis
$ docker build -t redis .
$ docker run -d --name redis -p 6379:6379 -e REDIS_PASSWORD=password redisIf redis is not setup, cache will default to django.core.cache.backends.locmem.LocMemCache.
Mailpit is a tool to capture outgoing emails. You can use it to test the email sending functionality. Install Mailpit using Docker with the following commands parallel to the Django server.
$ cd tools/mailpit
$ docker build -t mailpit .
$ docker run -d --name mailpit -p 8025:8025 -p 1025:1025 mailpitYou can access the Mailpit interface at http://localhost:8025/ to view the captured emails.
For the project to use Mailpit, you need to set the USE_MAILCAPTURE environment variable to True in the .env file. This will enable the email capture functionality in the project.
This project uses Celery for handling asynchronous tasks. Celery is a distributed task queue that allows you to run tasks in the background, making your application more responsive and efficient. As the broker, it uses RabbitMQ, which is set up in the previous section.
For the Celery worker to run, you also need to set the CELERY_BROKER_URL environment variable in the .env file. The default value is set to amqp://guest:guest@localhost:5672, which is the RabbitMQ setup we created earlier.
Celery worker can be run with the following command.
$ celery -A config worker -l info # Celery worker - handles the tasks
$ celery -A config beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler # Celery beat - schedules the periodic tasksIf you prefer not to use Celery—such as during development—you can disable it by setting the following environment variable:
USE_CELERY=False
When this is set, you do not need to run the Celery workers or set up RabbitMQ. However, note that scheduled tasks and asynchronous background jobs will not function in this mode. This configuration is intended only for development or testing, not for production use.
You can start the server in a production-ready configuration using Docker Compose. Ensure Docker and Docker Compose are installed on your system beforehand.
From the project root directory, run:
$ docker-compose up --buildThis command will build the images and start all necessary services (e.g., web, worker, beat, RabbitMQ, PostgreSQL).
Using the launch.json included with this project, you can either run each component separately or run all via the Run and Debug option of Start All.
Following settings are recommended to put in your settings.json.
{
"editor.formatOnSave": true,
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter"
},
"black-formatter.args": ["--line-length=120"],
"files.exclude": {
"**/__pycache__": true,
"**/.mypy_cache": true,
"**/.venv": true
},
"python.analysis.autoImportCompletions": true,
"python.analysis.indexing": true,
"python.analysis.packageIndexDepths": [
{
"name": "django",
"depth": 10,
"includeAllSymbols": true
},
{
"name": "rest_framework",
"depth": 10,
"includeAllSymbols": true
},
{
"name": "drf_spectacular",
"depth": 5,
"includeAllSymbols": true
},
{
"name": "unfold",
"depth": 10,
"includeAllSymbols": true
}
]
}Additionally, to make sure unit tests are discovered properly by VS Code following configurations are needed in settings.json file in the project level. This file is commited in the repo.
{
"python.testing.unittestArgs": ["-p", "test_*.py"],
"python.testing.pytestEnabled": false,
"python.testing.unittestEnabled": true
}Furthermore, in .env file, add the following:
MANAGE_PY_PATH=manage.pyTo add localization support, refer to the official Django localization guide.
To add Sinhala translations (as an example), run the following command:
$ python manage.py makemessages -l siThis will create .po files under locale/si/LC_MESSAGES/.
Edit these files to provide translations for your strings.
If you need to override translations for external packages (e.g., rest_framework, etc.):
- Create a new app under apps/i18n/ named after the package, e.g.:
apps/i18n/rest_framework_locale/. - Inside this app, add a locale/ directory and copy the .po file from the original package:
apps/i18n/rest_framework_locale/locale/si/LC_MESSAGES/django.po - Modify the translations as needed.
- Add this app to
I18N_OVERRIDE_APPSto ensure your overrides take precedence. - Don’t forget to compile translations after editing.
To enable full functionality of the system, the following scheduled tasks must be configured via the Django Admin panel (under Scheduled Tasks). Each task includes a description and its recommended schedule.
| Task | Details |
|---|---|
| Example Task | - Path: apps.example.tasks.task- Description: Example task. - Schedule: Monthly on the 1st (Crontab: 0 0 1 * *)- Args: None |
This project uses OpenTelemetry for observability. It is recommended to use SigNoz as the OpenTelemetry backend, though any compatible backend will work.
To enable telemetry, set the environment variable OTEL_EXPORTER_OTLP_ENDPOINT to point to your OpenTelemetry backend (e.g., http://127.0.0.1:4317 for gRPC, or http://127.0.0.1:4318 for HTTP). This endpoint should be reachable from your application container or host.
Note: When using Docker Compose, the default configuration sets
OTEL_EXPORTER_OTLP_ENDPOINTto the host network's port4317.
Ensure that all relevant environment variables (e.g., OTEL_EXPORTER_OTLP_ENDPOINT) are set before starting the application.
# Start the development server with OpenTelemetry enabled
python manage.py runserver --noreload
# Start the production server using Gunicorn
gunicorn --config tools/infra/gunicorn.conf.py config.wsgi:application
# Start using Docker Compose (assumes backend is running at localhost:4317)
docker-compose up --build