Skip to content

Conversation

@millsks
Copy link
Contributor

@millsks millsks commented Nov 30, 2025

Description

This PR adds django-rq as an optional background task backend. Projects can now opt into django-rq as a simpler alternative to Celery when generating a new project. Celery remains fully supported and unchanged.

What's included:

  • New use_django_rq cookiecutter option
  • django-rq dependency added to requirements when enabled
  • RQ configuration in settings, reusing existing Redis infrastructure
  • Docker Compose rqworker service and Procfile worker definition
  • Documentation covering worker setup, admin dashboard, and metrics endpoints

Key features:

  • Reuses existing Redis patterns from Celery integration
  • Includes admin dashboard, Prometheus metrics, and JSON stats API
  • Optional synchronous mode for easier debugging in local/test environments
  • Production-ready with systemd and Heroku deployment examples

Checklist:

  • I've made sure that tests are updated accordingly (especially if adding or updating a template option)
  • I've updated the documentation or confirm that my change doesn't require any updates

Rationale

Many Django projects need a simple Redis-backed job queue without the complexity of Celery. django-rq is production-ready and provides exactly that. Since cookiecutter-django already integrates Redis for Celery, adding django-rq as an option has minimal overhead while giving users a lighter-weight choice for background tasks.

Open questions for maintainers:

  • Should use_celery and use_django_rq be mutually exclusive, or allow both for advanced setups?
  • Any preference for the option naming or default queue configuration?
  • Are there specific dependency management or Docker conventions I should follow more closely?

Resolves #6209

millsks and others added 8 commits November 30, 2025 10:30
- Added configuration options for Django-RQ in cookiecutter.json and project generation options documentation.
- Updated README.md to include Django-RQ features and setup instructions.
- Created a comprehensive guide for using Django-RQ, detailing task creation, enqueuing, and monitoring.
- Modified Docker and Docker Compose files to include services for Valkey and RQ components.
- Updated environment files to include VALKEY_URL for local and production setups.
- Adjusted Procfile to support RQ worker and scheduler commands.
- Enhanced base.py settings to configure RQ queues and added necessary imports in tasks and tests.
- Implemented cleanup hooks in post_gen_project.py to remove RQ files if not used.
- Added new start scripts for RQ worker, scheduler, and dashboard in both local and production setups.
Copy link
Member

@browniebroke browniebroke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't look into too much details but noted a few things

Comment on lines 37 to +38
"use_celery": "n",
"use_django_rq": "n",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should use_celery and use_django_rq be mutually exclusive, or allow both for advanced setups?

AFAIU, Celery and RQ are solving the same problem, so we should design the questions to avoid having both installed. I would change this question to something like "background_queue": ["None", "Celery", "Django-RQ"]. Basically turn 2 Y/N questions into one drop-down question.

Comment on lines +108 to +112
valkey:
image: docker.io/valkey/valkey:8.0
container_name: {{ cookiecutter.project_slug }}_local_valkey
volumes:
- {{ cookiecutter.project_slug }}_local_valkey_data:/data
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know we're discussing migrating our Redis setup to Valkey, but we should perhaps stick to Redis for now. I would like to avoid us being in a situation where have different datastores, depending on which background queue framework was used.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm happy for this file to describe what's specific in our setup, but plkease avoid general explanantions about how to use django-rq. The package has docs for that (I hope), and when things will change on their end, I would like to avoid this content to go stale.

You can give a high level intro about django-rq, and explain how it's configured in the generated project, but then link to their docs please.

Comment on lines +346 to +358
RQ_QUEUES = {
"default": {
"URL": VALKEY_URL,
"DEFAULT_TIMEOUT": 360,
},
"high": {
"URL": VALKEY_URL,
"DEFAULT_TIMEOUT": 500,
},
"low": {
"URL": VALKEY_URL,
},
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure we need 3 queues by default, I think the Celery setup has a single queue... Maybe keep it simple for now?

(I haven't worker with djkango-rq so I can't assess the implications/caveats/footguns that comes with multiple queues followed by a single worker)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature: Add django-rq as an alternative task queue

2 participants