Django integration with RQ, a Redis
based Python queuing library. Django-RQ is a
simple app that allows you to configure your queues in django's settings.py
and easily use them in your project.
- Install
django-rq
(or download from PyPI):
pip install django-rq
- Add
django_rq
toINSTALLED_APPS
insettings.py
:
INSTALLED_APPS = (
# other apps
"django_rq",
)
- Configure your queues in django's
settings.py
(syntax based on Django's database config):
RQ_QUEUES = {
'default': {
'HOST': 'localhost',
'PORT': 6379,
'DB': 0,
'PASSWORD': 'some-password',
},
'high': {
'URL': os.getenv('REDISTOGO_URL', 'redis://localhost:6379'), # If you're on Heroku
'DB': 0,
},
'low': {
'HOST': 'localhost',
'PORT': 6379,
'DB': 0,
}
}
- Include
django_rq.urls
in yoururls.py
:
urlpatterns += patterns('',
(r'^admin/django_rq/', include('django_rq.urls')),
)
Django-RQ allows you to easily put jobs into any of the queues defined in
settings.py
. It comes with a few utility functions:
enqueue
- push a job to thedefault
queue:
import django_rq
django_rq.enqueue(func, foo, bar=baz)
get_queue
- accepts a single queue name argument (defaults to "default") and returns an RQQueue
instance for you to queue jobs into:
import django_rq
queue = django_rq.get_queue('high')
queue.enqueue(func, foo, bar=baz)
get_connection
- accepts a single queue name argument (defaults to "default") and returns a connection to the queue's Redis server:
import django_rq
redis_conn = django_rq.get_connection('high')
get_worker
- accepts optional queue names and returns a new RQWorker
instance for specified queues (ordefault
queue):
import django_rq
worker = django_rq.get_worker() # Returns a worker for "default" queue
worker.run()
worker = django_rq.get_worker('low', 'high') # Returns a worker for "low" and "high"
To easily turn a callable into an RQ task, you can also use the @job
decorator that comes with django_rq
:
from django_rq import job
@job
def long_running_func():
pass
long_running_func.delay() # Enqueue function in "default" queue
@job('high')
def long_running_func():
pass
long_running_func.delay() # Enqueue function in "high" queue
django_rq provides a management command that starts a worker for every queue specified as arguments:
python manage.py rqworker high default low
If you want to run rqworker
in burst mode, you can pass in the --burst
flag:
python manage.py rqworker high default low --burst
If you have RQ Scheduler installed,
you can also use the get_scheduler
function to return a Scheduler
instance for queues defined in settings.py's RQ_QUEUES
. For example:
import django_rq
scheduler = django_rq.get_scheduler('default')
job = scheduler.enqueue_at(datetime(2020, 10, 10), func)
django_rq
also provides a very simple dashboard to monitor the status of
your queues at /admin/django_rq/
.
If you need a more sophisticated monitoring tool for RQ, you could also try rq-dashboard.
Starting from version 0.3.3, RQ uses Python's logging
, this means
you can easily configure rqworker
's logging mechanism in django's
settings.py
. For example:
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"rq_console": {
"format": "%(asctime)s %(message)s",
"datefmt": "%H:%M:%S",
},
},
"handlers": {
"rq_console": {
"level": "DEBUG",
"class": "rq.utils.ColorizingStreamHandler",
"formatter": "rq_console",
"exclude": ["%(asctime)s"],
},
},
'loggers': {
"rq.worker": {
"handlers": ["rq_console"],
"level": "DEBUG"
},
}
}
For an easier testing process, you can run a worker synchronously this way:
from django.test impor TestCase
from django_rq import get_worker
class MyTest(TestCase):
def test_something_that_creates_jobs(self):
... # Stuff that init jobs.
get_worker().work(burst=True) # Processes all jobs then stop.
... # Asserts that the job stuff is done.
To run django_rq
's test suite:
django-admin.py test django_rq --settings=django_rq.tests.settings --pythonpath=.
- Added
--burst
option torqworker
management command - Added support for Python's
logging
, introduced inRQ
0.3.3 - Fixed a bug that causes jobs using RQ's new
get_current_job
to fail when executed through therqworker
management command
Fixed a minor bug in accessing rq_job_detail view.
More improvements to /admin/django_rq/:
- Views now require staff permission
- Now you can delete jobs from queue
- Failed jobs' tracebacks are better formatted
Greatly improved /admin/django_rq/, now you can:
- See jobs in each queue, including failed queue
- See each job's detailed information
- Simplified
@job
decorator syntax for enqueuing to "default" queue.
- Queues can now be configured using the URL parameter in
settings.py
.
- Added support for RQ's
@job
decorator - Added
get_worker
command
- "PASSWORD" key in RQ_QUEUES will now be used when connecting to Redis.