- Multiprocessing worker pool
- Encrypted and compressed task packages
- Scheduled tasks
- Result hooks
- Failure and result database
- PaaS compatible with multiple instances
- Django Admin integration
- Multi cluster monitor
- Redis-py
- Django > = 1.7
- Django-picklefield
- Arrow
- Blessed
Tested with: Python 2.7, 3.4. Django 1.7.8, 1.8.2*
*Django Q is currently in Alpha and as such not safe for production, yet.
- Install the latest version with pip:
pip install django-q
Add django_q to INSTALLED_APPS in your settings.py:
INSTALLED_APPS = ( # other apps 'django_q', )
- Run
python manage.py migrate
to create the database tables - Make sure you have a Redis server running somewhere
All configuration settings are optional. e.g:
# settings.py
Q_CLUSTER = {
'name': 'myproject',
'workers': 8,
'recycle': 500,
'compress': True,
'save_limit': 250,
'label': 'Django Q',
'redis': {
'host': '127.0.0.1',
'port': 6379,
'db': 0, }
}
- name Used to differentiate between projects using the same Redis server* ['default']
- workers The number of workers to use in the cluster [CPU count]
- recycle The number of tasks a worker will process before respawning. Used to release resources. [500]
- compress Compress task packages to Redis. Useful for large payloads. [False]
- save_limit Limits the amount of successful tasks saved to Django. Set to 0 for unlimited. Set to -1 for no success storage at all. Failures are always saved. [250]
- label The label used for the Django Admin page ['Django Q']
- redis Connection settings for Redis. Follows standard Redis-Py syntax. [localhost]
*Django Q uses your SECRET_KEY to encrypt task packages and prevent task crossover
Start a cluster with: python manage.py qcluster
Monitor your clusters with python manage.py qmonitor
Use async from your code to quickly offload tasks:
from django_q import async, result
# create the task
async('math.copysign', 2, -2)
# or with import and storing the id
import math.copysign
task_id = async(copysign, 2, -2)
# get the result
task_result = result(task_id)
# result returns None if the task has not been executed yet
# so in most cases you will want to use a hook:
async('math.modf', 2.5, hook='hooks.print_result')
# hooks.py
def print_result(task):
print(task.result)
async(func,*args,**kwargs)
- func: Function to execute. Dotted string or reference.
- args: Optional arguments for the function.
- hook: Optional function to call after execution. Dotted string or reference.
- kwargs: Optional keyword arguments for the function.
Schedules are regular Django models. You can manage them through the Admin page or directly from your code:
from django_q import Schedule, schedule
# Use the schedule wrapper
schedule('math.copysign',
2, -2,
hook='hooks.print_result',
schedule_type=Schedule.DAILY)
# Or create the object directly
Schedule.objects.create(func='math.copysign',
hook='hooks.print_result',
args='2,-2',
schedule_type=Schedule.DAILY
)
schedule(func,*args,**kwargs)
- func: the function to schedule. Dotted strings only.
- args: arguments for the scheduled function.
- hook: optional result hook function. Dotted strings only.
- schedule_type: (O)nce, (H)ourly, (D)aily, (W)eekly, M(onthly), Q(uarterly), Y(early)
- repeats: Number of times to repeat schedule. -1=Always, 0=Never, n=n.
- next_run: Next or first scheduled execution datetime.
- kwargs: optional keyword arguments for the scheduled function.
- Task and Schedule are Django Models and can therefore be managed by your own code.
- Task objects are only created after an async package has been executed.
- A Schedule creates a new async package for every execution and thus an unique Task
- Success and Failure are convenient proxy models of Task
To run the tests you will need py.test and pytest-django
- Write sphinx documentation
- Better tests and coverage
- Get out of Alpha
- Less dependencies?