Skip to content

A multiprocessing task queue application for Django

License

Notifications You must be signed in to change notification settings

gitter-badger/django-q

 
 

Repository files navigation

Django Q

A multiprocessing task queue for Django

image0 image1

Features

  • Multiprocessing worker pool
  • Encrypted and compressed task packages
  • Scheduled tasks
  • Result hooks
  • Failure and result database
  • PaaS compatible with multiple instances
  • Django Admin integration
  • Multi cluster monitor

Requirements

Tested with: Python 2.7, 3.4. Django 1.7.8, 1.8.2*

*Django Q is currently in Alpha and as such not safe for production, yet.

Installation

  • Install the latest version with pip: pip install django-q
  • Add django_q to INSTALLED_APPS in your settings.py:

    INSTALLED_APPS = (
        # other apps
        'django_q',
    )
  • Run python manage.py migrate to create the database tables
  • Make sure you have a Redis server running somewhere

Configuration

All configuration settings are optional. e.g:

# settings.py
Q_CLUSTER = {
    'name': 'myproject',
    'workers': 8,
    'recycle': 500,
    'compress': True,
    'save_limit': 250,
    'label': 'Django Q',
    'redis': {
        'host': '127.0.0.1',
        'port': 6379,
        'db': 0, }
}
  • name Used to differentiate between projects using the same Redis server* ['default']
  • workers The number of workers to use in the cluster [CPU count]
  • recycle The number of tasks a worker will process before respawning. Used to release resources. [500]
  • compress Compress task packages to Redis. Useful for large payloads. [False]
  • save_limit Limits the amount of successful tasks saved to Django. Set to 0 for unlimited. Set to -1 for no success storage at all. Failures are always saved. [250]
  • label The label used for the Django Admin page ['Django Q']
  • redis Connection settings for Redis. Follows standard Redis-Py syntax. [localhost]

*Django Q uses your SECRET_KEY to encrypt task packages and prevent task crossover

Management Commands

qcluster

Start a cluster with: python manage.py qcluster

qmonitor

Monitor your clusters with python manage.py qmonitor

Creating Tasks

Async

Use async from your code to quickly offload tasks:

from django_q import async, result

# create the task
async('math.copysign', 2, -2)

# or with import and storing the id
import math.copysign

task_id = async(copysign, 2, -2)

# get the result
task_result = result(task_id)

# result returns None if the task has not been executed yet
# so in most cases you will want to use a hook:

async('math.modf', 2.5, hook='hooks.print_result')

# hooks.py
def print_result(task):
    print(task.result)
async(func,*args,**kwargs)
  • func: Function to execute. Dotted string or reference.
  • args: Optional arguments for the function.
  • hook: Optional function to call after execution. Dotted string or reference.
  • kwargs: Optional keyword arguments for the function.

Schedule

Schedules are regular Django models. You can manage them through the Admin page or directly from your code:

from django_q import Schedule, schedule

# Use the schedule wrapper

schedule('math.copysign',
         2, -2,
         hook='hooks.print_result',
         schedule_type=Schedule.DAILY)

# Or create the object directly

Schedule.objects.create(func='math.copysign',
                        hook='hooks.print_result',
                        args='2,-2',
                        schedule_type=Schedule.DAILY
                        )
schedule(func,*args,**kwargs)
  • func: the function to schedule. Dotted strings only.
  • args: arguments for the scheduled function.
  • hook: optional result hook function. Dotted strings only.
  • schedule_type: (O)nce, (H)ourly, (D)aily, (W)eekly, M(onthly), Q(uarterly), Y(early)
  • repeats: Number of times to repeat schedule. -1=Always, 0=Never, n=n.
  • next_run: Next or first scheduled execution datetime.
  • kwargs: optional keyword arguments for the scheduled function.

Models

  • Task and Schedule are Django Models and can therefore be managed by your own code.
  • Task objects are only created after an async package has been executed.
  • A Schedule creates a new async package for every execution and thus an unique Task
  • Success and Failure are convenient proxy models of Task

Testing

To run the tests you will need py.test and pytest-django

Todo

  • Write sphinx documentation
  • Better tests and coverage
  • Get out of Alpha
  • Less dependencies?

Acknowledgements

About

A multiprocessing task queue application for Django

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%