Skip to content

llonchj/scrapyd

 
 

Repository files navigation

Scrapyd

image

Overview

Scrapyd is a service daemon to run [Scrapy](http://scrapy.org) spiders.

It allows you to deploy your Scrapy projects by building Python eggs of them and uploading them to the Scrapy service using a JSON API that you can also use for scheduling spider runs. It supports multiple projects also.

Requirements

  • Python 2.6 or up
  • Works on Linux, Windows, Mac OSX, BSD

Install

The quick way:

pip install scrapyd

Releases

You can download the latest stable and development releases from: http://pypi.python.org/pypi/Scrapyd

Community (blog, twitter, mail list, IRC)

See http://scrapy.org/community/

Contributing

See http://doc.scrapy.org/en/latest/contributing.html

Companies using Scrapy

See http://scrapy.org/companies/

Commercial Support

See http://scrapy.org/support/

About

A service daemon to run Scrapy spiders

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 90.8%
  • Shell 9.2%