Skip to content

arthurnn/podcast

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dependencies

1. Python
2. python-MySQLdb
	http://sourceforge.net/projects/mysql-python/
3. python Scrapy
	http://scrapy.org/download/

Installation

1. create db
	$ mysql -uroot -proot
	mysql> create database podcastdb;
2. create table
	$ mysql -uroot -proot podcastdb < db/mysql.sql

3. start crawler server
	$ scrapy server
	
4. register the spider in the server
	$ curl http://localhost:6800/schedule.json -d project=default -d spider=itunes
	
5. Just keep watching the spiders making their work:
	http://localhost:6800/

Bitdeli Badge

About

web crawler to grab itunes podcast info

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages