Skip to content

Use Scrapy (Python) to scrape deals info from LivingSocial's Chicago page and store the crawled data into a Postgre database

Notifications You must be signed in to change notification settings

yanniey/Scrapy_livingsocial_chicago

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data scraping with Scrapy (Python)

5/14/15

Use Scrapy to scrape deals info from LivingSocial's Chicago page

Steps:

  1. Config Scrappy spider to crawl and parse HTML
  2. Set up Postgre database to store the parsed data
  3. Create a pipeline to connect parsed data and dtabase with SQLAlchemy ORM

Result:

PSQL db:

postgres sql screenshot

Json output:

Json output

About

Use Scrapy (Python) to scrape deals info from LivingSocial's Chicago page and store the crawled data into a Postgre database

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published