Skip to content

meliodaseren/councilor-voter-guide

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

councilor-voter-guide

議員投票指南
文件

原則 Principle

  • 預設皆開源
    Open source by default.

  • 不做家長式的媒體,不做議題上的價值高下判斷、排名
    No paternalism.

  • 如要引入非政府公開資訊,必需是全民可參與編輯協作的或是候選人、民代、政黨自行編輯的
    There is a necessary requirement to include nonofficial data:

    • All citizen could cooperate these data.
    • These data are publish by candidates, councilors themself.

Project Layout

In Ubuntu 14.04 LTS

Website (Python/Django)

0.1 install basic tools

sudo apt-get update
sudo apt-get upgrade
sudo reboot
sudo apt-get install git python-pip python-dev python-setuptools postgresql libpq-dev
sudo easy_install virtualenv

0.2 set a password in your database(If you already have one, just skip this step)

sudo -u <username> psql -c "ALTER USER <username> with encrypted PASSWORD 'put_your_password_here';"

e.g.

sudo -u postgres psql -c "ALTER USER postgres with encrypted PASSWORD 'my_password';"

Clone source code from GitHub to local

It is quite big now. please be patient. don't use command like git --depth

git clone https://github.com/g0v/councilor-voter-guide.git       
cd councilor-voter-guide/voter_guide/

Start virtualenv and install packages

(if you don' mind packages installed into your local environment, just pip install -r requirements.txt)

virtualenv --no-site-packages venv      
. venv/bin/activate        
pip install -r requirements.txt     

Create db & restore data

We use Postgres 9.5, please set your database config in voter_guide/local_settings.py.
Please create a database(e.g. voter_guide), below will use voter_guide for example

createdb -h localhost -U <username> voter_guide
pg_restore --verbose --clean --no-acl --no-owner -h localhost -U <username> -d voter_guide local_db.dump

runserver

python manage.py runserver

Now you should able to see the web page at http://localhost:8000

Dumpdata(optional)

python manage.py dumpdata --exclude auth.permission --exclude contenttypes > db.json

Mac Related Instructions

Prepare Compiler

There are some python package written in C or C++ such as lxml. so a compiler is required. you can install a compiler via the following command:

xcode-select --install

Prepare PostgreSQL

You can install the packaged app here. put the app in your Application folder and click it to start.

And please add the following line to your ~/.bash_profile

export PATH=/Applications/Postgres.app/Contents/Versions/9.3/bin/:$PATH

please change the version number 9.3 if you download a different version of PostgreSQL.

after you add the PATH environment variable, source it.

source ~/.bash_profile

if you don't add the PATH variable, installation of psycopg2 will not success.

How to use this images

First Step: Download and Extract pgdata

git clone https://github.com/c3h3/g0v-cvg-pgdata.git && cd g0v-cvg-pgdata && tar xfzv 47821274c242ce68f2d8d18d4bb0d050d6481311.tar.gz
  • After that, you will get pgdata dir.
  • Assume pgdata's absolute path is "your_pgdata"

Second Step: RUN postgres with pgdata

docker run --name pgdb -v your_pgdata:/var/lib/postgresql/data postgres:9.3

If you want to use pgadmin connect with your db, you could also forwarding the port out ... with command ...

docker run --name pgdb -p 5432:5432 -v your_pgdata:/var/lib/postgresql/data postgres:9.3
  • "your_pgdata" is pgdata's absolute path in previous step.

Third Step: RUN web linked with pgdb

docker run --name g0v-cvg-web --link pgdb:postgres -p port_on_host:8000 -d c3h3/g0v-cvg-web

Crawler Docker c3h3 / g0v-cvg-crawler

How to use this images

Run Scarpy Server:

docker run --name g0v -p forward_port:6800 -v outside_items:/items -v outside_logs:/logs -d c3h3/g0v-cvg-crawler
  • "forward_port" is the port you want to forward into docker image (EXPOSE 6800)
  • "outside_items" is the directory you want to mount into docker image as /items
  • "outside_logs" is the directory you want to mount into docker image as /logs

Link Scarpy Server for Deploy and Submit Job:

docker run --link g0v:g0v -it c3h3/g0v-cvg-crawler /bin/bash

Example of Deploy ttc:

in a running docker instance which linked with g0v (Scarpy Server), you can use the following command to deploy tcc crawler to server:

cd /tmp/g0v-cvg/crawler/tcc && python deploy.py

Example of Crawl ttc.bills :

in a running docker instance which linked with g0v (Scarpy Server), you can use the following command to deploy tcc crawler to server:

cd /tmp/g0v-cvg/crawler/bin && python crawl_tcc_bills.py

CC0 1.0 Universal

CC0 1.0 Universal
This work is published from Taiwan.

About

縣市長 / 議員 投票指南

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • HTML 47.2%
  • JavaScript 41.4%
  • Python 5.8%
  • CSS 5.6%