Whatever the setup target (development or production, local or distant, with or without Docker), add a .env
file with the following variables (no quotes needed).
GOOGLE_ELEVATION_API_KEY=https://developers.google.com/maps/documentation/elevation
GOOGLE_DISTANCE_MATRIX_API_KEY=https://developers.google.com/maps/documentation/distance-matrix
OPEN_WEATHER_MAP_API_KEY=http://openweathermap.org/api
JCDECAUX_API_KEY=https://developer.jcdecaux.com/#/opendata/vls?page=getstarted
KEOLIS_API_KEY=https://data.keolis-rennes.com/fr/accueil.html
LACUB_API_KEY=http://data.bordeaux-metropole.fr/apicub
REGRESSORS_FOLDER=training/regressors/
MONGO_HOST=mongo
MONGO_PORT=27017
APP_SECRET=secret
POSTGRES_USER=postgres
POSTGRES_PASS=postgres
POSTGRES_HOST=postgres
POSTGRES_PORT=5432
POSTGRES_DBNAME=openbikes
Follow the installation script.
Because we scientific libraries for doing machine learning, we recommend using Anaconda's Python 3 distribution. You can download it here. A good idea is then to create virtual environment:
cd ~/path/to/api.openbikes.co/
conda create -n venv python=3.4 anaconda
source activate venv
make install
Refer to the official documentation.
Stick with PostgreSQL 9.3/9.4 and PostGIS 2.1/2.2.
cd ~/path/to/api.openbikes.co/
make dev
python3 manage.py initdb
./scripts/add-cities.sh
python3 manage.py runserver
python3 manage.py -h
to get a list of available commands.python3 collect-bikes.py
to get current biking data.python3 collect-weather.py
to get current weather data.python3 train-regressors.py
to train regressors.python3 scripts/import-dump.py <city>
to fetch and load a dump from the production server for the given city.python3 scripts/create-dataset.py <city>
to create a dataset containing positions, weather and biking data for the given city.