- Flask
pip install flask
- Flask-SQLAlchemy
pip install flask-sqlalchemy
- PostgreSQL server
- Installation is OS dependant
- PostgreSQL-server-dev-x.x
- Installation is OS dependant
sudo apt-get install postgresql-server-dev-9.1
for Ubuntu/Mint
- Installation is OS dependant
- pyowm
pip install pyowm
- configparser
- Probably installed by default
pip install configparser
- psycopg2
pip install psycopg2
- scikit-learn
pip install -U scikit-learn
- Clone the repo onto your computer:
git clone https://github.com/cruncher19/Ubicomp-GPP
- Create a user and database in postgres for the prediction system to use
- in my case I used user: greenhouse, database: greenhousePower
- don't forget to give your user permissions on your new database
- Change the
db_uri: db_uri: postgresql://greenhouse@localhost/greenhousePower
setting in the config file to point to your postgres instance
- should be of the form:
db_uri: postgresql://<username>@localhost/<database>
- Run
python initdb.py
to initialize the postgres database - Call
python routes.py
to start the server - You're good to go!
You can store power information in the database by POSTing to the server like this:
http://localhost:5000/storePowerProduction?powerLevel=50
You can turn off all devices connected to the power relay arduino like this:
http://localhost:5000/turnOffAllDevices
You can turn on all devices connected to the power relay arduino like this:
http://localhost:5000/turnOnAllDevices
run python predictor.py
in the repository's directory.
##Solar power website scraper usage python PowerScraper.py
The above command will print the time the site was last updated, the current daily power production and add the power information to the database via the REST API.
This currently uses a fake dataset to build a regression model and then uses the current weather conditions with the model to estimate power production.