Skip to content

Project management service used to manage my projects and portfolio website

Notifications You must be signed in to change notification settings

alexolivas/atlas-api

Repository files navigation

Atlas

The Atlas API is an internal project management service for my personal and 8BitGhost projects. Most of the endpoints are private but it does expose a small public API which is consumed by my portfolio website. It uses HTTP methods and a RESTful endpoint structure. The API authorization framework is OAuth. The current API version is beta.

Table of Contents

API Documentation

Visit the api docs

Get Key

Add this section on obtaining a key to use this service with a couple examples.

Development

This project uses gitflow and runs inside a docker container.

Getting Started

The first step to start working on this project is to clone the repository onto your computer

$ git clone https://gitlab.com/alexolivas/atlas

Setup git flow, follow instructions and accept the defaults

$ git flow init

Setup Environment

This service is run inside a dockerdocker container to resemble a production environment. To get started, navigate to the repo (the directory where you cloned the project)

$ cd <repo-directory>/atlas

Create an environment variables file at the root for the postgres container

$ vi .postgres-env

Populate it with the following data, generate a new 15 character password specific to your environment. Note these values will be required to generate the DATABASE_URL in the next step

POSTGRES_USER: atlasuser
POSTGRES_PASSWORD: XXXXXXXXXXXXXXX
POSTGRES_DB: atlas_db

Create a second environment variables file at the root for the django application

$ vi .env

Populate it with the following environment variables, generate the SECRET_KEY with a tool like 1password: 50 characters. It is important that these variables exist so that the application can run. NOTE: The ALLOWED_HOSTS variable must be a comma separated lists if there are multiple hosts e.g. 'localhost', '127.0.0.1'

ENVIRONMENT=development
SECRET_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
DEBUG=True
DATABASE_URL='postgres://<POSTGRES_USER>:<POSTGRES_PASSWORD>@localhost:5432/<POSTGRES_DB>'
ALLOWED_HOSTS=localhost, 127.0.0.1
CORS_ORIGIN_WHITELIST='localhost:3000', '127.0.0.1:3000'

Optionally, if you want to setup the photo upload feature you will have to setup an s3 bucket and add the following environment variables in your .env file once you've set it up

AWS_S3_ACCESS_KEY_ID=XXXXX
AWS_S3_SECRET_ACCESS_KEY=XXXXX
AWS_STORAGE_BUCKET_NAME=XXXXX

Again, optionally, if you want to setup the contact me feature you must add the following environment variable with the email address you want to receive your emails

DEFAULT_CONTACT_EMAIL_ADDRESS=<email-address>

This project is setup to run inside docker. It consists of two images, the django project and the postgres database. Run the following command to build the project for the first time (or anytime you want to start fresh)

$ docker-compose up --build

Install Data

Once you have your local development environment running you will want to load your database with demo data so that you can interact with the system. Run the following (from the project's root directory):

$ python restore_db.py

Once the demo data is installed on your system you can login to the admin portal at localhost:8000/admin with the credentials admin/admin123

Start Feature

$ git flow feature start $FEATURE_NAME

Publishing Images

When a new version of the atlas API is ready to be released, run the following command to publish the latest version up to docker hub

$ docker push alexolivas/atlas_api:latest

Running Tests

$ python manage.py test

Demo Data

From time to time its possible that I may want to add data for a particular feature and make it available to the project's demo data. This section provides a step by step guide for how to achieve this.

To create a backup of the current state of your database (specific to an APP) run the following command

$ python manage.py dumpdata $DJANGO_APP > $BACKUP_FILE.json

Then move the resulting backup to the resources directory

$ mv web.json resources/demo-data/$BACKUP_FILE.json

Starting a container

To rebuild the container entirely run the following command

docker-compose up --build

To start a container

docker-compose up

Examples

The following are examples of the primary applications whose data will be periodically updated to a state where we would want to create a backup

The web app contains all the information used by the portfolio website

$ python manage.py dumpdata web > resources/demo-data/web.json

The accounts app contains all the client accounts

$ python manage.py dumpdata accounts > resources/demo-data/accounts.json

The projects app contains all the projects in my portfolio

$ python manage.py dumpdata projects > resources/demo-data/projects.json

The auth.user app is the out of the box django app containing users that can login to the admin portal. This should already have 1 admin account but its possible that in the future I may add a feature that could require a different user

$ python manage.py dumpdata auth.user > resources/demo-data/users.json
$ python manage.py dumpdata auth.group > resources/demo-data/user-groups.json

The user's tokens are also out of the box

python manage.py dumpdata authtoken > resources/demo-data/tokens.json

Helpful Commands

Run this command to display the project's dependencies as a tree structure (pipdeptree comes pre-configured as a dependency on this project)

$ pipdeptree

Run this command to update any outdated pip dependencies. See this blog for additional information

$ pur -r requirements.txt

Release Process

To setup bitbucket pipelines you have to add the following pipeline environment variables

Environment Variable Value
ENVIRONMENT <development/stage/production>
DATABASE_URL sqlite:////atlas-db.sqlite
CORS_ORIGIN_WHITELIST ''
DEBUG True/False
ALLOWED_HOSTS
SECRET_KEY XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
DEFAULT_CONTACT_EMAIL_ADDRESS
DISABLE_COLLECTSTATIC 1

Merging a feature branch into develop automatically triggers the build pipeline to run unit tests to verify that the feature doesn't break anything on develop and deploys a new development environment to Heroku.

To schedule a release, use gitflow to create a release branch. Pushing up changes to a release branch automatically triggers the build pipeline to run unit tests and deploy a new staging RC environment to Heroku. NOTE: when starting a new release I have to manually (for now) update the version in atlas.version. To determine the next version use the following commands:

next patch: 1.4.2 -> 1.4.3

$ VERSION=`git semver --next-patch`

next minor: 1.4.2 -> 1.5.0

$ VERSION=`git semver --next-minor`

next major: 1.4.2 -> 2.0.0

$ VERSION=`git semver --next-major`

The above was taken from: https://romain.dorgueil.net/blog/en/tips/2016/08/20/releases-with-git-semver.html

Merging into master triggers the build pipeline:

  • run the unit tests
  • if they pass, tag the release to the next version (maybe)
  • deploy to heroku (stage environment to keep environments in sync)
  • promotes stage to production

Heroku Setup

Production and staging environments are hosted on Heroku. In order to get this REST API to run correctly we must setup the same environment variables we setup for development. However, with different values for each environment.

About

Project management service used to manage my projects and portfolio website

Resources

Stars

Watchers

Forks

Packages

No packages published