Skip to content

lundybernard/pytorch_microservice_demo

Repository files navigation

pytorch_microservice_demo

demonstrate a dockerized microservice that uses pytorch with CUDA acceleration

Build Status

Requirements

System

docker-ce v18.06.0-ce + nvidia-docker2

Conda packages

pytorch

Dev notes

pip install docker-compose (not in conda repositories)

TODO:

create performance tests, and figure out how to perf-test differences between versions

Deploy to GCP, and verify GPU acceleration

Installation

install in developer mode from source

python setup.py develop

Run Functional tests

Manually

start the web server manually

python pytorch_server/pytorch_server.py

run tests against the web server

python -m unittest functional_tests/service_test.py
pytest functional_tests/service_test.py

Run the service and tests via CLI

Install the package

python setup.py install
python setup.py develop

start webserver with cli

pytorch_server start

Run functional tests

pytorch_cli test service
pytest functional_tests/service_test.py

Run Container tests

to validate the docker container works properly, and docker-compose works

Manual Test

Run the container with docker-compose and test it with service_test

docker-compose build
docker-compose up
pytest integraiton_tests/service_test.py

Automatic test

container_test will run docker-compose before each test case, and execute the test against the running container

python -m unittest container_tests/container_test.py
pytest container_tests/container_test.py

rebuild local containers

sometimes necessary if container tests are failing

docker-compose down --rmi local
docker-compose build

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages