Developing and Deploying with Docker Cloud

In my last post I showed off a photo scavenger hunt that I built using TensorFlow. In this post, I'll dive deeper into how it was built and deployed using TensorFlow, Docker and Docker Cloud.

Docker Cloud, formerly known as Tutum, is a platform that manages docker services. I was drawn to it over Amazon's Container Service or Google's Container Engine because the deployment configuration looks almost identical to a docker-compose.yml file. Using Docker Compose in development allowed me to easily build independent services and then connect them together to create a more complicated stack.

Photo Hunt Flow

Our Twitter Bot service listens for incoming tweets containing a specified hashtag. If a tweet also contains a photo, the Twitter Bot service posts the image to the TensorFlow service which responds with the probability that the image is one of the items in our scavenger hunt.

By using links in Docker Compose, it's possible to loosely couple the Twitter Bot service with the TensorFlow service. The http://imagenet/upload domain is available to the Twitter Bot because Docker Compose adds an imagenet listing to our /etc/hosts file. This means our service doesn't need to know exactly what service it's communicating with as long as that service fulfills its contract.

Initially, our Twitter Bot communicated directly with TensorFlow, but it was changed to allow for scaling. Because the load balancer's link is aliased, the Twitter Bot service has no idea it is communicating with a new type of service.

twitter-bot:  
  build: ./twitter-bot
  volumes:
   - /var/somewhere2
   - ./twitter-bot:/twitter-bot
  env_file: ./twitter-bot/keys.env
  links:
    - imagenet-proxy:imagenet

Complete configuration

Docker Cloud connects all the pieces of a Docker echo system together and allows you to deploy a setup very similar to what you run in development. Build hooks can be used to generate Docker images whenever new code is pushed to Github. Those images can then be used to spin up Docker services on AWS instances provisioned by Docker Cloud.

As a developer, deployment has always been a pain point. Services like Heroku limit the types of libraries I can use and custom AWS deployments I've done are not scalable, not identical to my development environment, and generally just a pain to manage.

There are still limitations with Docker Cloud, which just recently came out of beta. It currently doesn't support autoscaling, which means you'll need to write your own scripts to monitor load and scale accordingly. The pricing structure also might prevent some developers looking for a place to cheaply host experiments from using the service. At $0.02 / node hour, it costs a little more than an AWS micro instance. Nonetheless, my experience with the service has been very positive.