IoT and Docker


IoT, or Internet of Things for the faint-hearted, is becoming a trendy topic amongst developers like myself. As for Docker, the hype is still present, but the audience is not as large. If you put the two in a blender (not literally), the ending result will surely satisfy anyone with experience levels from the l33t hacker to the virgin n00b.

The DIY project I refer to is Pineapple-Crust’, a mobile music streaming device built with a RaspberryPi, Docker, and an external hard drive. At a very high level, imagine a hard drive filled with music, attached to a credit card size computer running a web server in which you can access on a local network, and stream your favorite tunes via beautiful web UI.

Sound familiar? Yes, a cheesy and scaled down version of Spotify. But hey, isn’t that what engineering is all about? Creating something small and simple to understand the bigger picture, which in this case, is building/shipping/running applications (╯▅╰).

Other technologies include:

  • Elasticsearch for indexing
  • Flask micro-framework for application development
  • uWSGI for serving Flask app
  • Nginx for web server

Pre-Docker, my work flow for this project consisted of:

  • Building out and testing the app locally on a laptop
  • Testing app on the Pi
  • Writing configurations for both uWSGI and Nginx
  • Installing necessary packages and dependencies on Pi

All this is fun and dandy until you find yourself repeating this process, and forgetting critical commands at each step. Lets say for instance the SD card fails, or a friend gets word of this cool new music device and wants a copy for himself. Remembering which Linux packages to install, what files to place where, and how to configure each service quickly became a daunting task. Docker, more specifically ‘docker-compose’, makes maintaining a project with so many components like this hassle-free.

Before I get into the technical stuff, your RaspberryPi will need to be configured and have Docker installed. I could probably write a separate blog post on how to do that, or I can just forward you to this one that will get you up and running. Ideally, you would want to use Ansible for deployment, but again, separate blog post. Now, the technical stuff.

From “Compose is a tool for defining and running multi-container Docker applications.” What does that mean exactly? Well if you recall from earlier, I mentioned having multiple services (flask/nginx/uwsgi/elasticsearch). Since each service can run independently of the next, it would make sense that each one would run in its own container.

What not to do is attempt to get away with packing this whole application into a single Dockerfile and running docker build . :

  1. The whole idea of containerization is to isolate your builds.
  2. Pi’s only have 1GB RAM (that build would take forever).

Each application contains it’s own Dockerfile, that when built, will result in a multi-layered container (each layer being a line in the Dockerfile). String this together with Docker-Compose, docker-compose up, and you have yourself a delicious layered container sandwich, as in multi-container Docker application.

Notable configuration options:

  1. build : configuration option applied at build time, such as context and Dockerfile location. Context location also holds all necessary static/config files.
  2. depends_on: express dependency between apps (some services you want started before others).
  3. ports: expose ports on host:container (80:8080, 5000:5050).
  4. volumes: mount volumes from host:container (In our case, the mounted ext hd from the Pi to the container).
  5. version: compose file format (version 1 does not support volumes).
  6. services: the different containers that will ‘link’ together.

Why is this important? All the boring stuff like configuration settings and package dependencies are defined in each Dockerfile. I think Ronco Rotisserie said it best, “Set it, and forget it!”, so the user can now focus on just building the application. Say your mate wants to install this on their Pi as well, all he/she has to do is run git clone && cd pineapple-crust && docker-compose up.

Want to upgrade the front-end to Polymer? Easy, just scrap the flask/nginx containers and replace them with a Polymer one. Using some iron-ajax in the Polymer app, you can easily query the elasticsearch container and be on your way. Again, reinforcing the idea of reusability and isolating your builds.

Note: Any base image used on a RaspberryPi must be ARM based. All others just wont work. The CPU architecture on Pi’s differ from a typical laptop/desktop (usually running x86/x64). Image names are usually prefixed with rpi when searching on docker hub.

I only scratched the surface with the potential RaspberryPi or any other embedded device holds. Automation systems for dimming lights or opening shutters when entering a space, installations using light strips with controlled light patterns, or advance security systems can all be powered by such small devices.

The rabbit hole goes a little deeper, but this post is already too long for my liking. This is an open source project so you can view all the code/comments here. Star it, create an issue, brag to your friends, etc. IoT is the future!

IoT and Docker

Leave a Reply

Your email address will not be published. Required fields are marked *