Codementor Events

Let's talk about Containers

Published Jan 12, 2018
Let's talk about Containers

Before we dive into the world of Docker, let's pause for a second and take the word container literally. Or more specifically let's talk about shipping containers:

Few people know that the shipping container is unarguably one of the greatest inventions of the modern world. Without it, globalization would not have been possible. To explain why this is, let's look at the time before shipping containers (everything before the 1950s): back in these days in order to ship something, you would get all the goods and a lot of strong people to the port and let them store the individual pieces on the ship one after another. A process called (Break bulk cargo)[https://en.wikipedia.org/wiki/Break_bulk_cargo] It's easy to imagine that shifting to containers has an insane advantage (if you're interested in this particular topic, there is a (great video that sums this up).

OK, now back to the digital age. At this point, you're probably expecting me to say something like "Docker is the digital equivalent of shipping containers and will take deployment to the next level". I will not say that. The reason is simple:

The cost savings of using Docker over any deployment method are tiny when compared to shipping containers vs break bulk cargo (in absolute and relative numbers). So whenever you hear a comparison between the cargo world and Docker, keep this difference in mind.

Don't get me wrong, I love Docker and use it on a daily basis, I just like it for different reasons.

Notice how I so far have only talked about Docker in a deployment context? Truth is: I mostly use Docker for different scenarios. Let's have a look:

Shell Scripts ++

I try to automate as much of my workflows as I can. But scripting can sometimes be painful. What if you need a special library or even worse a system library to be installed. What happens when you need an older version of the programming language installed, while your side projects need to run with the latest version?

I understand there are solutions to these problems outside the container universe (package managers, version managers etc...) but let's have a look at a possible Docker approach to this problem.

Imagine this script, that fetches data from one API, transforms it and pushes it to another API in Ruby:

# script.rb
require "external_api1"
require "external_api2"
require "transformer"

data = ExternalApi1.fetch(ENV["API1"])
transfromed = Transformer.transform(data)
ExternalApi1.push(ENV["API2"], data)

Let's put it into this into a Docker container:

FROM: ruby:2.5

RUN apt-get install build-essential

RUN mkdir /app
COPY . /app
WORKDIR /app

RUN bundle install

This will set up an image with all the requirements you need to run your script.

Now you can run the script via:

docker build -t leifg/myscript . # only needed for code changes
docker run --rm -e API1=https://api1.com -e API2=https://api1.com leifg/myscript bundle exec ruby script.rb

That's it. The only thing you need now is Docker, you can even run this on computers without ruby installed at all.

But what can you do when you are working with a compiled language?

Building on the right system

Whenever you have a build process that compiles your sources into a binary, it greatly helps to do it on a system that is similar to your production system. It's not only about processor architecture, but also about dependent libraries.

Finding a Docker base image that resembles your production system is rather trivial (as long as you play in the Linux land). To give you an example, let's say you are running your production application on a CentOS 6.8 system. A build image for this system could look like this:

FROM centos:6.8

RUN yum groupinstall 'Development Tools'
RUN yum install someotherdependencies
RUN mkdir /app

COPY build.sh /usr/local/bin/
ENTRYPOINT ["/usr/local/bin/build.sh"]

Your build.sh script contains all the compile instructions you need to build your applications. There are several ways to trigger the build and get the binary. I'll give you an example using volumes:

docker build -t leifg/build_image
docker run --rm -v $(pwd):/app leifg/build_image compile

Assuming build.sh compile is the command to compile your sources into binary, the docker image will take all the sources in your current directory and output the binary in the folder /app within the image. Because this folder is mounted from the host system you'll have access to the binary even after the container finished running.

Note: In this scenario Docker is only used to build a binary, how you later deploy this is up to you, it can also be inside a Docker container but it doesn't have to.

With this simple setup, you give anyone the chance to compile the sources regardless of their local setup.

Of course, it's best practice to run this process on a CI system, which brings me to the next scenario.

Continuous Integration

In your CI process, you sooner or later need any kind of dependencies to execute your tests or deployment scripts. External libraries are usually easy to fulfill but often times your tests also reply on a database or maybe even on another service.

I don't remember how often I have been in contact with Customer Support of Travis, Circle or any other in order to find out if they supported a particular database. Usually, there was some kind of hack to get an obscure database to run but it always felt like a hack.

With Docker, a lot of these problems disappear. If the database you want to test against is available in a Docker container (which it usually is) you'll be able to configure it on your CI system, start it up and run your tests.

More and more CI systems now start to integrate this process so you don't actually have to care about starting the containers. IMHO Circle CI has the best integration with Docker (see the database example in their docs).

But let's take running Docker containers even one step further:

Serverless

IMHO Serverless (or Function as a Service) has the most potential for Docker.

I often stumble across articles on how to get your favorite language to run on AWS Lambda, the biggest serverless provider. My confidence in this actually working is pretty slim. It usually revolves around using a JVM implementation of Ruby/Python.

Some providers like IBM Cloud Functions provide support for Docker. This is neat way to use your favorite language as a deployed function.

In Conclusion

I hope I could sketch out some use cases for Docker outside of deployment.

I currently have a couple where I use Docker as a deployment strategy and the learning curve was pretty steep. It was a ton of work but in the end, succeeded. I genuinely believe that the fact that I use Docker so intensively on my dev machine and on CI helped a lot.

So if you want to step into Docker deployment, I suggest start using Docker first for any of the outlined use cases.

Discover and read more posts from Leif Gensert
get started