Scaling applications for changing workloads has previously been tricky. Having enough always-on VMs to handle any load is expensive, while having too few risks unwanted headlines when your application falls over. Kubernetes promises to solve this problem, but how do you decide if it’s right for you? GitLab offers an easy way to get started with Kubernetes and I used it to scale my application build easily and cheaply.

GitLab provides many really useful integrations. I’m particularly interested in the ones that remove barriers to scaling the development cycle. Too often I find development teams continue with development practices they created years earlier. They seem unable to improve something that’s working, or find out how things have moved on. Waiting for “the build” is probably the easiest problem to identify, and now with GitLab’s GKE integration, it’s one of the easiest to remedy.

GitLab’s continuous integration (CI) features were released in version 7.12, in mid 2015. Even the first version boasted continuous delivery (check out the very first CI configuration file in GitLab CE, it has a deploy_jobs field), and these days it can leverage many hosted and cloud solutions. The most flexible way of enabling building on and deploying to a selection of local and cloud services is to use Kubernetes, and easiest way to get started with that is to use GitLab’s Google Kubernetes Engine (GKE) integration. So that’s what I did.

The Problem I’m Solving

My repositories are on and I use the free build runners. My software is built for free in the cloud by GitLab (thanks GitLab!). GitLab provides just 2000 minutes of free build time per month, but I have a few servers laying around my office. It would be nice to move some of my builds there so that I can use all those free CPU cycles to build with and avoid upgrading my GitLab account to a paid subscription. I also have US$500 (or a bit over NZ$734) of credit on Google Cloud Platform, thanks to Google and GitLab promotions.

My goal is to use the new GKE integration to offload some of the build work currently handled by GitLab’s infrastructure. If I use Kubernetes, any work I do on one environment (e.g. the Google Cloud Platform) can be applied to other environments (e.g. my home network), and I don’t have to think about how many machines I have or how to distribute my builds: that’s exactly what Kubernetes is for. I’ve read lots about Kubernetes but I haven’t yet tried it for myself, so I’m keen to go that route. I’ve heard that it’s not the easiest suite of tools to get working the way I want, so I’d love it if GitLab just got it working for me.

How I Solved It

I’m using, which is based on GitLab EE. A quick glance at GitLab CE shows that the features described here are available there too, though you do need to enable Google authentication first.

Enable GKE integration in GitLab

The first step was to get a cluster of containers ready to run my builds in. In the videos I watched and blogs I read, this bit just happened. It involves setting up what Google term a “project” in the Google Cloud console, then using a single GitLab page to create the cluster for that project. Unfortunately things didn’t go smoothly and I had to step through a few hoops: I talk about that in another post.

Get GitLab to Build on my Cluster

At this point, I had a GKE project and its associated Kubernetes cluster. Now I need to tell GitLab to use that cluster to run my builds in. The GitLab videos I watched were very helpful, showing how to install the necessary applications directly from the GitLab web interface. For your convenience, here’s a link to the shortest one with all the important details (less than 6 minutes). I first installed Helm Tiller to enable GitLab to install the other necessary applications, then Ingress, which exposes a public IP address. Finally I installed GitLab Runner. Each of these steps took a few seconds to a minute, and GitLab keeps you informed while it happens.

There was another “gotcha” in this process, and the post I linked to earlier also explains how I got GitLab to use the runner it installed for me.

The Problem, Solved

I’m now able to configure any project to use my GKE cluster. It scales nicely, so builds will expand to fill as many nodes as they need to. I’m still using just three small nodes, but I have a good amount of control over the number and size of the VMs comprising my cluster. I can use the 2000 minutes GitLab gives me every month, then overflow onto GKE. This combination of build services seems like a very cheap way to build on other people’s computers. And if I’m lucky enough to find myself in need of more free build time, I’ve got Kubernetes integration with GitLab already working, so I’ll just learn how to build a cluster at home on any old Raspberry Pis I’ve got laying around!

Activate GitLab’s Hidden Superpower

Now that the cluster was associated with the project, the CI/CD page in the Settings area allowed me to enable Auto DevOps. With this feature, GitLab figures out what extra it can do with your build and application like dependency and security scanning. It can also automatically deploy what GitLab calls “review apps”, which are containerized versions of your application, built from branches or merge requests. I enabled it and configured the Domain field. The videos make use of a free service which allows you to work around GitLab’s requirement of using a qualified domain name instead of the IP address; the new GitLab user interface suggests using to do the same thing, so that’s what I did.

I’m now setting myself the stretch goal of deploying my application to the same Kubernetes cluster as I’m using to build. Auto DevOps is amazing when enabled for a project with no .gitlab-ci.yml, but I already have one and I want to merge the logic I’ve created with GitLab’s. I’m currently following the documentation on and it doesn’t look too hard. I’ll report back once I’ve got everything working the way I want. Wish me luck!