There are some fantastic tools available for continuous integration, particuarly for open source projects. Jenkins is probably the most popular; written in java its compatible with most operating systems and works with many languages. It also has multiple plugins so you're able to configure the system to suit you.
Jenkins is however quite an old project. By default each project has operate within the state of the server, which might not accurately represent the server your application is going to end up being deployed on.
Travis CI is a fantastic example of the next version of continuous integration tools. As each project is run in an individual virtual machine, you're able to define what version of the language you're running, the queue software and database to accurately mimic your server.
This is fantastic to run your unit, functional and integration tests, and is completely free for open source projects. By integrating this software with the git software you're using, you can ensure that a branch you've created is working correctly before merging into master.
Unfortunately Travis CI is quite an expensive solution if you have small projects you want to keep private, or you have a large number of projects which each have few commits.
Enter Gitlab CI... and Docker
Gitlab CI is a fantastic piece of software, which leverages docker images, to give you complete control over your environment. This means you can choose the type of server, software and memory you need to allocate to a project.
As runners are attached to Gitlab CI, you can increase the number of runners on each project, and scale them as a project grows.
For this example, we're going to configure Gitlab CI with Gitlab Enterprise.
Gitlab enterprise is available as a free service with unlimited private repositories. This is a great solution for agencies who need to manage multiple projects across several developers.
It can also be self hosted, so if security is a concern, that is always an option.
- Sign up for an account over at Gitlab
- Create a repository and push some stuff onto it
As you now already are registered with a gitlab account, you can access Gitlab CI by accessing ci.gitlab.com.
Add your newly created project to the tool by clicking "Add project to CI".
Click on runners in the left navigation menu, where you can copy your registration token (you'll need this in the next step).
Create a Runner
Gitlab CI needs a runner from you to be able to carry out some operation. Gitlab (the company) provides a service written in go which uses Docker to "contain" that image. What we're aiming to contain within this example is:
- An Apache server
- PHP 5.6
Server SetupDigital Ocean provide low cost servers you can configure for multiple services. For small companies, a server with a single core (meaning you can only carry out one runner at a time) should work well for most purposes. So in this instance, we're going to be setting up a $10 a month server. This is a recommended specification from Gitlab themselves, though the bare minimum is a 512mb server.
The server needs to be the x64 version, as docker isn't compatible with the 386 architecture. Here, I've chosen Ubuntu 14.04.
You can add your public key here or if you prefer, create without and you'll be emailed your password.
After you press create droplet, it'll take about 30 seconds to setup your VPS (Virtual Private Server).
SSH into your new server. Update aptitude and install the runner by executing:
Now you'll need to register the runner you configured earlier. Run:
Register the runner:
Apply the key you took from the GitLab CI website from the previous step.
Now we've setup the runner, we can work on the image. As previously stated, the image is a docker container which we're going to configure to lose its state after the build has completed.
As we've already installed docker in the last step, we can work on our docker file. This is a file which defines which packages we want to install for our container.
As we're mainly concerned about running tests then losing state, we're going to configure docker to install all these dependancies within a single configuration file. However if you begin to look further into using docker you'd likely separate these out.
Here we're installing apache, mysql, ruby, and mailcatcher. These tools are adequate for our needs, but we can update this whenever we need to. Save this as "Dockerfile".
Now we need to build the file. You can do this by changing to the directory containing the Dockerfile and running:
The "php56" is the name of the image you're creating. This can be whatever you want. You can create multiple builds with several docker files if you have different environments you'd like to test.
Finally we need to tell the CI tool what images it's allowed to run. Although Docker Inc. have updated docker to prevent container breakout, we don't really want to allow any images to run on the server. By default this is disabled, and the image provided will be the image defined when defining the runner.
Here we've defined the image we built earlier can be used with Gitlab CI under allowed_images. This can be replaced with a "*" if you'd prefer to run any image.
Enable The Runner
Now we've done all that admin work, we can enable the runner on the Gitlab CI website.
Setup a .gitlab-ci.yml file
We've finished setting up our runner, so now we can begin running tests! Our image already contains the majority of services we need, so we just need to create a file which defines how to run our tests. This file is placed in our projects repository and is committed into version control (in the root of the project).
As we've already configured the service to run with our runner, so the next time you commit to the project, the projects tasks will be run and you will be able to view the status of each build on the GitLab CI website.
Merge requests will now also have the additional merge status field which will tell you whether you should merge your branch into master.
Build badges are also generated, which are a nice extra to quickly view the status of a project.
As you can see, the Gitlab CI tool is very powerful, as you could define multiple environments, with different versions of software which could run on the same server, and are quickly provisioned thanks to the Docker container.
If you want to have a play around with Digital Ocean, following my affiliate link will give you $10 free.