Journal Articles
Browse in : |
All
> Journals
> CVu
> 324
(9)
All > Topics > Design (236) Any of these categories - All of these categories |
Note: when you create a new publication type, the articles module will automatically use the templates user-display-[publicationtype].xt and user-summary-[publicationtype].xt. If those templates do not exist when you try to preview or display a new article, you'll get this warning :-) Please place your own templates in themes/yourtheme/modules/articles . The templates will get the extension .xt there.
Title: Piping Software for Less
Author: Bob Schmidt
Date: 06 September 2020 18:23:37 +01:00 or Sun, 06 September 2020 18:23:37 +01:00
Summary: Paul Grenyer continues his mission to build a DevOps pipeline on a budget.
Body:
Developing software is hard and all good developers are lazy. This is one of the reasons we have tools which automate practices like continuous integration, static analysis and measuring test coverage. The practices help us to measure quality and find problems with code early. When you measure something you can make it better. Automation makes it easy to perform the practices and means that lazy developers are likely to perform them more often, especially if they’re automatically performed every time the developer checks code in.
This is old news. These practices have been around for more than twenty years. They have become industry standards and not using them is, quite rightly, frowned upon. What is relatively new is the introduction of cloud based services such as BitBucket Pipelines, CircleCI and SonarCloud, which allow you to set up these practices in minutes – however, this flexibility and efficiency comes with a cost.
Definitions | |
|
Why
While BitBucket Pipelines, CircleCI and SonarCloud have free tiers, there are limits.
With BitBucket Pipelines, you only get 50 build minutes a month on the free tier. The next step up is $15/month and then you get 2500 build minutes.
On the free CircleCI tier, you get 2500 free credits per week [4] but you can only use public repositories, which means anyone and everyone can see your code. The use of private repositories starts at $15 per month.
With SonarCloud, you can analyse as many lines of code as you like, but again you have to have your code in a public repository or pay $10 per month for the first 100,000 lines of code.
If you want continuous integration and a static analysis repository that includes test coverage and you need to keep your source code private, you’re looking at a minimum of $15 per month for these cloud based solutions and that’s if you can manage with only 50 build minutes per month. If you can’t, it’s more likely to be $30 per month… that’s $360 per year.
That’s not a lot of money for a large software company or even for a well funded startup or SME, although as the number of users goes up so does that price. For a personal project, it’s a lot of money.
Cost isn’t the only drawback, with these approaches you can lose some flexibility as well.
The alternative is to build your own development pipelines.
I bet you’re thinking that setting up these tools from scratch is a royal pain and will take hours, when the cloud solutions can be set up in minutes. Not to mention running and managing your own pipeline on your personal machine… and don’t they suck resources when they’re running in the background all the time? And shouldn’t they be set up on isolated machines? What if I told you that you could set all of this up in about an hour and turn it all on and off as necessary with a single command? And that if you wanted to, you could run it all on a DigitalOcean Droplet for around $20 per month. Interested? Read on.
What
When you know how, setting up a continuous integration server such as Jenkins and a static analysis repository such as SonarQube in a Docker container is relatively straightforward. As is starting and stopping them altogether using Docker Compose. As I said, the key is knowing how, and what I explain in the rest of this article is the product of around twenty development hours, a lot of which was banging my head against a number of individual issues which turned out to have really simple solutions.
Docker
Docker is a way of encapsulating software in a container. Anything from an entire operating system such as Ubuntu to a simple tool such as the scanner for SonarQube. The configuration of the container is detailed in a Dockerfile and Docker uses Dockerfiles to build, start and stop containers. Jenkins and SonarQube each have publicly available Docker images, which we’ll use with a few relatively minor modifications to build a development pipeline.
Docker Compose
Docker Compose is a tool which orchestrates Docker containers. Via a simple YML file, it is possible to start and stop multiple Docker containers with a single command. This means that – once configured – we can start and stop the entire development pipeline so that it is only running when we need it or, via a tool such as Terraform, construct and provision a DigitalOcean droplet (or AWS service, etc.) with a few simple commands, and tear it down again just as easily so that it only incurs cost when we’re actually developing. Terraform and DigitalOcean are beyond the scope of this article, but I plan to cover them in the near future.
See the Docker and Docker Compose websites for instructions on how to install them for your operating system.
What is Terraform? | |
|
How
In order to focus on the development pipeline configuration, I’ll describe how to create an extremely simple Dotnet Core class library with a very basic test and describe in more detail how to configure and run Jenkins and SonarQube Docker containers and setup simple projects in both to demonstrate the pipeline. I’ll also describe how to orchestrate the containers with Docker Compose.
I’m using Dotnet Core because that’s what I’m working with on a daily basis. The development pipeline can also be used with Java, Node, TypeScript or any other of the supported languages. Dotnet Core is also free to install and use on Windows, Linux and Mac which means that anyone can follow along.
A simple Dotnet Core class library project
I’ve chosen to use a class library project as an example for two reasons. It means that I can easily use a separate project for the tests, which allows me to describe the development pipeline more iteratively. It also means that I can use it as the groundwork for a future article which introduces the NuGet server Liget to the development pipeline.
Open a command prompt and start off by creating an empty directory and moving into it.
mkdir messagelib cd messagelib
Then open the directory in your favourite IDE, I like VSCode for this sort of project. Add a Dotnet Core appropriate .gitignore file and then create a solution and a class library project and add it to the solution:
dotnet new sln dotnet new classLib --name Messagelib dotnet sln add Messagelib/Messagelib.csproj
Delete MessageLib/class1.cs and create a new class file and class called Message
:
using System; namespace Messagelib { public class Message { public string Deliver() { return "Hello, World!"; } } }
Make sure it builds with:
dotnet build
Commit the solution to a public git repository, or you can use the existing one in my bitbucket account here: https://bitbucket.org/findmytea/messagelib.
A public repository keeps this example simple and, although I won’t cover it here, it’s quite straightforward to add a key to a BitBucket or GitHub private repository and to Jenkins so that it can access them.
Remember that one of the main driving forces for setting up the development pipeline is to allow the use of private repositories without having to incur unnecessary cost.
Run Jenkins in Docker with Docker Compose
Why use Jenkins, I hear you ask. Well, for me the answers are simple: familiarity and the availability of an existing tested and officially supported Docker image. I have been using Jenkins for as long as I can remember.
The official image is here: https://hub.docker.com/r/jenkins/jenkins
After getting Jenkins up and running in the container we’ll look at creating a ‘Pipeline’ with the Docker Pipeline plugin. Jenkins supports lots of different ‘Items’, which used to be called ‘Jobs’, but Docker can be used to encapsulate build and test environments as well. In fact this is what BitBucket Pipelines and CircleCi also do.
To run Jenkins Pipeline, we need a Jenkins installation with Docker installed. The easiest way to do this is to use the existing Jenkins Docker image from Docker Hub. Open a new command prompt and create a new directory for the development pipeline configuration and a sub directory called Jenkins with the Dockerfile in Listing 1 in it.
FROM jenkins/jenkins:lts USER root RUN apt-get update RUN apt-get -y install \ apt-transport-https \ ca-certificates \ curl \ gnupg-agent \ software-properties-common RUN curl -fsSL https://download.docker.com/linux/debian/gpg | apt-key add - RUN add-apt-repository \ "deb [arch=amd64] https://download.docker.com/linux/debian \ $(lsb_release -cs) \ stable" RUN apt-get update RUN apt-get install -y docker-ce docker-ce-cli containerd.io RUN service docker start # drop back to the regular jenkins user - good practice USER jenkins |
Listing 1 |
You can see that our Dockerfile imports the existing Jenkins Docker image and then installs Docker for Linux. The Jenkins image, like most Docker images, is based on a Linux base image.
To get Docker Compose to build and run the image, we need a simple docker-compose.yml file in the root of the development pipeline directory with the details of the Jenkins service (see Listing 2).
version: '3' services: jenkins: container_name: jenkins build: ./jenkins/ ports: - "8080:8080" - "5000:5000" volumes: - ~/.jenkins:/var/jenkins_home - /var/run/docker.sock:/var/run/docker.sock |
Listing 2 |
Note the build
parameter which references a sub directory where the Jenkins Dockerfile should be located. Also note the volumes. We want the builds to persist even if the container does not, so create a .jenkins directory in your home directory:
mkdir ~/.jenkins
Specifying it as a volume in docker-compose.yml tells the Docker image to write anything which Jenkins writes to /var/jenkins_home in the container to ~/.jenkins on the host – your local machine. If the development pipeline is running on a DigitalOcean droplet, spaces can be used to persist the volumes even after the droplet is torn down.
As well as running Jenkins in a Docker container we’ll also be doing our build and running our tests in a Docker container. Docker doesn’t generally like being run in a Docker container itself, so by specifying:
/var/run/docker.sock
as a volume, the Jenkins container and the test container can be run on the same Docker instance.
To run Jenkins, simply bring it up with Docker compose:
docker-compose up
(To stop it again just use Ctrl+C.)
Make sure the first time that you note the default password. It will appear in the log like this:
Jenkins initial setup is required. An admin user has been created and a password generated. Please use the following password to proceed to installation: <password> This may also be found at: /var/jenkins_home/ secrets/initialAdminPasswor
To configure Jenkins for the first time open a browser and navigate to:
http://localhost:8080
Then:
- Paste in the default password and click continue.
- Install the recommended plugins. This will take a few minutes. There is another plugin we need too which can be installed afterwards.
- Create the first admin user and click Save & Continue.
- Confirm the Jenkins url and click Save & Finish.
- Click Start Jenkins to start Jenkins.
You now have Jenkins up and running locally in a Docker container!
To use Docker pipelines in Jenkins, we need to install the plugin. To do this:
- Select Manage Jenkins from the left hand menu, followed by Manage Plugins.
- Select the Available tab, search for Docker Pipeline and select it.
- Click Download now and install after restart.
- On the next page, put a tick in the restart after download check box and wait for the installation and for Jenkins to restart. Then log in again.
Next we need to create the Docker Pipeline for the Messagelib solution.
- Select New Item from the left hand menu, enter
Messagelib
as the name, select Pipeline and click ok. - Scroll to the Pipeline section and select Pipeline script from SCM from the Definition dropdown. This is because we’re going to define our pipeline in a file in the Messagelib solution.
- From the SCM dropdown, select Git and enter the repository URL of the Messagelib solution. (See Figure 1.)
- Then click Save.
Figure 1 |
Jenkins is now configured to run the Messagelib pipeline, but we need to tell it what to do by adding a text file called Jenkinsfile to the root of the Messagelib solution. (See Listing 3.)
/* groovylint-disable CompileStatic, GStringExpressionWithinString, LineLength */ pipeline { agent { docker {image 'pjgrenyer/dotnet-build-sonarscanner:latest'} } stages { stage('Build & Test') { steps { sh 'dotnet clean' sh 'dotnet restore' sh 'dotnet build' } } } } |
Listing 3 |
This very simple Groovy script tells the Jenkins pipeline to get the latest ‘dotnet-build-sonarscanner’ Docker image and then use it to clean, restore and build the dotnet project. ‘dotnet-build-sonarscanner’ is a Docker image I built and pushed to Docker Hub using the Dockerfile in Listing 4.
FROM mcr.microsoft.com/dotnet/core/sdk:latest AS build-env WORKDIR / RUN apt update RUN apt install -y default-jre ARG dotnet_cli_home_arg=/tmp ENV DOTNET_CLI_HOME=$dotnet_cli_home_arg ENV DOTNET_CLI_TELEMETRY_OPTOUT=1 ENV PATH="${DOTNET_CLI_HOME}/.dotnet/tools:${PATH}" ENV HOME=${DOTNET_CLI_HOME} RUN dotnet tool install --global dotnet-sonarscanner RUN chmod 777 -R ${dotnet_cli_home_arg} |
Listing 4 |
This creates and configures a development environment for Dotnet Core and Sonar Scanner, which requires Java. (There is a way to use the Dockerfile directly, rather than getting it from Docker Hub.[6])
Once the Jenkins file is added to the project and committed, set the build off by clicking Build now from the left hand menu of the MessageLib item. The first run will take a little while as the Docker image is pulled (or built). Future runs won’t have to do that and will be quicker. You should find that once the image is downloaded, the project is built quickly and Jenkins shows success.
Run SonarQube in Docker with Docker Compose
When I first started using Jenkins, there were plugins for lots of different static analysis tools which would generate reports as part of the build. Then SonarQube came along as both a central repository for static analysis results and as a tool for grouping multiple static analysis tools together. [7]
SonarQube ships with a H2 in-memory database for evaluation purposes, but this isn’t ideal for long term use. Fortunately it’s very easy to run PostgreSQL in a Docker container and use that instead.
- Create a postgresql directory at the root of your development pipeline project:
mkdir postgresql
- Move into the directory and create a new file called init-sonar.sql with following content:
CREATE DATABASE sonar; CREATE USER sonar WITH PASSWORD 'sonar'; GRANT ALL PRIVILEGES ON DATABASE sonar to sonar;
This file is used by the PostgreSQL Docker container to create an empty Sonar database. It is created the first time the container starts. You may wish to use a more secure password, but as we’ll see shortly, the database will only be accessible to other containers on the same Docker host.
- Create a Dockerfile in the postgresql directory:
FROM postgres:latest COPY init-sonar.sql /docker-entrypoint-initdb.d/init-sonar.sql
The Dockerfile uses the official postgres image [8] and copies in the init-sonar.sql file.
- Add the following to the docker-compose.yml file:
db: build: ./postgresql/ environment: POSTGRES_PASSWORD: 3PdrAz6xWe5yErnQ volumes: - ~/.postgresql/:/var/lib/postgresql/dat
To be able to build the postgresql image, you need to specify a root user password and configure a volume to write the data to your local machine. Note that there is no port (
-p
) argument. This is because, as we’ll see, Docker containers running on the same host and started by Docker Compose are all added to the same network. So while the postgresql container is not accessible from the host or any remote machine, it is accessible to other Docker containers. - Create a directory on your local machine for the data:
mkdir ~/.postgresql
The easiest way to add SonarQube is directly into the docker-compose.yml file and configure it there as the Docker image does not need to be modified.
- Add the information in Listing 5 to the docker-compose.yml file.
The SonarQube Docker image is used, the port, the database credentials and the volumes are specified. Note that the host section of the connection string is
db
, the name of the docker-compose.yml section which starts the postgresql database. This is how Docker images created on the same host by Docker Compose refer to each other. - Create directories on your local machine for the data and logs:
mkdir ~/.sonar mkdir ~/.sonar/data mkdir ~/.sonar/logs
- SonarQube uses an embedded Elasticsearch, so if you’re using Linux you need to configure your local machine with the Elasticsearch production mode requirements and File Descriptors configuration. To do this execute the following at the command line:
sysctl -w vm.max_map_count=262144 sysctl -w fs.file-max=65536 ulimit -n 65536 ulimit -u 4096
If you’re running on Windows, you don’t need to do this.
sonar: container_name: sonar image: sonarqube:latest ports: - "9000:9000" environment: - SONARQUBE_JDBC_USERNAME=sonar - SONARQUBE_JDBC_PASSWORD=sonar - SONARQUBE_JDBC_URL=jdbc:postgresql: //db:5432/sonar volumes: - ~/.sonar/data:/opt/sonarqube/data - ~/.sonar/logs:/opt/sonarqube/logs |
Listing 5 |
SonarQube is now ready to run, so run Docker Compose:
docker-compose up
The first time it runs, SonarQube will configure itself and create the database it needs. To see SonarQube running, open a browser and navigate to http://localhost:9000
.
Static analysis with SonarQube
Now that SonarQube is up and running, we can create a project to hold the analysis results from Messagelib.
The default SonarQube username and password are both ‘admin’. This can be easily changed. Let’s create a project so we can analyse Messagelib:
- Login to SonarQube (user: admin, password: admin).
- Click Create new project
- Enter
messagelib
as the Project Key and Display Name – this is how SonarQube knows which project it is being asked to analyse. - Click setup.
- Then ignore the rest of the process as we don’t need it.
Next we need to return to the Messsagelib solution and update the Jenkinsfile so that it scans the project build output (see Listing 6). You can see the new sections of the file highlighted in darker text.
pipeline { agent { docker { image 'pjgrenyer/dotnet-build-sonarscanner:latest' args '--network host' } } stages { stage('Sonar setup') { steps { sh 'dotnet sonarscanner begin k:messagelib' } } stage('Build & Test') { steps { sh 'dotnet clean' sh 'dotnet restore' sh 'dotnet build' sh 'dotnet test' } } stage('Sonar End') { steps { sh 'dotnet sonarscanner end' } } } } |
Listing 6 |
The Docker container which is used to run the build and tests isn’t created by Docker Compose so we have to manually configure it to be on the host network so that it can talk to the SonarQube container by passing an extra argument:
args '--network host'
Before the build starts the SonarQube scanner needs to be started and configured with the project key for the SonarQube project:
stage('Sonar setup') { steps { sh 'dotnet sonarscanner begin /k:messagelib' } }
Recall that the SonarQube scanner is part of the ‘dotnet-build-sonarscanner’ image and therefore available to all builds which use it in a Docker pipeline. After the build, the SonarQube scanner needs to be stopped and told to send the results to SonarQube:
stage('Sonar End') { steps { sh 'dotnet sonarscanner end' } }
Check the Jenkinsfile changes into your repository and kick off the Jenkins item again and once it’s finished you should see something like Figure 2 in SonarQube after you login.
Figure 2 |
Then if you drill down (click on the project name) you get more detail (see Figure 3).
Figure 3 |
If you then click on the number of ‘code smells’ and then on one of the issues you’ll get the details for the individual issues and their position in the code (see Figure 4).
Figure 4 |
I’ll leave fixing the issues as an exercise for the reader.
Measuring test coverage with SonarQube
As well as static analysis, SonarQube also collates test code coverage. While SonarQube supports a handful of different code coverage tools, the important thing is to get the format of the coverage output file to match one of the supported tools.
SonarQube is showing 0% code coverage. While this is an accurate figure of the coverage, it’s not actually a reflection of any measuring. In order to measure test coverage, we need to write and run some tests and tell SonarQube scanner where to find the output and what format it is in. Let’s start with a test.
Go to your Messagelib solution top level directory, create an XUnit test project, add it to the solution and add the Messagelib class library project as a dependency:
dotnet new xunit --name MessagelibTest dotnet sln add MessagelibTest/ MessagelibTest.csproj dotnet add MessagelibTest/MessagelibTest.csproj reference Messagelib/Messagelib.csproj
Dotnet Core XUnit projects come with OpenCover [9] by default, but to generate a coverage report an extra package is needed. Change to the MessagelibTest directory and add the package:
dotnet add package coverlet.msbuild --version 2.9.0
Then switch back to the root directory of the solution.
Delete Messagelib/UnitTest.cs and create a new class file and class called MessageTest (see Listing 7).
using Messagelib; using Xunit; namespace MessagelibTest { public class MessageTest { [Fact(DisplayName = "Check that the correct delivery message is returned.")] public void DeliverMessage() { Assert.Equal("Hello, World!", new Message().Deliver()); } } } |
Listing 7 |
Check that the solution builds and the test passes by running:
dotnet test
at the command line. To generate the necessary output files and to tell SonarQube scanner where to find them we need to update the Jenkinsfile again (see Listing 8).
stage('Sonar setup') { steps { sh 'dotnet sonarscanner begin /k:messagelib /d:sonar.cs.opencover.reportsPaths= MessagelibTest/coverage.opencover.xml /d:sonar.coverage.exclusions="**Test*.cs"' } } |
Listing 8 |
Adding the highlighted arguments tells SonarQube scanner to look for a file called coverage.opencover.xml in the MessagelibTest directory and not to measure the test coverage on the test classes themselves (see Listing 9).
stage('Build & Test') { steps { sh 'dotnet clean' sh 'dotnet restore' sh 'dotnet build' sh 'dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=opencover' } } |
Listing 9 |
Adding the highlighted arguments tells dotnet to generate test coverage and output the results in OpenCover’ format when it runs the tests.
Check the solution in, run the Jenkins item again and see the coverage reach 100% (Figure 5).
Figure 5 |
Finally
This brings us to the end. I’ve shown you:
- How to build, configure and run a development pipeline with Docker, Docker Compose, Jenkins, SonarQube and Postgres.
- How to set up a Docker pipeline on Jenkins for continuous integration.
- How to perform static analysis and measure code coverage on a simple Dotnet Core project.
In many cases this will be all small project developers need. However, there’s plenty more which can be done, including:
- Building and running the development pipeline on DigitalOcean.
- Adding a NuGet server to publish private packages to.
I hope this article has persuaded you that you can save money on your existing cloud based development pipelines and whetted your appetite for what can be achieved with Docker and Docker Compose.
Until next time...
Acknowledgements
Thank you to Jez Higgins, Neil Carroll, Philip Watson, Alex Scotton and Lauren Gwynn for inspiration, ideas and reviews!
References
[1] Continuous Integration: https://codeship.com/continuous-integration-essentials
[2] Static Analysis: https://www.perforce.com/blog/sca/what-static-analysis
[3] Code Coverage: https://www.atlassian.com/continuous-delivery/software-testing/code-coverage
[4] CircleCI Credits: Credits are used to pay for your team’s usage based on machine type and size, and premium features like Docker layer caching. See https://circleci.com/pricing/
[5] Terraform: https://www.terraform.io/intro/
[6] Using Dockerfile directly: https://www.jenkins.io/doc/book/pipeline/docker/
[7] SonarQube official image: https://hub.docker.com/_/sonarqube/
[8] Postgres official image: https://hub.docker.com/_/postgres
[9] OpenCover is a code coverage tool for .NET 2 and above with support for 32 and 64 processes and covering both branch and sequence points: https://github.com/OpenCover/opencover‘
Paul Grenyer is a husband, father, software consultant, author, testing and agile evangelist.
Notes:
More fields may be available via dynamicdata ..