25 May 2021
Docker is useful for so many things like setting up a testing environment or just to quickly fiddle around with some unknown software application without having to install it. The other day I needed to go through some old data, which was stored in an MS SQL database backup (.bak file) and since I am working on a MacBook Pro, I wasn’t able to install MS SQL Server. Conveniently enough, Docker images exist for MS SQL Server, so I was able to run MS SQL Server (temporarily) on my laptop. The following docker run command will start a simple MS SQL Server 2019 (Express Edition) instance:
docker run --rm -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=some_password' -e 'MSSQL_PID=Express' mcr.microsoft.com/mssql/server:2019-latest
That was one less problem I had to think about. The next (and real) problem was being able to restore the backup file using some sql query. I found many questions and answers to this problem, but none used a Docker container for their MS SQL Server instance. The problem lies in the fact that the Docker container needs to have a volume mapped from the host to the container. The following docker run command will start an MS SQL Server instance with a volume mapping, and with a restriction so the Docker container is only reachable through my own machine (in this case, on port 12345):
docker run --rm -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=some_password' -e 'MSSQL_PID=Express' -p 127.0.0.1:12345:1433 -v /path/to/database/backup/:/tmp:rw mcr.microsoft.com/mssql/server:2019-latest
Now open up your favourite database management tool and open a connection to the SQL Server we just started. Enter the following query and your database is ready to use:
USE [master]
RESTORE DATABASE NAME_OF_DATABASE_TO_RESTORE
FROM DISK = '/tmp/backup.bak'
WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 5,
Move 'NAME_OF_DATABASE_TO_RESTORE' To '/var/opt/mssql/data/NAME_OF_DATABASE_TO_RESTORE.mdf',
Move 'NAME_OF_DATABASE_TO_RESTORE_log' To '/var/opt/mssql/data/NAME_OF_DATABASE_TO_RESTORE_log.ldf'
GO
use [NAME_OF_DATABASE_TO_RESTORE]
select *
from some_table
Voila!
18 Oct 2017
Many people believe the concept of DevOps can be captured in a simple definition. I tend to disagree. I’m not saying the concept of DevOps is hard to grasp or anything, I just think there hasn’t been any definition which is absolutely on point and understandable for anyone not directly working in IT. In this blogpost I’ll describe how I look at DevOps as a concept and how I try to help customers move forward.
Okay so let’s face it, there are already so many established authorities who came up with their own definition of DevOps, that we don’t need yet another. It reminds me of that one XKCD we all know, right? Ok, cool.
Everyone knows (maybe after some Googling) DevOps has something to do with bringing people working in Development closer to people working in Operations to overcome clear issues like responsibility in case an application is not responding appropriately. In my opinion DevOps simply has to do with three things: People, Processes, and Technology. If you accept these three areas as individual axis, we get a simple XYZ-space with an origin (let’s forget about negatives here). I’ll return to that analogy in a second. First, let’s concentrate on the areas alone. If you would look at each of these areas as individual maturity levels, with zero maturity in the origin, and extremely mature in .. well, infinity, we would be able to map some real life situation mapped onto that axis. If we would isolate the technology axis for instance, then the origin would mean we would make no use of technology at all. Going forward on the axis, we would come across all kinds of technology or tools we could use. Stuff like IDEs (Visual Studio, Netbeans, Eclipse, JDev, …), Source Code Management (SVN, Git, Mercurial, TFS, …), Continuous Integration tools (Jenkins, Travis CI, Gitlab-CI, …), and so on and so forth. The same could be done with the people and processes axis. Moving along each axis should bring you in some kind of situation in which Development teams are working more closely to Operations teams.
Coming back to the combination of the three areas, then each combination is a real life situation in the DevOps operating domain. Every business concerned with building its own software is situated somewhere in that DevOps domain. Some had their focus historically set on technology (or tools), others might have focused some more on processes or people, it really doesn’t matter.
Now think of the origin in our XYZ-space, a place where a lot can be done to improve one’s software building capabilities. Next, think of some large company (say Amazon for instance) which has multiple deployments per day. This is clearly a situation that lies far from the origin. People are well trained in following properly designed processes and they use some kind of technology or tools. Is this the ultimate DevOps way of doing things? Not necessarily. I like to think of this situation as a bicycle chain. There are no loose chain links (at least in plain sight), and the company is able to quickly have a change deployed on their servers in reasonable time. Is this the end goal then? Again, not necessarily. The company might want to make things more efficient, or they would like to add features (more logging, more or better monitoring). It all depends.
Having said that, I think most of us have had customers asking questions like
I would like to do DevOps, because that’s trending everywhere in technology-land, can you help me with that?
Of course we first have to explain what DevOps is, but none of the known definitions will help our customer understand what it exactly means since it’s not necessarily their area of expertise. I usually explain the analogy described above, after which I suggest to make up the status quo; where in the XYZ-space are they situated now, or which chain links do they already have (with or without some linked to one another already), and what kind of chain links they feel they are missing to complete the bicycle chain? So when would they like to be where having spent how much resources? All these questions should be answered in light of one underlying question; Will this help to build and maintain a software application in a more efficient way than we are doing now?
05 Dec 2016
For those people who would like to fiddle around with a few services that could make up a development street, look no further! This post is meant as a starting point to setup a complete development street with only a small amount of work. Afterwards you can tailor the installations so they will better fit your needs.
This development street consists of the following services:
- Buildserver: Jenkins 2.x (At the time of writing 2.19.4)
- Version Control System: Gitlab 8.x (At the time of writing 8.14.2-ce.0)
- Software Quality profiler: Sonar 6.x (At the time of writing 6.1)
At the end of this manual you’ll have a set of services configured to be usable as a development street. As an added ‘bonus’ I’ve added Portainer (From portainer.io) into the mix. When you’re new to Docker, the interface portainer offers around the Docker daemon will help you with a few basic Docker activities.
Installation
There really is no point of ‘installing’ the development street. It’s just a matter of starting the right Docker containers. But the hardest part has already been done, which is the combination of getting the right Docker images. This can be found in the Docker-Compose file below.
To start the development street for the first time, just run the following command:
docker-compose -f devstreet.yml up -d
This will start all Docker images in the devstreet.yml file. After waiting some time the images have been downloaded and started and you can start using the services. Well, almost then.
Configuration
Jenkins
When you first start the services, you can visit them at the following addresses:
But before you do, let us just finish the one-time ‘installations’.
Open up a web browser and point it to the Jenkins instance: http://localhost:9000
You’ll notice it asks you for an initial password, which can get from the container by issueing the following command:
docker exec -it jenkins cat /var/jenkins_home/secrets/initialAdminPassword
A 32-characters long string will appear. This is the temporary administrator password. Fill this password in the input box in the Jenkins interface. After you have done this, you can either select to let Jenkins continue using some default plugins, or you can select which specific plugins you would like to use. I let the reader choose which (extra) plugins he / she needs. For this manual, I’ll just continue with the default plugins.
When everything is installed, Jenkins lets you create a new administrator user. When you’re done doing this, let’s continue to Gitlab.
Gitlab
Open up http://localhost
Gitlab starts with asking the user to enter a new administrator’s password. After this, it logs you back out, and you need to use the newly entered password with username ‘root’.
After this, you’ll arrive in the main Gitlab interface.
SonarQube
Go to the webinterface of SonarQube at http://localhost:9000
The default username / password combination of the SonarQube instance is: admin / admin.
If you want you can add new software quality profiles in the configuration area.
Portainer
The webinterface of Portainers can be found at http://localhost:9999 and with it you can investigate which Docker images are available on the system, which Docker volumes and Docker networks, etc.
Development Street Overview

The idea of this setup is as follows. There is one public repository on Gitlab with all kinds of pipeline tools defined as Groovy functions. This repo can be seen in the image above at the top left and is called ‘cd’. I’ll cover a sample of the contents of this file in the next blogpost.
Every project will load these tools in its own pipeline definition file, which has to be called ‘Jenkinsfile’. Now in the (Multibranch pipeline) Jenkins job configuration, just point to the Gitlab repository and Jenkins will recognize the Jenkinsfile automatically and will be generating a specific tailored pipeline for this specific build configuration.
Stuff to consider
Of course there are things to consider. When Docker is not used properly for instance, you might lose all data that is generated during the uptime of the services.
Stopping / (Re)starting
When you stop the development street, do not use
docker-compose -f devstreet.yml down -v
because otherwise the volumes will be removed as well. We want to be able to start and stop the development street at anytime and without it having removed any data. Instead, just use
docker-compose -f devstreet.yml down
Persistent storage
In the light of the point above, it would be a good idea to fix persistent storage on a different file location.
devstreet.yml
So, without further ado, the contents of the docker-compose file, I called it devstreet.yml:
version: '2'
services:
portainer:
container_name: portainer
image: portainer/portainer
volumes:
- /var/run/docker.sock:/var/run/docker.sock
ports:
- "9999:9000"
gitlab:
container_name: gitlab
image: gitlab/gitlab-ce:8.14.2-ce.0
ports:
- "22:22"
- "80:80"
- "443:443"
volumes:
- gitlab_etc:/etc/gitlab
- gitlab_log:/var/log/gitlab
- gitlab_data:/var/opt/gitlab
# Default administrator password can be obtained via: docker exec -it jenkins cat /var/jenkins_home/secrets/initialAdminPassword
jenkins:
container_name: jenkins
image: jenkins:2.19.4-alpine
ports:
- "8080:8080"
- "50000:50000"
volumes:
- jenkins_home:/var/jenkins_home
# Default username / password: admin / admin
sonar:
container_name: sonar
image: sonarqube:6.1-alpine
environment:
- SONARQUBE_JDBC_USERNAME=sonar
- SONARQUBE_JDBC_PASSWORD=sonar
- SONARQUBE_JDBC_URL=jdbc:postgresql://sonar_db:5432/sonar
links:
- sonar_db:sonar_db
ports:
- "9000:9000"
- "9092:9092"
volumes_from:
- sonar_plugins
restart: always
sonar_plugins:
container_name: sonar_plugins
image: sonarqube:5.6-alpine
volumes:
- sonar_plugins_extensions:/opt/sonarqube/extensions
- sonar_plugins_bundled:/opt/sonarqube/bundled-plugins
command: /bin/true
sonar_db:
container_name: sonar_db
image: postgres:9.4
environment:
- POSTGRES_USER=sonar
- POSTGRES_PASSWORD=sonar
ports:
- "5432:5432"
volumes:
gitlab_etc: {}
gitlab_log: {}
gitlab_data: {}
jenkins_home: {}
sonar_plugins_extensions: {}
sonar_plugins_bundled: {}