Jenkinsfile Docker pipeline multi stage

Using a Jenkinsfile to configure the Jenkins build job for source code is great. Jenkins has a very nice Docker Pipeline plugin that makes it possible to execute docker commands nicely during the build.

Note: Don’t forget to read on this page the update of 16 august 2018.

However, a lot of the examples at https://jenkins.io/doc/book/pipeline/docker/ keep it very simple. They start and stop in one pipeline stage, with methods like docker.inside or docker.withRun. For example, building a container, running it, executing commands in it and destroy it, all within one stage. For several use cases this is fine, but for building an application Docker container, it is much nicer to implement multiple stages.

In the more advanced example on this page, I also used a pipelineContext global variable which is of type LinkedHashMap. The Jenkinsfile programming language is Groovy. In the Groovy this comes close to the equivalent of the JavaScript object. This variable makes it possible to share data or objects between stages. Let’s take a look at the “Run” stage.

stage('Run') {
    steps {
        echo "Run docker image"
        script {
            pipelineContext.dockerContainer = pipelineContext.dockerImage.run()
        }
    }
}

After running a Docker container based on an image, a container instance is returned. In order to be able to stop and remove the container if the build fails or is finished, you will need this container object.

post {
    always {
        echo "Stop Docker image"
        script {
            if (pipelineContext && pipelineContext.dockerContainer) {
                pipelineContext.dockerContainer.stop()
            }
        }
    }
}

Below is a complete example of a Declarative Pipeline that spreads several Docker commands across multiple stages.

// Initialize a LinkedHashMap / object to share between stages
def pipelineContext = [:]

pipeline {
    agent any

    environment {
        DOCKER_IMAGE_TAG = "my-app:build-${env.BUILD_ID}"
    }

    stages {
        stage('Configure') {
            steps {
                echo 'Create parameters file'
            }
        }
        stage('Build') {
            steps {
                echo "Build docker image"
                script {
                    dockerImage = docker.build("${env.DOCKER_IMAGE_TAG}",  '-f ./Dockerfile .')
                    pipelineContext.dockerImage = dockerImage
                }
            }
        }
        stage('Run') {
            steps {
                echo "Run docker image"
                script {
                    pipelineContext.dockerContainer = pipelineContext.dockerImage.run()
                }
            }
        }
        stage('Test') {
            steps {
                echo "Testing the app"
            }
        }
        stage('Push') {
            steps {
                echo "Pushing the Docker image to the registry"
            }
        }
        stage('Deploy') {
            steps {
                echo "Deploying the Docker image"
            }
        }
        stage('Verify') {
            parallel {
                stage('Verify home') {
                    agent any
                    steps {
                        echo "HTTP request to verify home"
                    }
                }
                stage('Verify health check') {
                    agent any
                    steps {
                        echo "HTTP request to verify application health check"
                    }
                }
                stage('Verify regression tests') {
                    agent any
                    steps {
                        echo "Running regression test suite"
                    }
                }
            }
        }
    }
    post {
        always {
            echo "Stop Docker image"
            script {
                if (pipelineContext && pipelineContext.dockerContainer) {
                    pipelineContext.dockerContainer.stop()
                }
            }
        }
    }
}

UPDATE 16 august 2018: pipelineContext is not required (anymore?)

I’m not sure why I experienced problems by the time of writing this post. The use of a global pipelineContext variable is not required (anymore?). Variables can be referenced across stages.

pipeline {
    agent any

    stages {
        stage('Build') {
            steps {
                script {
                    docker_image = docker.build("${env.DOCKER_IMAGE_TAG}", '-f ./Dockerfile .')
                }
            }
        }
        stage('Test') {
            parallel {
                stage('Unit tests') {
                    agent any
                    steps {
                        script {
                            docker_image.inside("--entrypoint='/start.sh'") {
                                sh 'cd /var/www/app && vendor/bin/phpunit --testsuite=Unittest'
                            }
                        }
                    }
                }
                stage('Health check') {
                    agent any
                    steps {
                        script {
                            docker_image.inside("--entrypoint='/start.sh'") {
                                timeout(time: 1, unit: 'MINUTES') {
                                    retry(5) {
                                        sleep 5
                                        sh "curl -sS http://localhost/info | grep 'My API'"
                                    }
                                }
                            }
                        }
                    }
                }
            }
        }
    }
}

Tags: ,,,