Jenkins Pipeline for server less builds

by | Aug 31, 2019 | Big Data

Jenkins supports using docker container engine. As result, Jenkins pipelines are going towards server less builds. Using the built-in docker plugin in pipelines is pretty simple. In that way, pipeline performs native calls to docker. And in consequence, docker is capable or running any image from docker registry.

Jenkins Pipeline executes docker container

 

Jenkins Pipeline 1

pipeline {
    agent {
        docker { image 'maven:latest' }
    }
    stages {
        stage('Test') {
            steps {
                sh 'mvn -version'
            }
        }
    }
}
In usual the ‘agent’ is a shell environment local to Jenkins host. Let Pipeline call commands means to execute them in a shell. With docker agent the shell moves into a container environment. Jenkins starts the container and executes the command in context of it. The example loads the latest maven image from registry. Then pipeline execution outputs maven version to console. In consequence, Jenkins may execute any docker image.

Jenkins Pipeline in kubernetes context

Jenkins supports cloud environments like Kubernetes. The Jenkins Kubernetes plugin is a key to server less world. There are minor changes to the Pipeline code. Now, the cloud will instantiate the docker container.

Jenkins Pipeline 2

pipeline {
  agent {
    kubernetes {
      defaultContainer 'maven'
      yamlFile 'KubernetesPod.yaml'
    }
  }
  stages {
    stage('Run maven') {
      steps {
        sh 'mvn -version'
      }
    }
    stage('Run shell') {
      container('mycontainer') {
        sh 'echo hello world'
      }
    }
  }
}
In this example agent is the Kubernetes cloud. Kubernetes names docker containers PODS. So, the YAML definition contains one or more containers. Each of them references a docker image and uses a named label. The Jenkins Pipelines uses this label as reference. The default container is maven. First stage uses the default container and calls maven. Second stage changes context to different container.

Jenkins Pipeline stash and unstash

 

Jenkins Pipeline 3

pipeline {
  ...
  stages {
    stage('step 1') {
      steps {
        ...
        stash includes: 'dist/**/*', name: 'FOOBAR'
      }
    }
    stage('step 2') {
       unstash 'FOOBAR'
    }
  }
}
But how can container share data? A pipeline is similar to an assembly line. Output of first container becomes input of second container. Is there some magic in yaml definition? For example a shared volume? In fact, that may be a solution but is not the recommended way. Instead of this, Jenkins supports the stash and unstash commands. The stash command fetches files from local file system. An optional ant-style pattern is used to filter. The mandatory name associates files with a label. Later, the unstash command will reference these files by label name. In this example Jenkins Pipeline copies files from stage 1 to stage 2.

References