Martin Heller
Contributor

What is Jenkins? The CI server explained

feature
Mar 15, 202310 mins
CI/CDDevopsSoftware Development

Jenkins offers a simple way to set up a continuous integration and continuous delivery environment for almost any combination of languages and source code repositories.

Jenkins offers a simple way to set up a continuous integration or continuous delivery (CI/CD) environment for almost any combination of languages and source code repositories using pipelines, as well as automating other routine development tasks. While Jenkins doesn’t eliminate the need to create scripts for individual steps, it does give you a faster and more robust way to integrate your entire chain of build, test, and deployment tools than you can easily build yourself.

 Before Jenkins, the best a developer could do to avoid breaking the nightly build was to write and test their code carefully and successfully on a local machine before committing it. But that meant testing one’s changes in isolation, without everyone else’s daily commits. There was no firm guarantee that the nightly build would survive the latest commit.

Jenkins–originally Hudson–was a direct response to this limitation.

Hudson and Jenkins

In 2004, Kohsuke Kawaguchi was a Java developer at Sun Microsystems. Kawaguchi was tired of breaking builds in his development work and wanted to find a way to know, before committing code to the repository, whether the code was going to work. So Kawaguchi built an automation server in and for Java to make that possible, called Hudson. Hudson became popular at Sun, and spread to other companies as open source.

Fast-forward to 2011, and a dispute between Oracle (which had acquired Sun) and the independent Hudson open source community led to a fork with a name change, Jenkins. In 2014 Kawaguchi became CTO of CloudBees, which offers Jenkins-based continuous delivery products.

Both forks continued to exist, although Jenkins was much more active. Today, the Jenkins project is still active. The Hudson website was closed down on Jan 31, 2020.

In March 2019 the Linux Foundation, along with CloudBees, Google, and a number of other companies, launched a new open source software foundation called the Continuous Delivery Foundation (CDF). Jenkins contributors decided that their project should join this new foundation. Kawaguchi wrote at the time that nothing of significance would change for users.

In January 2020 Kawaguchi announced he was moving to his new startup, Launchable. He also said that he would be officially stepping back from Jenkins, although staying on the CDF technical oversight committee. His role at CloudBees changed to advisor.

Jenkins automation

Today, Jenkins is the leading open-source automation server with some 1,600 to 1,800 plugins to support the automation of all kinds of development tasks. The problem Kawaguchi was originally trying to solve, continuous integration and continuous delivery of Java code (i.e., building projects, running tests, doing static code analysis, and deploying) is only one of many processes that people automate with Jenkins. The available plugins span five areas: platforms, UI, administration, source code management, and, most frequently, build management.

How Jenkins works

Jenkins is distributed as a WAR archive and as installer packages for the major operating systems, as a Homebrew package, as a Docker image, and as source code. Jenkins also supports installation and scaling on Kubernetes. The source code is mostly Java, with a few Groovy, Ruby, and Antlr files.

You can run the Jenkins WAR standalone or as a servlet in a Java application server such as Tomcat. In either case, it produces a web user interface and accepts calls to its REST API.

When you run Jenkins for the first time, it creates an administrative user with a long random password, which you can paste into its initial web page to unlock the installation.

Jenkins plugins

Once installed, Jenkins allows you to either accept the default plugin list or choose your own plugins.

jenkins plugin installer IDG

Once you have picked your initial set of plugins, click the Install button and Jenkins will add them.

jenkins getting started IDG

The Jenkins main screen displays the current build queue and Executor status, and offers links to create new items (jobs), manage users, view build histories, manage Jenkins, look at your custom views, and manage your credentials.

jenkins main screen IDG

A new Jenkins item can be any of six types of job plus a folder for organizing items.

jenkins new item IDG

The Manage Jenkins page allows you to do up to 18 different things, including the option to open a command-line interface. At this point, however, we should look at pipelines, which are enhanced workflows that are typically defined by scripts.

jenkins manage screen IDG

Jenkins pipelines

Once you have Jenkins configured, it’s time to create some projects that Jenkins can build for you. While you can use the web UI to create scripts, the current best practice is to create a pipeline script, named Jenkinsfile, and check it into your repository. The screenshot below shows the configuration web form for a multibranch pipeline.

jenkins multibranch pipeline IDG

As you can see, branch sources for this kind of pipeline in my basic Jenkins installation can be Git or Subversion repositories, including GitHub. If you need other kinds of repositories or different online repository services, it’s just a matter of adding the appropriate plugins and rebooting Jenkins. I tried, but couldn’t think of a source code management system that doesn’t already have a Jenkins plugin listed.

Jenkins pipelines can be declarative or scripted. A declarative pipeline, the simpler of the two, uses Groovy-compatible syntax—and if you want, you can start the file with #!groovy to point your code editor in the right direction. A declarative pipeline starts with a pipeline block, defines an agent, and defines stages that include executable steps, as in the three-stage example below.


pipeline {
    agent any

    stages {
        stage(‘Build’) {
            steps {
                echo ‘Building..’
            }
        }
        stage(‘Test’) {
            steps {
                echo ‘Testing..’
            }
        }
        stage(‘Deploy’) {
            steps {
                echo ‘Deploying....’
            }
        }
    }
}

pipeline is the mandatory outer block to invoke the Jenkins pipeline plugin. agent defines where you want to run the pipeline. any says to use any available agent to run the pipeline or stage. A more specific agent might declare a container to use, for example:


agent {
    docker {
        image ‘maven:3-alpine’
        label ‘my-defined-label’
        args  ‘-v /tmp:/tmp’
    }
}

stages contain a sequence of one or more stage directives. In the example above, the three stages are Build, Test, and Deploy.

steps do the actual work. In the example above the steps just printed messages. A more useful build step might look like the following:


pipeline {
    agent any

    stages {
        stage(‘Build’) {
            steps {
                sh ‘make’
                archiveArtifacts artifacts: ‘**/target/*.jar’, fingerprint: true
            }
        }
    }
}

Here we are invoking make from a shell, and then archiving any produced JAR files to the Jenkins archive.

The post section defines actions that will be run at the end of the pipeline run or stage. You can use a number of post-condition blocks within the post section: always, changed, failure, success, unstable, and aborted.

For example, the Jenkinsfile below always runs JUnit after the Test stage, but only sends an email if the pipeline fails.


pipeline {
    agent any
    stages {
        stage(‘Test’) {
            steps {
                sh ‘make check’
            }
        }
    }
    post {
        always {
            junit ‘**/target/*.xml’
        }
        failure {
            mail to: team@example.com, subject: ‘The Pipeline failed :(‘
        }
    }
}

The declarative pipeline can express most of what you need to define pipelines, and is much easier to learn than the scripted pipeline syntax, which is a Groovy-based DSL. The scripted pipeline is in fact a full-blown programming environment.

For comparison, the following two Jenkinsfiles are completely equivalent.

Declarative pipeline


pipeline {
    agent { docker ‘node:6.3’ }
    stages {
        stage(‘build’) {
            steps {
                sh ‘npm —version’
            }
        }
    }
}

Scripted pipeline


node(‘docker’) {
    checkout scm
    stage(‘Build’) {
        docker.image(‘node:6.3’).inside {
            sh ‘npm —version’
        }
    }
}

Blue Ocean, the Jenkins GUI

If you’d like the latest and greatest Jenkins UI, you can use the Blue Ocean plugin, which provides a graphical user experience. You can add the Blue Ocean plugin to your existing Jenkins installation or run a Jenkins/Blue Ocean Docker container. With Blue Ocean installed, your Jenkins main menu will have an extra icon:

jenkins menu with blue ocean IDG

You can open Blue Ocean directly if you wish. It’s in the /blue folder on the Jenkins server. Pipeline creation in Blue Ocean is a bit more graphical than in plain Jenkins:

jenkins blue ocean create pipeline IDG

Jenkins Docker

As I mentioned earlier, Jenkins is also distributed as a Docker image. There isn’t much more to the process: Once you’ve picked the type of source code manager (SCM), you provide a URL and credentials, then create a pipeline from a single repository or scan all repositories in the organization. Every branch with a Jenkinsfile will get a pipeline.

Here I’m running a Blue Ocean Docker image, which came with a few more Git service plugins installed than the default list of SCM providers:

jenkins blue ocean pipeline status IDG

Once you have run some pipelines, the Blue Ocean plugin will display their status, as shown above. You can zoom in on an individual pipeline to see the stages and steps:

jenkins blue ocean pipeline IDG

You can also zoom in on branches (top) and activities (bottom):

jenkins blue ocean branches IDG

jenkins blue ocean activity IDG

Why use Jenkins?

The Jenkins Pipeline plugin we’ve been using supports a general continuous integration/continuous delivery (CICD) use case, which is probably the most common use for Jenkins. There are specialized considerations for some other use cases.

Java projects were the original raison d’être for Jenkins. We’ve already seen that Jenkins supports building with Maven; it also works with Ant, Gradle, JUnit, Nexus, and Artifactory.

Android runs a kind of Java, but introduces the issue of how to test on the wide range of Android devices. The Android emulator plugin allows you to build and test on as many emulated devices as you care to define. The Google Play Publisher plugin lets you send builds to an alpha channel in Google Play for release or further testing on actual devices.

I’ve shown examples where we specified a Docker container as the agent for a pipeline and where we ran Jenkins and Blue Ocean in a Docker container. Docker containers are very useful in a Jenkins environment for improving speed, scalability, and consistency.

There are two major use cases for Jenkins and GitHub. One is build integration, which can include a service hook to trigger Jenkins on every commit to your GitHub repository. The second is the use of GitHub authentication to control access to Jenkins via OAuth.

Jenkins supports many other languages besides Java. For C/C++, there are plugins to capture errors and warnings from the console, generate build scripts with CMake, run unit tests, and perform static code analysis. Jenkins has a number of integrations with PHP tools.

While Python code doesn’t need to be built (unless you’re using Cython, for instance, or creating a Python wheel for installation) it’s useful that Jenkins integrates with Python testing and reporting tools, such as Nose2 and Pytest, and code quality tools such as Pylint. Similarly, Jenkins integrates with Ruby tools such as Rake, Cucumber, Brakeman, and CI::Reporter.

Jenkins for CI/CD

On the whole, Jenkins offers a simple way to set up a CI/CD environment for pretty much any combination of languages and source code repositories using pipelines, as well as automating a number of other routine development tasks. While Jenkins doesn’t eliminate the need to create scripts for individual steps, it does give you a quicker and more robust way to integrate your entire chain of build, test, and deployment tools than you could easily build yourself.

Martin Heller
Contributor

Martin Heller is a contributing editor and reviewer for InfoWorld. Formerly a web and Windows programming consultant, he developed databases, software, and websites from his office in Andover, Massachusetts, from 1986 to 2010. More recently, he has served as VP of technology and education at Alpha Software and chairman and CEO at Tubifi.

More from this author