Fading Coder

One Final Commit for the Last Sprint

Home > Tools > Content

Practical Guide to Building and Running Jenkins Pipelines for Software Delivery

Tools 2

Creating a Pipeline Job

  1. Log into the Jenkins dashboard, click "New Item" in the left navigation bar, select the "Pipeline" job type, enter a unique job name, and confirm to enter the configuraton page.
  2. Navigate to the Pipeline configuration section, under the "Definition" dropdown, select "Pipeline script" to edit code directly in the Jenkins editor, or select "Pipeline script from SCM" to pull pipeline scripts from version control systems like Git, SVN.
  3. (Optional) Add build parameters via the "Add Parameter" button, supported parameter types include string, boolean, choice list, file upload, etc. These parameters can be referanced directly in pipeline scripts to implement dynamic execution logic.
  4. Define execution agents: You can specify a global execution environment for the entire pipeline, or assign independent agents for individual stages. Supported environments include arbitrary available Jenkins nodes, OS-specific labeled nodes, Docker containers, cloud ephemeral instances, etc.
  5. Save the configuration, you can trigger an immediate run via the "Build Now" button, or configure triggers such as code webhooks, scheduled cron tasks, upstream job linkage for subsequent runs.

Writing Pipeline Scripts

Jenkins Pipeline uses Groovy-based DSL syntax, with all configuration wrapped in a top-level pipeline block:

  1. Agent block: Defines the default execution environment for the entire pipeline. For example, to use a preconfigured Maven build environment via Docker:
pipeline {
  agent {
    docker {
      image 'maven:3.9.6-eclipse-temurin-17'
      args '-v /local/maven/cache:/root/.m2 -v /workspace:/app'
    }
  }
  // remaining configuration
}
  1. Stages block: Contains all execution phases of the software delivery workflow, each phase is defined via a separate stage block, corresponding to logical segments such as code checkout, compilation, testing, deployment.
  2. Steps block: Nested inside each stage block, defines the specific tasks to run in the current phase, supports shell command execution, Jenkins plugin calls, custom script execution, etc.
  3. Advanced control logic can be added, including environment variable definition, conditional branches, loop logic, error handling, to implement complex custom workflows. For example, only run the production deployment stage when the current code branch is main.

Defining Execution Stages

Each stage block requires a unique display name, which will be rendered in the Jenkins UI for execution status tracking. The basic syntax is as follows:

stage('<StageDisplayName>') {
  agent <optional_stage_specific_agent>
  steps {
    // core task execution logic
  }
  post {
    // optional post-execution actions, e.g. send alert on failure, upload reports
  }
}

Sample stages configuration for a Java backend project:

stages {
  stage('Checkout Source Code') {
    steps {
      git url: 'https://github.com/your-org/backend-service.git', branch: env.BRANCH_NAME
    }
  }
  stage('Compile & Package') {
    steps {
      sh 'mvn -U clean package -DskipTests'
    }
  }
  stage('Automated Testing') {
    steps {
      sh 'mvn test jacoco:report'
    }
    post {
      always {
        junit 'target/surefire-reports/*.xml'
        publishHTML(target: [
          allowMissing: false,
          alwaysLinkToLastBuild: true,
          keepAll: true,
          reportDir: 'target/site/jacoco',
          reportFiles: 'index.html',
          reportName: 'Unit Test Coverage Report'
        ])
      }
    }
  }
  stage('Production Deployment') {
    when {
      branch 'main'
    }
    steps {
      sh './scripts/deploy-prod.sh'
    }
  }
}

Stages run sequentially by default, you can add when conditions to skip specific stages under predefined circumstances, or configure parallel execution for independent stages to reduce total run time.

Configuring Execution Agents

The agent block can be defined at the global pipeline level to apply to all stages, or overridden inside individual stages to use a dedicated execution environment. Common agent configuration scenarios:

  1. Run on any available Jenkins node: agent any
  2. Run on a node matching specific labels: agent { label 'linux-amd64 && high-mem-8g' }
  3. Run in a Docker container: Specify base image, volume mounts, network parameters, etc. Sample configuration for a frontend build stage using a Node.js image:
stage('Frontend Resource Build') {
  agent {
    docker {
      image 'node:20-alpine'
      args '-v /local/npm/cache:/root/.npm --network host'
    }
  }
  steps {
    sh 'npm install && npm run build:prod'
  }
}
  1. Run on Kubernetes cluster: For Jenkins clusters deployed on K8s, you can define custom pod templates directly in the agent block to dynamically create execution pods.

Adding Execution Steps

Steps are the minimum execution unit in a pipeline, each step corresponds to a specific operational task. Common step types include:

  • Shell command execution: Use sh for Linux/macOS nodes, bat for Windows nodes
  • Plugin integration steps: e.g. git for code checkout, junit for test report parsing, dockerBuild for container image building
  • Custom script calls: Run pre-written shell, Python, or other script files stored in the code repository

Sample step for building a Docker image and pushing to a private registry:

steps {
  docker.withRegistry('https://registry.your-domain.com', 'docker-registry-credential-id') {
    def serviceImage = docker.build("backend-service:${env.BUILD_ID}")
    serviceImage.push()
    serviceImage.push('latest')
  }
}

Triggering Pipeline Execution

After saving the pipeline configuration, you can trigger runs via multiple channels:

  1. Manual trigger: Click "Build Now" on the job details page, if parameters are configured, you will be prompted to enter parameter values before execution starts
  2. Scheduled trigger: Configure cron expressions in the Build Triggers section to run periodically, suitable for daily scheduled test tasks
  3. Webhook trigger: Configure webhooks in code repositories such as GitHub, GitLab, to automatically trigger a run when new code is pushed, a PR is merged, a tag is created
  4. Upstream job trigger: Run automatically after another specified job completes successfully, suitable for multi-job linked workflows

During execution, you can view real-time logs via the Jenkins console output page, each stage and step will display its execution status, duration, and output separately. If execution fails, you can locate the root cause via the error log, modify the pipeline script or configuraton, and re-run the job.

Accessing Pipeline Execution Results

You can view pipeline execution data via multiple channels:

  1. Jenkins default UI: The job details page lists all historical builds, each build's detail page displays stage execution status, full console logs, generated artifacts, test reports, etc.
  2. Blue Ocean UI: The Blue Ocean plugin provides a visual pipeline dashboard, displays the stage execusion flow graphically, supports one-click restart of failed stages, and view aggregated test reports and other structured data
  3. Jenkins CLI: Use the official Jenkins CLI tool to query build data programmatically:
  • List all builds of the target job: jenkins-cli job list-builds <job-name>
  • Fetch full console log of a specific build: jenkins-cli job console <job-name> <build-number>
  1. REST API: Send a GET request to /job/<job-name>/<build-number>/api/json to get structured build data, including stage status, duration, trigger user, artifact download links, etc., which can be used for secondary integration into custom DevOps platforms.

Related Articles

Efficient Usage of HTTP Client in IntelliJ IDEA

IntelliJ IDEA incorporates a versatile HTTP client tool, enabling developres to interact with RESTful services and APIs effectively with in the editor. This functionality streamlines workflows, replac...

Installing CocoaPods on macOS Catalina (10.15) Using a User-Managed Ruby

System Ruby on macOS 10.15 frequently fails to build native gems required by CocoaPods (for example, ffi), leading to errors like: ERROR: Failed to build gem native extension checking for ffi.h... no...

Resolve PhpStorm "Interpreter is not specified or invalid" on WAMP (Windows)

Symptom PhpStorm displays: "Interpreter is not specified or invalid. Press ‘Fix’ to edit your project configuration." This occurs when the IDE cannot locate a valid PHP CLI executable or when the debu...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.