Practical Guide to Building and Running Jenkins Pipelines for Software Delivery
Creating a Pipeline Job
- Log into the Jenkins dashboard, click "New Item" in the left navigation bar, select the "Pipeline" job type, enter a unique job name, and confirm to enter the configuraton page.
- Navigate to the Pipeline configuration section, under the "Definition" dropdown, select "Pipeline script" to edit code directly in the Jenkins editor, or select "Pipeline script from SCM" to pull pipeline scripts from version control systems like Git, SVN.
- (Optional) Add build parameters via the "Add Parameter" button, supported parameter types include string, boolean, choice list, file upload, etc. These parameters can be referanced directly in pipeline scripts to implement dynamic execution logic.
- Define execution agents: You can specify a global execution environment for the entire pipeline, or assign independent agents for individual stages. Supported environments include arbitrary available Jenkins nodes, OS-specific labeled nodes, Docker containers, cloud ephemeral instances, etc.
- Save the configuration, you can trigger an immediate run via the "Build Now" button, or configure triggers such as code webhooks, scheduled cron tasks, upstream job linkage for subsequent runs.
Writing Pipeline Scripts
Jenkins Pipeline uses Groovy-based DSL syntax, with all configuration wrapped in a top-level pipeline block:
- Agent block: Defines the default execution environment for the entire pipeline. For example, to use a preconfigured Maven build environment via Docker:
pipeline {
agent {
docker {
image 'maven:3.9.6-eclipse-temurin-17'
args '-v /local/maven/cache:/root/.m2 -v /workspace:/app'
}
}
// remaining configuration
}
- Stages block: Contains all execution phases of the software delivery workflow, each phase is defined via a separate
stageblock, corresponding to logical segments such as code checkout, compilation, testing, deployment. - Steps block: Nested inside each
stageblock, defines the specific tasks to run in the current phase, supports shell command execution, Jenkins plugin calls, custom script execution, etc. - Advanced control logic can be added, including environment variable definition, conditional branches, loop logic, error handling, to implement complex custom workflows. For example, only run the production deployment stage when the current code branch is
main.
Defining Execution Stages
Each stage block requires a unique display name, which will be rendered in the Jenkins UI for execution status tracking. The basic syntax is as follows:
stage('<StageDisplayName>') {
agent <optional_stage_specific_agent>
steps {
// core task execution logic
}
post {
// optional post-execution actions, e.g. send alert on failure, upload reports
}
}
Sample stages configuration for a Java backend project:
stages {
stage('Checkout Source Code') {
steps {
git url: 'https://github.com/your-org/backend-service.git', branch: env.BRANCH_NAME
}
}
stage('Compile & Package') {
steps {
sh 'mvn -U clean package -DskipTests'
}
}
stage('Automated Testing') {
steps {
sh 'mvn test jacoco:report'
}
post {
always {
junit 'target/surefire-reports/*.xml'
publishHTML(target: [
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: 'target/site/jacoco',
reportFiles: 'index.html',
reportName: 'Unit Test Coverage Report'
])
}
}
}
stage('Production Deployment') {
when {
branch 'main'
}
steps {
sh './scripts/deploy-prod.sh'
}
}
}
Stages run sequentially by default, you can add when conditions to skip specific stages under predefined circumstances, or configure parallel execution for independent stages to reduce total run time.
Configuring Execution Agents
The agent block can be defined at the global pipeline level to apply to all stages, or overridden inside individual stages to use a dedicated execution environment. Common agent configuration scenarios:
- Run on any available Jenkins node:
agent any - Run on a node matching specific labels:
agent { label 'linux-amd64 && high-mem-8g' } - Run in a Docker container: Specify base image, volume mounts, network parameters, etc. Sample configuration for a frontend build stage using a Node.js image:
stage('Frontend Resource Build') {
agent {
docker {
image 'node:20-alpine'
args '-v /local/npm/cache:/root/.npm --network host'
}
}
steps {
sh 'npm install && npm run build:prod'
}
}
- Run on Kubernetes cluster: For Jenkins clusters deployed on K8s, you can define custom pod templates directly in the agent block to dynamically create execution pods.
Adding Execution Steps
Steps are the minimum execution unit in a pipeline, each step corresponds to a specific operational task. Common step types include:
- Shell command execution: Use
shfor Linux/macOS nodes,batfor Windows nodes - Plugin integration steps: e.g.
gitfor code checkout,junitfor test report parsing,dockerBuildfor container image building - Custom script calls: Run pre-written shell, Python, or other script files stored in the code repository
Sample step for building a Docker image and pushing to a private registry:
steps {
docker.withRegistry('https://registry.your-domain.com', 'docker-registry-credential-id') {
def serviceImage = docker.build("backend-service:${env.BUILD_ID}")
serviceImage.push()
serviceImage.push('latest')
}
}
Triggering Pipeline Execution
After saving the pipeline configuration, you can trigger runs via multiple channels:
- Manual trigger: Click "Build Now" on the job details page, if parameters are configured, you will be prompted to enter parameter values before execution starts
- Scheduled trigger: Configure cron expressions in the Build Triggers section to run periodically, suitable for daily scheduled test tasks
- Webhook trigger: Configure webhooks in code repositories such as GitHub, GitLab, to automatically trigger a run when new code is pushed, a PR is merged, a tag is created
- Upstream job trigger: Run automatically after another specified job completes successfully, suitable for multi-job linked workflows
During execution, you can view real-time logs via the Jenkins console output page, each stage and step will display its execution status, duration, and output separately. If execution fails, you can locate the root cause via the error log, modify the pipeline script or configuraton, and re-run the job.
Accessing Pipeline Execution Results
You can view pipeline execution data via multiple channels:
- Jenkins default UI: The job details page lists all historical builds, each build's detail page displays stage execution status, full console logs, generated artifacts, test reports, etc.
- Blue Ocean UI: The Blue Ocean plugin provides a visual pipeline dashboard, displays the stage execusion flow graphically, supports one-click restart of failed stages, and view aggregated test reports and other structured data
- Jenkins CLI: Use the official Jenkins CLI tool to query build data programmatically:
- List all builds of the target job:
jenkins-cli job list-builds <job-name> - Fetch full console log of a specific build:
jenkins-cli job console <job-name> <build-number>
- REST API: Send a GET request to
/job/<job-name>/<build-number>/api/jsonto get structured build data, including stage status, duration, trigger user, artifact download links, etc., which can be used for secondary integration into custom DevOps platforms.