Automating Software Delivery with Jenkins, Maven, and Pipeline Orchestration
- Environment Preparation and Service Initialization
Establishing a reliable continuous integration environment begins with installing the Java runtime and the Jenkins core package on your target host. For RPM-based distributions, package managers streamline the process:
# Install dependencies and core services
dnf localinstall jdk-11-openjdk-devel-11.0.12.x86_64.rpm -y
dnf localinstall jenkins-2.346-1.1.noarch.rpm -y
Once installed, the service daemon must be configured to run under an account with sufficient filesystem permissions. The primary configuration file governs this behavior:
# Update the execution context to bypass restrictive defaults
sudo sed -i 's/^JENKINS_USER=.*/JENKINS_USER="svc-jenkins"/' /etc/sysconfig/jenkins
sudo systemctl enable --now jenkins
ss -tulpn | grep 8080
After the initial setup, administrative credentials are generated in the log stream. Plugins can be bulk-installed by extracting a prepared archive into the data directory:
# Extract pre-bundled extensions
tar xf jenkins-extensions-bundle.tar.gz
cp -a jenkins-plugins/* /var/lib/jenkins/plugins/
systemctl restart jenkins
Understanding the filesystem layout is essential for troubleshooting:
/usr/lib/jenkins/: Contains the core WAR executable/etc/sysconfig/jenkins: Holds service-level parameters/var/lib/jenkins/: Default workspace for jobs, configs, and builds/var/log/jenkins/: Stores service output and error traces
- Freestyle Jobs and Git Repository Synchronization
To automate static asset delivery, configure a target web server and establish a Git workflow that bridges external sources with your internal repository.
Target Server Preparation
# Deploy Nginx and initialize service
dnf install -y nginx
systemctl enable --now nginx
Clone an external repository, strip version control metadata, and stage files for internal tracking:
git clone https://gitee.com/example-team/frontend-static.git /tmp/staging
rm -rf /tmp/staging/.git
cp -a /tmp/staging/* /srv/http/nginx_root/
Internal Repository Initialization
Re-initialize the workspace and push to your internal GitLab instance:
cd /srv/http/nginx_root
git init
git remote add internal_origin http://gitlab.internal:8080/devops/web-assets.git
git add .
git commit -m "Initial frontend baseline"
git push -u internal_origin main
Within Jenkins, create a new freestyle job, input the internal repository URL, and configure read-only credentials.
Automated Deployment Script
The build step executes a shell script that archives the workspace, transfers it to the target node, and performs an atomic symlink rotation:
#!/bin/bash
WORKSPACE_ROOT="/var/lib/jenkins/build-area/web-assets"
DOCROOT_TARGET="/srv/http/nginx_root"
TARGET_NODE="10.0.0.12"
TIMESTAMP=$(date +"%Y%m%d-%H%M%S")
ARCHIVE_NAME="site_bundle_${TIMESTAMP}.tar.gz"
cd "${WORKSPACE_ROOT}" && tar czf "/tmp/${ARCHIVE_NAME}" --exclude='.git' ./*
scp "/tmp/${ARCHIVE_NAME}" deploy_user@${TARGET_NODE}:"${DOCROOT_TARGET}/"
ssh deploy_user@${TARGET_NODE} "mkdir -p ${DOCROOT_TARGET}/release_${TIMESTAMP} && \
tar xzf ${DOCROOT_TARGET}/${ARCHIVE_NAME} -C ${DOCROOT_TARGET}/release_${TIMESTAMP} && \
rm -f ${DOCROOT_TARGET}/${ARCHIVE_NAME}"
ssh deploy_user@${TARGET_NODE} "ln -sfn release_${TIMESTAMP} ${DOCROOT_TARGET}/current"
Prerequisites: Establish passwordless SSH authentication beforehand, and ensure the Jenkins service user has execution rights for the deployment script.
Event-Driven Triggers and Status Callbacks
Configure a GitLab webhook pointing to the Jenkins generic webhook trigger endpoint. Append the generated authentication token to secure the endpoint. On the Jenkins side, configure the "Build triggers" section to listen for push events. To return build results to GitLab, enable the GitLab API token in system settings and add the "GitLab plugin" post-build action. Pushing a commit will now initiate a build and report success or failure back to the merge request interface.
- Maven Integration and Artifact Repository Management
Binary Installation and Environment Configuration
Maven requires a properly configured Java environment. Download the binary archive, extract it, and update the system path:
# Download and place binaries
curl -O https://mirrors.tuna.tsinghua.edu.cn/apache/maven/maven-3/3.9.4/binaries/apache-maven-3.9.4-bin.tar.gz
tar xf apache-maven-3.9.4-bin.tar.gz
mv apache-maven-3.9.4 /opt/build-tools/
ln -s /opt/build-tools/apache-maven-3.9.4 /opt/build-tools/maven
echo 'export PATH=/opt/build-tools/maven/bin:$PATH' | tee -a /etc/profile.d/maven.sh
source /etc/profile.d/maven.sh
mvn -version
Key directories within the Maven distribution:
bin/: CLI entry pointsboot/: Classloader componentsconf/: Global configuration fileslib/: Core runtime dependencies
Build Lifecycle Commands
mvn validate: Verifies project structure and dependenciesmvn compile: Transforms source code into bytecodemvn test: Executes unit testing suitesmvn package: Archives compiled output (e.g., JAR, WAR)mvn integration-test: Deploys artifacts to a test environmentmvn verify: Validates package quality and integritymvn install: Publishes artifacts to the local cachemvn deploy: Pushes final packages to remote registriesmvn clean: Purges previous build artifacts
Repository Architecture
Maven utilizes a hierarchical caching strategy. It first checks the local cache (~/.m2/repository), then queries remote registries. Remote sources fall into three categories:
- Central: Default upstream registry containing open-source artifacts
- Private Nexus: Local proxy and hosted registry for corporate control and bandwidth optimization
- Third-party Mirrors: Public regional caches (e.g., Aliyun, JBoss)
Configuring Remote Caching and Mirrors
To accelerate dependency resolution, configure an external mirror globally in settings.xml:
<mirrors>
<mirror>
<id>regional-cache</id>
<name>Aliyun Maven Cache</name>
<url>https://maven.aliyun.com/repository/public</url>
<mirrorOf>central</mirrorOf>
</mirror>
</mirrors>
Alternatively, define repository sources directly in a project's pom.xml to scope the configuration to a single build context.
Compiling and Testing a Sample Project
# Unpack source archive
tar xf sample-app-1.0.tar.gz
cd sample-app
# Execute test suites and create distribution package
mvn test
mvn clean package -DskipTests
Deploying a Local Nexus Registry
Host a private artifact server to manage internal releases and cache external dependencies:
# Extract and link
tar xf nexus-3.42.0-01-unix.tar.gz
mv nexus-3.42.0-01 /opt/registry/
ln -s /opt/registry/nexus-3.42.0-01 /opt/registry/nexus
# Configure execution user
echo 'run_as_user="root"' >> /opt/registry/nexus/bin/nexus.rc
/opt/registry/nexus/bin/nexus start
Access the web interface at port 8081 using default credentials (admin / admin123). Nexus exposes four primary repository types:
maven-central: Proxy for upstream central registry (release policy)maven-releases: Hosted storage for stable internal buildsmaven-snapshots: Hosted storage for developmental iterationsmaven-public: Repository group aggregating all three
Integrating Maven with Nexus
Modify the global settings.xml to route traffic through the private registry:
<servers>
<server>
<id>nexus-releases</id>
<username>admin</username>
<password>admin123</password>
</server>
<server>
<id>nexus-snapshots</id>
<username>admin</username>
<password>admin123</password>
</server>
</servers>
<mirrors>
<mirror>
<id>nexus-mirror</id>
<url>http://10.0.0.20:8081/repository/maven-public/</url>
<mirrorOf>*</mirrorOf>
</mirror>
</mirrors>
<profiles>
<profile>
<id>private-registry</id>
<repositories>
<repository>
<id>central-proxy</id>
<url>http://10.0.0.20:8081/repository/maven-public/</url>
<releases><enabled>true</enabled></releases>
<snapshots><enabled>true</enabled></snapshots>
</repository>
</repositories>
</profile>
</profiles>
<activeProfiles>
<activeProfile>private-registry</activeProfile>
</activeProfiles>
Restart Nexus and verify resolution via mvn clean install.
Full-Stack Java Application Deployment
Push source code to GitLab, then configure Jenkins to fetch and compile using Maven. On the deployment target, provision the application server and database:
# Tomcat Setup
mkdir -p /opt/app-server
tar xf apache-tomcat-9.0.65.tar.gz -C /opt/app-server
ln -s /opt/app-server/apache-tomcat-9.0.65 /opt/app-server/tomcat
# Optimize entropy for faster startup
sed -i 's/securerandom.source=file:\/dev\/random/securerandom.source=file:\/dev\/urandom/' /etc/java/java.security
/opt/app-server/tomcat/bin/startup.sh
# MariaDB Initialization
dnf install -y mariadb-server
systemctl enable --now mariadb
mysql_secure_installation
mysql -u root -p -e "CREATE DATABASE app_schema; SOURCE /path/to/init.sql;"
Finally, transfer the compiled WAR file from Jenkins to the Tomcat deployement directory:
ssh deploy_user@10.0.0.20 "rm -rf /opt/app-server/tomcat/webapps/*"
scp /var/lib/jenkins/workspace/maven-build/target/app-core.war deploy_user@10.0.0.20:/opt/app-server/tomcat/webapps/ROOT.war
- Pipeline as Code and Advanced CI/CD Workflows
Pipelines transition traditional point-and-click job configuration into version-controlled, executable scripts. This approach bridges Continuous Integration (compilation, testing) and Continuous Delivery (deployment orchestration).
Jenkinsfile Architecture
A pipeline consists of three primary components:
- Agent: Specifies the execution node or container
- Stage: Logical grouping of tasks (e.g., Fetch, Validate, Deploy)
- Step: Individual commands or plugin invocations within a stage
Two syntax models exist: Declarative (structured, strict schema) and Scripted (Groovy-based, flexible). The declarative model is recommended for standard workflows:
pipeline {
agent any
stages {
stage('Compile') {
steps {
echo 'Processing source tree...'
}
}
stage('Verify') {
steps {
echo 'Running validation suite...'
}
}
stage('Release') {
steps {
echo 'Publishing artifacts...'
}
}
}
}
Executing Pipelines from Source Control
Store the Jenkinsfile at the repository root. When creating a Jenkins pipeline job, select "Pipeline script from SCM" and link to the Git repository. This ensures the build definition evolves alongside the codebase.
Advanced Pipeline Example with Remote Deployment
The following pipeline automates artifact packaging and zero-downtime deployment via SSH:
pipeline {
agent any
environment {
REMOTE_SERVER = '10.0.0.20'
DEPLOY_PATH = '/srv/http/nginx_root'
}
stages {
stage('Fetch Source') {
steps {
echo 'Pulling latest commits from SCM...'
}
}
stage('Execute Tests') {
steps {
echo 'Running unit and integration checks...'
}
}
stage('Archive Output') {
steps {
sh "tar czf /tmp/build_${BUILD_ID}.tar.gz -C ./* --exclude='.git' --exclude='Jenkinsfile'"
}
}
stage('Deploy to Target') {
steps {
sh "ssh deploy_user@${REMOTE_SERVER} 'mkdir -p ${DEPLOY_PATH}/v${BUILD_ID}'"
sh "scp /tmp/build_${BUILD_ID}.tar.gz deploy_user@${REMOTE_SERVER}:${DEPLOY_PATH}/v${BUILD_ID}/"
sh "ssh deploy_user@${REMOTE_SERVER} 'cd ${DEPLOY_PATH}/v${BUILD_ID} && tar xzf build_${BUILD_ID}.tar.gz && rm -f build_${BUILD_ID}.tar.gz'"
sh "ssh deploy_user@${REMOTE_SERVER} 'ln -sfn v${BUILD_ID} ${DEPLOY_PATH}/live'"
}
}
}
}
The BUILD_ID environment variable guarantees unique directory naming for each execution. For users unfamiliar with Groovy syntax or plugin directives, the built-in Pipeline Syntax Generator provides a drag-and-drop interface to construct valid steps, which can then be exported directly into the Jenkinsfile.