Introduction
In today’s fast-paced digital landscape, the ability to deliver software quickly, reliably, and securely has become a cornerstone of successful businesses. DevOps has emerged as a transformative approach to streamline software development and operations, breaking down traditional silos between teams to foster collaboration, efficiency, and continuous delivery. By integrating development and IT operations, DevOps not only accelerates the delivery of high-quality software but also enhances flexibility in adapting to changing requirements. This methodology ensures organizations can meet the increasing demand for faster releases without compromising on reliability or performance.
At the heart of DevOps lies a strong emphasis on automation, scalability, and security. Automation reduces manual effort, minimizes errors, and enables consistent deployment processes, allowing teams to focus on innovation rather than repetitive tasks. Scalability ensures that systems can adapt seamlessly to growing user demands, maintaining optimal performance regardless of scale. Security, integrated into every stage of the pipeline, ensures that applications remain resilient to threats and vulnerabilities, safeguarding sensitive data and maintaining trust with users. To implement and manage these principles effectively, a robust set of tools forms the backbone of a DevOps pipeline. Jenkins serves as a powerful automation server, orchestrating various stages of the CI/CD process. Maven simplifies the management of project dependencies and builds, ensuring seamless integration with development workflows. SonarQube enforces code quality and security standards, identifying vulnerabilities and promoting best practices. Nexus acts as a repository manager, streamlining artifact storage and distribution.
Containerization with Docker ensures applications run consistently across environments, while AWS Elastic Kubernetes Service (EKS) offers a scalable and managed Kubernetes platform to orchestrate containers. Finally, ArgoCD facilitates declarative GitOps workflows for continuous delivery, providing a secure and efficient way to manage application deployments in Kubernetes environments.
Together, these tools and practices create a comprehensive DevOps pipeline designed to enhance efficiency, maintain scalability, and ensure the security of software delivery. This introduction provides an overview of how these components contribute to modern software delivery, setting the stage for a deeper exploration of each tool and its role in building a robust DevOps workflow.
Pipeline Overview
A DevOps pipeline is a structured and automated workflow designed to streamline software delivery by integrating various stages from code development to deployment. Each stage serves a critical function, ensuring the delivery of high-quality, secure, and scalable software. Below is a detailed breakdown of the pipeline stages:
1. Code Integration
The pipeline begins with developers committing their code changes to a shared version control system, such as Git. Jenkins or a similar CI tool detects these changes and triggers the pipeline. This stage ensures that new code is integrated with the existing codebase smoothly and consistently, minimizing integration issues.
2. Build and Test
In this stage, the pipeline compiles the code and packages it into a deployable artifact using tools like Maven. Automated tests, including unit, integration, and functional tests, are executed to verify the functionality and stability of the code. Successful execution of tests ensures that the build is stable and ready for the next stages.
3. Code Quality Analysis
Code quality and security checks are performed using tools like SonarQube. This stage identifies vulnerabilities, ensures adherence to coding standards, and evaluates code maintainability. It provides actionable feedback to developers, promoting the delivery of high-quality code.
4. Artifact Storage
Once the code passes the build and quality analysis stages, the resulting artifact is stored in a repository manager like Nexus. This central repository securely stores versioned artifacts, enabling teams to track and manage dependencies efficiently.
5. Containerization
The stored artifact is then containerized using Docker. Containerization ensures that the application runs consistently across different environments by encapsulating the code, runtime, dependencies, and configurations into lightweight, portable containers.
6. Deployment
Containers are deployed to a managed Kubernetes platform like AWS EKS. This stage involves orchestrating the deployment of containers, ensuring scalability, high availability, and fault tolerance.
7. Continuous Delivery
The pipeline concludes with continuous delivery, managed by a GitOps tool like ArgoCD. ArgoCD monitors the desired state of the application defined in a Git repository and ensures that deployments align with this state. This declarative approach facilitates automated, reliable, and version-controlled updates to production environments.
1. Continuous Integration with Jenkins
Jenkins is an automation server that plays a key role in the continuous integration (CI) process by automating builds, tests, and deployments. Jenkins automates the process of integrating code changes, running builds, and executing tests. It ensures that every commit is validated and that any issues are detected early in the development process.
Key Features
Triggering Workflows: Jenkins can trigger workflows automatically based on code changes using Git hooks or polling.
Job Monitoring: Jenkins monitors build jobs and provides real-time feedback on their status.
Plugin Support: Jenkins has a large library of plugins to integrate with other tools, enhancing its functionality.
Handling Code Changes
Jenkins detects code changes using Git hooks (to trigger builds immediately) or polling (to check for changes at intervals).
Importance of Immediate Feedback
Jenkins provides immediate feedback on code changes, allowing developers to identify and fix issues quickly, ensuring a stable and reliable codebase.
In short, Jenkins automates CI processes, providing fast feedback and ensuring the stability of code integration.
pipeline {
agent any
parameters{
string(name: 'DOCKER_TAG', defaultValue: 'latest', description: 'Docker tag')
}
tools {
maven 'maven3'
}
environment {
SCANNER_HOME = tool 'sonar-scanner'
}
stages {
stage('Git Checkout') {
steps {
git branch: 'main', url: '<repo-url>'
}
}
stage('Compile') {
steps {
sh 'mvn compile'
}
}
stage('Test') {
steps {
sh 'mvn test -DskipTests=true'
}
}
stage('SonarQube Analysis') {
steps {
withSonarQubeEnv('sonarqube-server') {
sh '''$SCANNER_HOME/bin/sonar-scanner -Dsonar.projectName=bankapp -Dsonar.projectKey=bankapp -Dsonar.java.binaries=target'''
}
}
}
stage('Build') {
steps {
sh 'mvn package -DskipTests=true'
}
}
stage('Publish Artifacts') {
steps {
withMaven(globalMavenSettingsConfig: 'maven-globalsettings', jdk: '', maven: 'maven3', mavenSettingsConfig: '', traceability: true) {
sh 'mvn deploy -DskipTests=true'
}
}
}
stage('Docker Build and Tag Image') {
steps {
script {
withDockerRegistry(credentialsId: 'docker-credentials', toolName: 'docker') {
sh "docker build -t <dockerhub-userid/image-name>:${params.DOCKER_TAG} ."
}
}
}
}
stage('Push Image') {
steps {
script {
withDockerRegistry(credentialsId: 'docker-credentials', toolName: 'docker') {
sh "docker push <dockerhub-userid/image-name>:${params.DOCKER_TAG}"
}
}
}
}
stage('Update YAML File') {
steps {
script {
withCredentials([gitUsernamePassword(credentialsId: 'git-credentials', gitToolName: 'Default')]) {
sh '''
# Remove existing directory if it exists
if [ -d "Multi-Tier-BankApp-CD" ]; then
rm -rf Multi-Tier-BankApp-CD
fi
# Clone repository
git clone <repo-url-of-yaml-files>
cd Multi-Tier-BankApp-CD
# List the contents of the bankapp directory
ls -l bankapp
# Update the image version in the YAML file
sed -i 's|image: <dockerhub-userid/image-name>:.*|image: <dockerhub-userid/image-name>:'${DOCKER_TAG}'|' bankapp/bankapp-ds.yml
# Display the updated YAML file
cat bankapp/bankapp-ds.yml
'''
// Set Git user details
sh '''
cd Multi-Tier-BankApp-CD
git config user.email "mail"
git config user.name "username"
'''
// Add, commit, and push the changes to Git
sh '''
cd Multi-Tier-BankApp-CD
git add bankapp/bankapp-ds.yml
git commit -m "Updated image ${DOCKER_TAG}"
git push origin main
'''
}
}
}
}
}
}
This pipeline automates the build, test, quality analysis, artifact management, containerization, and deployment update processes for a Java-based banking application. Each stage is tailored to ensure an efficient DevOps workflow, reducing errors and enabling quick application delivery.
Pipeline Parameters and Tools
Agent Declaration:
Specifies the pipeline to run on any available Jenkins agent.
Flexibility to execute on different environments.
Parameterization:
Introduces DOCKER_TAG, allowing customizable tagging for Docker images.
Helps manage versioning of images effectively.
Tools Configuration:
Includes Maven for build and dependency management.
Configures SonarQube Scanner for code analysis.
Environment Variables:
- Sets SCANNER_HOME to the SonarQube scanner path for seamless integration.
Stages:
1. Git Checkout
Purpose: Retrieve the latest codebase for the project.
Actions: Pulls the main branch from the provided repository URL.
Outcome: Ensures the pipeline starts with the most recent code changes.
2. Compile
Purpose: Validate the code by compiling it.
Actions: Executes mvn compile to check for errors.
Outcome: Verifies that the codebase is error-free and ready for the next stage.
3. Test
Purpose: Run test cases to validate the application.
Actions: Executes the Maven test command while skipping tests for faster execution in this demo context.
Outcome: Ensures a basic level of testing before proceeding.
4. SonarQube Analysis
Purpose: Analyze the code for quality, security vulnerabilities, and maintainability.
Actions:
Configures SonarQube with project-specific parameters like project name and key.
Runs the scanner tool to assess the Java binaries.
Outcome: Provides a detailed report on code quality.
5. Build
Purpose: Create a deployable artifact.
Actions: Uses the mvn package command to bundle the application into a .jar file.
Outcome: Produces a package that can be distributed or deployed.
6. Publish Artifacts
Purpose: Store the build artifact in a central repository for future use.
Actions:
Executes Maven's deploy phase to upload the .jar file to Nexus or another repository.
Configures global Maven settings to ensure compatibility.
Outcome: Stores artifacts in a central location for versioning and sharing.
7. Docker Build and Tag Image
Purpose: Containerize the application for deployment.
Actions:
Builds a Docker image from the application’s codebase.
Tags the image with the version specified by the DOCKER_TAG parameter.
Outcome: Generates a portable Docker image ready for deployment.
8. Push Image
Purpose: Upload the Docker image to a registry (e.g., DockerHub).
Actions:
Authenticates using Docker credentials.
Pushes the tagged image to the specified registry.
Outcome: Makes the image available for deployment.
9. Update YAML File
This stage ensures the Kubernetes deployment uses the latest version of the Docker image created earlier in the pipeline.
Kubernetes deployments are defined and managed through configuration files, typically in YAML format. These files specify key details like the container image to deploy. When a new Docker image version is created, the deployment configuration must be updated to reference the new image. Automating this step eliminates manual intervention and ensures consistency across environments.
1. Clone the Repository
The YAML files for the Kubernetes deployment are stored in a Git repository. Cloning this repository ensures access to the latest configuration files.
The pipeline clones the repository (Multi-Tier-BankApp-CD) where the YAML files are maintained.
git clone <repo-url>
Before cloning, the pipeline removes any previously existing local copies of this directory to avoid conflicts.
After cloning, it navigates into the repository directory to access the necessary files.
2. Update the Deployment File
The deployment file (bankapp-ds.yml) contains the image information for the Kubernetes Deployment. Updating the image field ensures Kubernetes pulls the correct image version during deployment.
The sed command is used to find and replace the existing image reference with the new Docker image and tag:
sed -i 's|image: <dockerhub-userid/image-name>:.*|image: <dockerhub-userid/image-name>:'${DOCKER_TAG}'|' bankapp/bankapp-ds.yml
The s|...|...| syntax in sed is used for search-and-replace operations.
The .* wildcard matches the existing image tag, which is replaced by the new DOCKER_TAG.
Verification:
After updating the file, the pipeline displays the updated YAML to confirm the change:cat bankapp/bankapp-ds.yml
3. Configure Git User Details
Git operations (e.g., commits) require user details for attribution.
The pipeline sets the Git user name and email to identify the changes:
git config user.email "mail" git config user.name "username"
4. Commit and Push Changes
The updated YAML file must be saved and pushed back to the repository so the deployment can use it.
Steps
Stages the modified YAML file:
git add bankapp/bankapp-ds.yml
Commits the changes with a descriptive message:
git commit -m "Updated image ${DOCKER_TAG}"
Pushes the changes to the repository:
git push origin main
The Kubernetes YAML file (bankapp-ds.yml) now references the updated Docker image with the correct tag.
Once deployed, Kubernetes will pull the new image automatically, ensuring the latest application version runs in the cluster.
This automation reduces errors and enforces version consistency across environments.
2. Build Management with Maven
Maven is a build automation tool that streamlines the process of managing dependencies, compiling code, running tests, and packaging applications. It plays a vital role in ensuring a consistent and reliable build process across different environments in the DevOps pipeline.
Purpose of Maven in the Build Stage:
Maven automates the process of building, testing, and packaging applications. It reduces complexity by standardizing the build process, ensuring that every build is reproducible and consistent.
Dependency Management:
Maven simplifies dependency management by automatically downloading and managing project libraries from a central repository. The pom.xml file specifies the dependencies, ensuring that the correct versions are used across all environments, which eliminates compatibility issues.
Automated Testing:
Maven integrates with testing frameworks like JUnit and TestNG to automate unit and integration testing during the build process. This ensures that code is tested continuously, catching issues early and preventing defects from progressing further.
Artifact Creation:
Once the build and tests are successful, Maven packages the application into deployable artifacts (e.g., JAR, WAR files). These artifacts are stored in repositories like Nexus, where they can be retrieved for deployment.
Ensuring Consistency Across Environments:
Maven ensures that builds are consistent across environments by using the same pom.xml configuration for dependency management and build processes. This prevents the "it works on my machine" problem, making sure that the application behaves the same way in all environments.
In short, Maven automates the build, testing, and artifact creation process, ensuring a consistent and reliable pipeline across different environments.
3. Code Quality Analysis with SonarQube
In critical applications like banking, code quality is essential for ensuring security, performance, and reliability. Poor code quality can lead to vulnerabilities, errors, and system failures, which in sensitive environments can result in severe consequences.
SonarQube’s Features
Static Code Analysis: SonarQube scans the code to identify issues like bugs, vulnerabilities, and code smells without executing the program.
Security Checks: SonarQube provides security analysis to detect potential risks, ensuring that applications are secure against attacks.
Feedback Integration with Jenkins: SonarQube integrates with Jenkins to provide real-time feedback during the CI process, highlighting issues as soon as they are introduced.
Improving Code Maintainability and Reliability
By identifying and fixing issues early, SonarQube enhances code maintainability and reliability. Continuous feedback encourages best practices, making the codebase more efficient, secure, and easier to maintain.
In summary, SonarQube plays a crucial role in maintaining high code quality, especially in critical applications, by offering static analysis, security checks, and seamless integration with Jenkins.
4. Artifact Storage in Nexus
Nexus acts as a repository manager that stores and manages artifacts (e.g., JAR, WAR files) securely in the DevOps pipeline. It ensures that all build artifacts are versioned, easily accessible, and stored in a central location, facilitating smooth integration and deployment processes.
How Artifacts Are Uploaded and Managed: Artifacts are uploaded to Nexus after the build process (often triggered by Jenkins) using tools like Maven or Gradle. Nexus manages these artifacts by organizing them into repositories and tagging them with unique versions. It ensures that developers always access the correct version of an artifact for further testing or deployment.
Importance of Version Control for Reliable Deployments:
Version control in Nexus ensures that only specific versions of artifacts are deployed, reducing the risk of deploying incorrect or incompatible versions. This helps maintain consistency across environments and ensures reliable, repeatable deployments.
In summary, Nexus plays a critical role in managing artifacts securely, organizing them by version, and ensuring consistent and reliable deployments across different stages of the pipeline.
To configure Nexus with Jenkins for artifact storage, the following steps were carried out:
Installed the Config File Provider Plugin:
- This Jenkins plugin allows managing configuration files like Maven settings.xml directly within Jenkins. It was installed from the Manage Jenkins > Plugins section.
Created Global Maven Settings:
Navigated to Manage Jenkins > Managed Files.
Created a new Global Maven Settings file.
In this settings file, the repository configurations for both maven-releases and maven-snapshots were added under the <servers> section.
For each repository, details such as the repository URL and credentials (user ID and password) were provided. This ensures Jenkins has the necessary permissions to upload artifacts to these repositories.
<servers> <server> <id>maven-releases</id> <username>your-username</username> <password>your-password</password> </server> <server> <id>maven-snapshots</id> <username>your-username</username> <password>your-password</password> </server> </servers>
Updated the pom.xml File:
The pom.xml of the project was updated to include the Nexus repository URLs for both releases and snapshots.
The <distributionManagement> section was configured with the IP address or hostname of the Nexus server.
Example:
<distributionManagement> <repository> <id>maven-releases</id> <url>http://<Nexus-IP>:8081/repository/maven-releases/</url> </repository> <snapshotRepository> <id>maven-snapshots</id> <url>http://<Nexus-IP>:8081/repository/maven-snapshots/</url> </snapshotRepository> </distributionManagement>
Integration in Jenkins Pipeline:
The pipeline was configured to utilize the provided Maven settings by specifying the global settings file during the Maven build step.
With the settings.xml file in place, Jenkins could seamlessly authenticate and deploy build artifacts to the Nexus repositories as part of the pipeline.
5. Containerization with Docker
Docker is crucial for standardizing application environments because it allows developers to package applications and their dependencies into isolated containers. This ensures that the application behaves the same way across different environments, from development to production, eliminating the problem.
Process of Creating Docker Images Using Dockerfile: To create Docker images, developers write a Dockerfile, which defines the environment and steps to set up the application (e.g., base image, dependencies, configuration). Docker then builds an image from this file, which can be run as a container on any system that supports Docker.
Benefits of Containerization for Portability and Consistency: Containerization provides portability by enabling applications to run consistently across various platforms, whether on a developer's laptop, a test server, or a cloud environment. It also ensures consistency by maintaining the same dependencies, configurations, and environment settings inside each container, streamlining the deployment process. In summary, Docker is essential for standardizing environments, creating consistent and portable containers, and ensuring that applications can run seamlessly across different stages of the pipeline.
6. Deployment to AWS EKS
AWS EKS (Elastic Kubernetes Service) simplifies the management of Kubernetes clusters, providing a fully managed service that handles cluster setup, maintenance, and scaling. It eliminates the operational overhead of managing Kubernetes manually and integrates seamlessly with other AWS services, offering high availability and security. Kubernetes plays a crucial role in managing containerized applications by automating the deployment, scaling, and operation of workloads. It ensures efficient resource utilization, self-healing capabilities (auto-restarts, replacements), and seamless scaling of applications based on demand, making it ideal for handling dynamic workloads.
Deployment Process: Setting Up Clusters, Managing Workloads, and Autoscaling
Setting Up Clusters: With AWS EKS, clusters are easily created and managed through the AWS Management Console or CLI. EKS takes care of the underlying infrastructure and Kubernetes components.
Managing Workloads: Kubernetes enables the management of application workloads through Pods, Deployments, and Services. You can easily deploy, update, or scale applications across your EKS cluster.
Autoscaling: AWS EKS supports both Horizontal Pod Autoscaling (scaling the number of Pods based on CPU/memory usage) and Cluster Autoscaling (scaling the entire cluster based on resource demands), ensuring efficient and cost-effective resource utilization.
In summary, AWS EKS simplifies Kubernetes deployment and management, automates scaling, and integrates with AWS services, ensuring high availability, flexibility, and ease of management for containerized applications.
7. Continuous Deployment with ArgoCD
GitOps is an operational model where Git repositories serve as the single source of truth for managing and defining both infrastructure and application deployment. It brings a declarative approach to managing Kubernetes clusters, making it easier to automate deployments and manage configurations. ArgoCD is a GitOps tool specifically built to integrate with Kubernetes. It automates continuous delivery by synchronizing the state of applications in Kubernetes with their corresponding configurations in Git repositories. This approach not only simplifies deployment processes but also improves consistency, security, and traceability of changes. ArgoCD stands out as a powerful tool because it bridges the gap between Kubernetes and GitOps, offering a streamlined and automated way to manage application delivery. By enabling a Git-centric workflow, ArgoCD ensures that Kubernetes deployments are continuously updated based on the latest changes in the Git repository, ensuring consistency between the desired and actual states of applications.
Features of ArgoCD
Real-time Synchronization: ArgoCD automatically monitors Git repositories for changes and syncs those changes to the Kubernetes cluster in real time. This eliminates the need for manual intervention, enabling continuous delivery and ensuring that the deployed application always reflects the latest committed configuration in Git.
Rollback Capabilities: One of the key benefits of ArgoCD is its ability to perform easy rollbacks. If an issue arises during deployment, ArgoCD allows users to revert to previous versions of the application or configuration. This feature helps ensure that failures can be quickly mitigated, and stability is maintained, minimizing downtime and operational risk.
Visualization: ArgoCD provides a comprehensive web UI that visualizes the state of the applications and their environments. It displays a clear view of the deployment process, allowing users to track the current state, see which version of an application is deployed, and quickly spot any discrepancies or issues. The visualization feature also aids in troubleshooting, as users can easily see changes made and how they were applied.
Role of ArgoCD in Automating Deployment Management: ArgoCD automates deployment management by continuously monitoring the desired state of applications stored in Git and ensuring that the Kubernetes clusters are always in sync with this configuration. The automation of this process removes manual steps from deployment workflows, reducing human errors and speeding up the release cycle. ArgoCD integrates seamlessly with Kubernetes clusters, ensuring that any changes to the application or configuration in Git repositories are automatically deployed without manual intervention. This enables developers to push updates with confidence, knowing that the deployment process will be managed automatically and consistently. Additionally, ArgoCD facilitates declarative infrastructure management, where users define their infrastructure as code in Git, and ArgoCD takes care of applying those configurations to the cluster. In summary, ArgoCD is a key component in modern DevOps practices, empowering teams with a GitOps approach to automate continuous delivery. Its features like real-time synchronization, rollback capabilities, and visualization make it an indispensable tool for managing Kubernetes deployments efficiently, ensuring that applications are consistently and securely delivered with minimal effort.
To showcase the successful implementation of the pipeline, here is the final output of the deployed Banking Application. These screenshots capture the live application running on the Kubernetes cluster, reflecting how the entire CI/CD process seamlessly integrates development, testing, and deployment.
8. Conclusion
The integration of a comprehensive CI/CD pipeline in this project has provided significant benefits, driving the automation, quality, and scalability needed for successful modern software development. By leveraging powerful tools like Jenkins, Maven, SonarQube, Nexus, Docker, AWS EKS, and ArgoCD, the pipeline ensures that every aspect of the development, testing, and deployment process is streamlined, efficient, and reliable.
The core benefit of this pipeline is the automation of the software delivery process. Jenkins automates code integration, build, and deployment steps, reducing manual intervention and accelerating the development cycle. Automated testing, triggered by Jenkins pipelines, ensures that only code that meets predefined quality standards is deployed. This continuous feedback loop provides developers with real-time insights into code quality, enabling rapid bug identification and resolution. As a result, the team can focus on creating new features and improving the application while the pipeline handles the routine tasks. Quality is a cornerstone of this pipeline. With tools like SonarQube, the code is continuously analyzed for potential bugs, security vulnerabilities, and adherence to best practices. This allows for proactive code quality management, preventing issues from reaching production. Maven contributes by managing dependencies and ensuring that builds are consistent across environments, while Nexus serves as a secure artifact repository, enabling reliable and versioned artifact storage. These tools, working together, create a robust environment where code quality is maintained at every stage, leading to more reliable, maintainable, and secure software.
As applications scale, so too must the infrastructure that supports them. AWS EKS plays a critical role in ensuring the application can scale effortlessly with the demands of modern cloud-native workloads. Kubernetes, managed by AWS EKS, allows the application to run in a highly available and scalable environment with features like autoscaling, load balancing, and efficient resource management. Additionally, ArgoCD ensures that the deployment process is automated, offering continuous delivery that keeps the system up to date with minimal downtime. The ability to automatically deploy new versions and manage infrastructure scaling means that the system can handle increasing workloads without manual intervention, ensuring performance and availability as the application grows. This pipeline is a prime example of the significance of modern DevOps practices, especially for cloud-native applications. In today’s fast-paced development landscape, where speed and reliability are crucial, DevOps bridges the gap between development and operations, enabling continuous collaboration, automation, and delivery. By embracing these practices, organizations can reduce manual errors, improve deployment speed, and ensure high-quality software with minimal risk. The use of cloud platforms like AWS and the adoption of containerization technologies like Docker, paired with Kubernetes, also offers enhanced flexibility and resilience, key requirements for today’s software environments.
A special thanks to Aditya Jaiswal (DevOps Shack) YouTube channel. The video offered valuable support in configuring and running the pipeline. Its detailed, step-by-step breakdowns and insights were essential for gaining a clear understanding of the tools and methods applied throughout the process.
In conclusion, this CI/CD pipeline not only streamlines the development and deployment process but also fosters a culture of continuous improvement and innovation. The benefits of automation, quality assurance, and scalability are undeniable, and their integration within modern DevOps practices ensures that cloud-native applications can meet the dynamic demands of the market. As organizations continue to embrace DevOps, the ability to deliver faster, more reliable software while maintaining high standards will be a key factor in staying competitive and meeting customer expectations.