MPL – Modular Pipeline Library

https://jenkins.io/blog/2019/01/08/mpl-modular-pipeline-library/

Ebay Products

For example, let’s say you have a common Java Maven project. You are creating a
Jenkinsfile in the repo, and want to use the default pipeline prepared by your
DevOps team. The MPL already has a simple pipeline: the core MPLPipeline. It’s
a really simple pipeline, but it’s a good start for anyone who wants to try the
MPL. Let’s look at a simple Jenkinsfile:

@Library('mpl') _
MPLPipeline {}

This Jenkinsfile contains a single line to load the MPL, and another line to run
the pipeline. Most of the shared libraries implement an interface like this,
calling one step and providing some parameters. MPLPipeline is merely a custom
Pipeline step, as it lies in the vars directory, and its structure is very
simple, following these steps:

  1. Initialize the MPL
    The MPL uses the MPLManager singleton object to control the pipeline

  2. Merge configuration with default and store it
    A default configuration needed to specify stages and predefine some useful configs

  3. Define a declarative pipeline with 4 stages and poststeps:

    1. Checkout – Getting the project sources

    2. Build – Compiling, validation of static, unit tests

    3. Deploy – Uploading artifacts to the dynamic environment and running the app

    4. Test – Checking integration with other components

    5. Poststeps – Cleaning dynamic environment, sending notifications, etc.

  4. Running the defined pipeline
    This is where the MPL starts to work its magic and actually runs

Stages of the main MPL usually have just one step, the MPLModule.
This step contains the core functionality of the MPL: executing the modules
which contain the pipeline logic. You can find default modules in the MPL
repository, which are placed in resources/com/griddynamics/devops/mpl/modules.
Some of the folders include: Checkout, Build, Deploy, and Test, and in each of
them we can find Groovy files with the actual logic for the stages. This
infographic is a good example of a simplified MPL repository
structure:

Fig 3. A simplified MPL repository structure

When the Checkout stage starts, MPLModule loads the module by name (by default
a stage name), and runs the Checkout/Checkout.groovy
logic:

if( CFG.'git.url' )
  MPLModule('Git Checkout', CFG)
else
  MPLModule('Default Checkout', CFG)

If the configuration contains the git.url option, it will load a Git Checkout
module; otherwise, it will run the Default Checkout module. All the called
modules use the same configuration as the parent module, which is why CFG was
passed to the MPLModule call. In this case, we have no specific configuration,
so it will run the
Checkout/DefaultCheckout.groovy
logic. The space in the name is a separator to place the module into a specific
folder.

In the Default Checkout module, there is just one line with checkout scm
execution, which clones the repository specified in the Jenkins job. That’s all
the Checkout stage does, as the MPL functionality is excessive for such a small
stage, and we only need to talk about it here to show how the MPL works in
modules.

The same process applies to the Build stage, as the pipeline runs the
Maven Build
module:

withEnv(["PATH+MAVEN=${tool(CFG.'maven.tool_version' ?: 'Maven 3')}/bin"]) {
  def settings = CFG.'maven.settings_path' ? "-s '${CFG.'maven.settings_path'}'" : ''
  sh """mvn -B ${settings} -DargLine='-Xmx1024m -XX:MaxPermSize=1024m' clean install"""
}

This stage is a little bit more complicated, but the action is simple: we take
the tool with the default name Maven 3, and use it to run mvn clean install.
The modules are scripted pipelines, so you can do the same steps usually
available in the Jenkins Pipeline. The files don’t need any specific and
complicated syntax, just a plain file with steps and CFG as a predefined
variable with a stage configuration. The MPL modules inherited the sandbox from
the parent, so your scripts will be safe and survive the Jenkins restart, just
like a plain Jenkins pipeline.

In the Deploy folder, we find the sample structure of the Openshift Deploy
module. Its main purpose here is to show how to use poststep definitions in the
modules:

MPLPostStep('always') {
  echo "OpenShift Deploy Decommission poststep"
}
echo 'Executing Openshift Deploy process'

First, we define the always poststep. It is stored in the MPLManager, and is
called when poststeps are executed. We can call MPLPostStep with always as
many times as we want: all the poststeps will be stored and executed in FILO
order. Therefore, we can store poststep logic for actions that need to be done,
and then undone, in the same module, such as the decommission of the dynamic
environment. This ensures that the actions will be executed when the pipeline
is complete.

After the deploy stage, the pipeline executes the Test stage, but nothing too
interesting happens there. However, there is an aspect of testing which is very
important, and that’s the testing framework of the MPL itself.

Next Post

hadley/ggplot2-book

Sat Apr 6 , 2019
https://github.com/hadley/ggplot2-book/blob/master/springer/contract-2.pdf

You May Like