Jenkins Pipeline Maven /Build + Deploy to Nexus/Artifactory in Docker

Overview

If you want to use a Jenkins pipeline to build your maven project in a clean docker container, you can do so as shown below.

You can do any maven commands you like.  In this particular case I am:

  • Setting the version in the POM to the one provided by the person running the Jenkins job.
  • Building and deploying the maven project to Artifactory.

This is the “build one” of “build once deploy everywhere”.

To test this though, you can just swap my commands out for maven –version or something simple like that.

Configuration

Create  a new pipeline job and:

  1. Add a string parameter called VERSION_TO_TAG.
  2. Set pipeline definition to “Pipeline script from SCM”.
  3. Give your repo checkout URL – e.g. ssh://git@url.mycompany.com:7999/PROJECT/repo-name.git.
  4. Specify your git credentials to use.
  5. Specify your Jenkinsfile path (It is probably literally just “Jenkinsfile” which is the default if you have the Jenkinsfile in the root of your repo).

Make sure your project has a “Jenkinsfile” at its root with a definition like this:

pipeline {
    agent {
        docker { image 'maven:3-alpine' }
    }

    stages {
        stage('Set version, build, and deploy to artifactory.') {
            steps {
                sh 'mvn versions:set -DnewVersion=$VERSION_TO_TAG'
                sh 'mvn deploy'
            }
        }
    }
}

Now, when you build your project, you should see it docker-pull a maven:3-alpine image, start the container, and run our maven commands and upload artifacts to the Artifactory repository you have set up in your maven POM (in your distributionManagement section in case you haven’t done that yet).

Python PIP Install Local Module While Developing

I’m definitely still in the early stages of learning module/package building and deployment for python.  So, take this with a grain of salt…

But I ran into a case where I wanted to develop/manipulate a package locally in PyCharm while I was actively using it in another project I was developing (actually, in a Jupyter notebook).  It turns out there’s a pretty cool way to do this.

Module Preparation

The first thing I had to do was prepare the package so that it was deployable using the standard python distribution style.

In my case, I just made a directory for my package (lower-case name, underscore separators).  Inside the directory, I created:

Here’s an example.  Ignore everything that I didn’t mention; all of that is auto-generated by PyCharm and not relevant.  In fact, it probably would have been better to create a sub-directory in this project for the package; but I just considered the top level directory the package directory for now.

package-layout

Module Installation

Once you have your module set up like this, you can jump into your command line, assuming you have PIP installed, and you can run this command (tailored for your package directory location):

λ pip install -e C:\dev\python\jupyter_audit_query_tools
Obtaining file:///C:/dev/python/jupyter_audit_query_tools
Installing collected packages: PostgresQueryRunner
Running setup.py develop for PostgresQueryRunner
Successfully installed PostgresQueryRunner

You’ll also be able to see the package mapped to that directory when you list the packages in PIP:

λ pip list | grep postgres
postgres-query-runner 1.1.0 c:\dev\python\jupyter_audit_query_tools

Module Usage

After this, you should be able to import and use the package / modules in your interpreter or notebook.  You can change the code in the package and it will update in the places you’re using it assuming you re-import the package.  So, in Jupyter, this would mean clicking the restart-kernel/re-run button.