In the previous post in this series, I covered my favorite development-time helper: running job scripts from the command line. In this post, I’ll cover the differences between job-dsl and Pipelines, and how I currently see the two living together in the Jenkins ecosystem.
If you’re coming into this post directly, without reading the preceding articles in the series, I strongly encourage you to start at the start and then come back. For the rest of you, a quick refresher:
job-dsl is a way of creating Jenkins jobs with code instead of the GUI. Here’s a very simple example:
When processed via a Jenkins seed job, this code will turn into the familiar Jenkins jobs you know and love.
What are Pipelines?
Jenkins Pipelines are a huge topic, way more than I am going to cover in a single blog post. So I’m going to give the 30,000 foot view, leave a whole bunch of stuff out, and I hope whet your appetite for learning more. For the impatient, skip these next few paragraphs and head straight for the Jenkins Pipeline tutorial.
At its simplest, a Pipeline is very job-dsl-ish: it lets you describe operations in code. But you have to shift your mindset quite a bit from the Freestyle jobs you know well. When configuring a Freestyle job, you have the vast array of Jenkins plugins at your fingertips in the GUI (and job-dsl) — SCM management, Build Triggers, Build Steps, Post-build actions.
But with Pipelines, it’s different. You get Build Triggers and your Pipeline definition. But what about the other stuff, you ask? This is where the mindshift comes in. Those things are no longer configured at the job level, but at the Pipeline level. And plugins are not automatically supported in Pipelines, so currently you get a subset of all available Jenkins functionality in Pipelines.
Thus in practice, that means things like git repos, build steps, email, test recording/reporting, publishers, etc are all done — in text — in the Pipeline definition.
Here’s an example that ships with Jenkins:
This is kinda sorta like…
Probably confusing, I know. Let’s try to think of it this way: If you’ve read Jez Humble and David Farley’s Continuous Delivery, or have otherwise implemented build/deploy pipelines in Jenkins for years, you already have a solid conceptual sense of pipelines. It’s just that in Jenkins world until rather recently, you probably did this in one of two ways:
- Upstream / downstream jobs (possibly in combination with the Delivery Pipeline plugin); or
- Via the Build Flow plugin, with independent jobs being orchestrated via a simple text DSL
Either way, you probably had independent Freestyle jobs tied together somehow to make a pipeline.
Well, Jenkins Pipelines still certainly enable you to do that — and I’ll talk specifically about option #2 momentarily — but the big change here is that Pipelines enable you to do all that orchestration in a single job.
Whereas before you might have separate BuildJob, TestJob, and DeployJob tied together in one of the manners above, with Pipelines, you can do all that in a single job, using the concept of Stages to separate the discrete steps of the pipeline.
Cool! What else do I get with this?
Even with the simplest of Pipelines, you get:
- Durability, to survive Jenkins restarts
- Pausing for user input
- Parallelism built in
- Pipeline snippet generator to help you build pipelines
- Nice visualization of your Pipelines right in the UI
But wait, there’s more
You also get Travis CI style CI via a Jenkinsfile, and Multibranch pipelines which enable pipeline configuration for different branches of a repo. From the Jenkins Pipeline tutorial: “In a multibranch pipeline configuration, Jenkins automatically discovers, manages, and executes jobs for multiple source repositories and branches.”
In addition, Jenkins Blue Ocean is shaping up to have beautiful visualizations for Pipeline jobs
Using Pipelines right now, today
Let’s say you still really like loosely coupled jobs that can run independently or together as part of a Pipeline (caveat: I have encouraged this approach for years, and it’s why I’ve long used Build Flow plugin over Build/Delivery Pipeline plugin). Right now, today, you can replace your Build Flow jobs with Pipelines.
In fact, you should do this. From the Build Flow wiki page:
A simple Pipeline script to build other jobs looks like this:
Overall, pretty similar to BuildFlow. And don’t worry, you get all the parallelism, retry, etc that you’re used to.
I am tremendously grateful for the people who’ve built and maintained the Build Flow plugin over the years. For me, it’s been a cornerstone of continuous delivery, enabling independent, reusable, loosely coupled jobs. Build Flow developers: Thank you!
But, it’s time to move on, Pipeline will replace Build Flow.
Do Pipelines replace job-dsl?
Now, to the final question: should Pipelines replace job-dsl?
I believe that’s the wrong question.
job-dsl will be complementary to Pipelines. Even if I were to stop using Freestyle jobs entirely, and build nothing but Pipelines, I’d still use job-dsl to create those jobs. In fact, if you go back to my initial post where I described the problems we were trying to solve when we adopted job-dsl, none of them are solved by Pipelines. In that respect, Pipeline is just another type of job.
A friggin’ awesome type of job, no doubt. I am incredibly excited about Pipelines and look forward to using them more. And here’s how I’ll be building those jobs, as job-dsl has full support for Pipelines:
So what is the right question?
If asking whether Pipelines replace job-dsl is the wrong question, what’s the right question?
I believe it’s:
- When should Pipeline replace Freestyle jobs?
- When should Pipeline — via Jenkinsfile — replace creating jobs directly in Jenkins (via GUI or job-dsl)?
I’m going to mostly cop out of answering those questions right now, as, for me, the answers are still evolving as I work more with Pipelines.
My initial gut reactions are:
- replace Build Flows, as mentioned above
- replace Freestyle jobs when there’s no value in running that set of jobs independently
- replace Freestyle jobs when you’d benefit from what Multibranch provides
- replace Jenkins-built jobs with Jenkinsfile when you have a TravisCI-style workflow that you want to use in Jenkins instead and you’ve seriously considered the safety and security implications for your organization (my thoughts are in the very earliest stages here)
Next up: encouraging adoption of Jenkins-as-code among teams
In the final planned post in this Jenkins-as-code series, I will address how we encouraged adoption of this approach amongst our development teams. I’ll cover where we succeeded, where we stumbled, and the work I think we still have to do.
5 thoughts on “Jenkins-as-code: comparing job-dsl and Pipelines”
Good write-up. I’ve landed at the same place. We are using Jenkinsfiles (and other similar files in repos when one repo may need multiple build definitions), but defining the jobs themselves in the DSL. The beauty is that the DSL definition is minimal but still gives us solid management over jobs.
Waiting for your next post
I agree that job-dsl is complementary to Pipelines.
I use job-dsl to create pipeline jobs.
nice one , cleared my confusion on dsl vs pipeline
Very good set of posts! I – and I believe everyone else that went through the entire series – are longing to read the final post 😀 … Keep them coming!
Thanks a lot,