Simon Bisson
Contributor

How to work with Azure Pipelines for devops

feature
Sep 18, 20186 mins
Development ToolsDevopsMicrosoft Azure

Microsoft’s devops build tools get a makeover, both in the cloud and on your servers

Microsoft’s recent renaming of Visual Studio Team Services as Azure DevOps came as a surprise, rebranding a familiar service and adding significant new features. One of those new features, Azure Pipelines, builds on Microsoft’s previous cloud-hosted build service to deliver a more powerful tool for building and delivering on-premises and cloud-hosted applications for Windows, MacOS, and Linux.

Azure Pipelines is a continuous delivery tool, competing with tools like the open source Jenkins. It’s designed to build code in popular languages, test them, and then deliver them to your choice of endpoint. Like other CI/CD systems, it’s also extensible, with a library of tasks and extensions to add support for test tool and for integration with your devops tool chain.

Azure Pipelines follows a devops workflow

Pipelines can be configured using YAML or through a visual designer. It’s clear that Microsoft expects you to use the YAML option, which makes sense, because your pipeline configuration becomes another file that lives alongside your code, where it can be managed by your choice of source-control system. You’re also going to need a source-control environment to use Azure Pipelines, because that way it can automate the test and build process. Triggering builds from commits can speed up the development cycle significantly.

Building a pipeline in YAML is easy enough, using an azure-pipelines.yml file. Start by connecting your repository to Azure Pipelines, using OAuth to give it access to your code. It then scans your code and builds a basic template that’s ready for use. That template is saved in the master branch for your code. Once you commit code to the branch, Azure Pipelines’s default trigger then runs your build.

The default Azure Pipelines configuration file only handles basic tasks, and you’ll need to fine-tune it for your application and your target environments. If you’re using one of the directly supported environments, like .Net Core, you can define an agent pool that targets your code at a specific version and specific OS. While not all environments are supported out the box, you can add specific versions using pipeline tasks to install versions—though this will add cost to any build because it’s a separate job.

Working with hosted agents

Microsoft provides a set of hosted agents for most common builds, including Ubuntu (for Linux apps, for Android, and for Linux containers), two versions of Visual Studio, Xcode for MacOS and iOS, and a recent release of Windows Server for containers. You can also add your agents for specific target environments, though these require self-hosting either on cloud VMs or on local infrastructure.

One issue facing anyone using cloud-hosted builds is the ephemeral nature of build hosts. Microsoft will tear down and reset virtual machines between builds, so you get a new VM for each build you run. Thus, any dependencies your code has will need to be loaded each time a build runs, which can take a significant amount of time. While there are options, like self-hosting build agents, in practice it’s a matter of adding a task to your build YAML to load the files from Azure storage or from an external repository like NPM or NuGet.

Once a build has run, your Azure pipeline can then run your tests, sending results to loggers and failing a build if a test fails. The ability to export results in common log file formats lets you import them into analysis tools, ready to diagnose your code and fix any errors.

Compile code artifacts are delivered to a predefined build directory, and then published using your choice of publishing tasks. For example, .Net Core code can be pushed straight to NuGet or wrapped as a ZIP file if you’re delivering a web app to Azure.

Using the Azure Pipelines visual designer

While the YAML based approach to configuring Azure Pipelines lets you construct and share repeatable build scripts and environments, the visual designer is an attractive alternative. That’s especially so if you’re new to devops concepts and to automating builds. In the Azure Pipelines web UI, start by creating a new pipeline. You need to specify a repository and the project you’re going to build.

Creating jobs is easy enough: Choose to create an empty job and then attach it to your pipeline. You can then pick an appropriate agent for your build. You can then start to attach jobs to your agent, running scripts and compiling code before publishing the results. Once a build has been tested, using the Save and Queue option before attaching a continuous integration trigger and automating the build process. The visual designer is flexible: You can use it to pass variables to jobs and tasks using the same syntax as for a YAML configuration.

Complex tasks can be run using more than one pipeline; for example, using one to build your code, one to run tests, and one to handle deployments. Triggers pass state from pipeline to pipeline, so you can trigger a deployment automatically once all tests have been passed.

Using Azure Pipelines with GitHub

Perhaps the smartest thing Microsoft has done with Azure Pipelines is decouple them from the rest of its devops tool. Yes, you can use it with what was Visual Studio Team Services, but you can also use it with source code you have on GitHub. Open source projects get another bonus, because they get to have as many as ten jobs running in parallel for free, with unlimited build minutes. While that might require some work in sequencing jobs, it does give open source developers access to a state-of-the-art build tool without having to stand up their own Jenkins or Travis instances.

You’re not limited to using Microsoft infrastructure to run Azure Pipelines, because there’s a self-host option. This gives you one free parallel job with unlimited minutes. More than that comes in at $15 per parallel job per month. You need to install Visual Studio’s Team Foundation Server to get access to the local Azure Pipelines service, and once you’re up and running you can move jobs to and from the cloud platform if you prefer to use Microsoft-managed tools.

Simon Bisson
Contributor

Author of InfoWorld's Enterprise Microsoft blog, Simon Bisson prefers to think of “career” as a verb rather than a noun, having worked in academic and telecoms research, as well as having been the CTO of a startup, running the technical side of UK Online (the first national ISP with content as well as connections), before moving into consultancy and technology strategy. He’s built plenty of large-scale web applications, designed architectures for multi-terabyte online image stores, implemented B2B information hubs, and come up with next generation mobile network architectures and knowledge management solutions. In between doing all that, he’s been a freelance journalist since the early days of the web and writes about everything from enterprise architecture down to gadgets.

More from this author