Build your own Hosted VSTS Agent Cloud: Part 3 – Automate

In the previous posts (part 1part 2) I introduced Packer and showed the build and release scripts that I use to create my own VSTS Agent pool in Azure. In this post, I’ll tie everything together and show you the Git repositories and Build and Release definitions I use to automate my Agent Pool.

Setting up my Git repo’s

The GitHub repository with the VSTS Image configuration is open sourced at If all you want to do is reuse the configuration and make no changes, you can just reuse the repository as is. In my scenario, I want to be able to make my own changes and sync with the Microsoft repository to get their latest changes.

In my VSTS project, I’ve used the Import Repository option to pull the GitHub repo into my VSTS account. After the import, I just add the original GitHub repository as a remote so I can easily merge Microsofts updates with my own work.

VSTS allows you to import external repositories into your project

My default repository, where I store my scripts and configuration files, is in the VSTSHostedAgentPool repository. In this repo, I’ve added the vsts-image-generation repository as a submodule. This makes it easy to pull both repositories into my build definition. I can also use relative paths from my scripts pointing to the VSTS image generation files.

You can add the submodule to vsts-image-generation by running the following command from within your VSTSHostedAgentPool repository:

git submodule add -b master https://<youraccount>

For all my PowerShell work, VS Code is my favorite editor. As of January 2018 VS Code has support for Git Submodules which makes working with sub modules easier.

Creating the build definitions

I’ve chosen to use Builds as Code for the build definitions. I really like being able to version control my builds and it also makes it easier to share the builds through GitHub. I’ve created two builds: build and clean. Both builds and the release definition share one Variable Group. This makes it easy for me to define all values in one place and reuse them over all the definitions.

A shared Variable Group configures all values needed for building and releasing the Agent Pool

Clean is easy. The build takes three Boolean parameters that define if you want to delete your agent pool, the current images and any leftover Packer resource groups. I’ve added three variables that default to $true and are settable at queue time. This allows me to easily queue a new build and set variables to $false if I don’t want to remove those artifacts. The other variables are retrieved from the linked Variable Group.

Process Variables

The Clean definition has three parameters that you set at queue time to cleanup Azure resources

The YAML file is short and easy:


   name: Hosted VS2017
   demands: azureps


   - task: AzurePowerShell@2


      azureConnectionType: 'ConnectedServiceNameARM'

      azureSubscription: 'Azure Connection'

      ScriptPath: 'Clean.ps1'

      ScriptArguments: '-RemovePackerResourceGroups:$(RemovePackerResourceGroups) -RemoveManagedImages:$(RemoveManagedImages) -RemoveAgentPoolResourceGroup:$(RemoveAgentPoolResourceGroup) -ManagedImageName "$(ManagedImageName)" -ManagedImageResourceGroupName "$(ManagedImageResourceGroupName)" -AgentPoolResourceGroup "$(AgentPoolResourceGroup)"'

      azurePowerShellVersion: 'LatestVersion'

The YAML file defines one step that runs an Azure PowerShell script. ScriptArguments is the messiest part of the script. I pass in the required resource names and set the switches to true or false. One PowerShell trick I learned while working is this is that to explicitly set a PowerShell switch parameter to true or false you use the syntax -SwitchName:$(SwitchParameter)SwitchParameter can be $true or $false and this will enable or disable the switch.

I’ve named the other build definition Build Image. All variables are brought in by linking to the shared Variable Group that I mentioned above. The other important configuration option is to enable checkout of Git submodules.

Enable checkout of submodules to make sure that the vsts-image-generation repository is checked out as a submodule

And finally, this is the YAML file that calls the Build script. I run this build on a private agent in the Default agent pool because the time out of the Hosted Agent Pool is 30 minutes which is too low for creating the full VSTS image.

   name: Default

   timeoutInMinutes: 600


   - task: AzurePowerShell@2


         azureConnectionType: 'ConnectedServiceNameARM'

         azureSubscription: 'Azure Connection'

         ScriptPath: 'Build.ps1'

         ScriptArguments: '-Location "$(Location)" -PackerFile "$(PackerFile)" -ClientId "$(ClientId)" -ClientSecret "$(ClientSecret)" -TenantId "$(TenantId)" -SubscriptionId "$(SubscriptionId)" -ObjectId "$(ObjectId)" -ManagedImageResourceGroupName "$(ManagedImageResourceGroupName)" -ManagedImageName "$(ManagedImageName)"'

         azurePowerShellVersion: 'LatestVersion'

Creating the Release definitions

I’ve created a number of release definitions: Deploy Agent PoolScale Agent Pool, Stop Agent Pool and Start Agent Pool. All definitions are linked to the Variable Group that I share with the build definitions.

The following image shows the pipeline that I use for both Deploy Agent pool and Scale Agent Pool. The only difference is that I’ve disabled the automatic trigger for the Scale Agent Pool pipeline.

The pipeline for Deploy Agent Pool and Scale Agent Pool

The single task in the environment runs the Release.ps1 or Scale.ps1 script with an Azure PowerShell task.

The Scale script takes an extra parameter named Capacity. This maps to the number of VMs that you want to create in your VM Scale Set. Capacity is a Variable in the Release Definition. If you want to scale the number of VMs, you create a draft release, set the Capacity variable and then execute the release. This gives me a semi-automated way of scaling the number of Agents I have in my pool.

I leave it as an  exercise to the reader to add auto scaling based on the number of builds and releases in the queue. If you do work on this, I would love a pull request!

Managing Costs

Running a VM 24 hours a day, 7 days a week isn’t cheap. The Release script uses Standard DS4v2 VMs for the Agents which cost around €330 a month (make sure to apply dev/test pricing!). However, I don’t need the VMs to be on all the time. If they run from 09:00-19:00 I have already saved half the costs. I can also skip the weekends which cuts the price down to €117 euro. To help with this I created a simple script that starts or stops the VMs:



   [string]$ResourceGroup = $env:AgentPoolResourceGroup,

   [string]$ScaleSet = "ScaleSet",




Set-StrictMode -Version Latest

Get-AzureRmResourceGroup -Name $ResourceGroup -ErrorVariable notPresent -ErrorAction SilentlyContinue | Out-Null

if ( $notPresent) {

   "Resource group $ResourceGroup does not exist. Exiting script"



try {

   Get-AzureRmVmss -ResourceGroupName $ResourceGroup -VMScaleSetName $ScaleSet | Out-Null


catch {

   "Scale set $ScaleSet does not exist. Exiting script"



If ($Action -eq "Start") {

   Start-AzureRmVmss -ResourceGroupName $ResourceGroup -VMScaleSetName $ScaleSet


ElseIf ($Action -eq "Stop") {

   Stop-AzureRmVmss -ResourceGroupName $ResourceGroup -VMScaleSetName $ScaleSet -Force


Else {

   Write-Error "Unregonized action $Action"


I’ve created two release definitions where I call the Manage.ps1 script and pass start or stop to the script.

Scheduled release trigger

The schedule for the Start Agent Pool release definition starts the VMs at 07h on weekdays

There is one other aspect when it comes to costs: private pipelines cost money. For each VSTS account, you get one free private pipeline. For each Visual Studio Enterprise subscriber in your account you also get a free private pipeline. If you need additional pipelines you will have to buy those. As you can see in the Marketplace, these private pipelines come for $15 per pipeline per month.

Running private Agents isn’t cheap. However, as discussed in the first post, sometimes you have to because of specific requirements and extra performance.

What’s next

You now have a fully functional pipeline for running your own agents. You’ve seen how to use Packer to build images and use Visual Studio Team Services to store your scripts and configuration and run a series of builds and releases to manage everything for you.

There is one part left. In the final part I’ll discuss options for modifying the existing Microsoft Packer configuration to add extra software or cache resources you need such as Docker images or NPM packages.

About the Author:

My name is Wouter de Kort. I live in Groningen, the Netherlands with my wife and our rabbits Donald and Katrien. I became interested in computers when my dad came home with an old 286 monochrome laptop when I was 6. After finding my way around MS-DOS, Windows and playing my first game of Solitaire I became interested in software development. My first programming language was Quick Basic and I managed to write a program that helped you practice the multiplication tables. All with ASCII art of course!

Now this is all a couple of years behind me. In the meantime, I’ve learned other things like Visual Basic, C++, JavaScript and C#.  I work for Sogeti here in the Netherlands as a Principal Consultant. I focus on Application Lifecycle Management, Agile and DevOps using products such as Team Foundation Server and Visual Studio Team Services.

I think I’m a fairly good developer. I love to learn new things and share that with others. My focus is on Agile, DevOps and Application Lifecycle Management techniques. I’m one of the Microsoft ALM Rangers, a Microsoft MVP Visual Studio and Development Technologies and the author of DevOps on the Microsoft Stack. I also wrote some other books and I try to speak regularly at all kinds of conferences. If you want to get in touch, just for a chat or because I can help you or your company with something, you can contact me through Twitter.


de Kort, W. (2018). Build your own Hosted VSTS Agent Cloud: Part 3 – Automate – The Art Of Coding. [online] Available at: [Accessed 13 Feb. 2019].

Share this on...

Rate this Post: