Queue a Team Build from another and pass parameters

I have previously blogged about queuing a new Team Build at the successful completion of another Team Build for Team Foundation Server 2010. Since then I’ve had a few people ask how to queue a new Team Build and pass information into the new Team Build via the build process parameters. Recently I’ve needed to implement this exact behaviour for a client, and with TFS 2013 which has quite different default build process templates, so I thought I’d share it here.

In my situation I’m building on top the default TfvcTemplate.12.xaml process but the same approach can be easily applied to the Git build templates too. To begin, I have added two build process parameters to the template:

  1. Chained Build Definition Names – this is an optional array of strings which refer to the list of Build Definitions that should be queued upon successful completion of the current build. All the builds will be queued immediately and will execute as the controller and agents are available. The current build does not wait for the completion of the builds it queues. My simple implementation only supports queuing builds within the same Team Project.
  2. Source BuildUri – this is a single, optional, string which will accept the unique Team Build identifier of the previous build that queued it – this is not intended to be specified by a human but could be. When empty, it is ignored. However, when provided by a preceding build, this URI will be used to retrieve the Build Number and Drop Location of that preceding build and these values, plus the URI, will be made available to the projects and scripts executed within the new build. Following the new Team Build 2013 convention, these values are passed as environment variables named:

The assumption is that a build definition based on my “chaining” template will only queue other builds based on the same template, or another template which also accepts a SourceBuildUri parameter. This also means that builds can be chained to any depth, each passing the BuildUri of itself to the next build in the chain.

The projects and scripts can use the TF_BUILD_SOURCEDROPLOCATION variable to access the output of the previous build – naturally UNC file share drops are easier to consume than drops into TFS itself. Also the TF_BUILD_SOURCEBUILDURI means that the TFS API can be used to query every aspect of the preceding build, notably including the Information Nodes.

Prior to TFS 2012, queuing a new build from the workflow and passing parameters would have required a custom activity. However, in Team Build 2012 and 2013, Windows Workflow 4.0 is used which includes a new InvokeMethod activity making it possible to add items to the Process Parameters dictionary directly from the XAML.

The final XAML for the Build Process Template with support for queuing and passing parameters is available as a Gist. If you’d like to be able to integrate the same functionality with your own Team Build 2013 template you can see the four discrete edits I made to the default TfvcTemplate.12.xaml file from TFS 2013 in the Gist revisions.

When a build using this chaining template queues another build it explicitly sets the RequestedFor property to the same value as the current build so that the chain of builds will show in the My Builds view of the user who triggered the first build.

In my current implementation, the SourceBuildUri passed to each queued build is the URI of the immediately preceding build, but it some cases it may be more appropriate to propagate the BuildUri of the original build that triggered the entire chain. This would be a somewhat trivial change to the workflow for whomever needs this behaviour instead.

Effectively comparing Team Build Process Templates

I always prefer implementing .NET build customizations through MSBuild and I avoid modifying the Windows Workflow XAML files used by Team Build. However, some customizations are best implemented in the Team Build process, like chaining builds to execute in succession and pass information between them. As a consultant specializing in automated build an deployment I also spend a lot of time understanding Workflow customizations implemented by others.

For me the easiest way to understand the customizations implemented in a particular Team Build XAML file is to use a file differencing tool to compare the current workflow to a previous version of the workflow, or even to compare it to the default Team Build template it was based on. Unfortunately, the Windows Workflow designer in Visual Studio litters the XAML file with a lot of view state, obscuring the intended changes to the build process amongst irrelevant designer-implementation concerns.

To address this problem, I wrote a PowerShell script (available as a GitHub Gist) which removes all the elements and attributes from the XAML file which are known to be unimportant to the process it describes. Conveniently, the XAML file itself lists the set of XML namespace prefixes that can be safely removed in an mc:Ignorable attribute on the root document element.

Typically I use my XAML cleaning PowerShell script before each check-in to ensure the source control history stays clean but I have also used it on existing XAML files created by others to canonicalize them before opening them in a diff tool.

Using the script is as simple as:

.\Remove-IgnoreableXaml.ps1 -Path YourBuildTemplate.xaml


Or, if you don’t want to overwrite the file in place, specify an alternate destination:

.\Remove-IgnoreableXaml.ps1 -Path YourBuildTemplate.xaml -Destination YourCleanBuildTemplate.xaml


PowerShell Select-Xml versus Get-Content

In PowerShell, one of the most common examples you will see for parsing an XML file into a variable uses the Get-Content cmdlet and the cast operator, like this:

$Document = [xml](Get-Content -Path myfile.xml)

The resulting type of the $Document variable is an instance of System.Xml.XmlDocument. However, there is another approach to get the same, or better, result using the Select-Xml cmdlet:

$Document = ( Select-Xml -Path myfile.xml -XPath / ).Node

Sure, using the second variant is slightly longer, but with an important benefit over the first, and it’s not performance related.

In the first example, the file is first read into an array of strings and then cast. The casting operation (implemented by System.Management.Automation.LanguagePrimitives.ConvertToXml) is using an XmlReaderSettings instance with the IgnoreWhitespace property set to true and an XmlDocument instance with the PreserveWhitespace property set to false.

In the second example, the file is read directly into an XmlDocument (implemented by System.Management.Automation.InternalDeserializer.LoadUnsafeXmlDocument) using an XmlReaderSettings instance with the IgnoreWhitespace property set to false and an XmlDocument instance with the PreserveWhitespace property set to true – the opposite values of the first example.

The Select-Xml approach won’t completely preserve all the original formatting from the source file but it preserves much more than the Get-Content approach will and I’ve found this extremely useful when bulk updating version controlled XML files with a PowerShell script and wanting the resulting file diff to show the intended change and not be obscured by formatting changes.

You could construct the XmlDocument and XmlReaderSettings directly in PowerShell but not in so few characters. You can also load the System.Xml.Linq assembly and use the XDocument class which appears to give slightly better formatting consistency again but it’s still not perfect and PowerShell doesn’t provide the same quick access to elements and attributes as properties on the object.

Override the TFS Team Build OutDir property in TFS 2013

I’ve blogged twice before about the OutDir MSBuild property set by Team Build and I’ve recently discovered that with the default build process templates included with Team Foundation Server 2013, the passing of the OutDir can be disabled via a simple Team Build process parameter.

The parameter I am referring to is the “Output location”:


This parameter’s default value, “SingleFolder”, gives the traditional Team Build behaviour – the OutDir property will be specified on the MSBuild command-line and, unless you’ve made other changes, all build outputs will be dropped into this single folder.

Another value this parameter accepts is “PerProject” but this name can be slightly misleading. The OutDir property will still be specified on the MSBuild command-line but Team Build will append a subfolder for each project that has been specified in the Build Definition. That is, you may choose to build SolutionA.sln and SolutionB.sln from a single Build Definition and the “PerProject” option will split these into “SolutionA” and “SolutionB” subfolders. It will not output to different subfolders for the projects contained within each solution – for this behaviour you should specify the GenerateProjectSpecificOutputFolder property as an MSBuild argument as I’ve blogged previously.

The value of the “Output location” that you’ve probably been looking for is “AsConfigured”. With this setting, Team Build will not pass the OutDir property to MSBuild at all and your projects will all build to their usual locations, just like they do in Visual Studio – presumably to a \bin\ folder under each project. With this setting, it is then your responsibility to configure a post-build target or script to copy the required files from their default build locations to the Team Build binaries share. For this purpose, Team Build provides a “TF_BUILD_BINARIESDIRECTORY” environment variable specifying the destination path to use. There are also some other environment variables populated by Team Build 2013 documented here.

At the end of the build process, Team Build will then copy the contents of the TF_BUILD_BINARIESDIRECTORY to either the UNC path drop folder, or to storage within the TFS Collection database itself as you’ve chosen via the Staging Location setting on the Build Defaults page.

However, before you rush away to use this new capability, consider that the MSBuild, or more accurately the set of Microsoft.*.targets files used by almost all projects, already contain a great quantity of logic for handling which files to copy to the build drop. For example, Web Application projects, will copy the contents of the \bin\ folder and all the other content files (eg css, javascript, and images) whilst excluding C# code files, and the project file. Instead of re-implementing this behaviour yourself, leverage what MSBuild already provides and use the existing hook points to adjust this behaviour when you need to alter it for your situation.

If you’re interested, you’ll find that this new “Output location” behaviour is now implemented in the new RunMSBuild workflow activity, specifically within its RunMSBuildInternal private nested activity.

Update NuGet.exe version used by Team Build 2013 Package Restore

Since NuGet 2.7, there is a new approach to Package Restore. In short, it involves executing “nuget.exe restore” before building the solution or project, instead of having each project import the “nuget.targets” file. This new restore workflow solves a number of issues, especially with packages containing MSBuild customizations, but also with parallel builds conflicting when performing the restore in parallel.

Additionally, Team Foundation Server 2013’s Team Build implements this new Package Restore workflow in its default build process templates for both TFVC and Git repositories without any effort. This functionality is implemented care of the new RunMSBuild workflow activity (not to be confused with the original MSBuild workflow activity).

The RunMSBuild activity internally uses another new activity named “NuGetRestore”, which is also conveniently a public type you can use directly in customized build process templates. The NuGetRestore activity simply runs “nuget.exe” via the InvokeProcess activity to perform the real work, so there is no special TFS-only behaviour.

However, by default, the copy of “nuget.exe” that is used for the restore is located in the same folder as the assembly declaring the NuGetRestore activity (Microsoft.TeamFoundation.Build.Activities.dll) typically located in “C:\Program Files\Microsoft Team Foundation Server 12.0\Tools”. The version of this “nuget.exe” that ships with TFS 2013 RTM is version 2.7 but there is a good chance there will regularly be a newer NuGet available than the version shipped with Team Build, and with features you need or want. For example, version 2.8 was recently released and the new Fallback to Local Cache feature would be one handy way to improve build resiliency when the build agent can’t always connect to a NuGet repository.

I’ve done some research and I have found there are basically two options available for using a newer version of NuGet in your Team Builds now:

  1. Remote to each Team Build Agent with local Administrator privileges, and execute “nuget.exe update -self” on the file located in the TFS Tools folder mentioned above, or …
  2. Customize your build process XAML file in two places:
    1. Set the “RestoreNuGetPackages” argument to “false” on the RunMSBuild activity to avoid using the default “nuget.exe”.
    2. Insert the NuGetRestore activity immediately before RunMSBuild set the “ToolPath” argument to the location of the desired version of “nuget.exe” to use.

With any luck, each future TFS update will ship with the most recent version of NuGet for those builds that can wait.

Visual Studio Solutions, Projects, and shared code

I have been having numerous discussions with a variety of people about shared code in .NET code bases and I decided to blog my thoughts on the topic here – partly to reduce repetition, partly to help me distill the concepts in my own mind.

To clarify, these are my guidelines or rules of thumb. It is where I start when investigating options to improve handling shared code but I will bend these rules when required and I reserve the right to change my mind based on my future experiences.

To begin, there seem to be two basic perspectives on the purpose of a Visual Studio Solution.

  1. A Solution is a container, a boundary. It includes everything required for a software system to be built and tested. The only dependencies external to the Solution are third party dependencies or internal dependencies from a package management system like NuGet.
  2. A Solution is a view, a window. It includes only the necessary items to work with a particular aspect of a software system. Projects within a Solution will routinely be dependent on other Projects not in the Solution. There will often be multiple Solutions that overlap with, or completely encompass, other Solutions.

I subscribe to the first group. I believe this is the model that Visual Studio guides developers toward through its default behaviours and through the challenges that arise when veering away from this model. I believe that a new team member should be able to clone a clean working set of source from version control and build the Solution and have all they need within the IDE. I like that a successful build of the open Solution (mostly) indicates that I haven’t accidentally changed or removed code used elsewhere.

To follow, given a common scenario of two mostly discrete Solutions that currently share a common Project between them, I start asking:

  • Can the Project be moved into a new, third Solution and packaged as a NuGet package? The original Solutions then reference this shared Project by its Package from (private) NuGet Repository. This can lengthen the feedback cycle when debugging, so if this leads to a poor experience because the shared Project is a common source of issues, a better suite of Integration Tests in the third Solution may help. If the shared Project changes often to implement features rather than fix bugs this may not be a good option.
  • Can the two Solutions be combined into one all-inclusive Solution? Would the new Solution then have too many Projects resulting a the build and/or test experience too slow or resource intensive? If the Project count is too high and code has been separated into Projects simply to enforce layer separation, perhaps some Projects can be consolidated and a tool like NDepend used to enforce separation.
  • Do the two Solutions together represent too large a system? Is the coupling to the shared Project an indication of a design that would benefit from significant refactoring – for example, favouring composition over inheritance.

Finally, what is the value of sharing the common Project? In my experience, increased code reuse is associated with higher coupling. Duplication of the shared code instead may prove beneficial in other stages of the delivery cycle and reduce each Solutions influence/impact on the other.

I am also reminded of Paul Stovell’s short series of useful articles about Integration. The Shared Database solution is an example where a Data Access Layer Project might be shared between two Solutions but the Messaging approach is an example where the two Solutions could be much more independent.

NuGet Reference Paths for Projects in Multiple Solutions

This year I have been working with a code base that exhibits Visual Studio projects with three characteristics:

  1. The project references a NuGet package.
  2. The project is included in more than one Visual Studio solution.
  3. The solution files are located in different folders.

I’m not sure how common this scenario is. A few different threads on the NuGet CodePlex site suggests at least some other people are wrestling with it. Personally I endeavour to structure a code base to avoid sharing projects between solutions but for old, high-coupled code this can be difficult to achieve.

The problem with this scenario is with the relative paths used to resolve the assemblies within the referenced NuGet package when building the project in clean or constrained working folders – such as on a build agent or when someone first clones a repository. When a NuGet package is installed or updated in a project, the path to the package assemblies are specified relative to the /packages/ folder of the currently open solution. However, when another solution including the same project is built, the assembly won’t be resolved either because the first solution’s /packages/ folder is not present in the workspace and the NuGet Package Restore workflow has put the assemblies in the second solution’s /packages/ folder.

The existing attempts to solve this issue, and the same way I approached the problem originally, tend to be focused on writing reference paths relative to an MSBuild property like $(SolutionDir) or $(PackageDir) which then allows the path to be resolved correctly at build time. If I understand correctly, this approach has been rejected from becoming part of the official NuGet application because it doesn’t handle the scenario where a project is being built directly, not being built as part of a solution – something I also avoid generally.

Last week I had an idea to tackle the problem dynamically at build time instead of when the reference path is written to the project file. My solution is to introduce (yet another) NuGet package to the affected projects as a development-only dependency. I call this the NugetReferenceHintPathRewrite package. This package adds an MSBuild targets file to the project which executes just before the standard ResolveAssemblyReferences MSBuild target. When it executes it looks for references that specify a /packages/ folder as part of their path and then replaces the part of the path up to and including the /packages/ folder with the a new path to the currently building solution’s packages folder. This rewrite is done to the MSBuild Items in-process and does not modify the project file on disk.

The main benefit of this dynamic build-time approach is that I don’t have to worry about new packages being installed or packages being updated (ie re-installed) and the paths in the project file being set to the “wrong” path because someone else forgot to fix it before committing.

You can find the NuGetReferenceHintPathRewrite package on NuGet.org.