Category: PowerShell

Import a Gist into your PowerShell ISE profile

I realised a few hours after posting my last entry about a PowerShell ISE profile script, that I failed to describe to people new to the ISE environment, how to include the script as part of their ISE profile so it is always available.

Simply explaining how to determine the path of your ISE profile by checking the value of the $Profile variable from within the ISE would be boring, so instead I wrote another script to automatically download the first script from GitHub, and add it to your profile.

This new script can simply be copied and pasted into the ISE’s Command Pane (Ctrl+D) and your profile will be opened and updated, ready to be saved. Once saved, you can either restart the ISE or simply run ” . $profile ” (without quotes) from the Command Pane.

You will hopefully notice in this new script, that my usual attention to neatness and verbosity has been replaced with terseness due to the purpose of this script being a use once and throw away item. I had considered writing it as a single line but quickly regained my senses.

Automatic TFS Check Out for PowerShell ISE

I work with a lot of PowerShell scripts, it’s the fun part of my job. I do most of my editing using the PowerShell ISE, primarily because it is the default, but also because it has great syntax highlighting, tab completion, and debug support. Additionally, all of my scripts are in source control: sometimes Mercurial, occasionally Git, but mostly Team Foundation Server (TFS).

The way TFS works though, is that all files in your local workspace are marked as read-only until you Check-Out for editing. When working on TFS-managed files in Visual Studio, the check-out is done automatically as soon as you begin typing in the editor. The PowerShell ISE however has no idea about TFS and will let you edit to your heart’s content until you eventually try to save your changes and it fails due to the read-only flag.

But I’ve developed a solution…

I’ve written a PowerShell ISE-specific profile script that performs a few simple things:

  1. Checks if you have the TFS client installed (eg Team Explorer).
  2. Registers for ISE events on each open file and any files you open later.
  3. Upon editing of a file, if it is TFS-managed then checks it out.

The end result is the same TFS workflow experience from within the PowerShell ISE as Visual Studio provides.

Create SCVMM Templates with PowerShell

Now that I get to work with a TFS 2010 Lab Management environment most days, I find myself building various virtual machines to replicate the  production environments of our clients for testing. With many different clients and projects, the range of virtual machine operating systems expands exponentially as matrix of core OS version, processor architecture, service pack, IE version, and other minor variations. However, for any particular configuration, I’ll also want multiple copies so naturally I want to make use of System Center Virtual Machine Manager’s VM Template Library.

However, creating a template from a VM using the SCVMM Administrator Console, without destroying the original VM, is death by a thousand clicks:

  1. Shutdown the VM.
  2. Dismount any media in the VM’s virtual DVD drives.
  3. Clone the VM via an eight-page wizard.
  4. Wait for the cloning to complete.
  5. Convert the cloned VM into a template with a six-page wizard.
  6. Wait for the sysprepping to complete.
  7. Restart the original VM.

I tire of such tedium very quickly and as such, I’ve scripted the above process with PowerShell and the SCVMM Snapin. You can access the Export-VMTemplate script I’ve written on gist.github. If you open a PowerShell window from the toolbar in the SCVMM console, you can execute the script like this:

.\Export-VMTemplate.ps1 -VM 'MyOriginalVm' -TemplateName 'NewTemplateName' -LibraryServer 'MyLibSvr'

Hopefully the script itself, or at least some of the concepts within, will be useful to someone else.

Export all report definitions for a Team Project Collection

Team Foundation Server 2010 introduced Team Project Collections for organising Team Projects into groups. Collections also provide a self-contained unit for moving Team Projects between servers and this is well documented and supported.

However, if you’ve ever tried moving a Team Project Collection you’ll find the documentation is a long list of manual steps, and one of the more tedious steps is Savings Reports. This step basically tells you to use the Report Manager web interface to manually save every report for every Team Project in the collection as an .RDL file. A single project based on the MSF for Agile template will contain 15 reports across 5 folders, so you can easily spend a while clicking away in your browser.

To alleviate the pain, I’ve written a PowerShell script which accepts two parameters. The first is the url for the Team Project Collection, and the second is the destination path to save the .RDL files to. The script will query the Team Project Collection for its Report Server url and list of Team Projects via the TFS API, then it will use the Report Server web services to download the report definitions to the destination, maintaining the folder hierarchy and time stamps. You can access this script, called Export-TfsCollectionReports, on Gist.

Obviously, when you reach the step to import the report definitions on the new server, you’ll want a similar script to help. Unfortunately, I haven’t written that one yet but I will post it to my blog when I do. In the mean time you could follow the same concepts used in the export script to write one yourself.

No Web Browser, Need PowerShell

I recently found myself on a Windows Server 2003 machine without a functioning web browser and PowerShell wasn’t installed either.

No problem, I just opened notepad and started typing:

echo class Program { public static void Main() { >"%~dpn0.cs"
echo using (var wc = new System.Net.WebClient()) { >>"%~dpn0.cs"
echo wc.UseDefaultCredentials = true; >>"%~dpn0.cs"
echo wc.DownloadFile(@"http://download.microsoft.com/download/1/1/7/117FB25C-BB2D-41E1-B01E-0FEB0BC72C30/WindowsServer2003-KB968930-x86-ENG.exe", @"%~dpn0.installer.exe");}}} >>"%~dpn0.cs"
"%systemroot%\microsoft.net\framework\v3.5\csc.exe" /out:"%~dpn0.exe" "%~dpn0.cs"
"%~dpn0.exe"
"%~dpn0.installer.exe"

Saved it as a command script and double-clicked it. PowerShell v2 installer downloads and runs, bam!

This would require tweaking for other processor architectures, other operating system versions, or older .NET installations.

Thanks to @tathamoddie for the proxy-friendly fix.

Generic batch file wrapper for PowerShell scripts

Integrating PowerShell scripts into older style processes that are only designed to call out to executables or batch files (or “command scripts” as they have been known since NT4) can be slightly messy primarily due to the argument parsing semantics around double-quotes and file paths with spaces. I often end up writing a simple command script to wrap each PowerShell script to simplify this but it is tedious and repetitive and I’ve finally decided to create a generic script that works for any PowerShell script.

Simply save the following code as a text file and save it with the same name as your PowerShell script but replace the .ps1 extension with a .cmd extension.

@echo off
setlocal
set tempscript=%temp%\%~n0.%random%.ps1
echo $ErrorActionPreference="Stop" >"%tempscript%"
echo ^& "%~dpn0.ps1" %* >>"%tempscript%"
powershell.exe -command "& \"%tempscript%\""
set errlvl=%ERRORLEVEL%
del "%tempscript%"
exit /b %errlvl%

Now if your script is called “Get-Something.ps1”, you can simply run it like this:

Get-Something “c:\some path\some.file” –SecondArg 3.14

PowerShell for Developers

Several weeks ago Richard Banks (@rbanks54) coerced me to find time to present to the Australian Virtual Alt.NET User Group between moving house and deploying a new website to production for my current client.

I agreed and with an absurdly short amount of preparation presented to a small group online using Microsoft Live Meeting and a shared PowerShell console window. You can watch the presentation on the Oz Alt.NET Community Blog.

The “slides” and example scripts can be downloaded here.

Code Coverage Delta with Team System and PowerShell

My last post documented my first venture into working with Visual Studio’s Code Analysis output with PowerShell to find classes that need more testing. Since then I’ve taken the idea further to analyse how coverage has changed over a series of builds from TFS.

What resulted is a PowerShell script that takes, as a minimum, the name of a TFS Project and the name of a Build Definition within that project. The script will then, by default, locate the two most recent successful builds, grab the code coverage data from the test runs of each of those builds and output a list of classes whose coverage has changed between those builds, citing the change in the number of blocks not covered.

Additional parameters to the script allow partially successful or failed builds to be considered and also to analyse coverage change over a span of several builds rather just two consecutive builds.

The primary motivator behind developing this script was to be able to identify more accurately where coverage was lost when a new build has an overall coverage percentage lower than the last. This then helps to locate, among other things, where new code has been added without testing or where existing tests have been deleted or disabled.

A code base with a strong commitment to the Single Responsibility Principle should find class-level granularity sufficient but extending the script to support method-level reporting should be trivial given that the Coverage Analysis DataSet already includes all the required information.

The script requires the Team Foundation Server PowerShell Snapin from the TFS October 2008 Power Tools and the Visual Studio Coverage Analysis assemblies presumably available in Visual Studio 2008 Professional or higher. These dependencies only support 32-bit PowerShell so my script unfortunately suffers the same constraint.

Download the script here, and use it something like this:

PS C:\TfsWorkspace\> C:\Scripts\Compare-TfsBuildCoverage.ps1 -project Foo -build FooBuild | sort DeltaBlocksNotCovered

Analyse Code Coverage with PowerShell

Visual Studio 2008 Team System’s built-in Code Coverage is nice but the standard results window only allows you to drill down through each assembly, then namespace, class, and finally method. You can’t easily find the class with the least blocks covered, something I needed to do the other day.

I found John Cunningham’s blog about “off-road” code coverage and was pleased to see that Microsoft had provided an assembly in Visual Studio that can be used to parse the *.coverage file output by a test run. I followed his example to write a PowerShell script to provide basic access to the data.

You can download my script here.

Then you can use it like this:

$CoverageDS = ./Get-CodeCoverageDataSet.ps1 "data.coverage"
$CoverageDS.Class `
  | Sort-Object -Property BlocksNotCovered -Descending `
  | Select-Object `
    -First 25 `
    -Property `
      BlocksNotCovered, `
      @{
        Name = "Namespace";
        Expression = {
          $CoverageDS.NamespaceTable.FindByNamespaceKeyName($_.NamespaceKeyName).NamespaceName
        }
      }, `
      ClassName

The coverage file is typically found in the TestResults\[TestRunName]\In\[ComputerName]\ folder. You can easily perform queries over methods or lines rather than classes by using the other tables in the returned dataset. You can also use the ConvertTo-Html cmdlet to easily create a report for your team.

TFPT TreeClean tamed with PowerShell

Update: Philip Kelley from Microsoft, creator of TFPT, has kindly informed me that the July 2008 release of the TFS Power Tools is now available for download. This new version includes enhancements to TFPT TreeClean that allow you to specify which files to include or exclude and as such solves the main problem my TreeClean PowerShell script was created for. The output format of the new TreeClean also renders this script incompatible but the general concepts used by the script may still be useful.


I like the Team Foundation Server 2008 Power Tools, there are some great additions in there. One particular utility, TreeClean, has a great concept but is a little overzealous for my tastes.

The purpose of TreeClean is to find all local files in your workspace folders that do not exist in source control and then allow you to delete all of them. The problem is that it includes *.user files in its find results and the delete option is all or nothing. The list of files can also be rather overwhelming.

Thankfully we can get some more control by piping the results through PowerShell, starting with a simple script like this:

$ProgFiles = $Env:ProgramFiles ;
$ProgFiles32 = (Get-Item “Env:ProgramFiles(x86)”).Value ;
if (-not [String]::IsNullOrEmpty($ProgFiles32)) { $ProgFiles = $ProgFiles32 ; }

$TFPTEXE = Join-Path -Path $ProgFiles `
-ChildPath “Microsoft Team Foundation Server 2008 Power Tools\TFPT.exe” ;
if (-not (Test-Path -Path $TFPTEXE)) { throw “TFPT.EXE not found.” ; }

[string]$Root = Resolve-Path -Path (Get-Location) ;

& $TFPTEXE treeclean `
| Where-Object { $_ -like ($Root + “*”) } `
| Get-Item -Force ;

Once we have this script saved we can get more information from the results. For example, we can get count and list rogue files by extension:

TreeClean.ps1 | group Extension

We can exclude directories:

TreeClean.ps1 | ? { -not $_.PSIsContainer }

And finally we can delete everything but *.user files:

TreeClean.ps1 | ? { $_.Extension -ne “.user” } | Remove-Item

Now I can clean all the junk from my workspace but keep all my user-level project settings. However, while sorting through the extension-grouped report, looking for files to check-in before cleaning, there was a lot of noise from the build outputs. My quick solution:

gci -inc *.sln -rec | % { MSBuild /t:Clean $_ }

It also has the nice side effect of significantly reducing your workspace folder size if you want to zip it up and send it somewhere.