A common problem when managing deployments of applications, manual or automated, is where to securely store the passwords for service accounts used by Windows Services, IIS Application Pools, and Scheduled Tasks in each of the environments the applications are deployed to.
With Windows Server 2008 R2, the first step to simplifying this problem was introduced in the form of Managed Service Accounts. Unfortunately they suffered from the limitation of being restricted to a single computer so you couldn’t use them for load-balanced web applications, for example. It was also a challenge to get them to work for anything other than Windows Services in Server 2008.
Now, with Windows Server 2012, these accounts have matured and become Group Managed Service Accounts or gMSAs. They can be used to run processes on multiple machines and work well with IIS Application Pools and Scheduled Tasks too.
I had an early opportunity to experiment with gMSAs when Server 2012 was still a Release Candidate but more recently I’ve been fortunate to use them extensively for a multitude of applications on my current client engagement so I thought I’d share my experience with the benefits and some of gotchas.
How secure are Group Managed Service Accounts?
At the very least, they are more secure than using a domain user account for a service, because they remove the human-element, but there is more:
- They are based on a very similar model to Domain-joined Computer Accounts (they in fact inherit from the Computer Account class in the Active Directory schema).
- The passwords are automatically changed every 30 days (by default) and the infrastructure ensures all the necessary computers are updated.
- The passwords are 120-characters and generated by the Domain Controllers based on a root key, the account SID, and the current time.
- Like a computer account, only the Windows authentication infrastructure can access the password, and it is quite difficult (if not impossible) for a human to access the password.
- They are explicitly denied the Interactive Logon right so a human couldn’t logon with one even if they could acquire the password.
- The Domain Administrator controls which computers are allowed to run processes using the gMSA’s credentials.
- They are least privilege accounts by default, just like a standard domain user, until they are explicitly granted additional group memberships or privileges.
What do you need to use them?
- You’ll need at least one Windows Server 2012 Domain Controller in the domain where applications will run as gMSAs.
- You’ll need to upgrade the AD schema for Server 2012 too.
- You’ll need to create the root KDS key used to generate the gMSA passwords and wait for it to replicate
- The computers where the applications will run will need to be Windows Server 2012 or later too. However, you can grant gMSAs access to resources on older OSes because appears just like Computer accounts in AD.
- Familiarity with PowerShell because gMSAs are mostly managed via Cmdlets and have very little GUI support.
How do you use them for a Windows Service?
- Ensure the “PrincipalsAllowedToRetrieveManagedPassword” attribute for the gMSA includes the Computer Account where the service will run. (Be careful with tab completion and the “PrincipalsAllowedToDelegateToAccount” attribute.)
- Use the Install-ADServiceAccount cmdlet to prepare the account on the computer where the service will run.
- Use “sc.exe” or the Services management console to specify the gMSA to use to run the service. Leave the password blank. If you use the ServiceProcessInstaller class in a .NET assembly (typically combined with installutil.exe), it is not possible to specify a gMSA account, but it is possible to hack the private “haveLoginInfo” field to true via reflection to make it work. PowerShell’s New-Service cmdlet also won’t handle gMSAs.
How do you use them for an IIS Application Pool?
- The same as steps 1 and 2 above for a Windows Service.
- Use the SAM Account Name format (eg DOMAIN\Name$) to specify the gMSA using the IIS Manager, appcmd.exe, or the Set-WebConfigurationProperty cmdlet and leave the password blank. IIS doesn’t appear to accept a gMSA specified via the User Principal Name format (eg firstname.lastname@example.org).
How do you use them for a Scheduled Task?
- Again, the same as steps 1 and 2 above for a Windows Service.
- Use the New-ScheduledTaskPrincipal cmdlet to specify the gMSA account to use (more details here). The Task Scheduled management console and schtasks.exe won’t accept a gMSA account.
- By default the New-ADServiceAccount cmdlet used to create a gMSA will limit the account name to a maximum of 15 characters and won’t set the User Principal Name (UPN). If you want to use a longer name, use the -SamAccountName parameter to specify the abbreviated name and use the -OtherAttributes parameter to specify the UPN.
- Because gMSAs are explicitly denied the Interactive Logon right, they can’t be used to access other systems which impersonate users interactively. SQL Server Reporting Services is one example where an application running as a gMSA will need to provide alternate credentials to connect.
- Avoiding the need to manage passwords ourselves is a major convenience of Group Managed Service Accounts and this involves a security trade-off. Once a Domain Administrator adds a computer, for example “SERVERA”, to a gMSA’s PrincipalsAllowedToRetrieveManagedPassword list, all local Administrators of SERVERA can now configure any service, apppool, or task to run as that gMSA. If you don’t trust the sysadmins of SERVERA or the other applications on SERVERA, this could be provide a mechanism to escalate network privileges.
Given the need for environment of new server OSes, account name length limitations, and using the command-line instead of a GUI to configure everything, are gMSAs worth it. For me, it has been a resounding “Yes!” but I have a strong personal distaste for password management, especially in heavily automated Continuous Delivery scenarios. I also prefer to run each application in its own virtual machine wherever feasible.
As I’m a firm believer in Continuous Delivery as a means to deliver better, more relevant software, more often, I am always interested in new options entering the fast growing market of tooling to help implement automated deployments. So, this Sunday morning I sat down and took a look at Doug’s solution.
OnCheckin.com currently offers several different pricing plans starting from free and increasing with additional feature sets and the number of sites you will deploy. As part of being a hosted solution, OnCheckin is currently focused on automating the deployment of websites. Specifically it currently works with .NET web applications, version controlled in a Git, Subversion or TFS source repository hosted with a multitude of providers like GitHub or CodePlex or even with your own on-premises repositories if they are reachable from by the OnCheckin servers. As a deployment target OnCheckin currently supports FTP (secure and otherwise) and Microsoft Web Deploy, technologies that any .NET developer should be familiar with. The OnCheckin model makes it obviously suitable for public-facing websites but I can’t see any reason why it couldn’t also be used to deploy private sites if the deployment endpoints is secured appropriately.
Curious to see just how easy OnCheckin could be to use, I created a new ASP.NET MVC4 web project, committed it to a new Git repository and pushed it to my existing GitHub account. I then went to WindowsAzure.com and signed up for a free trial and added an Azure Website to my subscription. With my source and target ready, I signed up for the free OnCheckin plan, specified the relevant GitHub, Azure, and web project details and I had an automated deployment solution ready to run in less than 30 minutes since I started.
I could have then clicked the Run link to deploy immediately but the service is called “OnCheckin” so I decided instead to make a small change to my web app code, commit it, and push it to GitHub. By time I had switched back to the browser tab with my OnCheckin deployment management page open the service had detected my check-in and started to get the source and build the solution.
Unfortunately the first build failed because I hadn’t correctly specified the folder that the solution was in or the name of my web project to deploy. So I went back to the OnCheck project settings page, fixed the values, and this time clicked Run to trigger the deployment. While waiting for the deployment to complete I checked my new website and found that the Azure placeholder had disappeared and been replaced by a friendly OnCheckin offline page which was a nice touch. After about 7 minutes from clicking Run, my website was running and browsable from its URL on Azure. Very slick.
I did notice a few areas for improving the user experience of OnCheckin.com and I’ve provided a list back to Doug who has been very open to accepting feedback. I have an upcoming personal website project in the next few weeks and I’ll definitely be looking at OnCheckin.com to handle the deployments.