Introduction

PowerShell DSC is possibly one of the most potent configuration managers for Windows Operating Systems. Even with the increase in popularity of Ansible and SaltStack, Powershell DSC remains supreme when enforcing the desired state on a Windows VM. Ansible itself has included the win_dsc module, which allows Ansible to run PowerShell DSC. In this blog post, we will dive deeper into one of PowerShell DSC’s most powerful aspects, 3rd Party Resources, and how they interact with Azure Automation.

3rd Party Resources are PowerShell DSC Modules that are created by the community. Any PowerShell community member can create modules, and there are tons of modules out there to choose from. Modules are kept in repositories, the most well known and default PowerShell repository being the PowerShell Gallery run by Microsoft. This is a common repository for PowerShell modules that are deployed to the Gallery by the community. PowerShell Modules in the PSGallery can be downloaded and installed by the PowerShellGet Module.

As developers and infrastructure engineers, there are many different reasons to script various services you are creating. Often, instead of developing behavior or scripts from scratch, it is much easier to leverage the work that others have done to expedite a task’s completion. 3rd Party Modules allow for easily repeatable code that can become production-ready through collaboration.

Often, DSC Configuration can become complicated. Engineers can be asked to do many things, from creating an Azure AD Domain, configuring OMS Solutions associated with a VM, and even interactions with non-native Azure products, such as Splunk.

These may all seem very daunting, but don’t fret! Members of the PowerShell community have dealt with these problems and many others, and often you will find third party modules to help do the work for you.

Here is an example of a Third Party Resource, named ActiveDirectoryDsc, which will help in the promotion, configuration, and management of Active Directory

Azure Automation is a robust PaaS offering from Azure that allows for a cloud-based DSC pull server. Within Azure Automation, it is possible to add both custom modules that the user develops and third-party modules available in any hosted source.
⚠ It should be known that organizations in locked-down environments can manage their Repository of PowerShell Modules, which have been vetted by the respective InfoSec team. It is possible to deploy your Artifact repo using the Azure DevOps product shown here. It allows an internal team to deploy its versions of packages, and you can use that as your URI references.
⚠ There are a few ways to upload modules to the Azure Portal natively. You can upload manually through the portal as shown here in this picture:

Uploading modules to the Azure Portal

However, being DevOps Engineers, we want to automate this process as much as possible. One way to do this is via ARM Templates, like the ones we used in the previous module.
Below is an example of how to add a 3rd party module to your Azure Automation Account via ARM Templates:

{
"name": "[concat(parameters('automationAccountName'), '/', parameters('name'))]",
"type": "Microsoft.Automation/automationAccounts/modules",
"apiVersion": "2015-10-31",
"location": "[resourceGroup().location]",
"properties": {
"isGlobal": false,
"sizeInBytes": 0,
"contentLink": {
"uri": "uri to package"
}
}
}

If you are deploying from the PowerShellGallery, your Uri would look something like this:

"uri": "[concat('https://www.powershellgallery.com/api/v2/package/', parameters('name'), '/', parameters('version'))]"

Alternatively, you can script the import of modules using the New-AzAutomationModule module in a Powershell Script.

Oddly enough, there is sometimes some difficulty understanding the correct ContentUri to use in both the ARM and Powershell case. Finding the correct one can be done by navigating the right module in the Powershell Gallery, and adding /api/v2 to the URL, and replacing packages (plural) with package (singular).

Add the /api/v2 to a URL

Conclusion

3rd Party Modules are a great way for developers to speed up development and productivity. If you are inclined to help in the development of these modules, head over to GitHub and contribute!

Azure Automation provides credential assets for securely passing credentials between the automation account and a Desired State Configuration (DSC). Credentials can be created directly through the portal or through PowerShell and easily accessed in the DSC configuration. However, there a few disadvantages with storing credentials in an automation account vs. storing credentials in a KeyVault:

  • More fine-grained permissions can be set on a KeyVault – for example, custom access policies can be set for different principals.
  • KeyVault secret values can be viewed in the portal (assuming you have access). Passwords in an automation account credential cannot.
  • In most scenarios, a KeyVault is the “single source of truth”, where all secure assets in a tenant are stored. If you need to access credentials in an ARM template and a DSC configuration, they must be in a KeyVault for use in the ARM template.

In this example, we will walk through a very simple workstation configuration that pulls a username and password from a KeyVault, then passes those parameters into a DSC resource to create a new local user on a target VM.

Prerequisites

  • A Key Vault already provisioned
  • An Automation Account already provisioned
  • The Az.Accounts and Az.KeyVault modules imported into the Automation Account

Permissions

The Automation Connection, or more specifically, the service principal of the automation connection, needs to have at least “Get” selected under Secret permissions in the KeyVault access policies.

Creating the DSC file

These are the minimum set of parameters necessary to extract a username and password from a KeyVault:

param
(
    [Parameter(Mandatory)]
    [string] $keyVaultName,

    [Parameter(Mandatory)]
    [string] $usernameSecretName,

    [Parameter(Mandatory)]
    [string] $passwordSecretName,

    [Parameter(Mandatory)]
    [string] $automationConnectionName
)

The first 3 parameters’ purpose should be self-explanatory. The final parameter, automationConnectionName, is used to establish a connection to Azure. Even though this code is executing in the context of an Automation Account, it is not connected to Azure in the same way as if we had connected using Login-AzAccount or Connect-AzAccount. There are special cmdlets available when running in an automation account that we can use to establish a “full” connection:

$automationConnection = Get-AutomationConnection -Name $connectionName

Note that we are calling Get-AutomationConnection, NOT Get-AzAutomationConnection. The latter command only works when you have already established a connection to Azure. Get-AutomationConnection is one of those special cmdlets available when running in an Automation Account. Conversely, Get-AutomationConnection will not work if the DSC is executing outside the context of an Automation Account. For more information on connections in Azure Automation, refer to https://docs.microsoft.com/en-us/azure/automation/automation-connections

Get-AutomationConnection returns an object containing all the necessary properties for us to establish a “full” connection to Azure using the Connect-AzAccount cmdlet:

Connect-AzAccount -Tenant $automationConnection.TenantID -ApplicationId $automationConnection.ApplicationID -CertificateThumbprint $automationConnection.CertificateThumbprint 

Note that for those of you that aren’t running this in the Azure public cloud (such as Azure Government or Azure Germany), you’ll also need to add an environment switch to point to the correct cloud environment (such as -Environment AzureUSGovernment)

At this point, we can run the az cmdlets to extract the secrets from the KeyVault:

$username = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $usernameSecretName).SecretValueText

$password = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $passwordSecretName).SecretValue

Full Example Configuration

configuration Workstation
{
	param
	(
		[Parameter(Mandatory)]
		[string] $keyVaultName,

		[Parameter(Mandatory)]
		[string] $usernameSecretName,

		[Parameter(Mandatory)]
		[string] $passwordSecretName,

		[Parameter(Mandatory)]
		[string] $automationConnectionName
	)

	Import-DscResource -ModuleName PSDesiredStateConfiguration

	$automationConnection = Get-AutomationConnection -Name $automationConnectionName
	Connect-AzAccount -Tenant $automationConnection.TenantID -ApplicationId $automationConnection.ApplicationID -CertificateThumbprint $automationConnection.CertificateThumbprint

	$username = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $usernameSecretName).SecretValueText

	$password = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $passwordSecretName).SecretValue

	$credentials = New-Object System.Management.Automation.PSCredential ($username, $password)

	Node SampleWorkstation
	{
		User NonAdminUser
		{
			UserName = $username
			Password = $credentials
		}
	}
}

Final Thoughts

Remember the nature of DSC compilation – all variables are resolved at compile-time and stored in the resulting MOF file that is stored in the automation account. The compiled MOF is what is actually downloaded and executed on the target Node/VM. This means that if you change one of the secret values in the KeyVault, the MOF will still contain the old values until you recompile the DSC.

Copyright: <a href='https://www.123rf.com/profile_viktorus'>viktorus / 123RF Stock Photo</a>

I thought Per Werngren made some important observations in his recent article for Redmond Channel Partner Magazine. His main point: System Integrators (SIs) need to evolve their business models or risk disintermediation. As workloads are migrated to AWS and Azure, automation replaces the need for people to perform those tasks. This automation enables governance and compliance to standards, while also setting the stage for better downstream, fully-automated management, monitoring and operations. This, of course, further reduces the need for people performing in those roles,

Meanwhile, the new generation of intelligent PaaS services for predictive analytics, artificial intelligence, machine learning, etc. are also replacing jobs once done by hand. These new tools allow us to build better and more intelligent applications.

FREE HALF DAY SESSION: APP MODERNIZATION APPROACHES & BEST PRACTICES
Transform your business into a modern enterprise that engages customers, supports innovation, and has a competitive advantage, all while cutting costs with cloud-based app modernization.

Despite all this potential for automation, we still regularly see organizations allowing contractors to move workloads manually. It’s simply in a staffing contractor’s best interest to have people do this, despite it being a time-consuming and error-prone process. But why would an SI recommend automation and reduce their long-term revenue? Read More…

Microsoft has over a thousand Virtual Machine images available in the Microsoft Azure Marketplace. If your organization has their own on-premises “Gold Image” that’s been tailored, hardened, and adapted to meet specific organizational requirements (compliance, business, security, etc.), you can bring those images into your Azure subscription for reuse, automation, and/or manageability.

I recently had the opportunity to take a client’s virtualized Windows Server 2008 R2 “Gold Image” in .OVA format (VMware ), extract the contents using 7-Zip, run the Microsoft Virtual Machine Converter to create a VHD, prepare and upload the VHD, and create a Managed Image that was then deployed using PowerShell and an Azure Resource Manager Template.

It’s actually quite simple! Here’s how… Read More…

Please read Part One and Part Two

Part 3: Azure Automation, Azure RunBooks, and Octopus Deploy

With just PowerShell and an Azure ARM template, we can kick off a deployment in just a few minutes. But there are still some manual steps involved – you still need to login to your Azure subscription, enter a command to create a new resource group, and enter another command to kick off a deployment. With the help of an Azure automation account and a platform called Octopus Deploy, we can automate this process even further to a point where it takes as little as three clicks to deploy your whole infrastructure! Read More…