Introduction

Unfortunately, Azure DevOps does not have a SaaS offering running in Azure Government. The only options are to spin up Azure DevOps server in your Azure Government tenant or connect the Azure DevOps commercial PaaS offering (specifically Azure Pipelines) to Azure Government. Your customer may object to the latter approach; the purpose of this post is to provide you with additional ammunition in making a case that you can securely use commercial Azure DevOps with Azure Government.

Throughout this blog post, the biggest question you should always keep in mind is where is my code running?

Scenario

Take a simple example; a pipeline that calls a PowerShell script to create a key vault and randomly generates a secret to be used later (such as for a password during the creation of a VM)

add-type -AssemblyName System.Web
$rgName = "AisDwlBlog-rg"
$kvName = "AisDwlBlog-kv"
$pw = '"' + [System.Web.Security.Membership]::GeneratePassword(16, 4) + '"'
az group create --name $rgName --location eastus --output none
az keyvault create --name $kvName --resource-group $rgName --output none
az keyvault set-policy --name $kvName --secret-permissions list get --object-id 56155951-2544-4008-9c9a-e53c8e8a1ab2 --output none
az keyvault secret set --vault-name $kvName --name "VMPassword" --value $pw

The easiest way we can execute this script is to create a pipeline using a Microsoft-Hosted agent with an Azure PowerShell task that calls our deployment script:

pool:
  name: 'AisDwl'

steps:
- task: AzureCLI@2
  inputs:
    azureSubscription: 'DwlAzure'
    scriptType: 'ps'
    scriptLocation: 'scriptPath'
    scriptPath: '.\Deploy.ps1'

When we execute this, note that output from that PowerShell script flows back into the Pipeline:

Deployment Output

To circle back to that question that should still be in your mind….where is my code running? In this case, it is running in a Virtual Machine or container provided by Microsoft. If you have a customer that requires all interactions with potentially sensitive data to be executed in a more secure environment (such as IL-4 in Azure Government), you are out of luck as that VM/Container for the hosted build agent is not certified at any DoD Impact Level. Thus, we have to look at other options, where our deployment scripts can run in a more secure environment.

I’ll throw a second wrench into things…did you see the bug in my Deploy.ps1 script above? I forgot to add --output none to the last command (setting the password in the key vault). When I run the pipeline, this is shown in the output:

Secret Visible

Not good! In an ideal world, everyone writing these scripts would be properly handling output, but we need to code defensively to handle unintended situations. Also, think about any error messages that might bubble back to the output of the pipeline.

Option 1

Azure Pipelines provide the capability to run pipelines in self-hosted agents, which could be a VM or container-managed by you/your organization. If you set up this VM in a USGov or DoD region of Azure, your code is running in either an IL-4 or IL-5 compliant environment. However, we can’t simply spin up a build agent and call it a day. As with the Microsoft-hosted build agent, the default behavior of the pipeline still returns output to Azure DevOps. If there is ever an issue like I just demonstrated, or an inadvertent Write-Output or Write-Error, or an unhandled exception containing sensitive information, that will be displayed in the output of the pipeline. We need to prevent that information from flowing back to Azure Pipelines. Fortunately, there is a relatively simple fix for this: instead of having a task to execute your PowerShell scripts directly, create a wrapper/bootstrapper PowerShell script.

The key feature of the bootstrapper is that it executes the actual deployment script as a child process and captures the output from that child process, preventing any output or errors from flowing back into your Pipeline. In this case, I am simply writing output to a file on the build agent, but a more real-world scenario would be to upload that file to a storage account.

try
{
	& "$PSScriptRoot\Deploy.ps1" | Out-File "$PSScriptRoot\log.txt" -append
	Write-Output "Deployment complete"
}
catch
{
	Write-Error "there was an error"
}

The biggest disadvantage of this approach is the additional administrative burden of setting up and maintaining one (or more) VMs/containers to use as self-hosted build agents.

Option 2

If you would prefer to avoid managing infrastructure, another option is to run your deployment scripts in an Azure Automation Account. Your Pipeline (back to running in a Microsoft-hosted agent) starts an Azure Automation Runbook to kick off the deployment. The disadvantage of this approach is that all of your deployment scripts must either be staged to the Automation Account as modules or converted into “child” runbooks to be executed by the “bootstrapper” runbook. Also, keep in mind that the bootstrapper runbook must take the same preventative action of capturing output from any child scripts or runbooks to prevent potentially sensitive information from flowing back to the Pipeline.

Sample code of calling a runbook:

$resourceGroupName = "automation"
$automationAccountName = "dwl-aaa"
$runbookName = "Deployment-Bootstrapper"
                    
$job = Start-AzAutomationRunbook -AutomationAccountName $automationAccountName -ResourceGroupName $resourceGroupName -Name $runbookName -MaxWaitSeconds 120 -ErrorAction Stop
                    
$doLoop = $true
While ($doLoop) {
    Start-Sleep -s 5
    $job = Get-AzAutomationJob -ResourceGroupName $resourceGroupName –AutomationAccountName $automationAccountName -Id $job.JobId
    $status = $job.Status
    $doLoop = (($status -ne "Completed") -and ($status -ne "Failed") -and ($status -ne "Suspended") -and ($status -ne "Stopped"))
}
                    
if ($status -eq "Failed")
{
    Write-Error "Job Failed"
}

The Deployment script code running as an Azure Automation Runbook (Note that this has been converted to Azure PowerShell as the AzureCLI isn’t supported in an Automation Account Runbook):

$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzAccount -ServicePrincipal -Tenant $Conn.TenantID -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint

add-type -AssemblyName System.Web
$rgName = "AisDwlBlog-rg"
$kvName = "AisDwlBlog-kv"
$pw = [System.Web.Security.Membership]::GeneratePassword(16, 4)

$rg = Get-AzResourceGroup -Name $rgName
if ($rg -eq $null)
{
	$rg = New-AzResourceGroup -Name $rgName -Location EastUs
}

$kv = Get-AzKeyVault -VaultName $kvName -ResourceGroupName $rgName
if ($kv -eq $null)
{
	$kv = New-AzKeyVault -Name $kvName -ResourceGroupName $rgName -location EastUs
}
Set-AzKeyVaultAccessPolicy -VaultName $kvName -PermissionsToSecrets list,get,set -ServicePrincipalName $Conn.ApplicationID

$securePw = ConvertTo-SecureString -String $pw -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $kvName -Name "VMPassword" -SecretValue $securePw
We are in awe of our team as they have reached another impressive milestone in support of our nation’s mission to modernize their IT footprint. Within the last week, AIS was able to successfully establish the first Microsoft Azure Impact Level 6 (IL6) instances and achieve a conditional Authority to Operate (ATO) at IL6. 

At the end of 2018, AIS was the first to establish a cloud connection at Impact Level 5, meaning our customers could connect and store any unclassified data on their Azure subscription.

By achieving an ATO for Microsoft Azure at IL(6), AIS has enabled classified missions to use the best which Azure’s public cloud has to offer – unparalleled connectivity, high availability, and resiliency. This new capability will bring commercial innovations, which meet the security and integrity requirements of classified workloads, to DoD’s secured networks. IaaS, SaaS, and PaaS Azure services (ex. artificial Intelligence (AI), machine learning (ML), identity, analytics, security, and high-performance computing) will now be available for many more DoD systems and applications.

Dedicated to Enabling a More Secure Cloud for Government

Microsoft’s dedication to the public sector has been reflected in the continued investments made in Azure Government to enable the full spectrum of State, Local, and Federal agencies as well as all Military branches.

In addition to the recently achieved Provisional Authorization (PA) at the Department of Defense (DoD) IL6, Microsoft announced a third region to enable even higher availability for national security missions, released hundreds of new features and launched over 40 new services.

We Stand Alongside You to Back Your Mission

AIS is a proud Microsoft partner and stands alongside the products, technology investments, and values of the organization. We are honored to help pave the way for our government to increase performance and connectivity, reduce costs, realize actionable data insights, and enable faster mission innovations on the first cloud natively connected to classified networks – Azure Government.

STRUGGLING WITH ATO? READY TO MIGRATE?
Work with cloud consultants, engineers, and security experts to build cloud and data solutions that deliver real mission impact.

Late last Friday, the news of the Joint Enterprise Defense Infrastructure (JEDI) contract award to Microsoft Azure sent seismic waves through the software industry, government, and commercial IT circles alike.

Even as the dust settles on this contract award, including the inevitable requests for reconsideration and protest, DoD’s objectives from the solicitation are apparent.

DOD’s JEDI Objectives

Public Cloud is the Future DoD IT Backbone

A quick look at the JEDI statement of objectives illustrates the government’s comprehensive enterprise expectations with this procurement:

  • Fix fragmented, largely on-premises computing and storage solutions – This fragmentation is making it impossible to make data-driven decisions at “mission-speed”, negatively impacting outcomes. Not to mention that the rise in the level of cyber-attacks requires a comprehensive, repeatable, verifiable, and measurable security posture.
  • Commercial parity with cloud offerings for all classification levels – A cordoned off dedicated government cloud that lags in features is no longer acceptable. Furthermore, it is acceptable for the unclassified data center locations to not be dedicated to a cloud exclusive to the government.
  • Globally accessible and highly available, resilient infrastructure – The need for infrastructure that is reliable, durable, and can continue to operate despite catastrophic failure of pieces of infrastructure is crucial. The infrastructure must be capable of supporting geographically dispersed users at all classification levels, including in closed-loop networks.
  • Centralized management and distributed control – Apply security policies; monitor security compliance and service usage across the network; and accredit standardized service configurations.
  • Fortified Security that enables enhanced cyber defenses from the root level – These cyber defenses are enabled through the application layer and down to the data layer with improved capabilities including continuous monitoring, auditing, and automated threat identification.
  • Edge computing and storage capabilities – These capabilities must be able to function totally disconnected, including provisioning IaaS and PaaS services and running containerized applications, data analytics, and processing data locally. These capabilities must also provide for automated bidirectional synchronization of data storage with the cloud environment when a connection is re-established.
  • Advanced data analytics – An environment that securely enables timely, data-driven decision making and supports advanced data analytics capabilities such as machine learning and artificial intelligence.

Key Considerations: Agility and Faster Time to Market

From its inception, with the Sep 2017 memo announcing the formation of Cloud Executive Steering Group that culminated with the release of RFP in July 2018, DoD has been clear – They wanted a single cloud contract. They deemed a multi-cloud approach to be too slow and costly. Pentagon’s Chief Management officer defended a single cloud approach by suggesting that multi-cloud contract “could prevent DoD from rapidly delivering new capabilities and improved effectiveness to the warfighter that enterprise-level cloud computing can enable”, resulting in “additional costs and technical complexity on the Department in adopting enterprise-scale cloud technologies under a multiple-award contract. Requiring multiple vendors to provide cloud capabilities to the global tactical edge would require investment from each vendor to scale up their capabilities, adding expense without commensurate increase in capabilities”

A Single, Unified Cloud Platform Was Required

The JEDI solicitation expected a unified cloud platform that supports a broad set of workloads, with detailed requirements for scale and long-term price projections.

  1. Unclassified webserver with a peak load of 400,000 requests per minute
  2. High volume ERP system – ~30,000 active users
  3. IoT + Tactical Edge – A set of sensors that captures 12 GB of High Definition Audio and Video data per hour
  4. Large data set analysis – 200 GB of storage per day, 4.5 TB of online result data, 4.5 TB of nearline result data, and 72 TB of offline result data
  5. Small form-factor data center – 100 PB of storage with 2000 cores that is deliverable within 30 days of request and be able to fit inside a U.S. military cargo aircraft

Massive Validation for the Azure Platform

The fact that the Azure platform is the “last cloud standing” at the end of the long and arduous selection process is massive validation from our perspective.

As other bidders have discovered, much to their chagrin, the capabilities described above are not developed overnight. It’s a testament to Microsoft’s sustained commitment to meeting the wide-ranging requirements of the JEDI solicitation.

Lately, almost every major cloud provider has invested in bringing the latest innovations in compute (GPUs, FPGAs, ASICs), storage (very high IOPS, HPC), and network (VMs with 25 Gbps bandwidth) to their respective platforms. In the end, what I believe differentiates Azure is a long-standing focus on understanding and investing in enterprise IT needs. Here are a few examples:

  • Investments in Azure Stack started 2010 with the announcement of Azure Appliance. It took over seven years of learnings to finally run Azure completely in an isolated mode. Since then, the investments in Data Box Edge, Azure Sphere and commitment to hybrid solutions have been a key differentiator for Azure.
  • With 54 Azure regions worldwide that ( available in 140 countries) including dedicated Azure government regions – US DoD Central, US DoD East, US Gov Arizona, US Gov Iowa, US Gov Texas, US Gov Virginia, US Sec East, US Sec West – Azure team has accorded the highest priority on establishing a global footprint. Additionally, having a common team that builds, manages, and secures Azure’s cloud infrastructure has meant that even the public Azure services have DoD CC SRG IL 2, FedRAMP moderate and high designations.
  • Whether it is embracing Linux or Docker, providing the highest number of contributions to GitHub projects, or open-sourcing the majority of  Azure SDKs and services, Microsoft has demonstrated a leading commitment to open source solutions.
  • Decades of investment in Microsoft Research, including the core Microsoft Research Labs and Microsoft Research AI, has meant that they have the most well-rounded story for advanced data analytics and AI.
  • Documentation and ease of use have been accorded the highest engineering priorities. Case in point, rebuilding Azure docs entirely on Github. This has allowed an open feedback mechanism powered by Github issues.
As developers, we spend a lot of time developing APIs. Sometimes it’s to expose data that we’ve transformed or to ingest data from other sources. Coincidentally, more and more companies are jumping into the realm of API Management—Microsoft, Google, MuleSoft and Kong all have products now that provide this functionality. With this much investment from the big players in the tech industry, API management is obviously a priority. Now, why would anyone want to use an API Management tool?

The answer is simple: It allows you to create an API Gateway that you can load all your APIs into, providing a single source to query and curate. API Management makes life as an admin, a developer, and a consumer easier by providing everything for you in one package.

Azure API Management

Azure API Management logoWhat does Azure API Management provide? Azure API Management (APIM) is a cloud-based PaaS offering available in both commercial Azure and Azure Government. APIM provides a one-stop-shop for API authority, with the ability to create products, enforce policies, and utilize a robust developer portal.

Not only can API Management integrate seamlessly with your existing Azure infrastructure, but it can also manage APIs that exist on-prem and in other clouds. APIM is also available in both the IL4 and IL5 environments in Azure Government, which allows for extensibility and management for those working in the public sector.

APIM leverages a few key concepts to provide its functionality to you as a developer, including:

  • Products
  • Policies
  • Developer Portal

From providing security to leveraging rate-limiting and abstraction, Azure API Management does it all for API consolidation and governance in Azure. Any API can be ingested, and it gets even easier when APIs follow the OpenAPI Format.

What Are Products?

Products are a layer of abstraction provided inside APIM. Products allow you to create subsets of APIs that are already ingested into the solution—allowing you to overlap the use of APIs while restricting the use of individual collections of APIs. This level of compartmentalization allows you to not only separate your APIs into logical buckets but also enforce rules on these products separately, providing one more layer of control.

Product access is very similar to Azure RBAC—with different groups created inside of the APIM instance. These groups are yet another way for APIM admins to encapsulate and protect their APIs, allowing them to add users already associated to the APIM instance into separate subsets. Users can also be members of multiple groups, so admins can make sure the right people have access to the right APIs stored in their APIM instance.

What Are Policies?

Policies are APIM’s way of enforcing certain restrictions and providing a more granular level of control. There is an entire breadth of policies available in APIM, which range from simply disallowing usage of the API after calling it five times, to authentication, logging, caching, and transformation of requests or responses from JSON to XML and vice versa. Policies are perhaps the most powerful function of APIM and drive the control that everyone wants and need. Policies are written in XML and can be easily edited within the APIM XML Editor. Policies can also leverage C# 7 Syntax, which brings the power of the .NET Framework to your APIM governance.

What Is the Developer Portal?

The Azure API Management Developer Portal is an improved version of the Swagger documentation that’s generated when you use the OpenAPI spec. The Developer Portal provides an area for developers to readily see APIs, products, and associated applications. The Portal also provides sample request bodies (no more guessing API request structures!) and responses, along with code samples in many different languages.

Finally, the portal also allows you to try API calls with customized request bodies and headers, so you have the ability to see exactly what kind of call you want to make. Along with all that functionality, you can also download your own copy of the OpenAPI Spec for your API after it’s been ingested into your instance.

Why Should I Use APIM?

Every business should be using some form of API Management. You’ll be providing yourself a level of control previously not available. By deploying an API Gateway, that extra layer of abstraction allows for much tighter control of your APIs. Once an API has been ingested, APIM provides many additional functionalities.

First, you can match APIs to products, providing a greater level of compartmentalization. Second, you can add different groups to each product, with groups being subsets of users (i.e. Back-end Devs, Billing Devs, etc.). Third, you automatically generate a robust developer portal, which provides all of the functionality of the Swagger portal, but with added features, such as code snippets.  Finally, APIM also has complete integration with Application Insights in commercial Azure, providing access to a world-class logging and visualization tool.

Azure API Management brings power to the user, and no API should be left out.

One of the biggest roadblocks to government digital transformation is the lack of effective IT governance. Unresolved concerns including privacy, security and organizational silos that limit data sharing and analysis continue to pose hurdles for agencies.

Last night’s Azure Government Meetup in Washington, D.C. featured a stellar lineup of industry-leading experts who shared insights and strategies on achieving effective IT governance in areas including identity, portfolio and records management.

If you missed it, you can catch the replay hereRead More…

The Microsoft Government Tech Summit – a free, technical learning event for IT professionals and developers – is coming to Washington D.C., March 5-6, 2018! This two-day event will be packed with technical content, and this year Microsoft is showcasing Azure Government and Microsoft 365 for US Government.

Our Cloud Application Development Director, Brent Wodicka, is presenting this year on “A PaaS-First Approach to DoD Mission Apps” on March 5th at 1 p.m.  He will be co-presenting with Microsoft’s Derek Strausbaugh, and showcasing how Azure simplifies and re-imagines legacy mission applications. Registration is now open, and we’re hoping you can join us!

As the expectations of citizens increase, the need for technology innovation in government intensifies. Learn how cloud innovation can help meet the needs of the nation. Whether you’re interested in learning about security approaches or attracting and retaining talent with a more flexible and modern workstyle, Microsoft Government Tech Summit can help you evolve your skills and deepen your expertise to lead your agency through digital transformation.

What to expect:

  • Connect with experts from Microsoft and the community, and learn how to get the most from the cloud. Ask your toughest questions, learn best practices, and share strategies.
  • Choose from a variety of learning opportunities to deepen your cloud expertise, from keynotes and breakout sessions, to hands-on labs and a hackathon.
  • Customize your learning – whether you’re already cloud-savvy or just getting started – Microsoft Government Tech Summit has something for everyone.
  • Discover the latest trends, tools, and product roadmaps at more than 60 sessions covering a range of topics, including over 40 sessions focused on the needs of government agencies.

The cloud is changing expectations – and transforming the way we live and work. Join us at the Microsoft Government Tech Summit and learn how Microsoft’s cloud platform can help you lead your agency through digital transformation – and make the cloud part of your mission success.

REGISTER NOW

Last night’s AzureGov Meetup was an exceptionally fantastic one! The team challenged the Azure Government DC user community to create and share 12-minute demos to showcase cool tech that can help accelerate your cloud implementation.

We received a terrific response to the challenge and it resulted in a rock-star lineup of speakers, demos and an inside scoop on what’s moving the needle in government technology today. The presenters included:

  • Patrick Curran, Director, Federal Group, Planet Technologies
  • Mark Joscelyne, Head of Technical Operations, Public Sector, Frame
  • Deepak Mallya, Chief Cloud Architect, Cloudwave, Inc.
  • John Osborne, Principal OpenShift Solutions Architect, Red Hat
  • Steve Michelotti, Lead Dev Evangelist, Microsoft Azure Government

You can watch ALL the demos here at the archived livestream. (It’s available for a limited time, so check it out today!) Read More…

About this time last month, Microsoft announced that its Azure Government cloud platform received Authority to Operate (ATO) designations from both the U.S. Air Force and the U.S. Immigration and Customs Enforcement (ICE). The Air Force gave Azure Government the Defense Department‘s Impact Level 4 ATO, while ICE issued a FedRAMP High ATO.

In other words, that DoD Impact Level 4 ATO confirms that Azure Government complies with security standards required to host “controlled unclassified data for development, test and production environments within CCE.” The Air Force has already started to build a cloud infrastructure through the ATO and a shared application platform and hosting environment.

The FedRAMP High ATO authorizes Azure Government to handle ICE’s most sensitive unclassified data, including data that supports the agency’s core functions and protects against loss of life. The agency is currently implementing transformative technologies for homeland security and public safety, and the High ATO designation for Azure will allow them to innovate even faster.

This is great news for both the agencies and for Azure Government. We’ve helped large federal agencies make the move to the cloud using Azure tools, and while these migrations are always quite complex, we’ve actually streamlined the process down to five crucial steps: Compliance, envisioning, onboarding, deployment, and sustainment. 

AIS’ five-step DoD Cloud Adoption Framework is built on lessons we’ve learned from countless successful commercial and DoD secure cloud migrations and results in an expedited yet fully compliant process. We’re looking forward to helping many more agencies head to the cloud as a trusted Microsoft (and government) partner.

FREE HALF DAY SESSION: APP MODERNIZATION APPROACHES & BEST PRACTICES
Transform your business into a modern enterprise that engages customers, supports innovation, and has a competitive advantage, all while cutting costs with cloud-based app modernization.

While cloud is fast becoming the “new normal” for government, agencies are still challenged with the daunting task of IT modernization and developing a cohesive cloud migration strategy. Oftentimes, what’s holding back progress is that there simply isn’t a one-size-fits-all cloud playbook. That, combined with agency culture, hinders many agencies from making the move to cloud.

The November #AzureGov Meetup this week brought in both a packed house and a great lineup of government and industry experts who shared their best practices on critical components for cloud success, including: stakeholder engagement, evaluation, planning, implementation, outcomes…and the cultural changes you need to ensure a smooth transition.

We also celebrated the two year anniversary of the #AzureGov Meetup!


Read More…

We took Rimma Nehme’s excellent demo from BUILD 2017 and recreated it for AzureGov.

In a nutshell, we took the Marvel Universe Social Database and loaded it in Azure Cosmos DB as a graph database. Then we built a simple web page that invoked Gremlin queries against Cosmos DB.

The key theme of this demo is the ease with which you can create a globally distributed database that can support low latency queries against the Marvel Universe graph database. In the context of AzureGov (as shown below), we can seamlessly replicate the data across the three AzureGov regions by clicking on these regions within the Azure portal.

Here’s a quick look at the demo: