Azure API Management service has been around since 2014. AIS has been implementing solutions using Azure API Management for almost as long. I am working on a project that recently introduced new requirements that seemed like a good fit for API Management, so we looked into it. As a result, we decided to use Azure API Management and recently deployed it into production.
I want to share some of the considerations in making that decision and some of the implementation details. This won’t be a comprehensive case for all the reasons you might choose Azure API Management, but it will at the very least describe one use case and provide information for you to help make your judgment.

Architectural Diagram

The application in question is a React-based Single Paged Application (SPA) hosted in an Azure App Service with an ASP.NET Core 3.1 Web API backend. The application uses Azure Active Directory (AAD) authentication.

The application previously exposed data to an external system via the same API built into the application called by the SPA from the client browser. The external system uses an AAD service principal with a client ID and secret for authentication. This external system polls the API approximately once per minute. It only reads data and receives JSON results.

We separated this external API into it’s own Azure App Service and now host it behind Azure API Management.

Consideration for API Management

Several months ago, two other external systems that needed to receive data were proposed. They require the same information that the first external system is receiving. At least one of the new systems wants to receive XML instead of JSON. We are also sharing data with another system, via an Azure Storage account, which we want to convert to calling the same API as the others. This adds up to potentially four external systems accessing this API.

The existing API could have been used and would have required that all apps use AAD authentication. Each additional system is provided application identities and granted read-only access to the primary application. The API shared with the UI would have to be modified to support XML results. The UI changes regularly, and the original API is often altered to support it. These changes could be disruptive to the external data feed and require comprehensive testing of two different use cases in the same test stream.

Decoupling the API provided to the external systems from the API supporting the interactive UI makes sense. This doesn’t, by itself, necessitate the introduction of an API Management service. The primary reason we considered API Management is the uncertainty in what the external systems are capable of. They may support only certain kinds of authentication methods or require unique changes to the data stream (like XML responses). Updating our API code to support this wide variety of use cases adds lines of code, complexity, and more testing requirements, all of which can affect reliability and deployment velocity.

By funneling traffic through an Azure API Management layer, we can support a wider variety of scenarios without touching the backend API code. If we encounter a new requirement, we might define a new API Management API based on the same backend application but apply different policies based on need. The backend remains unchanged and is tested by itself. The external facing API Management API is still tested, but that testing is focused on the specifics of the interface (authentication, response formatting, etc.), not the data retrieval portion.

Once we embarked upon the Azure API Management journey, we also realized we had a couple of additional use cases built into our application. We had geocoding and mapping implemented through client-side calls directly to third-party services. We needed to make some of these calls through a paid subscription that requires an API key that we need to protect. We were considering proxying these through our existing backend API, which would have been fine, but this would have meant more lines of code, more complexity, and more testing of the core application. These calls instead go into the API Management Service and straight out to the third-party service. We get the added benefit that now this traffic is visible to us and logged in Log Analytics to monitor it. We probably wouldn’t have introduced API Management just for this, but since we had decided to implement it, why not use it?

Another benefit of moving to the API Management approach involves the authentication choices offered to external systems. By moving away from AAD authentication, we eliminated the need to account for these users in our internal application, Role-Based Access Controls (RBAC). An AAD identity that is granted access to the application could allow the external system (or a compromised set of credentials) to access application functionality not intended for use by that external system. Eliminating this class of users from the Roles and Permissions matrix that the primary application must implement removes one more thing to implement and test. We must still be aware of authorization for these external systems, but that is implemented and tested entirely separately from the main application.

Moving to API Management doesn’t wholly eliminate code; it simply moves it out of the backend application code and into API Policy, another form of code. Because of the power of the available API Policies, the number of lines of code to be written was reduced. Another significant benefit of API Management is separating functionality into smaller, simpler, easier-to-test components, which sounds a little like a Micro Services architecture.

Costs

The additional Basic Tier API Management Service cost is roughly equivalent to the cost of our Standard App Service Plan (generally with two instances in production), roughly doubling our compute costs. This constitutes less than 20% of our overall environment cost after factoring in the other services we are using (App Gateway, Database, Bandwidth, etc.), equivalent to only a couple hours of labor per month.

Our application did not have the requirements to justify the utilization of higher-priced API Management performance tiers. Economies of scale and ease of implementation could translate into higher performance and lower cost of implementation that would justify the cost of the additional firepower. The bottom line is, as always, to evaluate all the potential costs and weigh them against the cost of doing it another way.

Architecture

The resulting changes to our architecture involved:

  • Creating a new App Service to host the external API.
  • Adding in the API Management service.
  • Integrating our Application Gateway with the new API.

Network

Our initial network architecture consisted of a Virtual Network with two subnets, a DMZ for the App Gateway, and an SVC subnet for our App Services. A single public IP is used for the App Gateway. CNAME records pointing to the DNS entry for this Public IP are registered with our DNS Provider.

Original Design
Origonal Architecture

Microsoft Network Security Guidance

Microsoft recommends integrating the API Management service with a Virtual Network (VNET) (See Microsoft’s Azure security baseline for API Management) to keep the backend traffic isolated and to allow control through Network Security Groups (NSGs).

VNET integration requires the Premium tier of the Azure API Management service. The Premium tier starts at $2,795/month (East US Region as of 3/29/2021). The Basic tier is $147/month (East US Region). The Basic Tier has a small cache and throughput starting at 1,000 requests/second at that price compared to 4,000 requests/second with Premium.

Microsoft also recommends placing a Web Application Firewall (WAF) in front of your API Management service. This can be done with or without a VNET integrated API Management service.

App Gateway

We already had an App Gateway with WAF enabled, so we placed our API Management service behind this gateway and forwarded the API traffic to the API Management service. This provides us the full power of the firewall without introducing anything new into the architecture or doubling our App Gateway costs. The throughput-based cost of the App Gateway may go up slightly, depending on traffic, but this is a small percentage of our overall costs.

New Network Architecture

Our needs didn’t require the massive throughput or the high cost of a Premium API Management service, so we chose to go with the Basic tier and mitigate the Network security risks by locking down traffic between our components by other means.

The modified architecture now looks like this:

New Architectural Diagram
New Architecture

IP Restrictions

To ensure that our network traffic is locked down, we utilize IP restrictions to ensure that the only traffic accepted into our App Services comes from our API Management service. The API Management Service only accepts inbound traffic from our App Gateway, and the external API only accepts requests from known external systems.

App Gateway to API Management

We implemented an IP-filter API policy for the API Management service, supplying the Public IP(PIP) of the App Gateway as the only allowed inbound IP address. The PIP is stored in an API Management Named Value “gateway-pip”, which is referenced in the policy shown below.

Gateway API
<ip-filter> Policy

API Management to Backend API

Our Azure App Services are integrated with the VNET, allowing inbound traffic from the App Gateway (the DMZ subnet) and denying all inbound traffic from the Internet. The API Management Service must also be allowed in. We do this by adding the Public IP of the API Management Service to the allowed IPs for each App Service that API Management needs to connect to.

App Service IP Restriction
App Service IP Restriction

External System Inbound Traffic

We don’t want to open our API to just anyone, so we want to limit inbound traffic to the API used by the external systems to the known inbound IP addresses for those systems. Since we’re sharing a single App Gateway with a single Public IP for our main application, we have to allow incoming HTTPS traffic on port 443 from the entire Internet. Our users are coming in from the Internet, and we don’t know their IP addresses. This prevents us from using Network Security Groups (NSG) since the same port is used for both kinds of traffic. The only way to restrict NSG would be to create a second App Gateway (with firewall) on a different Public IP. To deal with this, we implemented an inbound IP restriction for the true client at the API Management Layer, given our aversion to increased costs.

This is implemented with a set-header policy and some code as shown below. The actual client IP is passed from the App Gateway in the X-Forwarded-For header, which we then compare in C# code and combine with other API Policy primitives to decline traffic from non-approved addresses.

See the Microsoft article for how this is done.

Automated Deployment

In the spirit of modern DevSecOps practices, we use Azure DevOps Pipelines for our infrastructure and deployment of our application. These pipelines, templates, and scripts automatically provision and configure all aspects of the API Management feature, including:

  • API Management Service
    ARM Templates to deploy Service and APIs
  • API Policy Publishing
    API Policies are kept in separate XML files
    Policies published via a PowerShell script
  • App Gateway Listener
    Add listener and routing rules
  • Secret Generation and Storage
    Generate or retrieve required secrets and store them in a KeyVault
  • Network Security Groups
  • IP Restrictions on App Services

API Management

Our API Management service implementation involves a single API Management instance with several APIs. Each API answers a different path on the primary hostname for the API (/API/service1api, API/service2api, etc.). One API is for our primary data feed to the external systems, and another acts as a proxy for our geocoding service and another for our mapping provider. Since the mapping and geocoding APIs are called from the client SPA, they must be authenticated with Azure AD and authenticated the same way with the same token already acquired to talk to the primary Application API. Named Values are configured with various values to assist in our API policy implementations. Some of these are secrets that are directly bound to an Azure KeyVault secret.

Frontend Authentication

Authentication of the external system is handled in the API Management layer using built-in API Management Subscription Keys. We can create subscriptions, generate and revoke subscription keys and even provide external system engineers the ability to perform some self-service through the API Management Developer Portal. For now, we are not enabling the Developer Portal and are handling Subscription Key generation and Key dissemination ourselves.

The mapping and geocoding APIs use Azure AD authentication because it is end-user traffic coming from our SPA. Because we’re authenticating against the same Azure AD App Registration as the application API, the same tokens issued for the standard backend API can be used with calls to the API Management API. This made changing the existing calls relatively easy because we didn’t have also to introduce a new token retrieval component.

Subscription Key Authentication

To utilize Subscription Key authentication, you need to create a Subscription, generate and share a Subscription Key, associate the Subscription with either a Product or an API, and then set “Requires Subscription” to true on the API.

Creating a Subscription
New SubscriptionSubscriptions

Requiring a Subscription

Requiring a Subscription API
Subscription Required

Azure AD Authentication

Authenticating the front-end with Azure AD requires creating an OpenID Connect Provider and then configuring the API to use it. There is also an OAuth option here, but we found the OpenID connect simpler and got it working first.

Create Open ID Connect Provider

The critical aspects of the OpenID Connect Provider are the Metadata endpoint URL and Client credentials. Generate a new Client Secret for the chosen Azure AD Application and enter the details here.

OpenID Connect Configuration
OpenID Connect Configuration

Configure API to use Open ID Connect Provider

In the API Settings tab, scroll down to the Security section and select OpenID to connect, and then choose the configured OpenID server configured in the previous step.
Unselect the “Subscription required” checkbox. It is unlikely you will be using both OpenID and a Subscription Key simultaneously, though you could. In this scenario within our application, the client application is a browser where we can’t provide or protect a subscription key.

API Open ID
API OpenID Connect Setting

Backend Authentication

One approach sometimes proposed for microservices architectures is not complicating internal services with authentication built-in to the services themselves. The idea is that the environment it’s hosted in provides sufficient guard rails to ensure only trusted callers can reach the service. In Kubernetes and other containerized approaches, the default implementation is not Internet-connected. In these scenarios, you must explicitly expose services, and you can apply policies or other restrictions to prevent unwanted inbound traffic. In our case, an Azure App Service is Internet visible by default (we do not use an App Service Environment, by the way). Unless you modify the configuration, your application is accessible from the Internet, so you have to choose to integrate it with a VNET and/or deny inbound traffic explicitly.

Authentication with API Key

Our goal is to be secure by default. For this reason, we added a simple API Key mechanism to the backend API application such that an accidental misconfiguration (or no configuration at all) would result in “Access Denied,” regardless of allowed network traffic. We placed the API key in an Azure KeyVault Secret. The Application and the API Management system identities are granted access to this KeyVault, allowing them to share the same secret. The key is automatically generated and doesn’t ever need to be seen by humans. Only privileged users can access them, and you don’t have to expose them to developers in Production. The API Management service retrieves and caches the key and passes it to the API.

The same approach to authenticating our backend application applies to passing a subscription key to a third-party service, like the geocoding service we use.

Here’s an example of a policy that retrieves a backend API Key from an API Management Named Value (which is bound to a KeyVault Secret) named “API-key” and passes it to the backend in the API-Key HTTP Request Header:

Key Header API
A policy with Api-Key Header

Authentication with Azure AD

JAVA API
A policy with Managed Identity

It is also possible to authenticate to the backend with Azure AD using an API Management System Assigned Identity. In this example, the backend Java App Service is configured for Azure AD authentication using the Azure AD App Registration shown here.

Application ID URI
Application ID URI

This Application ID URI is passed into the API Management policy above as the “resource.” The App Service is configured to authenticate with this App Registration by configuring authentication for Azure AD:

Application Service Authentication
Application Service Authentication

The Client ID is set to the Application ID for the AAD Application, and the Issuer URL is configured for the appropriate login URL with the tenant injected.

Azure Active Directory Settings
Azure Active Directory Settings

Response Formatting

Another benefit of the wide range of available API Policy features is the rapid implementation of response formatting. Our new backend API implementation didn’t support XML serialization out of the box, so we couldn’t take advantage of some of the standard ASP.NET Core response formatting without changing code, searching out a third-party XML formatter, or building a custom XML formatter.

The JSON to XML formatting policy built into API Management is one line of code:

<policies>
  <inbound>
    <base />
  </inbound>
  <backend>
      <base />
  </backend>
  <outbound>
      <base />
      <json-to-xml apply="content-type-json" consider-accept-header="true" />
  </outbound>
  <on-error>
      <base />
  </on-error>
</policies>

For JSON that is simply an array of unnamed objects, this produces output like this:

<Document>
    <Array>
        <property1>value/property1>
        <property2>value</property2>
    </Array>
    <Array>
        <property1>value</property1>
        <property2>value</property2>
    </Array>
</Document>

If this output does not meet the requirements, you have the opportunity to modify the result more before it goes out the door. For example, we could place proper element names by searching and replace on the output:

<policies>
  <inbound>
    <base />
  </inbound>
  <backend>
    <base />
  </backend>
  <outbound>
    <base />
    <json-to-xml apply="content-type-json" consider-accept-header="true" />
    <choose>
      <when condition="@(context.Request.Headers.GetValueOrDefault("Accept","") == "application/xml")">
          <find-and-replace from="&lt;Document&gt;" to="&lt;contacts&gt;" />
          <find-and-replace from="&lt;/Document&gt;" to="&lt;/contacts&gt;" />
          <find-and-replace from="&lt;Array&gt;" to="&lt;contact&gt;" />
          <find-and-replace from="&lt;/Array&gt;" to="&lt;/contact&gt;" />
      </when>
      <otherwise />
    </choose>
  </outbound>
  <on-error>
      <base />
  </on-error>
</policies>
This produces the following:
<Contacts>
    <Contact>
        <property1>value/property1>
        <property2>value</property2>
    </Contact>
    <Contact>
        <property1>value</property1>
        <property2>value</property2>
    </Contact>
</Contacts>

The policy is applied from the top down. You can add more modifications to the block, including C# code, to continue to modify the results.

If you need more complex transformations, you might need to modify the back-end API, but you should consider API Management Services before taking that step.

Conclusion

I hope you found this post informative and helpful in evaluating Azure API Management Services as a part of that journey. I found it easy to use and cost-effective for the solution we implemented. I believe we’ve just scratched the surface of its potential and expect to leverage it more as we build and enhance our system.

From C# Developer to DevOps Engineer

Over the last couple of years, I’ve become a DevOps Engineer after having been primarily a C# developer. Instead of primarily C# and SQL, I was now working almost exclusively with JSON, YAML, and PowerShell. While I was very familiar with Visual Studio 2013/2015/2017 and its excellent support for the .NET work I did over the years, I found the experience for building DevOps solutions to be underwhelming. At the time, the Intellisense for Azure Resource Manager (ARM) or Terraform templates, GitLab or Azure DevOps pipelines, and PowerShell was either non-existent or incomplete. In addition, Visual Studio was quite the resource hog when I wasn’t needing all the extras it provides.

Enter Visual Studio (VS) Code

Now, I had downloaded VS Code soon after it was released with the intent to use it at some point, to say I had. However, after seeing Visual Studio Code used in some ARM template videos where snippets were used, I decided to try it out. Like most Integrated Development Environments (IDE), VS Code isn’t truly ready to go right after installation. It’s taken me some time to build up my configuration to where I am today, and I’m still learning about new features and extensions that can improve my productivity. I want to share some of my preferences.

I want to point out a couple of things. First, I’ve been working primarily with GitLab Enterprise, Azure DevOps Services, and the Azure US Government Cloud. Some of these extensions are purely focused on those platforms. Second, I use the Visual Studio Code – Insiders release rather than the regular Visual Studio Code version. I have both installed, but I like having the newest stuff as soon as I can. For this post, that shouldn’t be an issue.

Theming

As long as there’s a decent dark color theme, I’m content. The bright/light themes give me headaches over time. VS Code’s default dark theme, Dark+, fits the bill for me.

One of the themes I didn’t know I needed before I stumbled across them was icon themes. I used to have the standard, generic folder and file icons, the Minimal theme in VS Code. That made it difficult to differentiate between PowerShell scripts, ARM templates, and other file types at a glance. There are a few included templates, but I’m using the VSCode Icons Theme. It’s one of the better options, but I’m contemplating making a custom one as this one doesn’t have an icon for Terraform variables files (.tfvars), and I’d like a different icon for YAML files. If the included themes aren’t suitable for you, there are several options for both types of themes and Product Icons themes through the marketplace.

Figure 1 – VS Code’s Minimal icon theme

Workspaces

Workspaces are a collection of folders that are a “collection of one or more folders are opened in a VS Code window.” A workspace file is created that contains a list of the folders and any settings for VS Code and extensions. I’ve only recently started using workspaces because I wanted to have settings configured for different projects.

Extensions in Visual Studio Code provide enhancements to improve productivity. Extensions include code snippets, new language support, debuggers, formatters, and more. I have nearly 60 installed (this includes several Microsoft pre-installs). We will focus on a handful that I rely on regularly.

Workspace Code Configuration
Figure 2 – VS Code Workspace configuration. Also shows the choice of Azure Cloud referenced in the Azure Account extension section below.

Azure Account

The Azure Account extension provides login support for other Azure extensions. By itself, it’s not flashy, but there are a few dozen other Azure extensions that can use the logged-on account from one to reference Azure resources targeted by the others. This extension has a setting, Azure Cloud, that was the main reason I started adopting Workspaces. The default is the commercial version, AzureCloud. I’ve changed it at the user level to AzureUSGoverment, but some of my recent projects use AzureCloud. I’ve set the workspace setting for those.

Azure Resource Manager (ARM) Tools

This extension will make your ARM template tasks much more manageable! It provides an extensive collection of code snippets to scaffolding out many different Azure resources. Schema support provides template autocompletion and linting-like validation. A template navigation pane makes finding resources in a larger template easy. There is also support for parameter files, linked templates, and more.

HashiCorp Terraform

Terraform is an offering of HashiCorp. They’ve provided an extension that supports Terraform files (.tf and .tfvars), including syntax highlighting. While there are only a few snippets included, the autocompletion when defining new blocks (i.e., resources, data) is quite extensive.

Terraform
Figure 3 – Terraform autocompletion

GitLens – Git Supercharged

GitLens is full of features that make tracking changes in code easily accessible. I installed this extension for the “Current Line Blame” feature that shows who changed the current line last, when they changed it and more. In addition, there are sidebar views for branches, remotes, commits, and file history that I use regularly. There are several other features that I either don’t use or even wasn’t aware of until writing this post, as well this is an excellent tool for Git repo users.
GitLens Line Blame

MSBuild Project Tools

I had a recent project that contained a relatively large MSBuild deployment package that needed to be updated to work with the changes made to migrate the application to Azure. I haven’t worked with MSBuild in several years. When I did, I didn’t have all the syntax and keywords committed to memory. This extension provides some essential support, including element completion and syntax highlighting. It did make the project a little easier to modify.

PowerShell Preview

I’ve become a bit of a PowerShell fan. I had been introduced to it when I was working with SharePoint, but since I’ve been doing DevOps work in conjunction with Azure, I’ve started enjoying writing scripts. The less-than-ideal support for PowerShell (at the time, at least) in Visual Studio 20xx was the main reason I gave VS Code a shot. This extension (or the stable PowerShell extension) provides the excellent IntelliSense, code snippets, and syntax highlighting you’d expect. However, it also has “Go to Definition” and “Find References” features that I relied on when writing C#. In addition, it incorporates linting/code analysis with PowerShell Script Analyzer, which helps you develop clean code that follows best practices.

PowerShell Preview

Powershell (stable)

Wrapping Up

I have far more than these extensions installed, but these are the ones I use the most when doing DevOps work. Some of the others either haven’t been used enough yet, aren’t helpful for a DevOps Engineer, or weren’t interesting enough to list for the sake of brevity.

However, I’ve created a Gist on my GitHub that contains the complete list of extensions I have installed if that’s of interest. Visual Studio Code is an amazing tool that, along with the proper configuration and extensions, has increased my productivity as a DevOps Engineer.

Recently I was working on a project for a DoD client looking to move multiple, siloed on-premise data workloads to the cloud as part of an Azure capabilities proof of concept. The client needed to upload large amounts of data, create parquet files, perform ad-hoc analysis using jupyter notebooks, and make data available to query using SQL for a reporting dashboard. A single dataset was approximately a terabyte with all the data measured in petabytes.

After completing the analysis, we ended up with many files staged in a data lake. Due to the considerable amount of data we were expecting in the future, we didn’t want to pay to store this data twice spread over multiple databases. We opted to take advantage of Azure Synapse and Polybase to directly query parquet files in the data lake using external tables[i]. We ended up with the following data processing flow:

Azure Synapse

When setting up the parquet files to be queried as an external table, some of them had many fields (200+), which led to numerous errors and quickly became very tedious. In addition, due to the nature of the project, numerous tables were often created. To avoid manually creating the tables, we looked for a solution to automatically create the external tables but could not find an existing solution. So it was up to me to create a simple solution that would work in our current environment without adding additional resources.

The solution was to leverage Data Factories Get Metadata activity[ii] while moving the data into the staging directories. The activity could give us the schema of the parquet files as a JSON string. I then take this JSON schema and pass it to a stored procedure on the synapse pool that would parse the JSON and insert the fields into a table I could use later:

WITH Fields (fieldOrder, fieldName, fieldType) AS (
  SELECT
    [ key ] AS fieldOrder,
    JSON_VALUE([ value ], 'strict $.name') AS fieldName,
    JSON_VALUE([ value ], 'strict $.type') AS fieldType
  FROM
    OPENJSON(@ schema)
)
INSERT INTO
  tables_to_create(
    tableName,
    fieldOrder,
    fieldName,
    fieldType,
    executeTime
  )
SELECT
  @tableName,
  fieldOrder,
  fieldName,
  fieldType,
  translatedType = CASE
    WHEN fieldType = 'Single' THEN 'real'
    WHEN fieldType = 'Boolean' THEN 'bit'
    WHEN fieldType = 'Double' THEN 'float'
    WHEN fieldType = 'Int64' THEN 'bigint'
    ELSE NULL
  END
  @ExecTime
FROM
  Fields

Then build a SQL command to check the existence of the table and then create it if it doesn’t exist:

SET
  @sqlCommand = 'IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N''[dbo].[' + @tableName + ']'') AND type in (N''U''))
  CREATE EXTERNAL TABLE [dbo].[' + @tableName + '] (' 

WHILE((SELECT COUNT(*) FROM tables_to_create WHERE executeTime = @ExecTime) > 0)
BEGIN
  DECLARE @key int
  SELECT
    @key = MIN(fieldOrder)
  FROM
    tables_to_create
  WHERE
    executeTime = @ExecTime
    
  DECLARE @fieldName VARCHAR(50)  
  DECLARE @translatedType VARCHAR(50)

  SELECT
    @fieldName = fieldName,
    @translatedType = translatedType
  FROM
    tables_to_create
  WHERE
    fieldOrder = @key
    AND executeTime = @ExecTime

  SET
    @sqlCommand = @sqlCommand + '
          [' + @fieldName + '] [' + @translatedType + '] NULL'

  DELETE FROM
    tables_to_create
  WHERE
    fieldOrder = @key
    AND executeTime = @ExecTime

  IF((SELECT COUNT(*) FROM tables_to_create WHERE executeTime = @ExecTime) > 0)
    SET
      @sqlCommand = @sqlCommand + ', '
END

SET
  @sqlCommand = @sqlCommand + '
  )
  WITH
  (
    LOCATION = ''/' + @folderPath + ''',
    DATA_SOURCE = DataLakeStaged,
    FILE_FORMAT = StagedParquet
  )'
  
EXEC(@sqlCommand)

This frees up the analyst from needing to manually create the external tables and know the mapping in the data factory to point to the correct location on the data lake. The analysts need to worry about making sure the name and path conventions we set up for syncing don’t land different schemas in the same folder.

Resources

[i] https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables?tabs=sql-pool#external-tables-in-dedicated-sql-pool-and-serverless-sql-pool

[ii] https://docs.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity#:~:text=Supported%20connectors%20%20%20%20Connector%2FMetadata%20%20,%20%20x%20%205%20more%20rows%20

AIS recently held an internal hackathon for Microsoft Power Platform to expose our team to the platform, concepts, approaches through hands-on experience and to demonstrate the role Power Platform plays in modernizing legacy applications in the cloud.

As a premier Microsoft partner, Power Platform has quickly become a core part of our holistic enterprise cloud transformation strategy. Our teams have helped enterprises across Financial Services, Insurance, Health and Life Sciences, and the Government leverage Power Platform for enterprise-grade application modernization in the last three years.

We’re integrating Power Platform with Azure and Microsoft 365 for powerful legacy modernization capabilities, and throughout the project, we uncovered new lessons learned to share across project teams.

The Power Platform Market

There’s been a surge in demand for Low Code Modernization solutions in the last few years. Many organizations are looking to enable Line of Business (LOB) owners and teams to build productivity and collaboration solutions, offsetting the continued strain on IT teams.

Microsoft answered the call with Power Platform. Since its release, we’ve been working with product teams and industry leaders to develop adoption frameworks, accelerators, and solutions to help organizations get the most out of their Power Platform investments. The demand has grown such that we’re continuing to introduce new ways to build our Power Platform skills. Enter Our hackathon team.

“Power platform is really a ‘work less do more,’ as it would even get the integrations done with legacy systems without writing a single line of code. This I have never experienced in my 13 years of experience, just amazing. The perspective of low-code-no-code is completely changed with this experience, it is an enterprise development tool that can connect your teams’ apps to outlook app to data verse to SQL. You name the technology; it has a connector to fit in.”
– Technology Lead at AIS, Kranthi Kiran Gullapalli

Meet the Hackathon Team

There was a lot of initial interest in learning the platform from our India-based team out of the AIS Hyderabad HQ. Over a dozen people committed weeks of their time inside and outside of regular working hours in the pursuit of the Power Platform experience.

Some of the participants included employees from our HUB team. The HUB is a group of employees fully dedicated to helping AIS project teams deliver successful cloud modernization projects.​ This team consolidates knowledge and experience to provide rapid research and guidance for cloud migration and modernization projects­­­. As app modernization using Power Platform work grows, we wanted to allow individuals to skill up on the platform.

But the HUB team members weren’t the only participants. These groups comprised learners and teachers across account teams. Teams were broken down into several pods with a leader to guide that team’s focus area for the project, including Dataverse data migration and integration, authentication, Power Apps Portals development, and more.

“MPP Hackathon” is where the idea of converting a monolithic application to a Power Apps Portal became a reality. Initially, some ideas flowed on reimagining a typical legacy app to Power Apps Portal, but the hackathon allowed us to experiment & try it. Hackathon helped me & my team to have hands-on exposure to the MPP environment. We worked with data modeling and got to know better on Dataverse (In-house storage in MPP). We took our first of many steps towards App Modernization with this hackathon as we are new to MPP. Microsoft Power Platform is a suite of business services that unifies our data. It provides a powerful way to analyze data, act on newly generated insights, deliver personalization, and automate business processes.”
– Jagrati Modi, Software Developer

A Mission to Innovate, Passion for Learning

AIS strives to provide ongoing learning opportunities for our employees. And we’re lucky to have a team that is so passionate about learning, innovation, and the continual pursuit of new skills.

We’ve developed programs like aisU to nurture a growing community of continued education, knowledge sharing, and technical excellence. We offer various opportunities to gain exposure to new technology and concepts, such as training boot camps, technology deep dives, and level-up sessions presented by consultants based on project experience.

Access to the methodologies, approaches, and lessons learned from these project teams helps employees gain the know-how and resources to consult on and deliver similar projects within their accounts successfully.

These opportunities aim to help AIS improve overall project delivery, presales support, and envisioning capabilities for better customer outcomes.

Up Next: Our Approach

Stay tuned for part two of this series as we dive deeper into the approach and execution of an end-to-end application modernization project leveraging Power Platform, API Integration, Dataverse, Azure DevOps, and more.

Until then, check out some of our open job opportunities. We’re always looking to add passionate technologists to our growing team. Introduce yourself and let’s talk about a future career at AIS.

The landscape for Power BI licensing is complex and evolving. To begin to grasp the structure and detail, let’s break it down. Today I am asking: what are the capabilities of using a free license?

Really, It’s Free

More than one path exists to obtain a free account. For our exploration in this article, I will use two:

  • Individual Free Account
  • Office 365 (Dev Tenant) Member Free Account

Let’s frame this in a story with two fictional characters. Cary Small started a small one-person business during the pandemic and wants to send a graph of her sales data to her small business Meetup group friends. Max Rockatansky is her friend. As an employee for a small company that subscribes to Microsoft 365, he was assigned a free account by the M365 tenant admin.

Fictional Characters

Free Account User Source of Free Account Different Experience in Power BI
Cary Small Individual Free Account Cannot publish to public
Max Rockatansky Assigned account by M365 tenant admin* Publish to public if admin has allowed it

*admin name Imperator Furiosa 😀

Let’s See This in Action

Cary created her Individual Free account from a personal domain name email address to try out Power BI, let’s say carys@mysweetstartup.biz. A link to create this type of account is in the Links section at the end of this article. Email addresses from free email companies such as Gmail, Hotmail, or Outlook.com are not accepted. This account type is interesting because it is the free-est of the free; no Office 365 membership is needed.

Create Free Individual Account form

Power BI Desktop

With her new account, the starting point for Cary is to install the Power BI Desktop application. This application is free to download, install and use, even without ever logging in to any account. A link to the download is included at the end of this article. Power BI Desktop is the starting point to generate and shape reports and visualizations.

Install Power BI Application

Here Cary imports her spreadsheet of sales data. She can transform it and create a visualization or report all before logging in. Within an hour, Cary has completed her graph of sales data:

Cary's Sales Chart

Now she wants to share it with her Small Business Meetup friends. The report must be published to the Power BI Service. Navigate to Home -> Publish in the menu. Cary will then see a prompt to log in to her Free Account.

To Publish and Share a Login is required

A dialog wants to know a destination. The only option for the free account is My Workspace. My Workspace is accessible to only the account owner. It does not allow for collaboration with publishing to Workspaces or App Workspaces as these are features available from paid subscriptions. Cary then selects My Workspace and clicks Next.

Publish workspace with Power BI

Success! Her report was published to My Workspace in the Power BI Service. A dialog appeared with the link to open the Power BI website.
Publishing to PowerBI

Profile and Account Status

Once on the Power BI site, click on your profile in the upper right corner. The dropdown detail will show your status: Free account, with a prompt to purchase a Pro upgrade.

Profile Info shows free account

Sharing the Report

Now that Cary’s report is published, she can explore her options for sharing the chart. Under the Export menu item are options to export to PowerPoint, a PDF, or to Analyze in Excel and be emailed as file attachments.

Export to file options

For a free account, this functionality feature is nice, but there is more to be offered.

Functionality features

Under the Share menu, she finds three options (Figure 9). The first option will email a link. When she clicks it, a Pro license requirement pops up. The last option is a QR code that requires the device to be logged into Power BI. Neither of those is helpful. But the middle one, Embed report, offers a selection to Publish to the web (public) so, she tries it.

Publish to the Public Web

Hitting a Roadblock

We hit another impediment:

Roadblock

Recently, an admin setting added to Power BI requires an admin to permit sharing publicly explicitly. As an Individual Free Account, Cary has no admin to approach. Her account is not in a tenant that she can access. At this point, Cary can share only by exporting a file. She thinks this is pretty good for free! But she would also like to embed the report in her free GitHub Pages website.

Max comes for help

Cary calls her friend Max for help. Unfortunately, Max does have the capability to publish a public Power BI report. To do this, he must gain access to Cary’s computer, so they start a Zoom meeting and steps below:

  1. Close the browser window to apps.powerbi.com.
  2. Return to Power BI Desktop – Cary signs out.
  3. Max will sign in.
  4. Publish again, choose My Workspace.
  5. On the Power BI Service website, sign out as Cary and sign in Max.
  6. Choose Embed -> Publish to Web.
  7. Now they see a better dialog (Figure 12):

Publish sharing is now possible

Click Create Embed Code. A dialog will caution you to be careful! “Public” means completely public so that any sensitive data will be compromised. But in our case, this is all for the sake of IT and is not real! Let’s go! Click Publish.

Embed in a public website

Success! You now have both an iframe tag and a link that you can email. Cary tested by emailing the link to her phone and clicked the link. The report opened directly to the chart with no impediment, no login. She then embeds the link on her startup website, opens Max’s LinkedIn page, and writes him a glowing compliment.

Getting started with PowerBI

A startup business might not be ready to invest in Premium Space or Pro licenses. With an Office 365 account, the option to use the free license also offers report sharing. The sharing is limited, and there is no collaboration option, but the business can still convey data visualizations. This is a great place to get started with Power BI.

Links