In a previous article, I wrote about the Key Vault FlexVolume driver for Kubernetes. I demonstrated how to use it to mount an HTTPS certificate from Azure Key Vault onto Kubernetes pods. Since then, the FlexVolume driver has been deprecated in favor of the Container Storage Interface (CSI) secrets store driver and Azure Key Vault provider.

The CSI Standard

The Container Storage Interface (CSI) is the latest evolution in storage plugins for Kubernetes. It is defined by a standard design to overcome the shortcomings of the FlexVolume plugin. It is an “out of tree” plugin, meaning that it is decoupled from Kubernetes so that CSI drivers can be developed and versioned separately from Kubernetes.

The Secrets Store CSI Driver

This driver’s design is a “secrets driver + provider” model where the secrets store CSI driver provides the implementation for mounting a volume and delivering secrets to pods. Providers implement access to a particular secrets store. Currently, supported providers include:

  • Azure Key Vault
  • HashiCorp Vault
  • Google Secret Manager

Multiple providers can run in the same cluster simultaneously.

Besides mounting secrets to a pod volume, this driver also allows you to map from the secret store to Kubernetes secrets optionally. This is useful when instead of terminating TLS at the pod level, you are using an ingress controller such as NGINX that requires the HTTPS certificate to be a Kubernetes secret. Here is an example of using the Azure provider for this case.

The SecretProviderClass Resource

With the FlexVolume driver for Key Vault, all the Key Vault and secret settings were declared in the YAML defining the volume mount in a deployment.

The Secret Store CSI Driver uses a custom Kubernetes resource called a SecretProviderClass to define the secret store and secret mount settings. Then the volume mount definition refers to the SecretProviderClass name. This results in a much cleaner deployment YAML and a decoupling of the secrets provider configuration from a particular volume mount.

Installing with Helm

Installing the Secrets Store CSI Driver and Azure provider is straightforward with the Helm package manager and the provided Helm charts. This installs the driver as a Kubernetes daemonset that will be available on all nodes so that any pods can utilize it in the cluster.

Mounting a Certificate for HTTPS

In addition to secrets such as passwords and API keys, Azure Key Vault can securely store and provide private key certificates such as those used for HTTPS. As we demonstrated with the FlexVolume driver for Key Vault, we can mount a certification to our pods and use them to bootstrap Kestrel in ASP.NET Core for HTTPS.
Let’s look at how we would do the same thing with the CSI Secret Store Driver and Azure Provider. You can see the full working example in the aks-csi-keyvault-certs GitHub repo and see more detailed instructions in the README.

First, we create our SecretProviderClass resource definition:

kind: SecretProviderClass
  name: azure-kvname
  provider: azure
    tenantId: "[*** YOUR KEY VAULT TENANT ID ***]"
    keyvaultName: "[*** YOUR KEY VAULT NAME ***]"
    objects:  |
        - |
          objectName: aks-https
          objectAlias: https.pfx.base64
          objectType: secret        # object types: secret, key or cert
          objectFormat: pfx         # for .NET Core 3.1 we want the PFX format
          objectVersion: ""         # [OPTIONAL] object versions, default to latest if empty

We name our class azure-kvname, which we will use in our volume definition. In the object property, we can define 1-N secrets to be mounted as files on the volume. In this case, our secret has these properties:

objectName The name of the certificate in Key Vault.
objectAlias This will be used as the file name on the volume.
objectType We use “secret”, which will get us the private key certificate.
objectFormat Unfortunately, even if we have stored the certificate in Key Vault as a PFX, Azure CSI provider will automatically convert it to PEM format. .NET Core 3.1 does not support PEM out of the box, so setting this to “pfx” ensures we get a PFX.

In our Kubernetes Deployment YAML, we then define our volume like so:

      - name: aks-keyvault-aspnetcore-httpscert
          readOnly: true
            secretProviderClass: "azure-kvname"
            name: kvcreds

In this demo, we use an Azure Active Directory service principal to authenticate to Key Vault, whose credentials are stored as a Kubernetes secret. The nodePublishSecretRef option provides the name of the Kubernetes secret containing these credentials.

Then in our deployment YAML, we define the volume mount for our pods:

        - name: aks-keyvault-aspnetcore-httpscert
          mountPath: /certs
          readOnly: true

Given our volume mountPath and the objectAlias in our SecretProviderClass, the certificate will be available in our pods using the path /certs/https.pfx.base64.

Keeping Secrets

The Secrets Store CSI Driver and Azure Key Vault provider for Kubernetes are a great way to deliver secrets to your containerized applications. If you are currently using the FlexVolume driver for Azure Key Vault, you should strongly consider updating to the CSI driver to take advantage of the latest innovations and features it provides.

Kubernetes has become the go-to orchestrator for running containers. At AIS, we are working with clients using Azure Kubernetes Service (AKS), applying our expertise in DevOps. One concern is delivering HTTPS certificates to containerized web applications; let’s use Azure Key Vault to achieve this.

Azure Key Vault is a cloud service that provides hardware security modules (HSMs) for securely storing and delivering secrets, encryption keys, and certificates. The Key Vault FlexVolume driver for Kubernetes offers a means for mounting certificates as files in our containers; this allows us to automate and manage HTTPS certificates for applications without requiring them to access Azure Key Vault directly.

Let’s look at how we can apply this to Kubernetes services that front containers running ASP.NET Core 3.1 applications. We’ll start with a sample application from Microsoft’s .NET Docker repo. You can find the full solution and step-by-step instructions in this aks-keyvault-certs GitHub repo.

We’ll use PFX encoded certificates in our Azure Key Vault for this demo, as they are readily loadable in .NET Core 3.1 for use in Kestrel hosting.

There are a few important details to note:

  1. You can retrieve a certificate from Azure Key Vault using the certificate, key or secret object types. To get the full private key certificate, you need to use the “secret” object type.
  2. If you import a PFX encoded certificate into Azure Key Vault, getting its secret will return the full PFX file; however, since the API return value is a string and a PFX is a binary file format, the result must be base64 decoded.
  3. If you import a certificate using the text encoded PEM format, it is returned as-is and base64 decoding is unnecessary; however, .NET Core does not currently support loading private-key PEM certificates.
  4. Since we’ve already authenticated to Key Vault, the resulting PFX file mounted in our container no longer requires the PFX password to load.

Knowing this, if we import a PFX certificate into an Azure Key Vault, we can configure our Kubernetes Deployment YAML to create a volume containing a file containing the certificate secret contents. For example:

      - name: aks-keyvault-aspnetcore-httpscert
          driver: "azure/kv"
            name: kvcreds
            # update with your key vault name
            keyvaultname: "YOUR KEY VAULT NAME"
            # update with your AAD tenant
            tenantid: "YOUR KEY VAULT TENANT"
            # The name of the object in Key Vault
            keyvaultobjectnames: "aks-https"
            # Use "secret" to get the full cert
            keyvaultobjecttypes: secret
            # This becomes the file name on the mount.
            keyvaultobjectaliases: "https.pfx.base64"

Then, in our pod specification we can mount the volume and set an environment variable for its path:

        # Set an environment var to the cert path for
        # the application to use.
          value: "/certs/https.pfx.base64"
        # Mount the key vault volume to /certs
        - name: aks-keyvault-aspnetcore-httpscert
          mountPath: /certs
          readOnly: true

Finally, in our ASP.NET Core application, we configure Kestrel to use the mounted certificate file for HTTPS endpoints:

public static IHostBuilder CreateHostBuilder(string[] args) =>
                .ConfigureWebHostDefaults(webBuilder =>
                        .UseKestrel(options =>
                            // configure Kestrel with our HTTPS certificate

        private static void ConfigureHttps(HttpsConnectionAdapterOptions options)
                // When we get the certificate from Key Vault as a secret,
                // it provides the entire PFX file but without the password.
                // Since PFX is a binary format and a secret is a string,
                // it is base64 encoded. So we read in the text file and convert
                // it to the bytes to initialize the X509Certificate2.
                var certPath = Environment.GetEnvironmentVariable("HTTPS_CERTIFICATE_PATH");
                if (!string.IsNullOrEmpty(certPath))
                    var certString = System.IO.File.ReadAllText(certPath);
                    var certBytes = Convert.FromBase64String(certString);
                    var httpsCert = new X509Certificate2(certBytes);

                    Console.WriteLine($"HTTPS cert Subject:    {httpsCert.Subject}");
                    Console.WriteLine($"HTTPS cert Thumbprint: {httpsCert.Thumbprint}");

                    // set the Kestrel HTTPS certificate
                    options.ServerCertificate = httpsCert;
            catch (Exception ex)
                Console.Error.WriteLine($"unable to load https cert: {ex}");

See the full working solution in the aks-keyvault-certs repo.

With this approach, we can automate managing HTTPS certificates with Azure Key Vault and delivering them to Azure Kubernetes Service pods running ASP.NET Core applications.

27525399 - open window on white wall and the cloudy skyModern cloud computing offers enterprises unprecedented opportunities to manage their IT infrastructure and applications with agility, resiliency, and security, while at the same time realizing significant cost savings. The ability to rapidly scale up and down in the cloud opens countless doors of possibility to use compute and storage resources in innovative ways that were not previously feasible.

But getting to the cloud and managing both cloud and on-premises resources can be a daunting challenge. As a recent Gartner article explains, a Cloud Strategy is a must for organizations. That’s where we at AIS can help – we have years of experience and successes working with enterprises to develop a cloud strategy. We have the resources and expertise to then plan and execute, leveraging the latest technologies and best practices.

Read More…

Innovations in devices, platforms and applications have advanced many user experiences – and user expectations. Voice activated digital assistants like Siri and Cortana have given users new ways to interact with services and information.

In light of this, interfaces like trusty web forms may seem a bit dated… perhaps it’s time to consider a more natural, conversational interaction with users.

A Pizza Bot
A sample Pizza Bot interaction (image courtesy of Microsoft from this article).

Read More…

Office 365

In earlier posts we went over the Office 365 development platform and proposed an example application to demonstrate how we can leverage its resources and Azure Active Directory using the Graph API.

In the previous post we looked under the hood at securing our web application and API with Azure Active Directory, and using the Graph API to find users, check calendars and send email notifications.

In the final installment of this series, we’ll take a closer look at the Outlook Add-in for this application.

Office 365 Add-ins

As we’ve seen, the Graph API makes it easy to integrate Office 365 resources and functionality into your own applications. Add-ins allow you to pull external resources and services directly into Office applications like Outlook, Word, PowerPoint and Excel.

Office 365 Add-ins are implemented as independently hosted web applications that are hosted within Office applications (both the web-based versions or native applications). This means: Read More…

Office Graph In my previous post, I proposed an example application that leverages the resources available to us in Office 365 development platform and Azure Active Directory, as well as the in-application integration of Office 365 Add-ins.

Now we’ll take a deeper look at the Graph API and some of the implementation points.

Build Your Enterprise Graph

The Graph API empowers developers and enterprises to build new relationships and interactions between resources in Azure Active Directory, Office 365, and other applications and data assets.

As Microsoft’s enterprise cloud offerings continue to expand, so will the opportunities to weave these resources together in new and innovative ways. Microsoft’s acquisition of LinkedIn will help it expand its social network graph, so it will be interesting to see how it plays into its Graph API in the future. Read More…

Office_365_AppsEnterprises have a trove of business resources and data that are often under-utilized – users, calendars, contacts, emails, tasks, documents and other files. Often there are redundancies between what users do with Office applications and other enterprise applications, and a painful lack of integration.

In prior posts, I discussed the compelling new Office 365 development platform and introduced Matter Center to demonstrate how integrating web-based add-ins directly into Office applications like Outlook can lead to productivity gains and happy users.

In this post we’ll introduce a sample application to show a practical example of how we can use these technologies to bring enterprise applications together with these valuable resources.

Read More…

Office 365As a full-stack software developer with a penchant for UI/UX, I must admit I was a little skeptical when I was recently tasked to investigate Office 365 as a development platform.

What I found surprised and impressed me.

The Office 365 Development Platform

We’ve gotten really good at spinning up web applications that help users solve problems and increase productivity. That’s great, but it can also leave users with all sorts of disparate applications and stand-alone tools to interact with throughout the day. This contributes to a common productivity disrupter: context switching – that is, the need to frequently switch between different applications and user experiences.

Office 365 offers new compelling ways to integrate external services and custom functionality directly into the Office applications people already use.

Users can do more without having to alt-tab their way through the day, and developers can leverage a rich set of features and functionality without re-inventing the wheel.

Imagine being able to perform many of your day-to-day tasks without ever leaving Outlook. Or accessing external content directly in Word, Excel or PowerPoint. Users can do more without having to alt-tab their way through the day, and developers can leverage a rich set of features and functionality without re-inventing the wheel.

What’s more, the functionality you add is available from anywhere, on any device. Office 365 provides rich browser-based web apps as well as native apps for Windows, iOS, and Android.


Read More…