There are times when you must correlate different log sources within a centralized Azure Log Analytics Workspace to have a single management point for leveraging the robust suite of tooling available within Azure that provides visualizations (Dashboards / Workbooks) and action and mitigation capabilities (Alerts / Automation). If you have resources spanning multiple tenants, Azure lighthouse is used to delegate access to these resources to collect these logs.

Azure Lighthouse

However, Azure lighthouse has its limitations. One that we recently encountered with a customer was the inability to delegate across the Azure commercial and Azure Government clouds:   

 “Delegation of subscriptions across a national cloud and the Azure public cloud, or across two separate national clouds, is not supported.”

Cross-tenant management experiences – Azure Lighthouse | Microsoft Docs

To facilitate log collection, we had to implement a Logic Apps solution to move data from one cloud to another.

Use Case

Application Insight data is hosted in an Azure Commercial tenant and must be transported for storage in a centralized log analytics workspace in an Azure Government tenant.

Solution Overview

The solution is to export the logs to an event hub and use a logic app to pull in data from the event hub to the workspace in the new tenant.

Tenant 1 & 2

Below is a tutorial on how we were able to accomplish this design. Note: We are porting app insight data to a log analytics workspace in our scenario. But a Log Analytics Workspace to Workspace transfer will also work by using the “Data Export” feature available in the Log Analytics Workspace blade, sending the logs of your choosing to an event hub and following the same subsequent steps outlined in this article.

Step One: Create an Event Hub

You will first send the data to an event hub to pull in application insight data from a Logic App in another tenant. You will need to create the hub in the same region as your source of logs. In this example, I used the basic pricing tier.

Event Hub Creation

Once the Azure deployment is finished, you must create an event hub with partition and retention settings. Navigate to your newly created event hub object in Azure, and under the Entities section of the blade, you can create a new event hub. Here I am using the default partition/retention settings.

Basic Event Hub

Once this hub is created, we are ready to send our application insight data.

Step Two: Send Application Insight Data to the Event Hub

Once the event hub has been deployed, you can start sending Application Insight data to the hub.

Navigate to your application insight objects, and in the blade, go to “Diagnostic settings.” Here I will create a diagnostic setting – pointing to the event hub. Give the setting a logical name and select the application data you want to send on the left pane. On the right pane, select “Stream to an event hub” and use the event hub namespace we created in the previous step.

App Diagnostics

Step Three: Create a Logic App in the New Tenant

Now that app data is being exported into Event Hubs, a Logic App can be created in the new tenant to pull in the data and send it to the central Log Analytics Workspace. In this example, I used the consumption tier.

Logic App Creation

UNLOCK EXCELLENCE IN MICROSOFT CLOUD
Make the most of your cloud investment with AIS’ proactive and severity-based approach to your cloud infrastructure management, based on your business strategy.

Step Four: Program the Logic App to Retrieve the Event Hub Data

Within our logic app, we want to build out three components within the designer:

  1. An execution trigger, in our example, is a recurring timer.
  2. A condition and action: when an event(s) are available in an event hub, parse the data in each message
  3. For each event message, send the data to Log Analytics Workspace.

For the first step – we will add a simple timer with a 1-minute recurrence frequency.

Set Timer Reoccurrence

For the second step, I will add an event hub trigger object: “when events are available in the event hub.” For this step, I will need to enter a connection string – this information can be found in the event hub object blade, under “Shared Access Policies.” Selecting the policy object will reveal the authentication keys and connection strings. You can choose the primary or secondary key connection string.

Available Events

Root Manager Shared Access Key

Create Event Hub

Next, I will need to parse out the event hub data. We will use the Parse JSON option under “data operations” for this. The content will be the body of the event hub.

Event Hub Data Parse JSON

The JSON data schema will depend on what telemetry your application is sending. You can upload a sample to generate the schema.
Lastly, I will want to send the logs to the log analytics workspace in the new tenant. For this, I will set up another data operations step where we loop through each event hub message and, within those events, loop through each of the records we previously parsed out and send them to the Log Analytics Workspace using the send data operation. This last step will require a connection to the workspace, which can be found under the “Agents Management” section of the Log Analytics Workspace blade. You will input the workspace ID and one of the two keys – either is fine.

Send logs to log analytics

Once set up is complete, we’re ready to save our Logic App and start sending data to the workspace. After a few minutes, you should be able to query the custom log table you configured in the Logic App.

Log Analytics workspace logs query

Conclusion

To conclude, there are ways to get data from Azure Commercial to a Log Analytics Workspace within Azure Government. With the help of Event Hub and Logic Apps, we could send data from one tenant to the other and work around the Azure Lighthouse limitations. Hopefully, you will find this helpful when implementing your solution.

Summary

To break this series apart into more manageable chunks, in this installment, we’ll focus on setting up the API in Azure API Management (APIM) and the custom connector. If you missed the first installment where we set up our On-Premises Data Gateway and Azure Logic App, please check it out here.

Scenario

The company has an on-premises SQL Server database that contains customer data that needs to be available to various apps built on the Power Platform and/or Azure.

Assumptions

  1. That you have a sample of the data your Logic App returns. For this write-up, we’ll use the data returned from the Logic App we created in our previous lab.

High-Level Steps

  1. Create an Azure API in Azure API Management (APIM) that provides access to the Logic App, and can be called from various Power Apps and Power Automate Flows
  2. Create a Custom Connector

Azure API Management

In this section, we’ll walk through setting up an API to the Logic App we created in our prior installment.

Why use Azure API Management (APIM)?

Azure APIM provides a single place for managing your company’s APIs. It can be used to selectively expose data and services to employees and partners by applying authentication and even usage limits.

Create the API

  1. Create an Azure API Management service instance in Azure. You can follow steps to do that here. In the Azure search bar, type API, and select API Management services. Provide the necessary information and click the Create button. Once complete, your API Management service should show as ‘on-line’.APIM 1Azure API Image 2Azure API Image 3
  2. Click to open your new APIM service. Select API’s from the blade that opens, and either select the Logic App tile or search APIs for Logic App and select it.
    Azure API Image 4
  3. Assuming you created the Logic App in the first installment of this series, select that Logic App from the “Select Logic App to import” blade that opens to the right.
    Azure API Image 5
  4. When that is done being created, you should notice an operation that was created automatically called manual-invoke. Select that operation and click the pencil edit button in the Frontend section.
    Azure API 6 Image
  5. Provide a meaningful display name if you’d like. Change the URL to a GET with a “/onpremcustomers” as a resource path.
  6. On the Query tab at the bottom, add the CustomerId (integer) query parameter.
    Azure API Image 7
  7. On the Responses tab, select the 200 OK response that should already be there. Add the details for an application/json representation using the sample data output from the Logic App created in the previous exercise. I also provided a definition of “customerresponse”.
    Azure API Image 8Aure API 11 Image
  8. On the Settings tab, add an API URL suffix, select a Product, and click save. You will want to make note of the Product you select. I am using “Unlimited” in my example. Later, when providing an API key for a custom connector, you’ll need to remember which product your API is using.Azure API 12 Image
  9. Now we’re ready to test this out. On the Test tab, enter a value for the Customer Id parameter and click Send. If all goes well you should get a 200 response back with, hopefully, some data.
    Azure API Image 13Azure API Image 14

Power Apps Custom Connector

  1. From the Azure API Management Overview blade, open the Developer portal (legacy). At the time of this writing, the new Developer portal was too buggy to use.
    Connector Image 1
  2. Click on the API we just created.
    Connector 2 Image
  3. Using the API definition button, download/copy and save the Open API 2 (JSON) file.
    Connector Image 3
  4. In Power Apps, go to Data  Custom Connectors  New Custom Connector Import an OpenAPI file. Import the downloaded JSON file from the previous step, provide a name for your connector and click Continue.
    Connector Image 5Connector Image 6
  5. Click through the General and Security tabs and make sure they match what you’re expecting (see screenshots).
    Connector Image 7Connector 9 Image
  6. On the Definition tab, if you see a parameter in the Body area, you can delete it. Then click Create connector at the top.
    Connector 11 Image
  7. On the Test tab, you’ll first need to create a new Connection. You will need the API key for the subscription you used in the APIM. In my case, I used “Unlimited”. (Click the ellipses to the right of the subscription and select Hide/Show Keys to see the key and copy it.)
    Connector 11.5 ImageConnector 12 Image
    Connector 15 Image
    Connector 13 Image
  8. Navigate back to the Custom Connector – Test tab and test out your Custom Connector.
    Connector 14 ImageConnector 16 Image

Conclusion

I hope this was helpful in demonstrating just how quickly you can better secure your Azure Logic Apps with Azure API Management, as well as how quickly you can spin up a custom connector that can then be distributed and used by other apps and Flows across your organization to connect to the Logic App through the new API.

This blog is part of a series of hands-on labs on leveraging the Power Platform and Microsoft Azure. To break this series up into more manageable chunks, in this installment, we’ll focus on setting up the On-Premises Gateway that will connect to our local SQL Server and the Azure Logic App that will use the new gateway to connect to our local SQL Server 2017 and return customer data using an HTTP request and response.

We’ll wrap it up by using Postman to test our Logic App. If you don’t have Postman installed, you can download it here.

Stay tuned for the next lab in this series, Azure API Management, and Custom Connectors.

Scenario

The company has an on-premises SQL Server database that contains customer data that needs to be available to various apps built on the Common Data Service.

High-Level Steps

  1. Install and configure an On-Premises Data Gateway to connect to the local SQL Server
  2. Create an Azure Logic App to connect to the SQL Server through the gateway

On-Premises Data Gateway

As you can imagine, there are already plenty of good write-ups on installing and configuring an On-Premises Data Gateway, including an official version from Microsoft. It is very straight forward, but I’m here to offer you a couple of pro-tips.

Pro Tip #1: Pay attention to the account you use for installing the gateway. This account needs to belong to the same Azure tenant as the account you’ll use for building your Azure Logic Apps that will use this gateway.

Pro Tip #2: Pay attention to the region that is automatically selected when installing the gateway. For this example, the gateway needs to be installed in the same region as your Azure Logic App.

Azure Logic App

Now I’m going to step you through building an Azure Logic App that can be executed using a simple HTTP Request. The Logic App will accept a single parameter (CustomerId), or no parameter at all. If no id is presented, the Logic App will return all customer rows. Otherwise, it will filter on the passed in CustomerId.

  1. Create a new Azure Logic App.Logic App
  2. When the deployment of the Logic App is complete, start with the common trigger “When HTTP request is received”.Logic Apps Designer
  3. Add the following JSON schema for the Request Body and click Save. Once you save the Logic App you should see the HTTP POST URL field populated automatically.
    {
    "properties": {
        "CustomerId": {
          "type": "integer"
        }
    },
     "type": "object"
    }

  4. Add an Initialize variable step named CustomerIdVariable, type string, and set the value to the passed in CustomerId parameter.
  5. Add a new Condition step to the Logic App (Controls -> Condition) and change the operator to “Or”. Add a second line row to the condition step and configure as follows.
      1. CustomerIdVariable is equal to null
      2. CustomerIdVariable is equal to 0
      3. Where null is a function, and CustomerIdVariable is set from the Dynamic content. Save the Logic App.
  6. In the “true” block, add a SQL Server, Get rows (V2) action. We will need to set up a new connection here so let’s do that. Since we were sure to set up the On-Premises Gateway in the same Azure Subscription, you should see the gateway automatically available in the Connection Gateway dropdown. We are using Windows Authentication for authenticating to the SQL Server.
  7. Once the connection is created, set the server name, database name, and table name on the SQL Server Get rows (V2) action. Save the Logic App.
  8. Repeat this process for the False side of the conditional, using the same connection that was created in a previous step. When creating this action, click the Add new parameter button and add a Filter Query parameter.
  9. Set the Filter Query parameter to powerappscontactid eq CustomerIdVariable where the CustomerIdVariable is filled in using the Dynamic content. Save the Logic App.
  10. After the Get rows (V2) action in the False side of the conditional, add another conditional. Use the following expression for the conditional.
    length(body(‘Get_rows_(V2)_2’)?[‘value’]) is equal to 0.
    Where the first part is an expression.
  11. For this new conditional we’ll leave the False side empty, but we want to add a Response Request action to the True side. Configure the Response to use Status Code 200, and [] as the Body. Save the Logic App.
  12. Finally, outside of the conditionals, we’ll add another Response Request Action with a Status Code of 200. For this response, we’ll use Dynamic content and send back the ‘values’ from both SQL Server Get rows (V2) actions.
  13. The Logic App should look like this.
  14. Time to save and test your Logic App. We’re going to use Postman to do that since we haven’t set up our API yet in Azure API Management. Open up Postman and start a new request. Use HTTP Post and the URL from the “HTTP Request Received” trigger on your Logic App. I also tweaked that URL to include our CustomerId query string parameter.

Once the URL is put into Postman, the other parameters should be auto-populated in your Postman environment. Here is a snippet of the URL that shows how I included the CustomerId, followed by a screenshot of the Logic App run (after completion), the Postman results, and the contents of the table displayed in SQL Server Management Studio.

…triggers/manual/paths/invoke?CustomerId=2&api-version=2016-10-01…

Conclusion

I hope this helped demonstrate how quickly you can start querying data in your on-premises data stores from a cloud resource like an Azure Logic App. Please keep in mind that you’ll most likely want to secure this Logic App since right now, anyone can send a request to it. We’ll cover that in more detail in another write-up.