Assumptions

This write up is executed in sequence in that if you jump to a section, it assumes you have completed the parts before it. This write-up assumes the following;

  • You have a Power Apps tenant with administrative privileges.
  • For the source control and CI/CD sections;
    • An Azure DevOps project configured with the Power Apps Build Tools Extension.
    • 3 Power Apps environments; development, staging/build, and production.

Executive Summary

This write-up is for anyone looking to learn how to automate the deployment of their Power Apps or Dynamics solution customizations. It starts out walking the reader through setting up a new environment since, for this exercise, you’ll need three of them. It also takes some time to demonstrate setting up a proper Power Apps solution, making sure to include a unique publisher prefix and only what components and subcomponents we need to satisfy our requirements. In other words, our solution stays clean and manageable. If you have already done these things, and have three environments ready, feel free to proceed to the section “Getting our Solution into Source Control.”

Create the New Environment

    1. Create a new environment.
      New Environment in PowerApps Image 1
    2. Create a new database in the process. Notice I checked the “Deploy sample apps and data” for this new environment:
      New Environment in PowerApps – Add Database Image
    3. Let’s change the Environment URL now to something that looks a little nicer than the organization’s internal name that was set up automatically by the system. Click on the new environment from the list, and you should be taken to a page where you can edit the URL:
      Change Environment URL
    4. Click on the link for the environment, and you will be taken to the Power Apps landing page for that environment. It should look something like this, except you may or may not have the sample apps installed:
      New Environment Home Page with Sample Apps Image
    5. Head over to make.powerapps.com, and let’s look at what we have there. You can get here by using the menu in the top-left corner next to the word Power Apps.
      Make PowerApps.com Image

      Create a New Publisher

    6. The first thing we want to do is build our own solution…and as many of you know, you’ll want your own custom prefix on your solution items. As far as I know at this time of this writing (11/7/19) Microsoft has not added a new user interface for adding publishers so that we can customize our prefix. That said, use the cog in the top right to open the Advanced settings window. You should see a window like this:
      Classic Advanced Settings Page Image
    7. Using the Settings dropdown, select the Customizations menu item.
      Classic Advanced Settings Page with Flyout
    8. Select Publishers on the Customizations page
      Classic Customization Image
    9. Click New on the Publisher page and fill out the Display Name and Prefix on the new Publisher dialog. The Name field should auto-populate based on what you put in the Display Name. Click Save and Close to take you back to the Publisher page.
      Classic Add Publisher Image

      Create a New Solution

    10. Back to our make.powerapps.com page, let’s select the Solutions menu item on the left.
    11. On the Solutions page, click the New solution button and provide a Display Name and Version for the new solution. Select the new Publisher we just set up in previous steps. Once again, the Name should auto-populate. Click Create.
      Create New Solution with New Publisher Image
    12. Open the solution by clicking on its Name in the solution list.
      Empty Solution Image

      Create a New Custom Entity

    13. Since you’ll undoubtedly see an empty solution, let’s start simple and add a custom entity. Click New so that you see the dropdown menu. Then select Entity. Fill out the Display Name and take note of the Publisher prefix.
      New Custom Entity

      Add an Out-of-Box Entity to the New Solution

    14. Once that Entity is created, click on it to open it, and you should see a number of out-of-box fields that were automatically created with your new Entity. Now go back to the solution. Let’s take it a step farther and see what happens when we add an out-of-box entity that is deployed when we create our database. Click Add existing from the solution home page and select Entity. Please scroll down and find Contact, select it, and click Next.
      Adding Contact Image
    15. This next page may get a little confusing. It asks what components you want to include, and if you wish to include the entity metadata. For now, let’s not select any parts and just add the Contact entity. You can always go back and add those components using the Add subcomponents button at the top of the Solution inventory page when you have the entity selected. For now, though, I just want to create a new form for the Contact entity so Cancel out of this.
      Add Subcomponents Image
    16. Open up your Contact entity in the solution by clicking on it. Then, select the Forms tab. From there, click the Add form menu and select Main Form from the dropdown.
      New Form of Contact Image
    17. You should have a window open up with the Contact form designer ready to go.
      Contact Form Designer
    18. Make some changes to this form and consider changing the Display Name of the form. Save it.
      Contact Form Designer with Changes
    19. Go back to the solution view and click Publish all customizations. If all was successful you will see a success message.
      Publishing Customizations ImagePublish Successful Image

      Getting our Solutions into Source Control

      This section will require 3 different service connections, one for each of the environments we’ll be using for our build and release pipeline. I’ll show you how to create these in this section, but if you’re already familiar go ahead and set your 3 service connections up now. For this example, we’ll use the following connections:

      – ogs-powerapps-connection – this is our dev environment
      – cds-powerapps-connection-ogsllcbox – this is our staging/build environment
      – cds-powerapps-connection-ogsllc – this is our production environment
      Connections Image

    20. Navigate to the Azure DevOps project you should already have created with Power Apps Build Tools installed (see Assumptions). Under Pipelines, select Builds and click New, then New Build Pipeline. We will use the classic editor for this write-up.
      DevOps Step 1 Image
    21. Click Continue, and then click the link to start with an empty job.
    22. Give your Pipeline a meaningful name then select the Agent job 1 node. Save the pipeline.
      DevOps Step 2
    23. On the Agent job 1 node, make sure to select the Allow scripts to access OAuth token
      Allow Script to use Ouath Toaken
    24. Click the plus (Add task) on Agent job 1. Type in Power Apps Tool into the search box and add the Power Apps Tool Installer to your pipeline.
      DevOps Step 3 Image
    25. Add another task to the agent job, this time search on Power Apps Export, and select the Power Apps Export Solution. Select this new task to make the necessary configuration changes. Use $(SolutionName) for the Solution Name field.
      We’ll add that variable later. Use $(Build.ArtifactStagingDirectory)\$(SolutionName).zip for the Solution Output File. I need to configure a new service connection, so click Manage next to the Power Apps Environment URL label.
      DevOps Step 4 Image
    26. Configure the new generic service connection
      DevOps Step 5 Image
    27. Add another task to the agent job and search on Power Apps Unpack to locate then select the Power Apps Unpack Solution. Use $(Build.ArtifactStagingDirectory)\$(SolutionName).zip for the Solution Input File. Use $(Build.SourcesDirectory)\$(SolutionName) for the Target Folder to Unpack Solution. Save the pipeline.
      DevOps Step 7 Image Saved Unpack Task
    28. Add another task to the agent job, this time use the Command line task. Give it a meaningful name and use the following script to commit the code to the appropriate repository. In this case, we are using ‘master’.
      echo commit all changes
      git config user.email "<>"
      git config user.name "Automatic Build"
      git checkout master
      git add --all
      git commit -m "solution init"
      echo push code to new repo
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master

      DevOps Step 6
    29.  Click on the Variables tab above the Pipeline designer area and add a variable for SolutionName. Save the pipeline. At this point, you can actually use the Save and Queue, or just queue the pipeline and cross your fingers.
      DevOps Step 9 ImageHopefully, all is successful.
      DevOps Final Successful
    30. Go over to your repor now and you should see something like this:
      Repo after Source is Loaded

      Deploying Solution as a Managed Solution (to Test, QA, Production)

      Creating a Managed Solution

    31. Now that we have our customizations in source control, we want to create a managed solution for deployment to our production environment. Create a new build pipeline using the classic interface, and empty job, and give it a meaningful name.  Once again on the Agent job, make sure to check the box to allow scripts to access the OAuth token.  Also add the Power Apps Tool Installer as we did in our first build pipeline.  Save your pipeline.
      Create Managed Solution Image
    32. Now we want to add a Power Apps Pack Solution task. Use $(Build.SourcesDirectory)\$(SolutionName) for the source folder, and $(Build.ArtifactStagingDirectory)\$(SolutionName).zip for the solution output file.  Leave the Type of Solution as Unmanaged.  Save the pipeline.
      DevOps Create Managed Solution
    33. Add a Power Apps Import Solution task. For the Solution Input file use $(Build.ArtifactStagingDirectory)\$(SolutionName).zip.  This is where we want to use our staging/build environment so either select that service connection or create a new service connection and point it to said environment.  Save your pipeline.
      DevOps Create Managed Solution Image
    34. Add a Power Apps Export Solution task. Use $(SolutionName) for the Solution Name, and $(Build.ArtifactStagingDirectory)\$(SolutionName)_managed.zip for the Solution Output file.  Make sure to check the Export as a Managed Solution checkbox.  Save the pipeline.
      DevOps Create Managed Solution Image
    35. Add a Publish Pipeline Artifact task and use $(Build.ArtifactStagingDirectory)\$(SolutionName)_managed.zip for the File or directory path. Use drop for the Artifact name.  Save the pipeline.
      DevOps Create Managed Solution Image 5
    36. Click on the Variables tab up top and add your SolutionName variable.
      DevOps Create Managed Solution Image 7
    37. Run the pipeline. If it was successful you should see a drop artifact in the Artifacts dropdown menu.
      DevOps Create Managed Solution Image 6

      Deploying to Production – Creating the Release Pipeline

    38. Now it’s time to create a Release pipeline. Click on Releases in the left nav, then click New, and then New release pipeline from the dropdown menu.  Start with an empty job on the next screen.
      Release Pipeline Image 1
    39. Select the Add an Artifact block and select the build pipeline that we created to build the managed solution.
      Release Pipeline Image 2
    40. Click on the 1 job, 0 tasks link in the Stage 1 block. Add the Power Apps Tools task, and a Power Apps Import Solution task.
    41. Click on the Agent job and scroll to find the Artifact download and select the build pipeline that was built for creating the managed solution.
      Release Pipeline 3 Image
    42. Click on the Power Apps Import Solution task. You will use something similar for the Solution Input File.  $(System.DefaultWorkingDirectory)/ogs-powerapps-create-managed-solution/drop/OGSPowerAppsCore_managed.zip
    43. Select the production environment service connection from the Power Apps Environment URL. Save the pipeline.
      Release Pipeline Image 4
    44. Use the Create Release button to test the release pipeline out.
      Release Pipeline Image 5
    45. Now you should be able to go to your production environment and see the solution, plus the changes we made to the contact form.
      Productions with New Solution Image
      Production Deployed Contact form with Updates Image

Conclusion

I hope this helped demonstrate how to use the Power Apps Build Tools to get your solutions into source control and automate the deployment of said solutions across any environments. I hope you were also able to pick up something new about your Power App solutions composition.

Summary

To break this series apart into more manageable chunks, in this installment, we’ll focus on setting up the API in Azure API Management (APIM) and the custom connector. If you missed the first installment where we set up our On-Premises Data Gateway and Azure Logic App, please check it out here.

Scenario

The company has an on-premises SQL Server database that contains customer data that needs to be available to various apps built on the Power Platform and/or Azure.

Assumptions

  1. That you have a sample of the data your Logic App returns. For this write-up, we’ll use the data returned from the Logic App we created in our previous lab.

High-Level Steps

  1. Create an Azure API in Azure API Management (APIM) that provides access to the Logic App, and can be called from various Power Apps and Power Automate Flows
  2. Create a Custom Connector

Azure API Management

In this section, we’ll walk through setting up an API to the Logic App we created in our prior installment.

Why use Azure API Management (APIM)?

Azure APIM provides a single place for managing your company’s APIs. It can be used to selectively expose data and services to employees and partners by applying authentication and even usage limits.

Create the API

  1. Create an Azure API Management service instance in Azure. You can follow steps to do that here. In the Azure search bar, type API, and select API Management services. Provide the necessary information and click the Create button. Once complete, your API Management service should show as ‘on-line’.APIM 1Azure API Image 2Azure API Image 3
  2. Click to open your new APIM service. Select API’s from the blade that opens, and either select the Logic App tile or search APIs for Logic App and select it.
    Azure API Image 4
  3. Assuming you created the Logic App in the first installment of this series, select that Logic App from the “Select Logic App to import” blade that opens to the right.
    Azure API Image 5
  4. When that is done being created, you should notice an operation that was created automatically called manual-invoke. Select that operation and click the pencil edit button in the Frontend section.
    Azure API 6 Image
  5. Provide a meaningful display name if you’d like. Change the URL to a GET with a “/onpremcustomers” as a resource path.
  6. On the Query tab at the bottom, add the CustomerId (integer) query parameter.
    Azure API Image 7
  7. On the Responses tab, select the 200 OK response that should already be there. Add the details for an application/json representation using the sample data output from the Logic App created in the previous exercise. I also provided a definition of “customerresponse”.
    Azure API Image 8Aure API 11 Image
  8. On the Settings tab, add an API URL suffix, select a Product, and click save. You will want to make note of the Product you select. I am using “Unlimited” in my example. Later, when providing an API key for a custom connector, you’ll need to remember which product your API is using.Azure API 12 Image
  9. Now we’re ready to test this out. On the Test tab, enter a value for the Customer Id parameter and click Send. If all goes well you should get a 200 response back with, hopefully, some data.
    Azure API Image 13Azure API Image 14

Power Apps Custom Connector

  1. From the Azure API Management Overview blade, open the Developer portal (legacy). At the time of this writing, the new Developer portal was too buggy to use.
    Connector Image 1
  2. Click on the API we just created.
    Connector 2 Image
  3. Using the API definition button, download/copy and save the Open API 2 (JSON) file.
    Connector Image 3
  4. In Power Apps, go to Data  Custom Connectors  New Custom Connector Import an OpenAPI file. Import the downloaded JSON file from the previous step, provide a name for your connector and click Continue.
    Connector Image 5Connector Image 6
  5. Click through the General and Security tabs and make sure they match what you’re expecting (see screenshots).
    Connector Image 7Connector 9 Image
  6. On the Definition tab, if you see a parameter in the Body area, you can delete it. Then click Create connector at the top.
    Connector 11 Image
  7. On the Test tab, you’ll first need to create a new Connection. You will need the API key for the subscription you used in the APIM. In my case, I used “Unlimited”. (Click the ellipses to the right of the subscription and select Hide/Show Keys to see the key and copy it.)
    Connector 11.5 ImageConnector 12 Image
    Connector 15 Image
    Connector 13 Image
  8. Navigate back to the Custom Connector – Test tab and test out your Custom Connector.
    Connector 14 ImageConnector 16 Image

Conclusion

I hope this was helpful in demonstrating just how quickly you can better secure your Azure Logic Apps with Azure API Management, as well as how quickly you can spin up a custom connector that can then be distributed and used by other apps and Flows across your organization to connect to the Logic App through the new API.

This blog is part of a series of hands-on labs on leveraging the Power Platform and Microsoft Azure. To break this series up into more manageable chunks, in this installment, we’ll focus on setting up the On-Premises Gateway that will connect to our local SQL Server and the Azure Logic App that will use the new gateway to connect to our local SQL Server 2017 and return customer data using an HTTP request and response.

We’ll wrap it up by using Postman to test our Logic App. If you don’t have Postman installed, you can download it here.

Stay tuned for the next lab in this series, Azure API Management, and Custom Connectors.

Scenario

The company has an on-premises SQL Server database that contains customer data that needs to be available to various apps built on the Common Data Service.

High-Level Steps

  1. Install and configure an On-Premises Data Gateway to connect to the local SQL Server
  2. Create an Azure Logic App to connect to the SQL Server through the gateway

On-Premises Data Gateway

As you can imagine, there are already plenty of good write-ups on installing and configuring an On-Premises Data Gateway, including an official version from Microsoft. It is very straight forward, but I’m here to offer you a couple of pro-tips.

Pro Tip #1: Pay attention to the account you use for installing the gateway. This account needs to belong to the same Azure tenant as the account you’ll use for building your Azure Logic Apps that will use this gateway.

Pro Tip #2: Pay attention to the region that is automatically selected when installing the gateway. For this example, the gateway needs to be installed in the same region as your Azure Logic App.

Azure Logic App

Now I’m going to step you through building an Azure Logic App that can be executed using a simple HTTP Request. The Logic App will accept a single parameter (CustomerId), or no parameter at all. If no id is presented, the Logic App will return all customer rows. Otherwise, it will filter on the passed in CustomerId.

  1. Create a new Azure Logic App.Logic App
  2. When the deployment of the Logic App is complete, start with the common trigger “When HTTP request is received”.Logic Apps Designer
  3. Add the following JSON schema for the Request Body and click Save. Once you save the Logic App you should see the HTTP POST URL field populated automatically.
    {
    "properties": {
        "CustomerId": {
          "type": "integer"
        }
    },
     "type": "object"
    }

  4. Add an Initialize variable step named CustomerIdVariable, type string, and set the value to the passed in CustomerId parameter.
  5. Add a new Condition step to the Logic App (Controls -> Condition) and change the operator to “Or”. Add a second line row to the condition step and configure as follows.
      1. CustomerIdVariable is equal to null
      2. CustomerIdVariable is equal to 0
      3. Where null is a function, and CustomerIdVariable is set from the Dynamic content. Save the Logic App.
  6. In the “true” block, add a SQL Server, Get rows (V2) action. We will need to set up a new connection here so let’s do that. Since we were sure to set up the On-Premises Gateway in the same Azure Subscription, you should see the gateway automatically available in the Connection Gateway dropdown. We are using Windows Authentication for authenticating to the SQL Server.
  7. Once the connection is created, set the server name, database name, and table name on the SQL Server Get rows (V2) action. Save the Logic App.
  8. Repeat this process for the False side of the conditional, using the same connection that was created in a previous step. When creating this action, click the Add new parameter button and add a Filter Query parameter.
  9. Set the Filter Query parameter to powerappscontactid eq CustomerIdVariable where the CustomerIdVariable is filled in using the Dynamic content. Save the Logic App.
  10. After the Get rows (V2) action in the False side of the conditional, add another conditional. Use the following expression for the conditional.
    length(body(‘Get_rows_(V2)_2’)?[‘value’]) is equal to 0.
    Where the first part is an expression.
  11. For this new conditional we’ll leave the False side empty, but we want to add a Response Request action to the True side. Configure the Response to use Status Code 200, and [] as the Body. Save the Logic App.
  12. Finally, outside of the conditionals, we’ll add another Response Request Action with a Status Code of 200. For this response, we’ll use Dynamic content and send back the ‘values’ from both SQL Server Get rows (V2) actions.
  13. The Logic App should look like this.
  14. Time to save and test your Logic App. We’re going to use Postman to do that since we haven’t set up our API yet in Azure API Management. Open up Postman and start a new request. Use HTTP Post and the URL from the “HTTP Request Received” trigger on your Logic App. I also tweaked that URL to include our CustomerId query string parameter.

Once the URL is put into Postman, the other parameters should be auto-populated in your Postman environment. Here is a snippet of the URL that shows how I included the CustomerId, followed by a screenshot of the Logic App run (after completion), the Postman results, and the contents of the table displayed in SQL Server Management Studio.

…triggers/manual/paths/invoke?CustomerId=2&api-version=2016-10-01…

Conclusion

I hope this helped demonstrate how quickly you can start querying data in your on-premises data stores from a cloud resource like an Azure Logic App. Please keep in mind that you’ll most likely want to secure this Logic App since right now, anyone can send a request to it. We’ll cover that in more detail in another write-up.

Leveraging the Power Platform and Microsoft Azure to connect housing agencies and intermediaries with the Department of Housing and Urban Development (HUD)

Intro

The US Department of Housing and Urban Development runs a program known as the Housing Counseling Program that aids housing agencies around the nation through things like grants and training. These agencies provide tremendous benefit to Americans in many forms including credit counseling, foreclosure prevention, predatory lending awareness, homelessness, and more. One of the requirements for approval into the program is that the agency uses a software system that is also approved by HUD and interfaces with HUD’s Agency Reporting Module SOAP web service.

Original System

The original system is built on Dynamics 365 Online using custom entities, web resources, and plug-ins. Complex batching functionality was implemented to deal with the two-minute timeout period that plug-ins have, web resource files (JavaScript, HTML, and CSS) were stuck in a solution with no real structure, and application lifecycle management (ALM) was extremely painful. The original system design also doesn’t account for the intermediary-to-agency relationship, specifically the roll-up reporting and auditing that intermediaries desperately need for providing their compiled agency data to HUD, as well as their own internal reporting. Finally, because the solution was built using Dynamics 365 Online Customer Engagement, licensing costs were more than double what we knew they could be with Microsoft’s new Power Apps licenses and Azure pay-as-you-go subscriptions.

The Original Diagram

Figure 1 – current CMS system build on Dynamics 365 Online

Definition

Intermediaries – organizations that operate a network of housing counseling agencies, providing critical support services to said agencies including training, pass-through funding, and technical assistance.

Modern System

Whereas the data schema for the solution remains mostly unchanged, the architecture of the system has changed profoundly. Let’s check out some of the bigger changes.

Power Apps Component Framework (PCF)

The Power Apps Component Framework enables developers to create code components that can run on model-driven and canvas apps and can be used on forms, dashboards, and views. Unlike traditional web resources, PCF controls are rendered in the same context and at the same time as any other component on a form. A major draw for developers is that PCF controls are created using TypeScript and tools like Visual Studio, and Visual Studio Code. In the modern solution, web resources are replaced with PCF controls that make calls directly to the Power Apps Web API.

Azure Resources (API Management, Function Apps, Logic Apps, and a Custom Connector)

A Custom Connector and Azure API Management are used to manage and secure various APIs that are used in the system. The connector can be used in any Azure Logic App to make connecting to an API much easier than having to configure HTTP actions. All of the business logic that once lived in plug-ins has been replaced with a combination of Azure Logic Apps and Azure Function Apps. This has provided incredible gains where development and testing are concerned, but it also provides a more secure and scalable model to support these agencies as they grow. Lastly, it removes the burden experienced with the two-minute time-out limitation that plug-ins have.

ALM with Azure DevOps Services and Power Apps Build Tools

Azure DevOps Services, coupled with the Power Apps Build Tools, are being used to ease the pain we experienced with ALM prior to these tools being available. Now we can easily move our Power App solutions across environments (e.g. dev, test, production) and ensure our latest solution code and configurations are always captured in source control.

ALM with Azure

Figure 2 – modern solution using Power Apps and Azure resources for extreme scalability and maintainability

Next Steps

I hope by demonstrating this use case you’re inspired to contact your Microsoft partner, or better yet contact us and let us work with you to identify your organization’s workloads and technology modernization opportunities.

Ready to dig a little deeper into the technologies used in this blog post? Below we have some hands-on labs for you. Stay tuned for more updates!

  1. Power Apps Component Framework Controls
    As a developer, PCF Controls has been one of the most exciting things to grace the Power Platform landscape. In this video, I’ll show you how to go from zero to a simple PCF control that calls the Power Platform Web API. I also provide a Pro Tip or two on how I organize my PCF controls in my development environment.

  2. Using an Azure Logic App to Query Data in an On-Premises SQL Server Database
    Companies are embracing modernization efforts across the organization, quickly replacing legacy apps with Power Apps and Azure resources. This can’t all happen at once though, and often times companies either can’t or won’t move certain data to the cloud. In this hands-on lab, I’ll walk you through using an Azure Logic App and an On-Premises Data Gateway to connect to a local SQL Server 2017 database to query customer data using an HTTP request and response.
  3. Azure API Management and Custom Connectors
    API’s give you the ability to better secure and manage your application interfaces. Couple an API with a Custom Connector and you not only better secure and manage, but also make it super easy for other app developers across the organization to connect to your back-end resources through the API, without having to worry about all of the API configurations.
  4. Power Apps DevOps Using Best Practices
    There’s more to DevOps than just backing up your Power App solutions in source control and automating your build and release pipeline. Good solution management and best practices around publishers, entity metadata, components, and subcomponents are also key to manageability and scalability with your ALM.


If you didn’t catch the first two parts of this series, you can do that here and here.  In this part, we’ll get a little more technical and use Microsoft Flow to do some pretty cool things. 

Remember when we talked about the size and quality of the images we take with our Power App and store as the entity image? When saved as the Entity Image for a CDS/D365 item, the image loses quality and is no longer good for an advertisement photo.  This is done automatically and as far as I can tell, the high-res image is gone once this conversion takes place (someone please correct me if I’m wrong on that!).  On the flip side of that, it doesn’t make a whole lot of sense to put all this tech together only to have my end users be required to take two pictures of an item, one for hi-res and one for low-res.  We don’t want to store a high-res in a relational database for 10,000 plus items because the database could bloat immensely.

Microsoft Flow and SharePoint to the rescue!  

PRO TIP:  Dynamics 365 will crop and resize the image before saving it as the entity image.  All entity images are displayed in a 144 x 144 pixel square.  You can read more about this here.  Make sure to save/retain your original image files.  We’re going to stick ours in a SharePoint Picture Gallery App.

Objective 

Create a Microsoft Flow that handles… 

  • Pulling the original image off the Dynamics record and storing it in SharePoint. 
  • Setting the patch image to the Entity Image for the Dynamics record 
  • Create an advertisement list item for the patch 
  • Save the URLs for the ad and image back to the patch record 

Create the Flow 

We’re going to write this Flow so that it’s triggered by a Note record being created. 

 Flow screenshot with Create from blank highlighted

  • On the next page, click “Search hundreds of connectors and triggers” at the bottom of the page. 
  • Select Dynamics 365 on the All tab for connectors and triggers. 
  • Select the “When a record is created” trigger. 

 Dynamics 365 is highlighted

  • Set the properties for Organization Name and Entity Name.  Entity Name should be “Notes”. 
  • Save the Flow and give it a name. 

Verifying a Few Things 

  • Add a new step and select the Condition item. 
  • The Condition should check to see if the Note has an attachment. We do this using the “Is Document” field.  

 Condition Control is highlighted 

  • In the “Yes” side of the conditional we want to check if the Object Type is a Patch (ogs_patch in this case).  

At this point, if the Flow has made it through both conditionals with a “Yes”, we know we are dealing with a new Note record that has an Attachment and belongs to a Patch record.   

Update the Patch Record 

Now we want to update the batch record’s Entity Image field with the attachment.  First we need to get a handle on the Patch record.  We’ll do that by adding an Action to the Yes branch of our new Conditional. 

  • Add a Dynamics 365 Update a Record Action.
  • Set the Organization Name, Entity Name, and Record identifier accordingly.  For our Patch Record identifier, we’ll use the Regarding field in the Dynamic content window. 

 

  • Click on Show advanced options and find the Picture of Patch field. 
  • For the Picture of Patch field we need to get the document body of the attachment and convert it from Base-64 encoding to binary.  We do this using the “Expression” area again.  Use the “base64ToBinary” function to convert the document body like so. 

 

  • Save your work!  I can’t tell you how many times I had to retype that function. 

Create Our SharePoint Items & Clean-up 

Now that we’ve updated our entity image with the uploaded patch picture we want to do a couple of things, but not necessarily in sequence.  This is where we’ll use a parallel branch in our Flow.   

Dealing with a Parallel Branch 

  • Under the last Update a Record action, add a Conditional.  After adding this Conditional hover over the line between the Update action and the new conditional.  You should see a plus sign that you can hover over and select “Add a parallel branch.” 



  • Select this and add a Compose action.  You may need to search for the Compose action. 

 

PRO TIP:  With Modern Sites in SharePoint, we now have three solid options for displaying images in SharePoint.  The Modern Document Library allows viewing as tiles and thumbnails within a document library, the Picture Library which has often been the place to store images prior to the Modern Document Library, and then we can simply just display an image, or images, on a page directly.

Saving the Attachment as an Image in SharePoint

  • Let’s deal with Compose branch first.  Our compose will have the same function as our Picture of Patch did above for the Input field.  base64ToBinary(triggerBody()?[documentbody’]) 
  • After the Compose, we’ll add a Create File Action for SharePoint and use the name from our Patch record as the name for our image in SharePoint.  I’m using a Picture Gallery App in SharePoint and for now, only using the .JPG file type.  The File Content should use the Output from our Compose Action. 

 

Delete the Note

  • Finally, we want to delete that Note from Dynamics (and the Common Data Service) so that the image attachment is no longer taking up space in our Common Data Service.  Add a Dynamics Delete a Record Action after the SharePoint Create file action.  Set the Organization Name, Entity Name, and use the Dynamics content for Note as the Item identifier.

 

Creating Our Advertisement

Let’s jump back to the new Conditional we added after the Update a record Action where we set the entity image. 

  • Set the conditional to check for the Generate Advertisement field being set to true. 
  • If this is true, add a SharePoint Create Item Action and let’s set some values.  What we’re doing here is creating a new SharePoint List Item that will contain some starter HTML for a Patch advertisement. 
  • Save our work! 

 

 

Updating Our Patch Record With Our URLs From SharePoint

  • Under the SharePoint Create Item Action for creating the Ad, AND after the SharePoint Create file action for creating the picture in the Picture Gallery, we’re going to add Dynamics Update record Actions that will be identical with one difference. 
  • The Organization Name, Entity Name, Record Identifier (set to Dynamic Content “Regarding”) should be the same. 
  • On the Ad side, the Update record should set the SharePoint Ad for Patch field to “Link to Item”. 

 

  • On the image side, the Update record should set the SharePoint Image for Patch to the “Path” 

 

Seeing It In Action 

Of course, I’ve been saving my work so let’s go ahead and give this a whirl. 

  • At the top right of your Flow you’ll see a Test button.  We’re going to click that and select “I’ll perform the trigger action.” 
  • To make this more interesting, I’m going to run this from SharePoint! I’ll update a patch and kickoff my Flow from the embedded Power Apps Canvas App on my SharePoint home page. 

 

  • I select the patch, then I click the edit button (pencil icon at the top right). 
  • Notice the Attach file link and the Generate Advertisement switch.  We’ll use the first for our image and the second for generating our ad item in SharePoint. 

 

  • Finally, I click the checkmark at the top right to save my changes.  This kicks off our Flow in less than a minute, and when we navigate back over to the Flow we can see that it completed successfully. 

 Verifying the flow

  • I’ll hop back over to SharePoint to make sure that my ad was created and my entity image was set.  I’ll also make sure the high-quality image made it to the SharePoint Picture Library and the Note was deleted from the Patch record in Dynamics.  I also want to make sure the URLs for the ad and image in SharePoint were set back to the Patch record. 

verifying in SharePoint Verifying in SharePoint image

One last thing: When we store the image in a SharePoint Picture Gallery App we can retain the dimensions, size, and quality of the original image, unlike when storing the image as a Dynamics 365 entity image.  Check out the properties in the next screen shot and compare that to the properties on the SharePoint page in the same screen shot.   


Comparing image file sizes

Conclusion 

I hope you are enjoying this series and continue to tune in as the solution for our dad’s beloved patch collection grows.  I constantly see updates and upgrades to the Power Platform so I know Microsoft is working hard on making it even better. 

In this part, things should get a little more interesting as we set up a Power App and our SharePoint site.

Objectives

  1. Create SharePoint Online Team Site.
  2. Create a Power App for team members to collect data with.
  3. Embed this Power App into a SharePoint page so we can do more work in a single environment.

Creating Our SharePoint Team Site

Using our existing SharePoint Online instance, I’m going to create a new Team site off of the root site collection.

  1. From the home page of the root site collection, click the “Create site” button.

Create site screenshot

2. Select Team site on the panel that opens.

Create a Site screen with Team Site hightlighted3. Fill in the information on the next page and click Next.

Team Site information input page

4. Add any additional users and click Finish.

Add additional users screenshot

If all went well, you should be redirected to the new site.

Screenshot of new Team Site

Creating the Power App

To be clear, we are going to create a “canvas” Power App as opposed to a model-driven Power App.  To get this going I’ll navigate to my Power Apps environment and create a new app.

Screenshot of PowerApps environment

    1. From your Power Apps home page, click Create.
    2. On this page you can see various templates for making both model-driven and canvas apps. We’re going to use the “Start from data” Canvas app.Screenshot with Start From Data option highlighted
    3. On the next page we see multiple choices for apps that start with our data. I’m going to use the Common Data Service here.Screenshot with Common Data Service hightlighted
    4. With my Common Data Service account instance selected under “Connections,” I scroll to find and select the “Patches” table, then click “Connect.”

Screenshot of Connections

PRO TIP:  If you get to this point and you still don’t see your data, make sure to check the “Environment” at the top right of the window.  You may not have the correct environment selected where your data is stored.  You can read more about Environments in Power Apps here.
Screenshot with Environments highlighted

        1. Once Power Apps is done creating the new app, the app designer will appear.

      Screenshot of App Designer

Let’s pause and look at this screen for a minute.  On the right, we have our properties and some other items, in the middle we have our design canvas, and on the left we have our Screens Explorer.  In our Screens Explorer we see three screens already created for us: the browse, detail, and edit screens.

If you look under each of those screens you’ll see a primary user interface object that is collapsed (meaning it has content/child nodes underneath it), along with some other user interface elements that usually provide some other functionality or a label.  When you select an element in the Screens Explorer it will also be selected on the design canvas.  In the previous screen shot, the Search Icon is selected under the Browse Screen 1 item.  Consequently, the search hourglass on the design canvas is also selected.

We don’t want to change this up much, but I think we can all agree that it would be a lot more helpful to have something other than the created-on date and item id as our main fields for each row.

PRO TIP:  Power Apps provides us with a super fast way of spinning up apps by automatically building things into our apps like navigation and search controls.  Be careful about changing or removing these unless you plan to replace them with a control of your own that provides the same functionality.  Oftentimes if you remove one of these you render a portion of the app unusable or difficult to access.  To see what a given user interface item does, select it in the designer and check out the “OnSelect” Action in the Advanced tab on the right.

Screenshot with OnSelect option highlighted

      1. Select the bolded date field on the row item on the design canvas and make sure the Advanced tab is selected on the right.
      2. In the Data section under the Advanced tab, we can use the Text field to change what we want displayed in each of these user interface elements. I’m going to change the first to name, the second to city, and the last one to state.Screenshot of text field

Once we’re done tweaking the browser display we’ll want to work on the detail and edit forms.  These work a bit differently.

      1. Select the Detail Form under the Detail Screen node. In the right pane under Properties, select the Fields link that indicates the number of fields currently selected.
      2. Check any fields you would like that aren’t currently on the form. Similarly, uncheck any fields the system put on the form that you want to remove.  You can also reorder the fields to your liking.

After a little bit of tweaking to both our detail and edit forms, we’re ready to publish this Power App.  You can preview the app with the “Play” arrow icon near the top right, or just switch over to your File menu to wrap things up.

      1. Click the File menu and under App Settings give your app a name and feel free to play around with the icon and background color. There are other settings you can explore too but for now we’ll just cover the basics.
      2. Click “Save” to save your app to your gallery.

Screenshot of gallery

In this next screen shot I’ve captured our three screens from left to right: display, detail, and edit. Screenshot of app screens.
You might be wondering what the Generate Advertisement switch is for on the edit screen.  We’ll use that to toggle that specific functionality later in our Flow.

Finally, if you click on the ellipsis for the new app in our Power Apps app gallery, you can click “Details” and get additional helpful information for the app such as the app URL.  This will come in handy for what we do next.

Here are some screen shots from my mobile phone as I helped gather inventory for this massive side project!

Screenshots of app data

Embedding the Power App in SharePoint

As mentioned, we don’t want to find ourselves jumping back and forth between application instances for managing inventory.  Thankfully we’re able to embed our Power App into the SharePoint page we want so we can do a lot of our common work from that one screen.

      1. Open the SharePoint site we created earlier and edit the home page. I’ve tweaked mine so we only have the Documents Library web part and some empty columns.
      2. Click the plus button in the right column to add a new web part. Find the Microsoft Power Apps web part and select it.Screenshot
      3. Paste the link from your Power App into the “App web link or ID” field in the right pane that opens. The app should come up in the new web part.Screenshot
      4. Publish the page.

We can now use the same Power App that everyone else will be using on their mobile devices, right here in SharePoint.

Screenshot of PowerApp in SharePoint

Tune in next time, when we’ll use Microsoft Flow to move some data around, automatically generate advertisements, and notify team members of important events.

Part One: Identify, Define, Build, Migrate

An assortment of fire department patchesMy dad passed away in 2015, leaving behind an extensive collection of fire trucks, patches, and other fire department (FD) memorabilia.  Before he passed, he gave us instructions to sell them and some direction on what to do with the money. After a few years of not really wanting to deal with it, my family decided to make a project out of it.  My mom, sister, wife, two daughters, and I are working our way through thousands of patches, hundreds of fire trucks, and who knows how many pendants and other trinket like items, all while working full-time jobs (school for the kiddos) and from different locations.

Dad was great about logging his patches into a Microsoft Access database, but not so good about taking pictures of them, and even worse at logging his fire trucks and other items.  The objective and high-level steps for this project were quickly identified.

The Objective

  1. Help my mom liquidate my dad’s enormous fire department memorabilia collection.

The High-Level Steps

  1. Identify the technologies to be used. Easy!
    1. Microsoft Dynamics 365 & Common Data Service – our foundation.
    2. Microsoft Power Apps – mobile app for inventory capture.
    3. Microsoft Flow – move data and attachments around, auto-create ads.
    4. Microsoft SharePoint – store ads, images. Keep large files out of CDS.
  2. Complete a first-cut of the data schema and migrate the patches data from the Microsoft Access database.
  3. Configure a software solution for the family to use so we can all capture data to a single database. Solution must be user friendly!
  4. Configure processes that streamline the creation of advertisements and other data processing.
  5. Start capturing data and creating ads!

The Players

Not everyone in an organization has the same skill level and this will certainly lead to some challenges.  With that in mind, let’s look at the players involved in our project.

  1. Mom – Low technical skill – Capable of using anything “Excel-like” to capture data.
  2. Sister – Low-to-Medium – Arguably more advanced than mom, works on a Mac. Enough said.
  3. Wife – Medium – Works around Excel with ease, understands what I do from a high level.
  4. Kids – Low-to-Medium – two daughters, ages 12 and 10. Both are geniuses on any touch device but have no clue how to work around Excel.
  5. Me – High – developer and technology enthusiast!

I’ve spent the better part of my career as a .Net developer working in SharePoint and Dynamics, among other things, so it was easy for me to decide on a path forward.  Let’s get rolling!

Configure Data Schema and Migrate Microsoft Access Data

Just so no one thinks I’m lying here for the sake of this blog, let’s see what my dad was working with back in the day.  Yes, he was ND alum.

Screenshot of patch entry form in Microsoft AccessPatch data in Microsoft Access

Side note: You see that column named “Patch Locator” highlighted in that last screen shot?  My dad kept his patches in old-school photo albums that he then stored in boxes.  This ‘locator’ field was his way of finding the patch once a box was full and stored away.  Genius dad!

As you can see defining the schema for patches was pretty much done.  If we run into anything along the way, we can certainly add it.

  1. In Dynamics I created an un-managed solution named “Fire Department Items Solution” and added two custom entities, “Patch” and “Fire Truck.”
  2. I added all the fields my dad had in his Access database, and then I made sure that the out of box field “EntityImage” was available for displaying an image of the patch.

PRO TIP:  Dynamics 365 only allows you to have one image field on an entity and it is not configured out of the box.  To use this field, create a new field on your entity and use the data type “Image”.  This will automatically set the name of your field to “EntityImage” and the image you set there will be used as your entity image at the top of the entity form.

Screenshot of PowerAppsPowerApps details

  1. Before we save and publish, we need to enable Notes functionality for our entities. To do this select the entity from the left pane in the solution explorer, then make sure the “Notes (includes attachments)” checkbox is selected.

PRO TIP:  When you save an image to the EntityImage filed it loses a lot of its quality.  Because we are using this data for inventory, including creating ads, we don’t want to lose the quality of our images.  For this reason, we will use the attachments collection for our entity to capture the actual high-quality image.  We will then use Microsoft Flow to take that image and store it as the EntityImage (which will lose quality) but also store the high-quality version in a SharePoint library.

PowerApps note functionality

  1. Finally, be sure to publish your customizations.

Migrating the Data

Now it’s time to migrate the data.  Since this was such a simple schema, I opted to use the out-of-box data import functionality that Dynamics 365 provides.  With that said, however, there are a few different ways to accomplish this migration. For me it was easy to simply export the Microsoft Access database to Excel, then use that file to import into Dynamics 365.

    1. Export your data into an Excel file from Microsoft Access.
      1. Export your data into an Excel file from Microsoft Access.
    2. In Excel you’ll want to Save a Copy and save it as a CSV file.
      Save a copy as a CSV file
    3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality to load our data.

3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality

    1. Choose the CSV file we just created when we saved the copy in Excel.

Choose your CSV file

    1. On this next screen, let’s click the button to Review our Field Mappings.

Review Field Mappings

    1. Here you’ll see some of my fields are mapped and some aren’t. Let’s get those shored up before we proceed.

Resolve mapped items

    1. Now that I’ve resolved all the field mappings, you’ll see we have green check marks across the board and we’re ready to import. Click the Finish Import button and you’re off.

Finish Import button

    1. You can check out the progress of the import by navigating to Settings à Data Management à

View Import progress

Summary & Next Steps

Let’s look at what we’ve done here.  On the surface it would appear we’ve simply gone into Dynamics 365 and configured a couple of entities.  But as we know, Dynamics 365 v9 was built on the Common Data Service (CDS) and that means our Dynamics data is now available to any other application that can connect to the CDS.  Why is this important for this project you might ask?  That answer will become clear in the next part of this blog.  For now, here are some screen shots on how things look now that we have our patch data migrated.

A look at the imported data

Keep in mind, come end of January 2019 everyone will need to switch over to Microsoft’s Unified Interface and that’s what we’re using here for our patches.  This is an example of a model-driven Power App which we’ll discuss in our next entry to this blog.

If you log in to your Power Apps environment using the same credentials as your Dynamics 365 environment, you should see your entities and the data migrated in this environment too.  Remember, once it’s in Dynamics, it’s available through the CDS.

A view of the migrated data

One thing to note, if you have 10,000-plus records like I do for patches, CDS in the browser may freeze trying to display them all.  I would hope MS resolves this at some point so that it handles paging and displaying of data as gracefully as the D365 web client does.

Stay tuned for my next entry where we’ll set up our SharePoint Online site, create a simple canvas Power App for inventory management on our mobile devices, and then set up a Flow to help move some things around and automate the creation of our online advertisements.

Thanks for reading!

While it isn’t quite as good as having complete control of your CSS, Dynamics CRM (2015 Online Update 1, and On-Prem 2016) now offers a feature called Themes. Themes enable the organization to customize their CRM Web interface to some degree, although we still don’t have complete control of the styling.

There are plenty of good blogs on Dynamics CRM themes, but I’ve yet to find one that includes good tips on determining the hex values for the colors you need. This blog will help you determine these values, including using a color picker to pull a color’s hex value from an image. Read More…

Creating SharePoint Sites From a Dynamics CRM Plugin

With more and more companies adopting both Dynamics CRM and SharePoint into their corporate technology stacks, I’ve found myself integrating the two technologies more often than ever before. It’s quite cool. These are two extremely powerful systems and they both do what they are designed to do very well. I’m not going to go in-depth about what the differences are between these products…they are in totally different software “genres.” Apples and oranges. However, I want to point out one thing SharePoint gives you that many clients ask for again and again: Subsites.

Subsites are SharePoint sites that belong to a root SharePoint Site Collection. An example of this would be a corporate intranet SharePoint Site Collection (https://intranet) with SharePoint Subsites for each division in the corporation, e.g. Marketing (https://intranet/sites/marketing), Accounting (https://intranet/sites/accounting), etc. There are pros and cons to breaking these Subsites out into their own Site Collections (so they can have their own content database), but quite often you see this hierarchical approach.

Another very common use of SharePoint Subsites is for things like item level management, for instance a project. You may have a SharePoint Site Collection named Projects (https://projects), and you may require a new SharePoint Team Site for any new projects that come online so as to provide an environment where a team can collaborate on said project, and maintain any data or documents about the project in a central location. This can best be described with URLs —https://projects/sites/Project1, https://projects/sites/Project2 and so on.

So how does Dynamics CRM play into my scenario?  What if you’d like to leverage this same functionality for things like Accounts, Contacts or Leads?  Pick an Entity from CRM — for the purposes of this blog the Entity is somewhat arbitrary, but we’ll use the Account Entity so we have something to focus on. Every new Account I create may or may not require an environment where teams can collaborate and store data and documents for this new Account. Dynamics CRM gives us SharePoint Document Management out of the box. But what if I need more? What if my company maintains a site ‘template’ that can be used to create a new, standardized website (SharePoint Subsite) for any new Accounts that come online? A place where a team can collaborate and have the freedom to deal with unstructured data and content in a central location?

In this blog I’ll show you how to leverage the SharePoint Client Object Model in a Dynamics CRM Plugin to create a SharePoint Team Site for new Accounts that come online.

Read More…