SharePoint in Microsoft 365 is a fantastic resource for building company intranets, and a lot of functionality is available without significant customizations.

Even various templates are available through Microsoft’s SharePoint lookbook, showcasing commonly used examples.

One feature missing from the default templates is an announcement banner, a specific request for intranets. For example, companies may want to announce important information to their employees who are prominently featured on the intranet home page, such as an office closure due to weather or a reminder that timesheets are due.

There are examples of how to accomplish this in the SharePoint Patterns and Practices (PnP) GitHub repository (such as react-app-announcements), but this requires the development of a SharePoint Framework (SPFx) application customizer.

However, there is an easier way to display an announcement banner with less effort and without knowledge of SPFx.

A quick summary on how to accomplish this is to:

  • Set up a list in the intranet site to keep track of the announcements
  • Create a custom list view for showing current announcements
  • Use JSON formatting to modify that custom view
  • Place a list view web part on the intranet home page and point it to the custom view

Announcements List Setup

First, set up the announcements list.

From the home page of the intranet, choose “New” > “List” in the menu.

SharePoint Communication site

Then choose “Blank List.” In the dialog that appears, enter the list’s name (“Announcements”), add a description, and un-check the box to show in site navigation. Then press the “Create” button.

Company wide announcements

The new list is created, and the screen is redirected to the default list view.
Columns will need to be added to the list to track when to show the announcements.
Press “+ Add column,” then choose “Date and time.”

Create a Column for Values

In the slide-in panel for “Create a column,” enter and/or choose the following values for the column:

  • Name: Start Date
  • Description: The date that the announcement should begin displaying
  • Type: Date and time
  • Include Time: No
  • Friendly format: No
  • Default value: Today’s date

Apply details in created column

Then press the “Save” button to create the new column.

Next, press “+ Add column,” then choose “Date and time” again to add another column.

This time, enter and/or choose the following values for the column, then press the “Save” button to create the new column:

  • Name: End Date
  • Description: The date that the announcement should stop displaying
  • Type: Date and time
  • Include Time: Yes
  • Friendly format: No
  • Default value: None

Create an End Date Column

Another column to define the type of announcement should also be created. This will also determine the icon and color that is displayed for the announcement.

Press “+ Add column,” then choose “Choice.”

Enter and/or choose the following values for the column:

  • Name: Announcement Type
  • Description: The type of announcement to display: Notice = Blue, Warning = Yellow, Alert = Pink
  • Type: Choice

For the Choices section, edit the existing three choices to be:

  • Notice
  • Warning
  • Alert

Change the color of the “Warning” choice by pressing the palette icon and choosing the “Gold” color.

Change the color of the “Alert” choice by pressing the palette icon and choosing the “Peach” color.

Customize Column Color

Select “Notice” to be the Default value, then press “More options” to make further adjustments to the column.

Choose Options for the Column

Choose the following options for the column, then press “Save” to create it.

  • Display choices using: Radio Buttons
  • Allow multiple selections: No
  • Require that this column contains information: No
  • Enforce unique values: No
  • Add to all content types: Yes

Create a section for a link

The last column to add should be a link if the announcement has more information that the user needs to reference.

Press “+ Add column,” then choose “Single line of text.” Enter and/or choose the following values for the column, then press the “Save” button to create the new column:

  • Name: Announcement Link
  • Description: Link for more information
  • Type: Single line of text
  • Default value: #

Creating an Announcement Column

The Announcements list should now have all the columns needed:

  • Title
  • Start Date
  • End Date
  • Announcement Type
  • Announcement Link

Custom View for Announcements List

Next, a custom view of the Announcements list should be created to display current announcements.

On the right side of the list menu bar, press the “All Items” button, then choose “Create new view”.

New View Creation

In the dialog, enter “Current Announcements” for the View name, keep “Make this a public view” checked, then press “Create.”

Current Announcements

It should redirect to the Current Announcements list view. On the right side of the list menu bar, press the “Current Announcements” button, then choose “Edit current view.”

Edit Current Column View

In the “Columns” section, make sure the following columns are checked, to include them in the view:

  • Title
  • Start Date
  • End Date
  • Announcement Type
  • Announcement Link

Make sure columns are checked

In the “Sort” section, set the first sort to “End Date” and “Show items in descending order,” then set the second sort to “Start Date” and “Show items in descending order.”

Sort the Column by End Date

In the “Filter” section, set the filter to show items when “Start Date” is less than or equal to “[Today]” AND when “End Date” is greater than or equal to “[Today].”

Customize the Filter Section

Scroll down and press the “OK” button to finish editing the view.

Custom View Formatting

Before applying some custom formatting to the view, enter a few items into the list for testing purposes.

To see the items in the view, make sure that for each item, the “Start Date” is less than or equal to today’s date, and the “End Date” is greater than or equal to today’s date. Then, choose at least one of each “Announcement Type” to see how the formatting changes for each type.

An example is below:

  • Title: Timesheets Due by COB 10/31/2021
  • Start Date: 10/26/2021
  • End Date: 11/1/2021 12:00 AM
  • Announcement Type: Warning
  • Announcement Link: #

Timesheets Due and Announcement Link

If the announcement banner should link to somewhere else, enter a full URL into the “Announcement Link” field.

After entering a few items into the list, it should look something like this:

Populated List

On the right side of the list menu bar, press the “Current Announcements” button, then choose “Format current view.”

Current View Format

In the sidebar, scroll down and choose “Advanced mode”.

Advanced Mode

The sidebar will now show an input field that contains JSON code. Copy the code from that field and paste it into a preferred code editor to make it easier to work with.

Format View

Microsoft has some great examples of list view formatting. Let’s walk through the formatting for this example.

Since this view will only show a banner, the list headers and the selection options should be hidden, so “hideSelection” and “hideColumnHeader” should be set to true.

Instead of the traditional table elements, we will use the “rowFormatter” to render custom elements per row/item.

In the row formatter, we should first determine what type the element should be, set by the “elmType” property. In this case, we will set it to a “div”.

Next, we will apply CSS class attributes to that div element based on certain conditions. There are many predefined CSS class names that SharePoint uses, which we can apply to the div. A few of the common ones are listed in their style guidelines.

The condition we want to check is the value of the “Announcement Type” column. When referencing a list column in JSON, enclose it in “[]” brackets, and prepend it with “$”. (Depending on the name of the column, sometimes spaces may also need to be substituted with “_x0020_”.)

In the code snippet shown, we check the following conditions and apply the appropriate CSS classes to the div based on that condition.

Announcement Type

{
  "$schema": "https://developer.microsoft.com/json-schemas/sp/row-formatting.schema.json",
  "hideSelection": true,
  "hideColumnHeader": true,
  "rowFormatter": {
    "elmType": "div",
    "attributes": {
      "class": "=if([$AnnouncementType] == 'Notice', 'sp-row-card sp-field-severity--low', if([$AnnouncementType] == 'Warning', 'sp-row-card sp-field-severity--warning', if([$AnnouncementType] == 'Alert', 'sp-row-card sp-field-severity--severeWarning', '')))"
    },
    "children": [
    ]
  }
}

Next, we will specify how the child elements of the div are rendered. Under the “children” property, we will set up two span elements inside the div: One for the icon and one for the text.
For the span element with the icon, first, we set up an “elmType” of “span”. Then we check the “Announcement Type” column again to reference the name of the icon from the Fluent UI library and apply it to the “iconName” attribute under “attributes.”

Attributes

We can also apply specific CSS styles to that span via the “style” property. Here we use custom padding, display, and font-size style attributes.

{
  "$schema": "https://developer.microsoft.com/json-schemas/sp/row-formatting.schema.json",
  "hideSelection": true,
  "hideColumnHeader": true,
  "rowFormatter": {
    "elmType": "div",
    "attributes": {
      "class": "=if([$AnnouncementType] == 'Notice', 'sp-row-card sp-field-severity--low', if([$AnnouncementType] == 'Warning', 'sp-row-card sp-field-severity--warning', if([$AnnouncementType] == 'Alert', 'sp-row-card sp-field-severity--severeWarning', '')))"
    },
    "children": [
      {
        "elmType": "span",
        "attributes": {
          "iconName": "=if([$AnnouncementType] == 'Notice', 'Info', if([$AnnouncementType] == 'Warning', 'Error', if([$AnnouncementType] == 'Alert', 'Warning', '')))"
        },
        "style": {
          "padding-right": "12px",
          "display": "inline-block",
          "font-size": "1.5em"
        }
      }
    ]
  }
}

Next to the span with the icon, we will render another span with the “Title” text for that announcement item. It should be set as an “elmType” of “span.” Custom display and vertical-align styles can be applied with the “style” property. To render the title text, set the “txtContent” property to “[$Title]”.

{
  "$schema": "https://developer.microsoft.com/json-schemas/sp/row-formatting.schema.json",
  "hideSelection": true,
  "hideColumnHeader": true,
  "rowFormatter": {
    "elmType": "div",
    "attributes": {
      "class": "=if([$AnnouncementType] == 'Notice', 'sp-row-card sp-field-severity--low', if([$AnnouncementType] == 'Warning', 'sp-row-card sp-field-severity--warning', if([$AnnouncementType] == 'Alert', 'sp-row-card sp-field-severity--severeWarning', '')))"
    },
    "children": [
      {
        "elmType": "span",
        "attributes": {
          "iconName": "=if([$AnnouncementType] == 'Notice', 'Info', if([$AnnouncementType] == 'Warning', 'Error', if([$AnnouncementType] == 'Alert', 'Warning', '')))"
        },
        "style": {
          "padding-right": "12px",
          "display": "inline-block",
          "font-size": "1.5em"
        }
      },
      {
        "elmType": "span",
        "style": {
          "display": "inline-block",
          "vertical-align": "top"
        },
        "txtContent": "[$Title]"
      }
    ]
  }
}

Select the code above and copy it. Go back to the editor for the custom view formatting ( “Current Announcements” > “Format current view” > “Advanced mode”) and paste the code into the JSON code editor.

Format View List Layout

Press the “Save” button, and the “Current Announcements” list view should now look similar to this:

List View for Announcements

We may want to change the background color for notices to the same color chosen when creating the choices for the column. To do that, change “sp-field-severity–low” in the div’s class attribute to “sp-CSS-backgroundColor-BgCornflowerBlue.” Once that change is made to the code in the custom view formatting and “Save” is pressed again, any “Notice” announcements should now have a light blue background.

This looks great for basic announcements, but what if users need to link to an article or website for more information? To do that, we will need to take the current span children of each row and move them inside an anchor element used for the link.

Create a new “elmType” of “a” as the first object inside the main div’s “children” property. Set the anchor’s “href” attribute to reference the “Announcement Link” column, and apply a condition to set the anchor’s “target” attribute. We want the link to open in a new window if it is not the default “#” value. Next, add custom “style” properties to set the display, text-align, font-size, text-decoration, and color, then add an empty “children” property, where we will move the current icon and text spans.

{
        "elmType": "a",
        "attributes": {
          "target": "=if([$AnnouncementLink]!='#','_blank','_self')",
          "href": "[$AnnouncementLink]"
        },
        "style": {
          "text-align": "left",
          "font-size": "1.3em",
          "text-decoration": "none",
          "color": "#212121",
          "display": "block"
        },
        "children": []
      }, 

Now move the original child “span” elements into the children of the “a” element. The final code should look like this:

{
  "$schema": "https://developer.microsoft.com/json-schemas/sp/v2/row-formatting.schema.json",
  "hideSelection": true,
  "hideColumnHeader": true,
  "rowFormatter": {
    "elmType": "div",
    "attributes": {
      "class": "=if([$AnnouncementType] == 'Notice', 'sp-row-card sp-CSS-backgroundColor-BgCornflowerBlue', if([$AnnouncementType] == 'Warning', 'sp-row-card sp-field-severity--warning', if([$AnnouncementType] == 'Alert', 'sp-row-card sp-field-severity--severeWarning', '')))"
    },
    "children": [
      {
        "elmType": "a",
        "attributes": {
          "target": "=if([$AnnouncementLink]!='#','_blank','_self')",
          "href": "[$AnnouncementLink]"
        },
        "style": {
          "text-align": "left",
          "font-size": "1.3em",
          "text-decoration": "none",
          "color": "#212121",
          "display": "block"
        },
        "children": [
          {
            "elmType": "span",
            "attributes": {
              "iconName": "=if([$AnnouncementType] == 'Notice', 'Info', if([$AnnouncementType] == 'Warning', 'Error', if([$AnnouncementType] == 'Alert', 'Warning', '')))"
            },
            "style": {
              "padding-right": "12px",
              "display": "inline-block",
              "font-size": "1.5em"
            }
          },
          {
            "elmType": "span",
            "style": {
              "display": "inline-block",
              "vertical-align": "top"
            },
            "txtContent": "[$Title]"
          }
        ]
      }
    ]
  }
}

Paste this updated code into the editor for the custom view formatting ( “Current Announcements”
> “Format current view” > “Advanced mode”), and press the “Save” button once again.

Current Announcements

The “Current Announcements” list view should now look similar to this:

Current Announcement Views

Display List View on Home Page

Now that the list view has been formatted as desired, it can be added to the intranet home page.

Go to the intranet home page and press the “Edit” button in the toolbar to edit the page.

Edit Current View

Hover over a prominent area on the page where the announcements should go, then press the “+” button to add a new web part there.

Add new web part

Scroll down to the “Documents, lists, and libraries” section in the menu and choose the “List” web part.

Choose the List web part

When the web part is added to the page, it will prompt for which list to select. Choose the “Announcements” list.

Announcements

By default, the List web part displays the All Items list view. Hover over the pencil icon next to the web part to edit its properties.

Edit Announcements List

In the slide-in panel, select the “Current Announcements” view, toggle the switch to the “Hide command bar,” then press the “Apply” button.

Current Announcements

The List web part should now display the formatted “Current Announcements” view without additional list command bar controls.

Web Part List

In the page toolbar, press the “Save as draft” button to save the updates, then press “Republish” to publish the new version of the home page.

Save as a Draft

Now important company announcements are displayed on the intranet home page in a prominent location.

Republish

Conclusion

This is just one example of how a low-code solution with relatively little effort can be implemented to satisfy the need for a custom feature not natively provided by SharePoint.
For additional resources and other examples, refer to Microsoft’s documentation on list formatting.

AIS has tons of experience helping companies build useful intranets. Contact us to see how we can help.

I recently had the opportunity to perform a lift-and-shift migration of a SharePoint 2016 environment to cloud Infrastructure as a Service (IaaS) in Amazon Web Services (AWS). To support the long-term goals of the client, Okta would be implemented for authentication. Additionally, the client had several product integrations, including SQL Server Reporting Services (SSRS) (Integrated Mode), Office Online Server (OOS), Gimmal, and Nintex.

One of the first problems that I ran into was very little knowledge or lessons learned available. Okta does provide an Integration Guide; however, it simply walks you through setting up Okta as a Trusted Claims Provider for SharePoint. The guide does not cover or even mention potential architectural, migration, or integration concerns. I found a useful article at SharePointDoctors.com that does a great job filling in some of the gaps left in the Okta documentation, and I highly recommend reviewing it. One of the most critical points made by SharePointDoctors.com was to “Test, Test, and Test Again,” which is exactly what we did to discover and solve migration issues. In this post, I will share some of the issues we encountered migrating to Okta and what we did to mitigate them.

Lesson 1: Authentication Providers and the People Picker

When configuring Okta, there is no way to switch to Okta entirely; Windows Authentication is required for service accounts, and search crawls — watching an Okta product presentation, around the 8:20 mark, the presenter glazes over this fact. He claims that when they are ready for the final cutover, they disable Windows Authentication.

Claims Authentication Types

Initially, we had both Okta and Windows Authentication enabled for the Default Zone. If you configure SharePoint this way, users will be asked to select which credentials to use to log on to the site when they navigate the site.

Windows Authentication Sign In

If you do not want users to be prompted with this, follow these steps:

  1. Open Central Administration
  2. Select Manage Web Applications
  3. Select your web application and then Authentication Providers from the ribbon
  4. Select the Zone (usually Default)
  5. Scroll down to the Sign In Page URL and select Custom Sign In Page
  6. Enter /_trust/

Custom Sign In Page

This will force the user to use Okta when navigating to the site. You can find additional information about the login sequence here.

When we configured both authentication modes in the Default Zone, we found that whenever the People Picker was used, it would return two users: The Windows User and the Okta User. We knew that this would be very confusing and decided to extend the Web Application with a different Zone. Thus, we had a web application (ex. internal.contoso.com) in the Default Zone with Windows Authentication and a second web application (ex. contoso.com) in the Internet Zone with Okta Authentication. Using this arrangement, admins, third-party integrations, and search crawls could operate in the Default Zone, and all users would have access within the Internet Zone. You will understand why we chose this as you see the issues we encountered later in this article.
If the issues that we encountered are not applicable and you decide to use both authentication types in the Default Zone, you can hide AD from the People Picker using the following script:

Add-PSSnapin Microsoft.SharePoint.Powershell
$cpm = Get-SPClaimProviderManager
$ad = get-spclaimprovider -identity "AD"
$ad.IsVisible = $false
$cpm.Update()

SHAREPOINT MIGRATION FOR DHS
AIS helped the U.S. Department of Homeland Security migrate its large SharePoint Server environment to M365 in just six months, reducing costs and improving collaboration. Read how we did it.

Lesson 2: Migrating Claims

Using Move-SPUser

When I started scripting out the migration of users, I initially started with the script provided by SharePointDoctors.com, iterating over each site collection and calling Move-SPUser for each user.

However, SharePointDoctors.com warns that migrating a user twice is bad:
“SharePoint deletes all instances of the original user and replaces it with a new blank one that owns nothing. As a result, you lose all permissions information about the user.”

This concerned me greatly and since we had four web applications and numerous site collections with the same user in several places. I was concerned that if Move-SPUser was called more than once for the same user, the user would be over-written; after meeting with Microsoft, I found that this is not the case. The overwrite concern would be if a user logged into the system premigration creating a scenario where an Okta user (ex. c:0-.t|okta|chettinger) and a Windows user (ex. i:0#.w|ais\chettinger) exist in the system. Once Move-SPUser migrated the Windows User, the original Okta User would be overwritten with a new Okta user. In other words, there is no problem with calling Move-SPUser more than once if you happen to do so over multiple site collections.

Planning Time for User Migration

Another interesting issue that we encountered was the time it took to migrate many users (20k+). After working with Microsoft and researching the logs, we found that it took longer to migrate a user on servers where 3rd party integrations were installed. For example, when we ran the script on the SSRS or Search servers, it would only take 1 second to migrate a user. If we ran it on one of the application servers, it would take 1.5 minutes per user. In our case, we had Nintex and Gimmal installed. After working with Microsoft and testing thoroughly, we determined that it was perfectly safe to run the migration on the faster servers and that there was no negative impact on the migration.

Using SPFarm.MigrateUser()

While working on the script to migrate groups using SPFarm.MigrateGroup() I found that there was also a SPFarm.MigrateUser() function. It seemed more efficient to build a list of users and iterate over it, calling SPFarm.MigrateUser() for each one. Once again, we met with Microsoft, and they assured us that the SPFarm.MigrateUser() function would behave just like the Move-SPUser command, only at the farm level. Ultimately, we used this as it allowed us to batch up the user migration into PowerShell Jobs easily. This is the script that we ended up using across multiple servers.

Add-PSSnapin Microsoft.SharePoint.Powershell
$domain = "ais"
$dryrun = $true

$muliplier = 0    #The number of server instance migrating users (0,1,2,3,etc.)
$jobCount = 20    #The number of parallel jobs
$chuckSize = 100  #The number of users to process at one time
$maxUsers = 2500  #The number of users to process on this server

$usersString = 
"<ActiveUsers>
    <ActiveUser>i:0#.w|ais\user1</ActiveUser>
    <ActiveUser>i:0#.w|ais\user2</ActiveUser>
    <ActiveUser>i:0#.w|ais\user3</ActiveUser>
    <ActiveUser>i:0#.w|ais\user4</ActiveUser>
</ActiveUsers>"

#Using a string in this example, but a separate file would be more appropriate
$ScriptRoot = Split-Path $MyInvocation.MyCommand.Path
#$UsersXml = (Get-Content "$($ScriptRoot)\ActiveUsers.xml") 

$UsersXml = $usersString
$users = $UsersXml.ActiveUsers.ActiveUser

#Use Dry Run to test
$dryrunText = "[DRYRUN]"
if($dryrun -eq $false){
    $dryrunText = ""
}

if($maxUsers -ne $null){
    $users = $users| select -Skip ($maxUsers*$muliplier) | select -first $maxUsers
}

$oktaClaimChar = Get-SPClaimTypeEncoding |  Where-Object { $_.ClaimType -like '*Okta*' }

Write-Host "$($dryrunText)Start: $(Get-Date)"

#Build Chunks
$chunks = [System.Collections.ArrayList]::new()
for ($i = 0; $i -lt $users.Count; $i += $chuckSize) {    
    
    if (($users.Count - $i) -gt ($chuckSize-1)  ) {
        $chunks.add($users[$i..($i + ($chuckSize-1))]) | Out-Null
    }
    else {
        $chunks.add($users[$i..($users.Count - 1)]) | Out-Null
    }
}


for ($i = 0; $i -lt $chunks.Count; $i++) {  
    $chunk = $chunks[$i]
    Write-Progress -Id 0 -Activity Updating -Status 'Progress->' -PercentComplete ($i/$chunks.Count * 100) -CurrentOperation Chunks
    $running = @(Get-Job | Where-Object { $_.State -eq 'Running' })
    if ($running.Count -ge $jobCount) {
        $running | Wait-Job -Any | Out-Null
    }
    $jobName = "Chunk$i"
    $job = Start-Job -Name $jobName -OutVariable $job {
        Add-PSSnapin Microsoft.SharePoint.Powershell
        $chunk = $using:chunk
        $dryrun = $using:dryrun
        $dryrunText = $using:dryrunText
        $i = $using:i
        
        $oktaClaimChar = $using:oktaClaimChar        
        $farm = Get-SPFarm

        for ($j = 0; $j -lt $chunk.Count; $j++) {
            $user = $chunk[$j] 
            if($user -ne $null)
            {
                $oldUserName = $user.ToLower()
                $newUserName =  $user.Replace("i:0#.w|", "i:0$($oktaClaimChar.EncodingCharacter).t|okta|")
                $newUserName =  $newUserName.Replace("$domain\", "")               
                if($oldUserName -ne $newUserName)
                {
                    Write-Host "  $($dryrunText) Moving User $oldUserName  to $newUserName"
                    if($dryrun -eq $false)
                    {     
                        try{    
                            $farm.MigrateUserAccount($oldUserName,$newUserName,$false)
                        }catch{
                            Write-Host $_
                        }
                    }              
                }     
            }                       
        }      
    }
}
Wait-Job * | Out-Null 

# Process the results
foreach($job in Get-Job)
{
    $result = Receive-Job $job
    Write-Host $job.Name
    Write-Host $result
}
Remove-Job -State Completed
Write-Host "$($dryrunText)End: $(Get-Date)" 

Lesson 3: Integration with SQL Server Reporting Services (SSRS)

As mentioned earlier, our environment was running SQL Server Reporting Services (SSRS) – Integrated Mode. There were no changes necessary for reports to work for the end-user. However, for report authors to create and edit reports, they needed to use Windows authentication. How you decide to handle this is tightly coupled with what I covered in Lessons 1 and 2. If you choose to use both Okta and Windows Authentication in a single zone, you will face issues when editing a report with the Report Builder while logged in as an Okta user.

This was the second reason why we went with two authentication zones. To edit the report, the authors would connect to the Default Zone URL (ex. https://internal.contoso.com); however, if the data source is a SharePoint list, the Internet Zone URL is used (ex. https://contoso.com). SharePoint will respect the permissions of the SharePoint user (in this case, Okta).

Input Data Source Type

For all of this to work together, we migrated content, then migrated users and groups to Okta claims, and then added new Windows groups so that certain users could log in with Windows credentials and edit reports.
CAUTION: This creates the scenario that I warned about; so, make sure your user migration was successful before adding these groups and letting report authors access the system. If you migrate the Windows user to Okta, and then the user logs in with windows credentials, there will be two claims in the system (ex. c:0-.t|okta|chettinger and i:0#.w|ais\chettinger). If you were to migrate the new Windows user a second time, it would likely overwrite the Okta user and its permissions.

Lesson 4: Integration with Microsoft Office Products

Microsoft Word

Okta did not seem to consider Microsoft Office when developing its SharePoint integration solution. Editing items in Word, Excel, and PowerPoint is an important feature, and our users wanted it to work. When the Open in Word option is used on a file in SharePoint, the Word application will open on the user’s computer and attempt to authenticate with Okta.

Open in Word

Under the hood, Office products use an outdated browser control based on Internet Explorer version 9. The Okta login page would not render correctly in the browser control due to compatibility issues; instead, it would throw a script error, and controls would not render.

Script Error

We had to work with Okta to get them to change it for our login page. Microsoft loosely explains how to fix it in this article; however, Okta had to detect the browser version and add the meta tag accordingly. Ultimately, if you plan on using Office products with SharePoint, you will need to work with Okta to get your login page fixed.

Microsoft Outlook

We also ran into a unique issue with Microsoft Outlook and adding Calendars. Outlook 365 users had to go to File > Options > Trust Center > Form-based Sign-in and choose Ask me what to do for each host.

Trust Center

For Outlook clients before 365 (Outlook 2019 in our case), the Form-based Sign-In option was unavailable. We had to work with our Group Policy Object (GPO) Administrators and create a GPO to set this and add the hostname (ex. contoso.com). Unfortunately, this only partially fixed the problem; once users added the calendar, they started getting prompted with Windows credentials. After working with Microsoft, we found out that when the calendar is added to Outlook, it stores the URL somewhere, and it gets it from SharePoint based on the first Zone it finds a URL in.

  • It checks the Zones in the following order: Intranet, Default, Extranet, Internet, Custom. If you remember, we had the following:
  • Intranet – empty
  • Default – https://internal.contoso.com (Windows Auth)
  • Internet – https://contoso.com (Okta Auth)
  • Extranet – empty
  • Custom – empty

Outlook was storing the URL from the Default Zone, which was Windows authentication, and prompted the user. So, what was the fix? First, we had to move https://contoso.com to the Intranet Zone so that Outlook would store it instead.

  • Intranet – https://contoso.com (Okta Auth)
  • Default – https://internal.contoso.com (Windows Auth)
  • Internet – empty
  • Extranet – empty
  • Custom – empty

Lesson 5: Integration with Nintex Forms and Workflows

When we started testing Nintex Forms and Workflows, we quickly found that users had not been migrated as we had hoped. So the first thing we did was add the Nintex MigrateUser operation right after the farm-migrate user command in the script above:

$farm.MigrateUserAccount($oldUserName,$newUserName,$false)
NWAdmin.exe -o MigrateUser -oldUser $oldUserName -newUser $newUserName 

According to Nintex, this command only updates user settings and history, not in the workflow definitions or running workflows. So, to fix the workflows, I wrote a script that recursively goes through all of SharePoint and looks for the hidden library NintexWorkflows. Each of these libraries exports each workflow, replaces the Windows claim with an Okta claim and then deploys the workflow. It does all of this using NWAdmin.exe operations and was approved by Nintex. Here is the script that we used:

$domain = "ais"
$dryrun = $true
$dryrunText = "DRYRUN"


$oktaClaimChar = Get-SPClaimTypeEncoding | Where-Object { $_.ClaimType -like '*Okta*' }
$encodedValue = [int]([char]($oktaClaimChar.EncodingCharacter))
$encodingCharacter = "&#$($encodedValue);"
if($dryrun -eq $false){
    $dryrunText = ""
}

function CheckWorkflow($asset)
{ 
    $text = [System.Text.Encoding]::ASCII.GetString($asset.OpenBinary())

    if($text.Contains("i:0#.w|$domain\"))
    {        

        try {
            $assetConfig = $asset.ParentFolder.Files | Where-Object Name -eq $($asset.Title +".xoml.wfconfig.xml")
            $configText = [System.Text.Encoding]::ASCII.GetString($assetConfig.OpenBinary())
            $configXml = $configText
            $listId = $configXml.WorkflowConfig.Association.ListID

            $path = $asset.Web.Url.Replace('https://','')
            $pattern = '[\\/]'
            $path = $path -replace $pattern, '-'
            $nwfFile = "C:\Temp\NintexMigration\$path\$($asset.title).nwf"


            if((Test-Path "C:\Temp\NintexMigration\$path") -eq $false){
                New-Item -ItemType Directory -Path "C:\Temp\NintexMigration\$path"
            }

            if($null -ne $listId)
            {
                $list = $asset.Web.Lists | Where-Object Id -eq $listId
                $listName = $list.Title

                $output = & NWAdmin.exe -o ExportWorkflow -siteUrl $($asset.Web.Url) -list "$($listName)" -workflowName "$($asset.title)" -fileName "$($nwfFile)" -workflowtype list           
                if ($output  -eq "Exporting complete.")
                {
                    $nwfText = Get-Content -Path "$($nwfFile)"
                    $newNwfText = $nwfText
                    $newNwfText = $newNwfText.Replace("i:0#.w|$domain\","i:0$($encodingCharacter).t|okta|")    
                    Set-Content -Path "$($nwfFile)" -Value $newNwfText

                    Write-Host "$dryrun TextChange Type=""List"" SiteUrl=""$($asset.Web.Url)"" TargetList=""$($listName)"" WorkflowName=""$($asset.title)"" NWFFile=""$($nwfFile)"" Web=""$($asset.Web.Url)"" File=""$($asset.Url)"" DateTime=""$(get-date -f MM-dd-yyyy_HH_mm_ss)"""
                    if($dryrun -eq $false) {  
                        & NWAdmin.exe -o DeployWorkflow -siteUrl $($asset.Web.Url) -targetlist "$($listName)" -workflowName "$($asset.title)" -nwffile "$($nwfFile)" -overwrite
                    }
                } else{
                    Write-Host "$dryrunText $output"
                }
            }
            else
            {
                $output = & NWAdmin.exe -o ExportWorkflow -siteUrl $($asset.Web.Url) -workflowName "$($asset.title)" -fileName "$($nwfFile)" -workflowtype site            
                if ($output  -eq "Exporting complete.")
                {
                    $nwfText = Get-Content -Path "$($nwfFile)"
                    $newNwfText = $nwfText
                    $newNwfText = $newNwfText.Replace("i:0#.w|$domain\","i:0$($encodingCharacter).t|okta|")   
                    Set-Content -Path "$($nwfFile)" -Value $newNwfText

                    Write-Host "$dryrun TextChange Type=""Site"" SiteUrl=""$($asset.Web.Url)"" WorkflowName=""$($asset.title)"" NWFFile=""$($nwfFile)"" Web=""$($asset.Web.Url)"" File=""$($asset.Url)"" DateTime=""$(get-date -f MM-dd-yyyy_HH_mm_ss)"""
                    if($dryrun -eq $false) {  
                        & NWAdmin.exe -o DeployWorkflow -siteUrl $($asset.Web.Url) -workflowName "$($asset.title)" -nwffile "$($nwfFile)" -overwrite
                    }
                } else{
                    Write-Host "$dryrunText $output"
                }
            }

        } catch {
            Write-Line $_
        }
    }   
}

function CheckWorkflows($w)
{
    foreach ($list in $w.Lists)
    {
        if ( $list.title.tolower().contains( "nintexworkflows" ) )
        {
            foreach ($item in $list.items)
            {
                $asset = $item.file
                CheckWorkflow($asset)
            }
        }
    }
    foreach($sub in $w.Webs)
    {
        CheckWorkflows($sub)    
    }
}

$spWebApps = Get-SPWebApplication
foreach ($spWebApp in $spWebApps)
{      
    foreach ($spSite in $spWebApp.Sites)
    {
        if ($null -ne $spSite)
        {
            CheckWorkflows($spSite.RootWeb)
            $spSite.Dispose()
        }
    } 
}

Conclusion

There is much to consider if you want to use Okta as your authentication provider for SharePoint On-Premises. If you are using integrations such as Nintex, SSRS, and Microsoft Office, there will be a lot of work ahead of you. Hopefully, this blog post will save you some time with planning and risk mitigation. Either way, the most important take away to be sure to test thoroughly.

We're hiring for SharePoint careers at AIS. Help us deliver solutions and support client SharePoint environments.

Do your users want reports in SharePoint? Yes! They crave reports and charts. Regardless of which version of SharePoint they are using, getting started now to build their data visualizations in Power BI will position the reports for seamless migration to future SharePoint versions. These are the necessary steps to take to add a simple report in a SharePoint modern page.

Important note: To embed Power BI reports in SharePoint Online, a Power BI Pro license is required.

The Process Flow

This is the flow:

The Process Flow Figure 1

Figure 1: The Process Flow

  1. Create your data in SharePoint; say a list or library.
  2. Start Power BI Desktop to connect to SharePoint to read the data and transform it and create visualizations.
  3. Publish to Power BI Online, where a link becomes available to paste into the Power BI webpart available in SharePoint Modern pages.

We’ve gone full circle! Let’s look at the steps.

SharePoint Online Data

For this example, we will read data from a simple custom list. I added the list to my SPO Dev Tenant site named Vacation Planner. Since all our “vacay” are now “staycay,” I decided to make a board game list. Along with the default Title column that you get with any new list, I added three more. Each is a number column. Then I added games to the list; I listed every game I could think of. For each one, I entered somewhat random numbers for Difficulty and Minimum age and Popularity, although I am pretty sure Candy Land is for 4-year-olds.

SharePoint Online Data Figure 2

Figure 2: Board games list

To create the list, I was logged into SPO as a fictitious test user I named Gwen Lim.

Build the Report

Install the Power BI Desktop application to build the report. It’s free: download it here.

On first use, you will be prompted to sign in. If the login type option appears, choose “Organizational” and log in with a Windows account. I logged in with fictional Gwen Lim’s account. The app, either from the startup splash screen or the menu, chooses “Get Data.”

Select Data Source Figure 4

Figure 3: Select a data source

From the Common data sources dropdown, select “More…” at the bottom. Then click the “Online Services” option, and you should see “SharePoint Online List” on the right. Select that and then click “Connect” at the bottom.

Choose SharePoint online Figure 5

Figure 4: We will choose SharePoint online list

In the SharePoint Online Lists dialog, paste the address URL of the SharePoint site that contains your list. You can check the 2.0 (Beta) radio button (see figure 6) to enable the app to open the default view of your list or leave it 1.0 if you prefer.

SharePoint Site URL Figure 6

Figure 5: Enter the SharePoint site URL

A Navigator window appears with all of the lists available in the SharePoint site in the left columns with checkboxes. Then, check BoardGames to see a view of the data in the right side of the pane. Click the Load button.

Select the List Figure 7

Figure 6: Select the List

You can start building the report. The fields of the data display on the right side. Having chosen a specific, limited column view as default for the list, along with selecting the 2.0 radio button, you will see only a few fields (aka columns) on the right, which is easy to work with.

BoardGames List App

Figure 7: The BoardGames list fields appear

Ignore the fields for a moment while you chose a Visualization. Select the doughnut. Now, it’s time to apply fields to the doughnut. Drag Title into the Legend box under Visualizations. A legend appears beside the doughnut chart. Drag Popularity into the Values box, and your doughnut comes to life with color.

Pick a visualization Figure 9

Figure 8: Pick a visualization chart and add fields

When you hover the chart, tooltips appear with data for each game. Age level, Difficulty, and Popularity values have been imported as decimal values, which would be more readable as whole numbers. To alter this, and to edit column heading text, click on the ribbon’s Transform Data button.

Modify the Data Figure 10

Figure 9: Modify the data

To change the column value from a decimal to a whole number, click the column title to select it and then click on the ribbon’s Data Type button. Select Whole Number as in figure. Double click the column heading to rename the column.

Changing field titles and data types

Figure 10: Changing field titles and data types

Click the Close & Apply button on the left in the ribbon to cause the visualization to appear with the changes applied. Now when you hover your cursor over a section, Minimum Age will appear with a space and both values as whole numbers.

Ready to Publish Figure 11

Figure 11: Improved tooltips

Display in SharePoint

To display the report in SharePoint, click the Publish button in the ribbon on the right side. You will be prompted to save your report in .pbix format.

Ready to publish report figure

Figure 12: Ready to publish!

Save anywhere you want to keep it, and then the Publish to Power BI dialog appears. Varied workspaces can be configured, but initially, you only have “My Workspace” as an option, which is fine. Select it and then click “Select.”

Publishing to Power BI

Figure 13: Successful publish to Power BI Online

When you see the Success! dialog, click on the link to open the .pbix in Power BI online to view your report. In the Share menu above the report, drop down the menu options, and hover over Embed report. Here you want to see an option for SharePoint online.

Link to use in SharePoint page Figure 14

Figure 14: Get a link to use in a SharePoint page

This will be missing until you upgrade to a Power BI Pro license. This is not free, but the trial is for 60 days. Once you have that option in the menu and select it, you are rewarded with the Embed link to use in SharePoint.

Embed link for SharePoint

Figure 15: Click to highlight and then copy

Click that link to highlight it and copy. Now head over to your SharePoint site and create a page.

Locate built-in Power BI

Figure 16: Locate the built-in Power BI webpart

Click the webpart plus sign, and in the search box, type “power” to filter the results. The Power BI webpart will appear. Click on it, and the webpart will be inserted into your page. You will see a green button for Add report; click it to open the properties panel on the right. Paste in the embed link you got from Power BI online.

Apply the embed link

Figure 17: Apply the embed link

Click away from that textbox and your report will appear on the left.

Report Displayed Correctly

Figure 18: Report successfully displayed in SharePoint Online

Conclusion

This is a no-code solution and a simple demo. However, the depth of tooling provided by Power BI to enable developers and business data experts to transform and visualize organizational data is immense. The speed and ease with which we can integrate data reporting into SharePoint modern pages will be welcome to customers as they migrate to current SharePoint versions.

Links

Embed a report web part in SharePoint Online – Power BI | Microsoft Docs

I support projects where we have platforms like SharePoint, are looking towards adopting PowerApps, and have Azure Government subscriptions. While learning about containers, I wondered where they could fit into an environment where many applications are developed on top of SharePoint. This helped me better understand containers and discover the perfect use case. I thought of sharing my architecture idea through this blog post. If you would like to learn more about container architecture, read this blog post on Container Architecture Basics from our VP of Solution Engineering, Brent Wodicka.

Architecture Idea

If you are maintaining a SharePoint on-premise farm in a local data center or in the cloud, you cannot replace your SharePoint farm VMs with Containers. But what can you do is take the heavy business logic code out of your custom applications from SharePoint farm and deploy it as services using containers. Deploy your backend services using containers. You can call these services endpoints (Rest API Endpoints) from your SharePoint framework web parts and render UI inside the SharePoint framework webparts.

If your application is developed using PowerApps, I believe you can call custom service endpoints in PowerApps. So, your front end can still be deployed using SharePoint or PowerApps, but your business logic can be deployed as services inside the containers.

Where can containers fit in a multi-platform environment?

Answer: Below is two diagrams that explain the use case of the container in a multi-platform environment discussed above.
Business Logic Services Snapshot

Backend Databases and Cloud Services

Advantages of this Architecture

  • Since you have server-side code deployed inside the Azure Cloud, you can easily integrate with other Azure PaaS services, including Azure SQL Database, Azure Cognitive Search.
  • Since most business logic will deploy inside the containers as services, it is easy to implement these services in any other cloud provider.
  • Suppose you have complex legacy applications developed inside the SharePoint, and you are storing the data in SharePoint lists. In that case, you can move those SP lists data to the Azure SQL database and call the Azure SQL Apis from your services that deployed side the containers. ( See the second diagram above)

Conclusion

Suppose you have heavy business logic written in front end JavaScript files in SharePoint. You can rewrite those codes as server-side code using C# and deploy as services inside the containers and call these services endpoints from SharePoint webparts. Complex application data can even move from SharePoint lists to Azure SQL Databases. Containers solve deploying your custom code as services, but they cannot replace your SharePoint infrastructure.

In this post, I will show you how DevOps practices can add value to a variety of Office 365 development scenarios. The practices we will discuss are Infrastructure as Code, Continuous Integration, and Continuous Delivery. The advances in DevOps and SharePoint Framework (SPFx) have allowed us to make advancements in the way that we develop software and have improved our efficiency.

Infrastructure as Code (IaC)

Practicing IaC means that the infrastructure your applications depend on is created and maintained by code that is source controlled, tested, and deployed to production much like software. When discussing IaC, we’re typically talking about provisioning resources to a cloud provider. In our case, the “infrastructure” is Office 365 – a SaaS product with extensive customization and configuration options.  

While you could manage your O365 tenant with PowerShell, the code-centric and template-based PnP Provisioning Framework aligns better with this practice because: 

  1. Using the frameworks declarative XML syntax, you describe what you want to exist rather than writing code to manage how it gets created.
  2. It is easier for developers to run idempotent deployments to enact the desired state of your Office 365 tenant.  

While originally developed to support SharePoint Online and on-premise deployments, you can see in its latest schema that it has expanded to support Microsoft Teams, OneDrive, and Active Directory.  

Continuous Integration (CI) 

The practice of continuously integrating first means that your team has established the habit of frequently merging small batches of changes into a central code repository. Upon that merge, we automatically build and test the code to quickly identify bugs and quality issues.  

SharePoint Framework is a commonly used tool used to extend the capabilities of SharePoint Online and on-premise Much like the Provisioning Framework, SharePoint Framework is expanding to support other Office 365 services. You can currently use it to develop for Microsoft Teams and will soon be able to use it to develop Office Add-Ins.

Azure DevOps is a one-stop-shop service that provides everything you need throughout the software development lifecycle. For example, your team can version control your projects source code in Repos. After merging changes, use Pipelines to trigger a CI process that runs the build and test tasks of your SharePoint Framework solution.  

Continuous Delivery (CD)

Continuous Delivery, the practice of running automated deployments through a sequence of environments, starts after a completed CI process. Azure DevOps Pipelines will again be the tool of choice to execute the deployment procedures against each environment.

Example Solution

A solution demonstrating how to use the technologies and practices described above is available on Applied Information ScienceGitHub account. The result is a pipeline capable of receiving frequent changes to O365 configuration and SPFx applications from 1 or many developers, verifying the quality of the change, and deploying it to a series of environments.

Dev Tenant Diagram

I encourage you to explore the source code using the following summary as a guide. You’ll find the solution organized into three areas – SPFx, Provisioning, and Pipeline.

SPFx

A simple hello world web part was created using the yeoman generator. Jest was added to test the code. Npm and gulp scripts are used to build and package the source code which produces an sppkg file.

Provisioning

The PnP Provisioning Template XML file found here defines the desired state of the target tenant. The following is the desired state:

  1. Install the SPFx App into the tenant App Catalog.
  2. Create a site collection that will host our web parts page.
  3. Install the SPFx App to the Site Collection.
  4. Create a page that will host our web part.
  5. Add the web part to the page.

By using parameters for the tenant URL and site owner, the same template can be deployed to multiple environments. A PowerShell build script bundles the template and all required files, such as the SPFx sppkg file, into a single pnp file ready for deployment.

Pipeline

A multi-stage YAML pipeline defined in the Pipeline folder of the example solution runs the following process:

  1. Build, test, and package the SPFx and Provisioning Template source code.
  2. Deploy the prerequisite SharePoint infrastructure to the tenant if it does not already exist.
  3. Install and configure the SPFx web part.
  4. Repeat #2 and #3 for all environments.

Build Process Diagram

Secret variables, such as the username and password used to connect to the tenant, are only referenced in the pipeline. The values are set and encrypted in the Azure DevOps pipeline editor.

Variables Diagram with Passwords

Conclusion

In the not-too-distant past, it was high effort to write unit tests for a SharePoint solution, and most deployments were manual. In this post, I have shown you how advancements in the platform and tooling have changed this. The mentality, practices, and tools brought by DevOps can improve the pace and quality of any software development and infrastructure management project, including projects building upon Office 365.


If you didn’t catch the first two parts of this series, you can do that here and here.  In this part, we’ll get a little more technical and use Microsoft Flow to do some pretty cool things. 

Remember when we talked about the size and quality of the images we take with our Power App and store as the entity image? When saved as the Entity Image for a CDS/D365 item, the image loses quality and is no longer good for an advertisement photo.  This is done automatically and as far as I can tell, the high-res image is gone once this conversion takes place (someone please correct me if I’m wrong on that!).  On the flip side of that, it doesn’t make a whole lot of sense to put all this tech together only to have my end users be required to take two pictures of an item, one for hi-res and one for low-res.  We don’t want to store a high-res in a relational database for 10,000 plus items because the database could bloat immensely.

Microsoft Flow and SharePoint to the rescue!  

PRO TIP:  Dynamics 365 will crop and resize the image before saving it as the entity image.  All entity images are displayed in a 144 x 144 pixel square.  You can read more about this here.  Make sure to save/retain your original image files.  We’re going to stick ours in a SharePoint Picture Gallery App.

Objective 

Create a Microsoft Flow that handles… 

  • Pulling the original image off the Dynamics record and storing it in SharePoint. 
  • Setting the patch image to the Entity Image for the Dynamics record 
  • Create an advertisement list item for the patch 
  • Save the URLs for the ad and image back to the patch record 

Create the Flow 

We’re going to write this Flow so that it’s triggered by a Note record being created. 

 Flow screenshot with Create from blank highlighted

  • On the next page, click “Search hundreds of connectors and triggers” at the bottom of the page. 
  • Select Dynamics 365 on the All tab for connectors and triggers. 
  • Select the “When a record is created” trigger. 

 Dynamics 365 is highlighted

  • Set the properties for Organization Name and Entity Name.  Entity Name should be “Notes”. 
  • Save the Flow and give it a name. 

Verifying a Few Things 

  • Add a new step and select the Condition item. 
  • The Condition should check to see if the Note has an attachment. We do this using the “Is Document” field.  

 Condition Control is highlighted 

  • In the “Yes” side of the conditional we want to check if the Object Type is a Patch (ogs_patch in this case).  

At this point, if the Flow has made it through both conditionals with a “Yes”, we know we are dealing with a new Note record that has an Attachment and belongs to a Patch record.   

Update the Patch Record 

Now we want to update the batch record’s Entity Image field with the attachment.  First we need to get a handle on the Patch record.  We’ll do that by adding an Action to the Yes branch of our new Conditional. 

  • Add a Dynamics 365 Update a Record Action.
  • Set the Organization Name, Entity Name, and Record identifier accordingly.  For our Patch Record identifier, we’ll use the Regarding field in the Dynamic content window. 

 

  • Click on Show advanced options and find the Picture of Patch field. 
  • For the Picture of Patch field we need to get the document body of the attachment and convert it from Base-64 encoding to binary.  We do this using the “Expression” area again.  Use the “base64ToBinary” function to convert the document body like so. 

 

  • Save your work!  I can’t tell you how many times I had to retype that function. 

Create Our SharePoint Items & Clean-up 

Now that we’ve updated our entity image with the uploaded patch picture we want to do a couple of things, but not necessarily in sequence.  This is where we’ll use a parallel branch in our Flow.   

Dealing with a Parallel Branch 

  • Under the last Update a Record action, add a Conditional.  After adding this Conditional hover over the line between the Update action and the new conditional.  You should see a plus sign that you can hover over and select “Add a parallel branch.” 



  • Select this and add a Compose action.  You may need to search for the Compose action. 

 

PRO TIP:  With Modern Sites in SharePoint, we now have three solid options for displaying images in SharePoint.  The Modern Document Library allows viewing as tiles and thumbnails within a document library, the Picture Library which has often been the place to store images prior to the Modern Document Library, and then we can simply just display an image, or images, on a page directly.

Saving the Attachment as an Image in SharePoint

  • Let’s deal with Compose branch first.  Our compose will have the same function as our Picture of Patch did above for the Input field.  base64ToBinary(triggerBody()?[documentbody’]) 
  • After the Compose, we’ll add a Create File Action for SharePoint and use the name from our Patch record as the name for our image in SharePoint.  I’m using a Picture Gallery App in SharePoint and for now, only using the .JPG file type.  The File Content should use the Output from our Compose Action. 

 

Delete the Note

  • Finally, we want to delete that Note from Dynamics (and the Common Data Service) so that the image attachment is no longer taking up space in our Common Data Service.  Add a Dynamics Delete a Record Action after the SharePoint Create file action.  Set the Organization Name, Entity Name, and use the Dynamics content for Note as the Item identifier.

 

Creating Our Advertisement

Let’s jump back to the new Conditional we added after the Update a record Action where we set the entity image. 

  • Set the conditional to check for the Generate Advertisement field being set to true. 
  • If this is true, add a SharePoint Create Item Action and let’s set some values.  What we’re doing here is creating a new SharePoint List Item that will contain some starter HTML for a Patch advertisement. 
  • Save our work! 

 

 

Updating Our Patch Record With Our URLs From SharePoint

  • Under the SharePoint Create Item Action for creating the Ad, AND after the SharePoint Create file action for creating the picture in the Picture Gallery, we’re going to add Dynamics Update record Actions that will be identical with one difference. 
  • The Organization Name, Entity Name, Record Identifier (set to Dynamic Content “Regarding”) should be the same. 
  • On the Ad side, the Update record should set the SharePoint Ad for Patch field to “Link to Item”. 

 

  • On the image side, the Update record should set the SharePoint Image for Patch to the “Path” 

 

Seeing It In Action 

Of course, I’ve been saving my work so let’s go ahead and give this a whirl. 

  • At the top right of your Flow you’ll see a Test button.  We’re going to click that and select “I’ll perform the trigger action.” 
  • To make this more interesting, I’m going to run this from SharePoint! I’ll update a patch and kickoff my Flow from the embedded Power Apps Canvas App on my SharePoint home page. 

 

  • I select the patch, then I click the edit button (pencil icon at the top right). 
  • Notice the Attach file link and the Generate Advertisement switch.  We’ll use the first for our image and the second for generating our ad item in SharePoint. 

 

  • Finally, I click the checkmark at the top right to save my changes.  This kicks off our Flow in less than a minute, and when we navigate back over to the Flow we can see that it completed successfully. 

 Verifying the flow

  • I’ll hop back over to SharePoint to make sure that my ad was created and my entity image was set.  I’ll also make sure the high-quality image made it to the SharePoint Picture Library and the Note was deleted from the Patch record in Dynamics.  I also want to make sure the URLs for the ad and image in SharePoint were set back to the Patch record. 

verifying in SharePoint Verifying in SharePoint image

One last thing: When we store the image in a SharePoint Picture Gallery App we can retain the dimensions, size, and quality of the original image, unlike when storing the image as a Dynamics 365 entity image.  Check out the properties in the next screen shot and compare that to the properties on the SharePoint page in the same screen shot.   


Comparing image file sizes

Conclusion 

I hope you are enjoying this series and continue to tune in as the solution for our dad’s beloved patch collection grows.  I constantly see updates and upgrades to the Power Platform so I know Microsoft is working hard on making it even better. 

Part One: Identify, Define, Build, Migrate

An assortment of fire department patchesMy dad passed away in 2015, leaving behind an extensive collection of fire trucks, patches, and other fire department (FD) memorabilia.  Before he passed, he gave us instructions to sell them and some direction on what to do with the money. After a few years of not really wanting to deal with it, my family decided to make a project out of it.  My mom, sister, wife, two daughters, and I are working our way through thousands of patches, hundreds of fire trucks, and who knows how many pendants and other trinket like items, all while working full-time jobs (school for the kiddos) and from different locations.

Dad was great about logging his patches into a Microsoft Access database, but not so good about taking pictures of them, and even worse at logging his fire trucks and other items.  The objective and high-level steps for this project were quickly identified.

The Objective

  1. Help my mom liquidate my dad’s enormous fire department memorabilia collection.

The High-Level Steps

  1. Identify the technologies to be used. Easy!
    1. Microsoft Dynamics 365 & Common Data Service – our foundation.
    2. Microsoft Power Apps – mobile app for inventory capture.
    3. Microsoft Flow – move data and attachments around, auto-create ads.
    4. Microsoft SharePoint – store ads, images. Keep large files out of CDS.
  2. Complete a first-cut of the data schema and migrate the patches data from the Microsoft Access database.
  3. Configure a software solution for the family to use so we can all capture data to a single database. Solution must be user friendly!
  4. Configure processes that streamline the creation of advertisements and other data processing.
  5. Start capturing data and creating ads!

The Players

Not everyone in an organization has the same skill level and this will certainly lead to some challenges.  With that in mind, let’s look at the players involved in our project.

  1. Mom – Low technical skill – Capable of using anything “Excel-like” to capture data.
  2. Sister – Low-to-Medium – Arguably more advanced than mom, works on a Mac. Enough said.
  3. Wife – Medium – Works around Excel with ease, understands what I do from a high level.
  4. Kids – Low-to-Medium – two daughters, ages 12 and 10. Both are geniuses on any touch device but have no clue how to work around Excel.
  5. Me – High – developer and technology enthusiast!

I’ve spent the better part of my career as a .Net developer working in SharePoint and Dynamics, among other things, so it was easy for me to decide on a path forward.  Let’s get rolling!

Configure Data Schema and Migrate Microsoft Access Data

Just so no one thinks I’m lying here for the sake of this blog, let’s see what my dad was working with back in the day.  Yes, he was ND alum.

Screenshot of patch entry form in Microsoft AccessPatch data in Microsoft Access

Side note: You see that column named “Patch Locator” highlighted in that last screen shot?  My dad kept his patches in old-school photo albums that he then stored in boxes.  This ‘locator’ field was his way of finding the patch once a box was full and stored away.  Genius dad!

As you can see defining the schema for patches was pretty much done.  If we run into anything along the way, we can certainly add it.

  1. In Dynamics I created an un-managed solution named “Fire Department Items Solution” and added two custom entities, “Patch” and “Fire Truck.”
  2. I added all the fields my dad had in his Access database, and then I made sure that the out of box field “EntityImage” was available for displaying an image of the patch.

PRO TIP:  Dynamics 365 only allows you to have one image field on an entity and it is not configured out of the box.  To use this field, create a new field on your entity and use the data type “Image”.  This will automatically set the name of your field to “EntityImage” and the image you set there will be used as your entity image at the top of the entity form.

Screenshot of PowerAppsPowerApps details

  1. Before we save and publish, we need to enable Notes functionality for our entities. To do this select the entity from the left pane in the solution explorer, then make sure the “Notes (includes attachments)” checkbox is selected.

PRO TIP:  When you save an image to the EntityImage filed it loses a lot of its quality.  Because we are using this data for inventory, including creating ads, we don’t want to lose the quality of our images.  For this reason, we will use the attachments collection for our entity to capture the actual high-quality image.  We will then use Microsoft Flow to take that image and store it as the EntityImage (which will lose quality) but also store the high-quality version in a SharePoint library.

PowerApps note functionality

  1. Finally, be sure to publish your customizations.

Migrating the Data

Now it’s time to migrate the data.  Since this was such a simple schema, I opted to use the out-of-box data import functionality that Dynamics 365 provides.  With that said, however, there are a few different ways to accomplish this migration. For me it was easy to simply export the Microsoft Access database to Excel, then use that file to import into Dynamics 365.

    1. Export your data into an Excel file from Microsoft Access.
      1. Export your data into an Excel file from Microsoft Access.
    2. In Excel you’ll want to Save a Copy and save it as a CSV file.
      Save a copy as a CSV file
    3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality to load our data.

3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality

    1. Choose the CSV file we just created when we saved the copy in Excel.

Choose your CSV file

    1. On this next screen, let’s click the button to Review our Field Mappings.

Review Field Mappings

    1. Here you’ll see some of my fields are mapped and some aren’t. Let’s get those shored up before we proceed.

Resolve mapped items

    1. Now that I’ve resolved all the field mappings, you’ll see we have green check marks across the board and we’re ready to import. Click the Finish Import button and you’re off.

Finish Import button

    1. You can check out the progress of the import by navigating to Settings à Data Management à

View Import progress

Summary & Next Steps

Let’s look at what we’ve done here.  On the surface it would appear we’ve simply gone into Dynamics 365 and configured a couple of entities.  But as we know, Dynamics 365 v9 was built on the Common Data Service (CDS) and that means our Dynamics data is now available to any other application that can connect to the CDS.  Why is this important for this project you might ask?  That answer will become clear in the next part of this blog.  For now, here are some screen shots on how things look now that we have our patch data migrated.

A look at the imported data

Keep in mind, come end of January 2019 everyone will need to switch over to Microsoft’s Unified Interface and that’s what we’re using here for our patches.  This is an example of a model-driven Power App which we’ll discuss in our next entry to this blog.

If you log in to your Power Apps environment using the same credentials as your Dynamics 365 environment, you should see your entities and the data migrated in this environment too.  Remember, once it’s in Dynamics, it’s available through the CDS.

A view of the migrated data

One thing to note, if you have 10,000-plus records like I do for patches, CDS in the browser may freeze trying to display them all.  I would hope MS resolves this at some point so that it handles paging and displaying of data as gracefully as the D365 web client does.

Stay tuned for my next entry where we’ll set up our SharePoint Online site, create a simple canvas Power App for inventory management on our mobile devices, and then set up a Flow to help move some things around and automate the creation of our online advertisements.

Thanks for reading!

Calling all SharePoint users and Office 365 developers! AIS is hosting this month’s Meetup for the Triangle SharePoint User Group in Morrisville, North Carolina. The Meetup is this Thursday at AIS’ North Carolina office. There are still a few spots left so be sure to RVSP today.

About the Session:

In this session we’ll walk through building a client-side web part with the SharePoint framework. By using generic components, we can build web parts that can be reused across an entire organization or multiple clients. Time permitting, we will walk through several examples and possibly some framework extensions.

Event Agenda:

5:45 p.m.  Doors Open
5:45 to 6:15 p.m.  Networking & Dinner
6:15 p.m.  Announcements & Introductions
6:20 to 7:40 p.m. Presentation

The TriSPUG Meetups are a fantastic way for developers, IT, and business users to learn, share, and grow their knowledge in Microsoft SharePoint and Office 365. Attendance is always free and informal. All interest levels and experience levels are welcome!

RSVP Here!

With the wide variety of updated features available through Office 365, organizations can now create robust, beautiful intranets right out-of-the-box. In contrast to SharePoint classic sites, SharePoint modern sites have a clean interface, are responsive and adaptive to mobile devices, and offer significant performance improvements.

Read part one of this three-part blog series here. 

Read part two here.

Now that you set up your SharePoint libraries to use custom content types, you can add content. Go to the Documents library and upload a few documents to the library. For each document, edit the properties and choose any appropriate values for your custom site columns.

In the example below, All isselected for the AIS Office Location field, Human Resources is selected for the AIS Support Team (department) field, and the value for Show on AIS Connect Home is set to Yes.

Adding content to SharePoint

Read More…

With the wide variety of updated features available through Office 365, organizations can now create robust, beautiful intranets right out-of-the-box. In contrast to SharePoint classic sites, SharePoint modern sites have a clean interface, are responsive and adaptive to mobile devices, and offer significant performance improvements.

Read part one of this three-part blog series here. 

In today’s post, we’ll move on to setting up each site in the hub. In this sample infrastructure, each department will have a communication site to share with the entire organization, and an internal team site. Create a new SharePoint site using a modern communication site design.

SharePoint Communication Site screenshot

Read More…