Do your users want reports in SharePoint? Yes! They crave reports and charts. Regardless of which version of SharePoint they are using, getting started now to build their data visualizations in Power BI will position the reports for seamless migration to future SharePoint versions. These are the necessary steps to take to add a simple report in a SharePoint modern page.

Important note: To embed Power BI reports in SharePoint Online, a Power BI Pro license is required.

The Process Flow

This is the flow:

The Process Flow Figure 1

Figure 1: The Process Flow

  1. Create your data in SharePoint; say a list or library.
  2. Start Power BI Desktop to connect to SharePoint to read the data and transform it and create visualizations.
  3. Publish to Power BI Online, where a link becomes available to paste into the Power BI webpart available in SharePoint Modern pages.

We’ve gone full circle! Let’s look at the steps.

SharePoint Online Data

For this example, we will read data from a simple custom list. I added the list to my SPO Dev Tenant site named Vacation Planner. Since all our “vacay” are now “staycay,” I decided to make a board game list. Along with the default Title column that you get with any new list, I added three more. Each is a number column. Then I added games to the list; I listed every game I could think of. For each one, I entered somewhat random numbers for Difficulty and Minimum age and Popularity, although I am pretty sure Candy Land is for 4-year-olds.

SharePoint Online Data Figure 2

Figure 2: Board games list

To create the list, I was logged into SPO as a fictitious test user I named Gwen Lim.

Build the Report

Install the Power BI Desktop application to build the report. It’s free: download it here.

On first use, you will be prompted to sign in. If the login type option appears, choose “Organizational” and log in with a Windows account. I logged in with fictional Gwen Lim’s account. The app, either from the startup splash screen or the menu, chooses “Get Data.”

Select Data Source Figure 4

Figure 3: Select a data source

From the Common data sources dropdown, select “More…” at the bottom. Then click the “Online Services” option, and you should see “SharePoint Online List” on the right. Select that and then click “Connect” at the bottom.

Choose SharePoint online Figure 5

Figure 4: We will choose SharePoint online list

In the SharePoint Online Lists dialog, paste the address URL of the SharePoint site that contains your list. You can check the 2.0 (Beta) radio button (see figure 6) to enable the app to open the default view of your list or leave it 1.0 if you prefer.

SharePoint Site URL Figure 6

Figure 5: Enter the SharePoint site URL

A Navigator window appears with all of the lists available in the SharePoint site in the left columns with checkboxes. Then, check BoardGames to see a view of the data in the right side of the pane. Click the Load button.

Select the List Figure 7

Figure 6: Select the List

You can start building the report. The fields of the data display on the right side. Having chosen a specific, limited column view as default for the list, along with selecting the 2.0 radio button, you will see only a few fields (aka columns) on the right, which is easy to work with.

BoardGames List App

Figure 7: The BoardGames list fields appear

Ignore the fields for a moment while you chose a Visualization. Select the doughnut. Now, it’s time to apply fields to the doughnut. Drag Title into the Legend box under Visualizations. A legend appears beside the doughnut chart. Drag Popularity into the Values box, and your doughnut comes to life with color.

Pick a visualization Figure 9

Figure 8: Pick a visualization chart and add fields

When you hover the chart, tooltips appear with data for each game. Age level, Difficulty, and Popularity values have been imported as decimal values, which would be more readable as whole numbers. To alter this, and to edit column heading text, click on the ribbon’s Transform Data button.

Modify the Data Figure 10

Figure 9: Modify the data

To change the column value from a decimal to a whole number, click the column title to select it and then click on the ribbon’s Data Type button. Select Whole Number as in figure. Double click the column heading to rename the column.

Changing field titles and data types

Figure 10: Changing field titles and data types

Click the Close & Apply button on the left in the ribbon to cause the visualization to appear with the changes applied. Now when you hover your cursor over a section, Minimum Age will appear with a space and both values as whole numbers.

Ready to Publish Figure 11

Figure 11: Improved tooltips

Display in SharePoint

To display the report in SharePoint, click the Publish button in the ribbon on the right side. You will be prompted to save your report in .pbix format.

Ready to publish report figure

Figure 12: Ready to publish!

Save anywhere you want to keep it, and then the Publish to Power BI dialog appears. Varied workspaces can be configured, but initially, you only have “My Workspace” as an option, which is fine. Select it and then click “Select.”

Publishing to Power BI

Figure 13: Successful publish to Power BI Online

When you see the Success! dialog, click on the link to open the .pbix in Power BI online to view your report. In the Share menu above the report, drop down the menu options, and hover over Embed report. Here you want to see an option for SharePoint online.

Link to use in SharePoint page Figure 14

Figure 14: Get a link to use in a SharePoint page

This will be missing until you upgrade to a Power BI Pro license. This is not free, but the trial is for 60 days. Once you have that option in the menu and select it, you are rewarded with the Embed link to use in SharePoint.

Embed link for SharePoint

Figure 15: Click to highlight and then copy

Click that link to highlight it and copy. Now head over to your SharePoint site and create a page.

Locate built-in Power BI

Figure 16: Locate the built-in Power BI webpart

Click the webpart plus sign, and in the search box, type “power” to filter the results. The Power BI webpart will appear. Click on it, and the webpart will be inserted into your page. You will see a green button for Add report; click it to open the properties panel on the right. Paste in the embed link you got from Power BI online.

Apply the embed link

Figure 17: Apply the embed link

Click away from that textbox and your report will appear on the left.

Report Displayed Correctly

Figure 18: Report successfully displayed in SharePoint Online

Conclusion

This is a no-code solution and a simple demo. However, the depth of tooling provided by Power BI to enable developers and business data experts to transform and visualize organizational data is immense. The speed and ease with which we can integrate data reporting into SharePoint modern pages will be welcome to customers as they migrate to current SharePoint versions.

Links

Embed a report web part in SharePoint Online – Power BI | Microsoft Docs

I support projects where we have platforms like SharePoint, are looking towards adopting PowerApps, and have Azure Government subscriptions. While learning about containers, I wondered where they could fit into an environment where many applications are developed on top of SharePoint. This helped me better understand containers and discover the perfect use case. I thought of sharing my architecture idea through this blog post. If you would like to learn more about container architecture, read this blog post on Container Architecture Basics from our VP of Solution Engineering, Brent Wodicka.

Architecture Idea

If you are maintaining a SharePoint on-premise farm in a local data center or in the cloud, you cannot replace your SharePoint farm VMs with Containers. But what can you do is take the heavy business logic code out of your custom applications from SharePoint farm and deploy it as services using containers. Deploy your backend services using containers. You can call these services endpoints (Rest API Endpoints) from your SharePoint framework web parts and render UI inside the SharePoint framework webparts.

If your application is developed using PowerApps, I believe you can call custom service endpoints in PowerApps. So, your front end can still be deployed using SharePoint or PowerApps, but your business logic can be deployed as services inside the containers.

Where can containers fit in a multi-platform environment?

Answer: Below is two diagrams that explain the use case of the container in a multi-platform environment discussed above.
Business Logic Services Snapshot

Backend Databases and Cloud Services

Advantages of this Architecture

  • Since you have server-side code deployed inside the Azure Cloud, you can easily integrate with other Azure PaaS services, including Azure SQL Database, Azure Cognitive Search.
  • Since most business logic will deploy inside the containers as services, it is easy to implement these services in any other cloud provider.
  • Suppose you have complex legacy applications developed inside the SharePoint, and you are storing the data in SharePoint lists. In that case, you can move those SP lists data to the Azure SQL database and call the Azure SQL Apis from your services that deployed side the containers. ( See the second diagram above)

Conclusion

Suppose you have heavy business logic written in front end JavaScript files in SharePoint. You can rewrite those codes as server-side code using C# and deploy as services inside the containers and call these services endpoints from SharePoint webparts. Complex application data can even move from SharePoint lists to Azure SQL Databases. Containers solve deploying your custom code as services, but they cannot replace your SharePoint infrastructure.

In this post, I will show you how DevOps practices can add value to a variety of Office 365 development scenarios. The practices we will discuss are Infrastructure as Code, Continuous Integration, and Continuous Delivery. The advances in DevOps and SharePoint Framework (SPFx) have allowed us to make advancements in the way that we develop software and have improved our efficiency.

Infrastructure as Code (IaC)

Practicing IaC means that the infrastructure your applications depend on is created and maintained by code that is source controlled, tested, and deployed to production much like software. When discussing IaC, we’re typically talking about provisioning resources to a cloud provider. In our case, the “infrastructure” is Office 365 – a SaaS product with extensive customization and configuration options.  

While you could manage your O365 tenant with PowerShell, the code-centric and template-based PnP Provisioning Framework aligns better with this practice because: 

  1. Using the frameworks declarative XML syntax, you describe what you want to exist rather than writing code to manage how it gets created.
  2. It is easier for developers to run idempotent deployments to enact the desired state of your Office 365 tenant.  

While originally developed to support SharePoint Online and on-premise deployments, you can see in its latest schema that it has expanded to support Microsoft Teams, OneDrive, and Active Directory.  

Continuous Integration (CI) 

The practice of continuously integrating first means that your team has established the habit of frequently merging small batches of changes into a central code repository. Upon that merge, we automatically build and test the code to quickly identify bugs and quality issues.  

SharePoint Framework is a commonly used tool used to extend the capabilities of SharePoint Online and on-premise Much like the Provisioning Framework, SharePoint Framework is expanding to support other Office 365 services. You can currently use it to develop for Microsoft Teams and will soon be able to use it to develop Office Add-Ins.

Azure DevOps is a one-stop-shop service that provides everything you need throughout the software development lifecycle. For example, your team can version control your projects source code in Repos. After merging changes, use Pipelines to trigger a CI process that runs the build and test tasks of your SharePoint Framework solution.  

Continuous Delivery (CD)

Continuous Delivery, the practice of running automated deployments through a sequence of environments, starts after a completed CI process. Azure DevOps Pipelines will again be the tool of choice to execute the deployment procedures against each environment.

Example Solution

A solution demonstrating how to use the technologies and practices described above is available on Applied Information ScienceGitHub account. The result is a pipeline capable of receiving frequent changes to O365 configuration and SPFx applications from 1 or many developers, verifying the quality of the change, and deploying it to a series of environments.

Dev Tenant Diagram

I encourage you to explore the source code using the following summary as a guide. You’ll find the solution organized into three areas – SPFx, Provisioning, and Pipeline.

SPFx

A simple hello world web part was created using the yeoman generator. Jest was added to test the code. Npm and gulp scripts are used to build and package the source code which produces an sppkg file.

Provisioning

The PnP Provisioning Template XML file found here defines the desired state of the target tenant. The following is the desired state:

  1. Install the SPFx App into the tenant App Catalog.
  2. Create a site collection that will host our web parts page.
  3. Install the SPFx App to the Site Collection.
  4. Create a page that will host our web part.
  5. Add the web part to the page.

By using parameters for the tenant URL and site owner, the same template can be deployed to multiple environments. A PowerShell build script bundles the template and all required files, such as the SPFx sppkg file, into a single pnp file ready for deployment.

Pipeline

A multi-stage YAML pipeline defined in the Pipeline folder of the example solution runs the following process:

  1. Build, test, and package the SPFx and Provisioning Template source code.
  2. Deploy the prerequisite SharePoint infrastructure to the tenant if it does not already exist.
  3. Install and configure the SPFx web part.
  4. Repeat #2 and #3 for all environments.

Build Process Diagram

Secret variables, such as the username and password used to connect to the tenant, are only referenced in the pipeline. The values are set and encrypted in the Azure DevOps pipeline editor.

Variables Diagram with Passwords

Conclusion

In the not-too-distant past, it was high effort to write unit tests for a SharePoint solution, and most deployments were manual. In this post, I have shown you how advancements in the platform and tooling have changed this. The mentality, practices, and tools brought by DevOps can improve the pace and quality of any software development and infrastructure management project, including projects building upon Office 365.


If you didn’t catch the first two parts of this series, you can do that here and here.  In this part, we’ll get a little more technical and use Microsoft Flow to do some pretty cool things. 

Remember when we talked about the size and quality of the images we take with our Power App and store as the entity image? When saved as the Entity Image for a CDS/D365 item, the image loses quality and is no longer good for an advertisement photo.  This is done automatically and as far as I can tell, the high-res image is gone once this conversion takes place (someone please correct me if I’m wrong on that!).  On the flip side of that, it doesn’t make a whole lot of sense to put all this tech together only to have my end users be required to take two pictures of an item, one for hi-res and one for low-res.  We don’t want to store a high-res in a relational database for 10,000 plus items because the database could bloat immensely.

Microsoft Flow and SharePoint to the rescue!  

PRO TIP:  Dynamics 365 will crop and resize the image before saving it as the entity image.  All entity images are displayed in a 144 x 144 pixel square.  You can read more about this here.  Make sure to save/retain your original image files.  We’re going to stick ours in a SharePoint Picture Gallery App.

Objective 

Create a Microsoft Flow that handles… 

  • Pulling the original image off the Dynamics record and storing it in SharePoint. 
  • Setting the patch image to the Entity Image for the Dynamics record 
  • Create an advertisement list item for the patch 
  • Save the URLs for the ad and image back to the patch record 

Create the Flow 

We’re going to write this Flow so that it’s triggered by a Note record being created. 

 Flow screenshot with Create from blank highlighted

  • On the next page, click “Search hundreds of connectors and triggers” at the bottom of the page. 
  • Select Dynamics 365 on the All tab for connectors and triggers. 
  • Select the “When a record is created” trigger. 

 Dynamics 365 is highlighted

  • Set the properties for Organization Name and Entity Name.  Entity Name should be “Notes”. 
  • Save the Flow and give it a name. 

Verifying a Few Things 

  • Add a new step and select the Condition item. 
  • The Condition should check to see if the Note has an attachment. We do this using the “Is Document” field.  

 Condition Control is highlighted 

  • In the “Yes” side of the conditional we want to check if the Object Type is a Patch (ogs_patch in this case).  

At this point, if the Flow has made it through both conditionals with a “Yes”, we know we are dealing with a new Note record that has an Attachment and belongs to a Patch record.   

Update the Patch Record 

Now we want to update the batch record’s Entity Image field with the attachment.  First we need to get a handle on the Patch record.  We’ll do that by adding an Action to the Yes branch of our new Conditional. 

  • Add a Dynamics 365 Update a Record Action.
  • Set the Organization Name, Entity Name, and Record identifier accordingly.  For our Patch Record identifier, we’ll use the Regarding field in the Dynamic content window. 

 

  • Click on Show advanced options and find the Picture of Patch field. 
  • For the Picture of Patch field we need to get the document body of the attachment and convert it from Base-64 encoding to binary.  We do this using the “Expression” area again.  Use the “base64ToBinary” function to convert the document body like so. 

 

  • Save your work!  I can’t tell you how many times I had to retype that function. 

Create Our SharePoint Items & Clean-up 

Now that we’ve updated our entity image with the uploaded patch picture we want to do a couple of things, but not necessarily in sequence.  This is where we’ll use a parallel branch in our Flow.   

Dealing with a Parallel Branch 

  • Under the last Update a Record action, add a Conditional.  After adding this Conditional hover over the line between the Update action and the new conditional.  You should see a plus sign that you can hover over and select “Add a parallel branch.” 



  • Select this and add a Compose action.  You may need to search for the Compose action. 

 

PRO TIP:  With Modern Sites in SharePoint, we now have three solid options for displaying images in SharePoint.  The Modern Document Library allows viewing as tiles and thumbnails within a document library, the Picture Library which has often been the place to store images prior to the Modern Document Library, and then we can simply just display an image, or images, on a page directly.

Saving the Attachment as an Image in SharePoint

  • Let’s deal with Compose branch first.  Our compose will have the same function as our Picture of Patch did above for the Input field.  base64ToBinary(triggerBody()?[documentbody’]) 
  • After the Compose, we’ll add a Create File Action for SharePoint and use the name from our Patch record as the name for our image in SharePoint.  I’m using a Picture Gallery App in SharePoint and for now, only using the .JPG file type.  The File Content should use the Output from our Compose Action. 

 

Delete the Note

  • Finally, we want to delete that Note from Dynamics (and the Common Data Service) so that the image attachment is no longer taking up space in our Common Data Service.  Add a Dynamics Delete a Record Action after the SharePoint Create file action.  Set the Organization Name, Entity Name, and use the Dynamics content for Note as the Item identifier.

 

Creating Our Advertisement

Let’s jump back to the new Conditional we added after the Update a record Action where we set the entity image. 

  • Set the conditional to check for the Generate Advertisement field being set to true. 
  • If this is true, add a SharePoint Create Item Action and let’s set some values.  What we’re doing here is creating a new SharePoint List Item that will contain some starter HTML for a Patch advertisement. 
  • Save our work! 

 

 

Updating Our Patch Record With Our URLs From SharePoint

  • Under the SharePoint Create Item Action for creating the Ad, AND after the SharePoint Create file action for creating the picture in the Picture Gallery, we’re going to add Dynamics Update record Actions that will be identical with one difference. 
  • The Organization Name, Entity Name, Record Identifier (set to Dynamic Content “Regarding”) should be the same. 
  • On the Ad side, the Update record should set the SharePoint Ad for Patch field to “Link to Item”. 

 

  • On the image side, the Update record should set the SharePoint Image for Patch to the “Path” 

 

Seeing It In Action 

Of course, I’ve been saving my work so let’s go ahead and give this a whirl. 

  • At the top right of your Flow you’ll see a Test button.  We’re going to click that and select “I’ll perform the trigger action.” 
  • To make this more interesting, I’m going to run this from SharePoint! I’ll update a patch and kickoff my Flow from the embedded Power Apps Canvas App on my SharePoint home page. 

 

  • I select the patch, then I click the edit button (pencil icon at the top right). 
  • Notice the Attach file link and the Generate Advertisement switch.  We’ll use the first for our image and the second for generating our ad item in SharePoint. 

 

  • Finally, I click the checkmark at the top right to save my changes.  This kicks off our Flow in less than a minute, and when we navigate back over to the Flow we can see that it completed successfully. 

 Verifying the flow

  • I’ll hop back over to SharePoint to make sure that my ad was created and my entity image was set.  I’ll also make sure the high-quality image made it to the SharePoint Picture Library and the Note was deleted from the Patch record in Dynamics.  I also want to make sure the URLs for the ad and image in SharePoint were set back to the Patch record. 

verifying in SharePoint Verifying in SharePoint image

One last thing: When we store the image in a SharePoint Picture Gallery App we can retain the dimensions, size, and quality of the original image, unlike when storing the image as a Dynamics 365 entity image.  Check out the properties in the next screen shot and compare that to the properties on the SharePoint page in the same screen shot.   


Comparing image file sizes

Conclusion 

I hope you are enjoying this series and continue to tune in as the solution for our dad’s beloved patch collection grows.  I constantly see updates and upgrades to the Power Platform so I know Microsoft is working hard on making it even better. 

Part One: Identify, Define, Build, Migrate

An assortment of fire department patchesMy dad passed away in 2015, leaving behind an extensive collection of fire trucks, patches, and other fire department (FD) memorabilia.  Before he passed, he gave us instructions to sell them and some direction on what to do with the money. After a few years of not really wanting to deal with it, my family decided to make a project out of it.  My mom, sister, wife, two daughters, and I are working our way through thousands of patches, hundreds of fire trucks, and who knows how many pendants and other trinket like items, all while working full-time jobs (school for the kiddos) and from different locations.

Dad was great about logging his patches into a Microsoft Access database, but not so good about taking pictures of them, and even worse at logging his fire trucks and other items.  The objective and high-level steps for this project were quickly identified.

The Objective

  1. Help my mom liquidate my dad’s enormous fire department memorabilia collection.

The High-Level Steps

  1. Identify the technologies to be used. Easy!
    1. Microsoft Dynamics 365 & Common Data Service – our foundation.
    2. Microsoft Power Apps – mobile app for inventory capture.
    3. Microsoft Flow – move data and attachments around, auto-create ads.
    4. Microsoft SharePoint – store ads, images. Keep large files out of CDS.
  2. Complete a first-cut of the data schema and migrate the patches data from the Microsoft Access database.
  3. Configure a software solution for the family to use so we can all capture data to a single database. Solution must be user friendly!
  4. Configure processes that streamline the creation of advertisements and other data processing.
  5. Start capturing data and creating ads!

The Players

Not everyone in an organization has the same skill level and this will certainly lead to some challenges.  With that in mind, let’s look at the players involved in our project.

  1. Mom – Low technical skill – Capable of using anything “Excel-like” to capture data.
  2. Sister – Low-to-Medium – Arguably more advanced than mom, works on a Mac. Enough said.
  3. Wife – Medium – Works around Excel with ease, understands what I do from a high level.
  4. Kids – Low-to-Medium – two daughters, ages 12 and 10. Both are geniuses on any touch device but have no clue how to work around Excel.
  5. Me – High – developer and technology enthusiast!

I’ve spent the better part of my career as a .Net developer working in SharePoint and Dynamics, among other things, so it was easy for me to decide on a path forward.  Let’s get rolling!

Configure Data Schema and Migrate Microsoft Access Data

Just so no one thinks I’m lying here for the sake of this blog, let’s see what my dad was working with back in the day.  Yes, he was ND alum.

Screenshot of patch entry form in Microsoft AccessPatch data in Microsoft Access

Side note: You see that column named “Patch Locator” highlighted in that last screen shot?  My dad kept his patches in old-school photo albums that he then stored in boxes.  This ‘locator’ field was his way of finding the patch once a box was full and stored away.  Genius dad!

As you can see defining the schema for patches was pretty much done.  If we run into anything along the way, we can certainly add it.

  1. In Dynamics I created an un-managed solution named “Fire Department Items Solution” and added two custom entities, “Patch” and “Fire Truck.”
  2. I added all the fields my dad had in his Access database, and then I made sure that the out of box field “EntityImage” was available for displaying an image of the patch.

PRO TIP:  Dynamics 365 only allows you to have one image field on an entity and it is not configured out of the box.  To use this field, create a new field on your entity and use the data type “Image”.  This will automatically set the name of your field to “EntityImage” and the image you set there will be used as your entity image at the top of the entity form.

Screenshot of PowerAppsPowerApps details

  1. Before we save and publish, we need to enable Notes functionality for our entities. To do this select the entity from the left pane in the solution explorer, then make sure the “Notes (includes attachments)” checkbox is selected.

PRO TIP:  When you save an image to the EntityImage filed it loses a lot of its quality.  Because we are using this data for inventory, including creating ads, we don’t want to lose the quality of our images.  For this reason, we will use the attachments collection for our entity to capture the actual high-quality image.  We will then use Microsoft Flow to take that image and store it as the EntityImage (which will lose quality) but also store the high-quality version in a SharePoint library.

PowerApps note functionality

  1. Finally, be sure to publish your customizations.

Migrating the Data

Now it’s time to migrate the data.  Since this was such a simple schema, I opted to use the out-of-box data import functionality that Dynamics 365 provides.  With that said, however, there are a few different ways to accomplish this migration. For me it was easy to simply export the Microsoft Access database to Excel, then use that file to import into Dynamics 365.

    1. Export your data into an Excel file from Microsoft Access.
      1. Export your data into an Excel file from Microsoft Access.
    2. In Excel you’ll want to Save a Copy and save it as a CSV file.
      Save a copy as a CSV file
    3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality to load our data.

3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality

    1. Choose the CSV file we just created when we saved the copy in Excel.

Choose your CSV file

    1. On this next screen, let’s click the button to Review our Field Mappings.

Review Field Mappings

    1. Here you’ll see some of my fields are mapped and some aren’t. Let’s get those shored up before we proceed.

Resolve mapped items

    1. Now that I’ve resolved all the field mappings, you’ll see we have green check marks across the board and we’re ready to import. Click the Finish Import button and you’re off.

Finish Import button

    1. You can check out the progress of the import by navigating to Settings à Data Management à

View Import progress

Summary & Next Steps

Let’s look at what we’ve done here.  On the surface it would appear we’ve simply gone into Dynamics 365 and configured a couple of entities.  But as we know, Dynamics 365 v9 was built on the Common Data Service (CDS) and that means our Dynamics data is now available to any other application that can connect to the CDS.  Why is this important for this project you might ask?  That answer will become clear in the next part of this blog.  For now, here are some screen shots on how things look now that we have our patch data migrated.

A look at the imported data

Keep in mind, come end of January 2019 everyone will need to switch over to Microsoft’s Unified Interface and that’s what we’re using here for our patches.  This is an example of a model-driven Power App which we’ll discuss in our next entry to this blog.

If you log in to your Power Apps environment using the same credentials as your Dynamics 365 environment, you should see your entities and the data migrated in this environment too.  Remember, once it’s in Dynamics, it’s available through the CDS.

A view of the migrated data

One thing to note, if you have 10,000-plus records like I do for patches, CDS in the browser may freeze trying to display them all.  I would hope MS resolves this at some point so that it handles paging and displaying of data as gracefully as the D365 web client does.

Stay tuned for my next entry where we’ll set up our SharePoint Online site, create a simple canvas Power App for inventory management on our mobile devices, and then set up a Flow to help move some things around and automate the creation of our online advertisements.

Thanks for reading!

Calling all SharePoint users and Office 365 developers! AIS is hosting this month’s Meetup for the Triangle SharePoint User Group in Morrisville, North Carolina. The Meetup is this Thursday at AIS’ North Carolina office. There are still a few spots left so be sure to RVSP today.

About the Session:

In this session we’ll walk through building a client-side web part with the SharePoint framework. By using generic components, we can build web parts that can be reused across an entire organization or multiple clients. Time permitting, we will walk through several examples and possibly some framework extensions.

Event Agenda:

5:45 p.m.  Doors Open
5:45 to 6:15 p.m.  Networking & Dinner
6:15 p.m.  Announcements & Introductions
6:20 to 7:40 p.m. Presentation

The TriSPUG Meetups are a fantastic way for developers, IT, and business users to learn, share, and grow their knowledge in Microsoft SharePoint and Office 365. Attendance is always free and informal. All interest levels and experience levels are welcome!

RSVP Here!

With the wide variety of updated features available through Office 365, organizations can now create robust, beautiful intranets right out-of-the-box. In contrast to SharePoint classic sites, SharePoint modern sites have a clean interface, are responsive and adaptive to mobile devices, and offer significant performance improvements.

Read part one of this three-part blog series here. 

Read part two here.

Now that you set up your SharePoint libraries to use custom content types, you can add content. Go to the Documents library and upload a few documents to the library. For each document, edit the properties and choose any appropriate values for your custom site columns.

In the example below, All isselected for the AIS Office Location field, Human Resources is selected for the AIS Support Team (department) field, and the value for Show on AIS Connect Home is set to Yes.

Adding content to SharePoint

Read More…

With the wide variety of updated features available through Office 365, organizations can now create robust, beautiful intranets right out-of-the-box. In contrast to SharePoint classic sites, SharePoint modern sites have a clean interface, are responsive and adaptive to mobile devices, and offer significant performance improvements.

Read part one of this three-part blog series here. 

In today’s post, we’ll move on to setting up each site in the hub. In this sample infrastructure, each department will have a communication site to share with the entire organization, and an internal team site. Create a new SharePoint site using a modern communication site design.

SharePoint Communication Site screenshot

Read More…

SharePoint logoWith the wide variety of updated features available through Office 365, organizations can now create robust, beautiful intranets right out-of-the-box. In contrast to SharePoint classic sites, SharePoint modern sites have a clean interface, are responsive and adaptive to mobile devices, and offer significant performance improvements.

In the past, many intranets were built as a single large site collection with multiple levels of sub-sites underneath. The modern infrastructure can be flatter, with each department as its own site collection, but connected together through a SharePoint hub site.

Key features of a modern hub site — which make it an ideal starting point for an intranet — include:

  • Cross-site navigation:  consistent top navigation among associated sites
  • Content roll-up:  aggregated news and content among associated sites
  • Consistent look-and-feel: a common theme / branding for associated sites
  • Scoped search:  search content within associated sites

Now let’s walk through the process of creating a new, modern intranet in SharePoint. (Note that for the sake of length and readability, we’ll be publishing this process in three parts here on the blog. The entire guide will be available as a handy download, however, once the series has concluded!)

To start, create a new SharePoint site using a modern communication site design.

Creating a new SharePoint site

Read More…

Calling all SharePoint users and Office 365 developers! Once again, we invite you to attend this month’s Meetup for the Triangle SharePoint User Group in Morrisville, North Carolina. The Meetup is TOMORROW and space in limited, so RSVP today to claim your spot.

About the Session: 

Many traditional SharePoint developers have been caught off guard with the fast pace of changes to the SharePoint ecosystem in recent years. Whether it’s the rapid adoption of Office 365 or the growing investment in cloud-based infrastructure and services, it can all feel very foreign to anyone still using some of the same development approaches and tools first pioneered in SharePoint 2007.

This month’s session will break down traditional SharePoint solutions (such as features, webparts, workflows, event receivers, and timer jobs) and discuss how they translate to modern equivalents in Office 365 and the cloud. We’ll touch on popular topics like the role of SPFX, Power Apps, and Flow, and also other key Azure Services such as Logic Apps, Azure Functions, and Hybrid Data Connections.

You’ll gain an understanding for the growing role of new APIs such as Microsoft Graph, various nuances with authentication, and the importance of hybrid environments and accessing on-premises data. Along the way you’ll discover some of the tools, techniques, and approaches that will be invaluable as you decide what part of your toolbelt will be the most important to upgrade!

About the Speaker: 

Josh Carlisle is a full stack software developer based out of Raleigh, North Carolina working as a Senior Solution Architect at B&R Business Solutions. He has 20 years of development experience from the early days of VB5, COM, ASP, and the birth of .Net to his first adventures with SharePoint development in 2004. His current focus is on architecting, designing, and developing solutions for Azure, Office 365, and SharePoint using the latest front end JavaScript frameworks such as Angular and React alongside service side solutions based on ASP.NET Core and Node.js. Josh also enjoys sharing is knowledge and experience at regional user groups and community events.

Come join your peers and fellow developers for a great session of networking and learning. As always, this event is free but space in limited. RSVP here!