The data team of the internal AIS Microsoft Power Platform Hackathon used 3 different data movement techniques to solve the hackathon use case: Dataflows, Power Query, and Power Automate. Read on to learn how we did it. There are several ways to import and export data through Microsoft Dataverse, the data backbone of Microsoft Power Platform.

What is Dataverse?

Dataverse is designed to work with any data and incorporates all the significant data technologies that any organization needs – relational, non-relational, file, image, search, and data lake. Dataverse helps to store and manage data securely. Dataverse includes a set of visual designers to create, edit, and interact with data. In Dataverse, tables are used to model and manage business data. To increase productivity, Dataverse includes a set of tables known as standard tables.

We used Dataverse as a place to store all our data related to catalogs and imported and exported data as per the scenarios.

Our Approach

Our hackathon team was tasked with migrating the data of a legacy application into Dataverse. The legacy application data was created from the eShopOnWeb project and was hosted in Azure SQL. Our approach was to break this use case down into two problems:

  1. Migrate the tables (schema)
  2. Migrate the data

The Power BI Model View of our data structure in the screenshot below shows the entities and their relationships we needed to migrate. The schema did not need to change during the migration, representing both the source and destination data model.

Power BI Model View

LEGACY DATA CONVERSION
Accelerate your data migration project with our free white paper download.

Migrate the Tables

We evaluated 3 techniques for migrating the tables into Dataverse. 

  1. Create the tables automatically during the data migration when using Dataflows 
  2. Write an application that creates the columns through the Dataverse Web API 
  3. Manually create them in the portal 

Option 1: Dataflows
Tables can be created in the process of moving data into Dataverse using Dataflows. This is only an option if you do not need to modify the schema and migrate the data.

Option 2: Dataverse Web API
The Dataverse web API provides a RESTful web service to interact with data in Microsoft Dataverse using a wide variety of platforms & programming languages, such as C#. This is an excellent option to consider if you’d like to programmatically migrate the schema but need to change the data model in the process.

Option 3: Manual
Manually creating tables in the portal is the simplest of the three options but could be time-consuming and error-prone if you make many tables or migrate the same schema into multiple environments. However, given the time constraint of our hackathon and the simple use case, we chose to use this option.

Migrate the Data

Once the tables are in place, Dataverse is ready to receive data from the source system. We evaluated four options to migrate data from the legacy application to Dataverse.

  1. Import from a CSV file
  2. Use Power Query
  3. Use Dataflows

Option 1: Import from CSV
You can load data in Dataverse by importing data from different files like Excel or CSV. This is an excellent option if you need to do a 1-time import of data that does not need to be transformed and does not include any unsupported data types, such as timezones, images, or multi-select choices. We connected to the legacy SQL database using SQL Server Management Studio and exported the data to a CSV during the hackathon. We then completed the import process using the portal to upload the CSV and map the source and destination columns.

Map source to destination columns

Option 2: Power Query
Users can filter, transform, and combine data before loading it into a new or existing Dataverse table. The target data source can be online and on-premises sources, including SQL Server, Salesforce, IBM DB2Access, Excel, or a Web API. Use this option if you are moving a large volume of data or if the data being moved needs to be reshaped during the migration.

Option 3: Dataflows
Dataflows are built upon Power Query, so they have all the same benefits but bring the added advantage of letting users trigger the migration on-demand or automatically on a schedule.

We will be diving deeper into each team, so stay tuned for more blog posts around our AIS Internal Hackathon!

Authored by Jagrati Modi (team lead)
Thank you to the Data team for sharing their experience:

  • Jagrati Modi (team lead)
  • Souradeep Banerjee
  • Nikhil Grover

Project Background

From August 2020 to this February, I worked as a Business Analyst (BA) on a project team to help a major insurance company modernize thousands of InfoPath forms into the Microsoft Power Platform. Our client needed to retain critical business functionality and workflows of these forms before Microsoft ends their extended InfoPath support in July 2026. Our efforts included modernizing the InfoPath forms by using Power Apps canvas apps, Power Automate, and some Power BI.

It was a great experience working on this project. First of all, I got to work with a project team with members specializing in different technical areas who were motivated to mentor and learn from each other. I learned about the Power Platform, Power Platform accessibility, Modern SharePoint lists, and some QA testing basics.

At the same time, as a BA with an extensive background in user experience (UX) research, I contributed to the project’s success by sharing and implementing UX best practices and helping improve the overall usability of our applications. Throughout the project, I had opportunities to:

  • Lead project discovery that followed a user-centered design process involved multiple activities and steps within our internal AIS team and with our client stakeholders.
  • Prototype the new solution first in Axure and later on directly in the Power Platform – the low-code/no-code, connected, and secure platform, which empowers citizen developers like me to solve business problems.
  • Work with our team to improve overall form design for better usability, including improving form labels, instructions, and controls.

I describe my overall project experience in a two-part blog post:

  • Part 1 (this post) focuses on how we followed a user-centered design process to conduct InfoPath form discovery.
  • Part 2 compares prototyping in Axure and Power Apps and illustrates how we improved form design from InfoPath to Power Apps.

User-Centered Design/Discovery Process

Our project included ten members: 1 Project Manager, 1 Power Platform Solution Architect, 3 Power Platform Developers, 2 Accessibility Specialists, 1 UX Architect, 1 QA Test Engineer, and me as the BA. The Power Platform Solution Architect, UX Architect, and I formed the discovery team, with constant support and participation from our Project Manager.

Our project followed an Agile process with two-week sprints, and our discovery team of three worked on the target InfoPath forms a sprint ahead of the rest of the team members. When our Developers and Test Engineer were ready to start developing and testing the new app, we had already completed the needed discovery. We received the client’s approval on the app requirements. Figure 1 below depicts our simplified discovery process.

Design Discovery for Agile Process

Detailed Discovery Steps

In reality, multiple activities and meetings could occur in each step, and below are more details of how we conducted discovery.

  1. While familiarizing ourselves with the target InfoPath form(s), our discovery team focused on forms themselves and reviewed relevant
    • SharePoint site(s), different SharePoint list views, and workflows.
    • It was helpful for us to understand the form context, which was necessary for our conversations with stakeholders.
    • I created a Stakeholder Interview Guide Template at the beginning of the Project and constantly updated the Template with tailored questions.
    • We might hold an internal meeting about the initial findings before we met with client stakeholders.
  2. Regarding our initial meeting with key stakeholders/form owners:
    • To ensure their attendance, I always tried to schedule meetings at least one week in advance.
    • During the meeting, I followed the Interview Guide, focusing on their existing use of the InfoPath forms and workflows, what worked well, what pain points they experienced, and what wish lists they wanted.
    • I would schedule follow-up meetings if needed, especially if multiple InfoPath forms would be modernized together.
  3. Armed with a good understanding of the target forms, we would meet internally to debrief and propose a new solution.
    • Sometimes a modern SharePoint list would suffice, but most frequently, a canvas app would be required to replace existing forms, especially when multiple or similar InfoPath forms needed to be modernized.
    • Our Solution Architect would also check data integration or connection with other systems outside the form(s), ensuring that such integration would continue to work in the new solution.
    • Power Automate would be used for associated workflows, such as email notifications, permission/role-based access control, and user profile fields.
  4. After our discovery team agreed that a canvas app should be used, I would create an Excel spreadsheet to document app requirements, while our UX Architect would start to create a prototype.
    • The prototype would illustrate our design ideas, provide a visual representation, and include some interactions of the new solution.
    • I always created at least three worksheets in the Excel spreadsheet:
      • Background: Summary of what we learned in client meetings, such as the purpose of the existing form(s), description of form users, and form owners’ contact info.
      • URLs: Links to the existing form(s) and SharePoint site(s).
      • Acceptance Criteria: Detailed requirements of the canvas app.
    • During this step, the UX Architect and I would meet multiple times to ensure that the documented requirements and prototype would match.
    • When needed, we would consult our Solution Architect to ensure that the requirements and prototype took full advantage of the Power Platform capabilities.
    • I acted as both a BA and UX Architect for several forms, documenting requirements and creating prototypes, which simplified this step.
  5. When we met with stakeholders again, we showed them both the prototype and requirements and emphasized that no actual coding had occurred. Doing so would encourage them to provide any additional suggestions if needed so that the new solution would meet their business needs.
    • The Excel file and prototype complemented each other well, especially in areas where the prototype did not cover. For example, instead of making workflows work in the prototype, which would require a lot of coding, I would specify that in a requirement explaining when and who would be notified when a form was submitted.
    • Sometimes we needed to go back and adjust our prototype and requirements before we met with the stakeholders again to get their final approval to our proposed solution.
  6. As the last discovery step, we would meet with our Developers and QA Test Engineer, walking through the prototype and detailed requirements accessible from a SharePoint document library.
    • The meeting acted as a knowledge transfer session from the Discovery team to all other members.
    • Based on the information, Developers would create tasks in the Azure DevOps (ADO) to restructure data, estimate needed screens or forms, report data in Excel or Power Bi, and create flows.
    • Test Engineer would create the necessary test plan, cases, and steps in ADO.

Ideally, our Discovery team could answer all questions from our Developers and Test Engineer at this step. However, from time to time, they would ask us a few questions or details that we could not answer, which required us to go back to client stakeholders for further clarification. When that happened, I would update the Stakeholder Interview Template to cover such questions in future discovery.

Examples of such questions included:

  • Will form submitters be allowed to edit their submissions?
  • Should they be allowed to save a draft before submission, or should they fill out the entire form in one sitting?
  • Should there be an app Administrator screen so that stakeholders could make future updates easily within the app, or will they be comfortable enough to make updates directly from the back-end, such as a SharePoint list?

In Summary

As described above, having a designated Discovery team in a project with members specializing in UX and technology solutions worked well and significantly contributed to our project success. In general:

  • Following a user-centered design/discovery process helped the project teamwork efficiently, avoided surprises in later development and testing, and saved overall project time.
  • It was important for project discovery to start at least a couple of weeks ahead of the rest of the team.
  • Involving client stakeholders early and throughout discovery was crucial in ensuring that the new solution would meet the client’s business needs.
  • Discovery artifacts, such as the prototype and requirements, should be available at a central location so that all team members could easily refer to them and clearly understand how the new solution should function.

In Part 2 of this blog series, I will compare my experience prototyping in Axure and the Power Platform. I will also share a few form design and content development best practices that we implemented when we modernized InfoPath forms to Power Apps.

The landscape for Power BI licensing is complex and evolving. To begin to grasp the structure and detail, let’s break it down. Today I am asking: what are the capabilities of using a free license?

Really, It’s Free

More than one path exists to obtain a free account. For our exploration in this article, I will use two:

  • Individual Free Account
  • Office 365 (Dev Tenant) Member Free Account

Let’s frame this in a story with two fictional characters. Cary Small started a small one-person business during the pandemic and wants to send a graph of her sales data to her small business Meetup group friends. Max Rockatansky is her friend. As an employee for a small company that subscribes to Microsoft 365, he was assigned a free account by the M365 tenant admin.

Fictional Characters

Free Account User Source of Free Account Different Experience in Power BI
Cary Small Individual Free Account Cannot publish to public
Max Rockatansky Assigned account by M365 tenant admin* Publish to public if admin has allowed it

*admin name Imperator Furiosa 😀

Let’s See This in Action

Cary created her Individual Free account from a personal domain name email address to try out Power BI, let’s say carys@mysweetstartup.biz. A link to create this type of account is in the Links section at the end of this article. Email addresses from free email companies such as Gmail, Hotmail, or Outlook.com are not accepted. This account type is interesting because it is the free-est of the free; no Office 365 membership is needed.

Create Free Individual Account form

Power BI Desktop

With her new account, the starting point for Cary is to install the Power BI Desktop application. This application is free to download, install and use, even without ever logging in to any account. A link to the download is included at the end of this article. Power BI Desktop is the starting point to generate and shape reports and visualizations.

Install Power BI Application

Here Cary imports her spreadsheet of sales data. She can transform it and create a visualization or report all before logging in. Within an hour, Cary has completed her graph of sales data:

Cary's Sales Chart

Now she wants to share it with her Small Business Meetup friends. The report must be published to the Power BI Service. Navigate to Home -> Publish in the menu. Cary will then see a prompt to log in to her Free Account.

To Publish and Share a Login is required

A dialog wants to know a destination. The only option for the free account is My Workspace. My Workspace is accessible to only the account owner. It does not allow for collaboration with publishing to Workspaces or App Workspaces as these are features available from paid subscriptions. Cary then selects My Workspace and clicks Next.

Publish workspace with Power BI

Success! Her report was published to My Workspace in the Power BI Service. A dialog appeared with the link to open the Power BI website.
Publishing to PowerBI

Profile and Account Status

Once on the Power BI site, click on your profile in the upper right corner. The dropdown detail will show your status: Free account, with a prompt to purchase a Pro upgrade.

Profile Info shows free account

Sharing the Report

Now that Cary’s report is published, she can explore her options for sharing the chart. Under the Export menu item are options to export to PowerPoint, a PDF, or to Analyze in Excel and be emailed as file attachments.

Export to file options

For a free account, this functionality feature is nice, but there is more to be offered.

Functionality features

Under the Share menu, she finds three options (Figure 9). The first option will email a link. When she clicks it, a Pro license requirement pops up. The last option is a QR code that requires the device to be logged into Power BI. Neither of those is helpful. But the middle one, Embed report, offers a selection to Publish to the web (public) so, she tries it.

Publish to the Public Web

Hitting a Roadblock

We hit another impediment:

Roadblock

Recently, an admin setting added to Power BI requires an admin to permit sharing publicly explicitly. As an Individual Free Account, Cary has no admin to approach. Her account is not in a tenant that she can access. At this point, Cary can share only by exporting a file. She thinks this is pretty good for free! But she would also like to embed the report in her free GitHub Pages website.

Max comes for help

Cary calls her friend Max for help. Unfortunately, Max does have the capability to publish a public Power BI report. To do this, he must gain access to Cary’s computer, so they start a Zoom meeting and steps below:

  1. Close the browser window to apps.powerbi.com.
  2. Return to Power BI Desktop – Cary signs out.
  3. Max will sign in.
  4. Publish again, choose My Workspace.
  5. On the Power BI Service website, sign out as Cary and sign in Max.
  6. Choose Embed -> Publish to Web.
  7. Now they see a better dialog (Figure 12):

Publish sharing is now possible

Click Create Embed Code. A dialog will caution you to be careful! “Public” means completely public so that any sensitive data will be compromised. But in our case, this is all for the sake of IT and is not real! Let’s go! Click Publish.

Embed in a public website

Success! You now have both an iframe tag and a link that you can email. Cary tested by emailing the link to her phone and clicked the link. The report opened directly to the chart with no impediment, no login. She then embeds the link on her startup website, opens Max’s LinkedIn page, and writes him a glowing compliment.

Getting started with PowerBI

A startup business might not be ready to invest in Premium Space or Pro licenses. With an Office 365 account, the option to use the free license also offers report sharing. The sharing is limited, and there is no collaboration option, but the business can still convey data visualizations. This is a great place to get started with Power BI.

Links

Do your users want reports in SharePoint? Yes! They crave reports and charts. Regardless of which version of SharePoint they are using, getting started now to build their data visualizations in Power BI will position the reports for seamless migration to future SharePoint versions. These are the necessary steps to take to add a simple report in a SharePoint modern page.

Important note: To embed Power BI reports in SharePoint Online, a Power BI Pro license is required.

The Process Flow

This is the flow:

The Process Flow Figure 1

Figure 1: The Process Flow

  1. Create your data in SharePoint; say a list or library.
  2. Start Power BI Desktop to connect to SharePoint to read the data and transform it and create visualizations.
  3. Publish to Power BI Online, where a link becomes available to paste into the Power BI webpart available in SharePoint Modern pages.

We’ve gone full circle! Let’s look at the steps.

SharePoint Online Data

For this example, we will read data from a simple custom list. I added the list to my SPO Dev Tenant site named Vacation Planner. Since all our “vacay” are now “staycay,” I decided to make a board game list. Along with the default Title column that you get with any new list, I added three more. Each is a number column. Then I added games to the list; I listed every game I could think of. For each one, I entered somewhat random numbers for Difficulty and Minimum age and Popularity, although I am pretty sure Candy Land is for 4-year-olds.

SharePoint Online Data Figure 2

Figure 2: Board games list

To create the list, I was logged into SPO as a fictitious test user I named Gwen Lim.

Build the Report

Install the Power BI Desktop application to build the report. It’s free: download it here.

On first use, you will be prompted to sign in. If the login type option appears, choose “Organizational” and log in with a Windows account. I logged in with fictional Gwen Lim’s account. The app, either from the startup splash screen or the menu, chooses “Get Data.”

Select Data Source Figure 4

Figure 3: Select a data source

From the Common data sources dropdown, select “More…” at the bottom. Then click the “Online Services” option, and you should see “SharePoint Online List” on the right. Select that and then click “Connect” at the bottom.

Choose SharePoint online Figure 5

Figure 4: We will choose SharePoint online list

In the SharePoint Online Lists dialog, paste the address URL of the SharePoint site that contains your list. You can check the 2.0 (Beta) radio button (see figure 6) to enable the app to open the default view of your list or leave it 1.0 if you prefer.

SharePoint Site URL Figure 6

Figure 5: Enter the SharePoint site URL

A Navigator window appears with all of the lists available in the SharePoint site in the left columns with checkboxes. Then, check BoardGames to see a view of the data in the right side of the pane. Click the Load button.

Select the List Figure 7

Figure 6: Select the List

You can start building the report. The fields of the data display on the right side. Having chosen a specific, limited column view as default for the list, along with selecting the 2.0 radio button, you will see only a few fields (aka columns) on the right, which is easy to work with.

BoardGames List App

Figure 7: The BoardGames list fields appear

Ignore the fields for a moment while you chose a Visualization. Select the doughnut. Now, it’s time to apply fields to the doughnut. Drag Title into the Legend box under Visualizations. A legend appears beside the doughnut chart. Drag Popularity into the Values box, and your doughnut comes to life with color.

Pick a visualization Figure 9

Figure 8: Pick a visualization chart and add fields

When you hover the chart, tooltips appear with data for each game. Age level, Difficulty, and Popularity values have been imported as decimal values, which would be more readable as whole numbers. To alter this, and to edit column heading text, click on the ribbon’s Transform Data button.

Modify the Data Figure 10

Figure 9: Modify the data

To change the column value from a decimal to a whole number, click the column title to select it and then click on the ribbon’s Data Type button. Select Whole Number as in figure. Double click the column heading to rename the column.

Changing field titles and data types

Figure 10: Changing field titles and data types

Click the Close & Apply button on the left in the ribbon to cause the visualization to appear with the changes applied. Now when you hover your cursor over a section, Minimum Age will appear with a space and both values as whole numbers.

Ready to Publish Figure 11

Figure 11: Improved tooltips

Display in SharePoint

To display the report in SharePoint, click the Publish button in the ribbon on the right side. You will be prompted to save your report in .pbix format.

Ready to publish report figure

Figure 12: Ready to publish!

Save anywhere you want to keep it, and then the Publish to Power BI dialog appears. Varied workspaces can be configured, but initially, you only have “My Workspace” as an option, which is fine. Select it and then click “Select.”

Publishing to Power BI

Figure 13: Successful publish to Power BI Online

When you see the Success! dialog, click on the link to open the .pbix in Power BI online to view your report. In the Share menu above the report, drop down the menu options, and hover over Embed report. Here you want to see an option for SharePoint online.

Link to use in SharePoint page Figure 14

Figure 14: Get a link to use in a SharePoint page

This will be missing until you upgrade to a Power BI Pro license. This is not free, but the trial is for 60 days. Once you have that option in the menu and select it, you are rewarded with the Embed link to use in SharePoint.

Embed link for SharePoint

Figure 15: Click to highlight and then copy

Click that link to highlight it and copy. Now head over to your SharePoint site and create a page.

Locate built-in Power BI

Figure 16: Locate the built-in Power BI webpart

Click the webpart plus sign, and in the search box, type “power” to filter the results. The Power BI webpart will appear. Click on it, and the webpart will be inserted into your page. You will see a green button for Add report; click it to open the properties panel on the right. Paste in the embed link you got from Power BI online.

Apply the embed link

Figure 17: Apply the embed link

Click away from that textbox and your report will appear on the left.

Report Displayed Correctly

Figure 18: Report successfully displayed in SharePoint Online

Conclusion

This is a no-code solution and a simple demo. However, the depth of tooling provided by Power BI to enable developers and business data experts to transform and visualize organizational data is immense. The speed and ease with which we can integrate data reporting into SharePoint modern pages will be welcome to customers as they migrate to current SharePoint versions.

Links

Embed a report web part in SharePoint Online – Power BI | Microsoft Docs

The Internet of Things also called the Internet of Everything or the Industrial Internet, is a technology paradigm envisioned as a global network of machines and devices capable of interacting with each other. It is a network of Internet-connected devices that communicate embedded sensor data to the cloud for centralized processing. Here we build an end to end solution from device to cloud (see reference architecture diagram below) or an end to end IoT implementation of the application while covering all its aspects like alerting an Operator, shutting down a system, and more.

About the Microsoft Professional Program (MPP)

This program will teach you the device programming, data analytics, machine learning, and solution design skills for a successful career in IoT. It is a collection of courses that teach skills in various core technology tracks that helps you to excel in the industry’s latest trends. These courses are created and taught by experts and feature hands-on labs and engaging communities.

Azure IoT reference architecture diagram

Benefits

A T Kearney: The potential for the Internet of Things (IoT) is enormous. It is projected that the world will reach 26 billion connected devices by 2020, with incremental revenue potential of $300 billion in services.

McKinsey & Co: The Internet of Things (IoT) offers a potential economic impact of $4 trillion to $11 trillion a year by 2025.

For a partner having these certified people can serve their customer whereas the developer can work on these projects and explore this new area.

MPP vs Microsoft Certification

The professional programs help us in gaining technical job-ready skills and get real-world experience through online courses, hands-on labs, and expert instruction within a specific time period. It is a good starting point to get your hands dirty with the technologies by learning via practical work, rather than classic certification style-based learning of reading a book. In MPP you be asked questions during the modules and you must complete all labs ready for the module exam where you will have to setup a solution from scratch and if your solution is correct only then will your answers be correct.

This program consists of eight different courses.

  • Getting Started with IoT
    This is a basic generic IoT course that provides an idea about IoT eco system or broad Perspective about IoT, it covers the concepts and patterns of an IoT solution and can be used to support business needs in industries like Manufacturing, Smart City/Building, Energy, Healthcare, Retail, and Transportation components of an IoT Architecture
  • Program Embedded Device Hardware
    Here you will learn the basics for programming resource-constrained devices. In addition to that you will get some programming best practices that can be applied when working with embedded devices and you will get practice developing code that interacts with hardware, SDKS, devices that connect to various kinds of sensors.
  • Implement IoT Device Communication
    Explains the Cloud Gateway or Azure IoT hub which helps in, Connecting and Manage IoT devices also helps in configuring them for secure cloud communication. Azure IoT hub helps in implementing secure 2-way communication between devices and the cloud, provision simulated devices using client tools such as Azure CLI and preforming management tasks while examining aspects of device security, Device Provisioning Service and how to provision devices at scale.
  • Analyze and Store IoT Data
    Analyzing data, how to store it and able to configure the latest tools and implement data analytics and storage requirements for IoT solutions. Explains the concepts related to cold storage and set up Azure Data Lake for cold storage. Analysis and concepts for warm storage. Using Azure Cosmos DB as an endpoint to receive data from Azure Stream Analytics jobs and analytic capabilities of the Azure edge runtime. Set up stream analytics, to run on a simulated edge device, stream analytics querying, routing and analysis capabilities.
  • Visualize and Interpret IoT Data
    In this course we can explore Time Series Insights. Real-Time Streaming, Predictive models data visualization tools, how to build visualizations with Time Series Insights, and how to create secure Power BI Service Dashboards for businesses characteristics of time series data – how it can be used for analysis and prediction. How IoT telemetry data is typically generated as time series data and techniques for managing and analyzing it with Azure Time Series Insights. Store, analyse and instantly query massive amounts of time series data. General introduction to using Power BI, with specific emphasis on how Power BI can load, transform and visualize IoT data sets.
  • Implement Predictive Analytics using IoT Data
    Predictive Analytics for IoT Solutions through a series of machine learning implementations that are common for IoT scenarios, such as predictive maintenance.
    Helps in describing machine learning scenarios and algorithms commonly pertinent to IoT, how to use the IoT solution Accelerator for Predictive Maintenance, preparing data for machine learning operations and analysis, apply feature engineering within the analysis process, choosing the appropriate machine learning algorithms for given business scenarios. Identify target variables based on the type of machine learning algorithm, evaluate the effectiveness of regression models.
  • Evaluate and Design an IoT Solution
    Learn to develop business planning documents and the solution architecture for their IoT implementations. To build massively scalable Internet of Things solutions in an enterprise environment, it is essential to have tools and services that can securely manage thousands to millions of devices while at the same time providing the back-end resources required to produce useful data insights and support for the business. Azure IoT services provide the scalability, reliability and security, as well as a host of functions to support IoT solutions in any of the vertical marketplaces (industrial/manufacturing, smart city/building, energy, agriculture, retail, etc.). In IoT Architecture Design and Business Planning, you will be presented with instruction that will document approaches to design, propose, deploy, and operate an IoT Architecture.
  • Final Project
    In this Project you should feel confident to start an IoT architect career and ability to design and implement a full IoT solution. In this project, you will get evaluated on the knowledge and skills that you acquired by completing the other IoT courses. Instead of learning new skills, we get assessed what we know with the emphasis placed on the hands-on activities. To leverage a real-world project scenario that enables us to verify that you have the skills required to implement an Azure IoT solution. The challenge activities section includes a series of tasks where you must design and implement an Azure IoT solution that encompasses many of the technologies. You will need to apply many different aspects of the training within your solution in order to be successful.

Real-World Scenario

Consider simulating a weather station located at some remote location. The station will send telemetry data to the cloud, where the data will be stored for long-term analysis and also monitored in real-time to ensure the wind speed does not exceed safe limits also unsafe wind speeds be detected, the solution will initiate an action that in the real-world would send alert notifications and ensure the wind farm turbines apply rotor brakes to ensure the turbines do not over-rev.
The functional requirements, constraints and the Proof of Value must satisfy.

  • Every turbine in the simulated farm will leverage several sensors to provide the telemetry information relating to turbine performance and will connect directly (and securely) to a network that provides access to the internet.
  • Demonstrate the use of Time Series Insights to view the wind turbine telemetry data.
  • Route telemetry to storage appropriate for high-speed access for Business Intelligence.
  • Create a dashboard in Power BI desktop that displays telemetry as lines charts and gauges.

Time Series Insight Graph

Graph Demo 1

Graph Demo 2

Graph Demo 3

Driving value, lowering costs, and building your organization’s future with Microsoft’s next great business technology

Lately, I’ve been helping folks understand the Microsoft Power Platform (MPP) by sharing two simple diagrams.

The first one is below and is my stab (others have made theirs) at contextualizing the platform’s various components in relation to one another.

The Common Data Service (CDS) is the real magic, I tell people. No matter which app you are using, the data lives there in that one CDS across your entire environment. (And no, folks outside your organization don’t get to use it.) This means that data available to one of your apps can be re-used and re-purposed by your other apps, no wizardry or custom integration required. I promise, it just works. Think expansively about the power of this in your organization, and you’ll come up with some cockamamie/brilliant ideas about what you can do.

These are the types of data-driving-business-function that geeks like me always dreamed of.

A diagram of Microsoft Power Platform components

Then there’s Power Apps, in purple. Most folks think of this as a low-code/no-code app development tool. It is, but it’s more. Imagine that there are three flavors of Power Apps:

  1. Dynamics 365, which in the end is a set of really big Power Apps developed by Microsoft
  2. COTS apps developed by Microsoft partners (including AIS), available for organizations to license and use
  3. Custom apps you build yourself

Point Microsoft PowerBI at all of this, then mash it up with data from outside of your CDS that you get to via hundreds of out-of-the-box connectors, automate it all together with workflows in Flow…and you’ve got Power Platform in a nutshell.

When I’m presenting this to a group, I turn to my next slide pretty quickly at this point.

A rearranged look at Microsoft Power Platform

Here I’ve essentially re-arranged the pieces to make my broader point: When we think about the Power Platform, the emphasis needs to be on the Platform bit. When your organization invests in this technology, say via working with an implementation partner such as AIS or purchasing Power Apps P1/P2 licenses, you’re not just getting a product or a one-off app solution.

What you’re getting is a platform on which to build your modern business. You’re not just extending Office 365. Instead, you’re creating a future where your organization’s data and business processes are deeply integrated with, driving, and learning intelligently from one another.

The more you leverage the platform, the higher the ROI and the lower the marginal costs of those licenses become. A central goal of any implementing partner ought to be guiding organizations on the journey of migrating legacy systems onto the platform (i.e., retiring legacy licensing + O&M costs) and empowering workers to make the platform even more valuable.

We don’t invest in one-off apps anymore, i.e. a CRM in one corner of your network where you run your sales, something in another where you manage your delivery, clunky Human Resources Management off over there where you take care of your people, etc.. No, what we care about here is the platform where you integrate all of the above — not through monolithic one-size-fits-all ERP — but rather through elegant app experiences across all your users’ devices that tie back to that magical Common Data Service.

This is what I mean when I tell folks sky’s the limit, and thinking about your entire business is what’s called for here. It’s because Power Platform gives us the ability to learn and grow with our customers, constituents, vendors, employees, and other stakeholders like never before.

That’s what has everyone at Microsoft so excited. I am as well.

I want to learn from you. How do you make Power Platform understandable to those who haven’t thought about it too deeply? How does your organization make it valuable as a platform rather than just a product? I love to build beautiful things, so inspire me!

Join us tomorrow for a free webinar with AIS’ CTO and Microsoft MVP Vishwas Lele on Microsoft Power Apps and Flow. This webinar is designed to show you how to easily create Power Apps applications, and how to best take advantage of the recently introduced Power Apps custom visual for Power BI.

Vishwas will showcase a Power Apps application that is essentially a “portal” for existing Line of Business Enterprise Applications (inventory, contracts, etc.) and Services (Dynamics, O365, DropBox, etc.) Through the use of Power Apps features like the out-of-the-box connectors, integration with Flow and mobile enablement, you’ll learn how easy it is to build an app that allows users to have all the information they need in one location and on the device of their choice.

The webinar kicks off TOMORROW at 10 a.m. ET. Watch it right here or on Microsoft’s Power BI YouTube.

Steve Michelotti and I presented a session on AzureGov last week at Microsoft Ignite 2017 in Orlando. It focused on demonstrating the innovative capabilities in AzureGov that are specifically designed to help government agencies with their mission. We dedicated about 80% of the session to live demos.

Steve started out with a brief description of AzureGov and how to get started…along with some recent news announcements, including API Management and Key Vault. Steve then quickly transitioned into demos related to Cognitive Services, Azure IOT and Power BI. I conducted two demos related to Cosmos DB Graph database and the CNTK deep learning algorithm on an N Series GPU machine.

Please watch the video below and let us know if you have any questions.

Recently we collaborated with Microsoft and Prospect Silicon Valley (ProspectSV) on a project to assess the viability and value of several Azure services. Specifically, we were asked to demonstrate how the cloud-based platform could be used to retrieve, store, visualize and predict trends based on data from multiple sources. In order to demonstrate these capabilities, we built an ASP.NET MVC application leveraging the following Azure components:

  • Azure App Services
  • Azure Machine Learning
  • Azure Power BI Embedded
  • Azure Storage

Figure 1: ProspectSV Application Architecture depicts how the system uses these four Azure components. This diagram also describes which external data sources are used and where that data is stored.
Read More…

When you read about the Internet of Things, you often hear about connected cars, connected kitchen appliances, small devices that let you order things quickly, or other consumer-grade applications. In this post, I will quickly describe a recent IoT project I worked on where the devices are not small consumer-grade sensors…they are large industrial manufacturing machines.

In this case, machines arranged on a shop floor are responsible for cutting metal into various shapes. These machines must be both very powerful and very precise, and they have robotic arms that are programmed to grip specialized tools for this activity. These machines use the MT Connect protocol as the language for communicating their operational status and the results of any action taken. On collection, the data is streamed to a collection point for analysis. In older machines, adapters are installed to stream the machine’s data using the common language.

Our work on this project helped the engineers identify optimal cut times by analyzing the machine activity data. First, we needed to enhance the collection process so that all data was readily available, then apply the appropriate business rules to identify cut time, and finally provide quick, actionable feedback on the outcome. Read More…