How We Modernized a Legacy App using Power Platform

In May, AIS held an internal hackathon for Microsoft Power Platform to expose our team to the platform, concepts, approaches through hands-on experience and to demonstrate the role Power Platform plays in modernizing legacy applications in the cloud.

The Microsoft Power Platform Hackathon was an opportunity for our enthusiastic team to modernize a legacy e-commerce (E-shop) application using Microsoft Power Platform. The legacy application, the deployment of eShopOnWeb, helped users find a product of interest by browsing and filtering. Users could also add products to their cart and checkout. The app also provided an interface for administrators to add, update, or delete products from the catalog. The idea behind this hackathon was that we could build this application on top of the Microsoft Power Platform or Low Code application platform instead of using classic ASP, ASP.NET, and GSP. We wanted to write the application using Power Portal to drag and drop to use existing components. 

A new system was developed to replace the legacy e-commerce application with the complete feature parity. The solution included a Power Apps Portal with the same “look and feel” functionality. We used Dataverse as the persistent layer instead of SQL server and integrated it with new communication methods such as sending emails and text messages to users. Additionally, we used a Web API to communicate with Legacy Reporting Systems using Power Platform Connectors, providing secure access to the new system using Azure Active Directory B2C.

In addition, the team explored ways to backup and source control the solution and automate the deployment from one environment to another. The diagram below represents the architecture of our final solution.

Final Power Platform Blog

It is named ‘Work Less, Do More,’ Power Platform replaces the work that might take many days or months to a few hours. So let’s dive in and learn how we arranged all these pieces and modernized the legacy application using Microsoft Power Platform.

JOIN OUR GROWING TEAM
AIS provides employees with opportunities to learn and grow in their careers. Won't you join us?

Technical Approach

Our main motivation was to identify the appropriate approach to bring applications to scale. Our innovative approaches and technical depth have earned us the privilege of experience in designing, implementing, and modernizing some of the most complex cloud solutions. The application was divided into different components, which were developed by individual teams.  All the components were then pulled together to provide the complete Power app solution.

When dealing with legacy applications, we looked at past examples of our approaches from experience and have outlined them below: 

  • Rehost: A direct cloud lift and shift. This is a faster, less resource-intensive migration that moves your apps to the cloud without any code modification. The rehosting approach to app modernization is capturing the on-prem environment that runs an application (the servers) and directly moving that to the cloud as virtual servers. In this approach, the environment hosting the application is modernized, but the core application itself is not significantly offered.
  • Refactor: This approach is about modernizing legacy applications by rearchitecting to target cloud-native “serverless” technologies where possible. Refactoring typically requires more significant recoding of the existing application, however, this method takes advantage of the best of what the public cloud has to offer – managed offerings for all application components. In this approach, we re-architect existing applications and deploy them as Platform as a Service (PaaS).
  • Replatform: Essentially, this approach is somewhere in between Rehost and Refactor, because the entire VM is not moving to the cloud. This application will be containerized and run on top of Kubernetes.
  • Reimagine: This is typically thought of as a Greenfield cloud-native rewrite, which is why code changes are high, even though you will eventually end up with a low operating cost.

The idea of Power Platform and low code applications is that you can get to the Reimagine approach without having to spend a lot of effort in building a cloud-native application.

In this section, we are going to highlight these components and how they were implemented in our solution:

The front-end team focused on building the Power Apps Portal for the end-users and a model-driven app for the administrators. The Portal allowed the users to browse through the product catalog, add an item to the cart, place an order, view their past orders, and manage their profile. The model-driven app allowed administrators to manage the product catalog just like the legacy application. The team used Portals Web API to fetch data from Dataverse and used Liquid templates for web pages.

The data team focused mainly on using Microsoft Power Platform Dataverse as the persistent layer for both the Portal and the admin app. They also migrated schema and data from legacy datastore to Dataverse by exploring various techniques, including Dataflows, CSV imports, and custom code.

The integration team focussed on leveraging existing Power Platform connectors to add new functionality to the system. For example, the system sends the order confirmation email to the user using the Office Outlook connector in the Power Automate Flow. Similarly, it sends text messages to users through the Twilio Connector. The team also leveraged SQL Server Connector for data sync so that the legacy reporting systems remained unaffected.

The DevOps team automated the portal deployment process using Power DevOps Tools and deployed the solution across three environments (dev, test, prod). Since Microsoft Power Platform does not support source control and versioning, the team used Azure DevOps as the solution repository and version control.

The identity team focused on providing secure access to the Portal to a different set of users. The team used Azure AD B2C to decouple identity and access management from the Portal application.

Stay tuned! We will be publishing a blog for each team for a deeper dive into their individual focuses for this hackathon.

Lessons Learned

Ultimately this hackathon proved that Power Platform is a great app modernization solution for the following reasons.

  • We can use Portal as a modern low code alternative to create websites and interact with data in Dataverse.
  • Model-driven apps provide a rich no-code design environment to create applications and share quickly.
  • We can quickly build secure apps using connectors.
  • Innovate and improve business, as these connectors are easily customizable, and end-users can easily change or create the content for Email or SMS Templates.
  • With the help of Power Platform build tools, we can quickly deploy the solution into various environments. Increase the release frequency.

MEETING NEEDS QUICKLY WITH POWER PLATFORM
AIS architected, developed, and deployed a secure global health solutions management application and digital marketplace built on Power Platform.

Thank you to the Internal Power Platform Hackathon Technical Deep Dive team
Authored by Lav

Covers experience from all teams:

Front end
Ritika Agarwal (team lead)
Devyanshi Tiwari
Pooranendu Patel

Data
Jagrati Modi (team lead)
Souradeep Banerjee
Nikhil Grover

Integration
Kranthi Kiran (team lead)
Varalika Bishnoi
Sravan Kumar
Pavan Bandi

DevOps
Vikram Reddy (team lead)

Identity
Lav Kumar (team lead)
Davood Khan

Recommended Reads

In January, AIS’ Steve Suing posted a great article titled “What to Know When Purchasing a COTS Product and Moving to the Cloud.” In his post, Steve discusses some things to consider such as security, automation, and integration. Springboarding off of those topics for this blog, I am going to discuss another major consideration: legacy data.

Many COTS implementations are done to modernize a business function that is currently being performed in a legacy application. There are some obvious considerations here. For example, when modernization a legacy app, you might ask:

  • How much of the business needs are met by purely out of the out-of-the-box functionality?
  • How much effort will be necessary to meet the rest of your requirements in the form of configuration and customization of the product or through changing your business process?

However, legacy data is a critical consideration that should not be overlooked during this process.

Dealing with Data When Retiring a Legacy System

If one of the motivators for moving to a new system is to realize cost savings by shutting down legacy systems, that means the legacy data likely needs to be migrated. If there is a considerable amount of data and length of legacy history, the effort involved in bringing that data into the new product should be carefully considered. This might seem trivial, with most products at least proclaiming to have a multitude of ways to connect to other systems but recent experience has shown me that the cost of not properly verifying the level of effort can far exceed the savings provided by a COTS product. These types of questions can help avoid a nightmare scenario:

  1. How will the legacy data be brought into the product?
  2. How much transformation of data is necessary, and can it be done with tools that already exist?
  3. Does the product team have significant experience of similarly legacy systems (complexity, amount of data, industry space, etc.) moving to the product?

The first and second conversations usually start by sitting down and talking to the product team and having them outline the common processes that take place to bring legacy data into the product. During these conversations, be sure both sides have technical folks in the room. This will ensure the conversations have the necessary depth to uncover how the process works.

Once your team has a good understanding of the migration methods recommended by the product team, start exploring some actual use cases related to your data. After exploring some of the common scenarios to get a handle on the process, jump quickly into some of the toughest use cases. Sure, the product might be able to bring in 80% of the data with a simple extract, transform, load (ETL), but what about that other 20%? Often legacy systems were created in a time long, long ago, and sometimes functionality will have been squished into them in ways that make the data messy. Also, consider how the business processes have changed over time and how the migration will address legacy data created as rules changed over the history of the system. Don’t be afraid to look at that messy data and ask the tough questions about it. This is key.

CONVERTING LEGACY DATA TO BIG DATA PLATFORMS
Accelerate your data migration project with our free white paper download.

Your Stakeholders During Legacy Data Migration

Be sure those involved are not too invested in the product to decrease confirmation bias, the tendency to search for, interpret, and/or focus on information that will confirm that this is your silver bullet. This way they will ask the hard questions. Bring in a good mix of technical and business thought leaders. For instance, challenge the DBA’s and SME’s to work together to come up with things that will simply not work. Be sure to set up an environment in which people are rewarded for coming up with blockers and not demotivated by being seen as difficult. Remember every blocker you come up with in this early phase, could be the key to making better decisions, and likely have a major downstream impact.

The product of these sessions should be a list of tough use cases. Now it’s time to bring the product team again. Throw the use cases up on a whiteboard and see how the product team works through them. On your side of the table, be sure to include people that will be responsible for the effort of bringing the new system to life. With skin in the game, these people are much less likely to allow problems to be glossed over and drive a truly realistic conversation. Because this involves the tough questions, the exercise will likely take multiple sessions. Resist any pressure to get this done quickly, keeping in mind that a poor decision now, can have ripple efforts that will last for years.

After some of these sessions, both sides should have a good understanding of the legacy system, the target product, and some of the complexities of meshing the two. With that in place, ask the product team for examples of similar legacy systems they have tackled. This question should be asked up front, but it really cannot be answered intelligently until at least some of the edge cases of the legacy system are well understood. While the product team might not be able to share details due to non-disclosure agreements, they should be able to speak in specific enough terms to demonstrate experience. Including those involved in the edge case discovery sessions is a must. With the familiarity gained during those sessions, those are the people that are going to be most likely to make a good call on whether the experience being presented by the product team relates closely enough to your needs.

Using Lessons Learned to Inform Future Data Migration Projects

Have I seen this process work? Honestly, the answer is no. My hope is that the information I have presented, based on lessons learned from projects past helps others avoid some of the pitfalls I have faced and the ripple effects that could lead to huge cost overruns. It’s easy to lose sight of the importance of asking hard questions up front and continuing to do so in light of the pressure to get started. If you feel you are giving in to such pressure, just reach out to future you and ask them for some input.

And a final thought. As everyone knows, for complex business processes, COTS products don’t always perfectly align to your legacy system. There are two ways to find alignment. Customizing the system is the obvious route, although it often makes more sense to change business processes to align with a product you are investing in rather than trying to force a product to do what it was not designed for. If you find that this is not possible, perhaps you should be looking at another product. Keep in mind that if a product cannot be found that fits your business processes closely enough, it might make more financial sense to consider creating your own system from the ground up.

Lift n Shift Approach to Cloud Transformation

What does it mean to Rehost an application?

Rehosting is an approach to migrating business applications hosted in on-premises data center environments to the cloud by moving the application “as-is,” with little to no changes to the business functions performed by the application. It’s a faster, less resource-intensive migration approach that gets your apps into the cloud without much code modification. It is often a good first step to cloud transformation.

Organizations with applications that were initially developed for an on-premises environment commonly look to rehosting to take advantage of cloud computing benefits. These benefits may include increased availability and networking speeds, reduced technical debt, and a pay-per-usage cost structure. When defining your cloud migration strategy, it’s essential to analyze all migration approaches, such as re-platforming, refactoring, replacing, and retiring.

CHECK OUT OUR WHITEPAPER & LEARN ABOUT CLOUD-BASED APP MODERNIZATION APPROACHES

Read More…

In this blog post, I discuss an app modernization approach that we call “modernize-by-shifting.” In essence, we take an existing application and move it to “managed” container hosting environments like Azure Kubernetes Service or Azure Service Fabric Mesh. The primary goal of this app modernization strategy is to undertake minimal possible change to the existing application codebase. This approach to modernization is markedly different from a “lift-and-shift” approach where workloads are migrated to the cloud IaaS unchanged with little to no use of cloud native capabilities.

Step One of App Modernization by Shifting

As the first step of this approach, an existing application is broken into a set of container images that include everything needed to run a portion of the application: code, runtime, system tools, system libraries, and settings. Approaches to breaking up the application in smaller parts can vary based on original architecture. For example, if we begin with multi-tier application, each tier (e.g. presentation, application, business, data access) could map to a container image. While this approach will admittedly lead to coarser-grained images, compared to a puritanical microservices-based approach of light-weight images, it should be seen as the first step in modernizing the application.

Read More…

Inflexible customer solutions and business unit silos are the bane of any organization’s existence. So how does a large, multi-billion dollar insurance organization, with numerous lines of business, create a customer-centric business model while implementing configurable, agile systems for faster business transactions?

The solution is not so simple, but with our assistance, we’ve managed to point our large insurance client in the right direction. What began as a plan to develop a 360-degree customer profile and connect the disparate information silos between business units ultimately became the first step towards a more customer-centric organization.

A major multi-year initiative to modernize the organization’s mainframe systems onto the Microsoft technology platform will now provide significant cost savings over current systems and enable years of future business innovation. Read More…

Welcome to the first article in a series on moving enterprise systems from a mainframe-based platform to something else. That “something else” could be any number of things, but our default assumption (unless I say otherwise) is going to be a transaction processing system based on a platform like Microsoft’s .NET Framework.  While I expect that there will eventually be some technical content, the initial focus is going to be on planning, methodology, defining solutions and project management.  Moving a large system off of a legacy mainframe platform is very different from a development project that starts with a blank slate. Chances are that you’re going to be shifting paradigms for quite a few people in both your technical and business organizations — and for yourself, as well. Read More…