This blog will share ten thought-provoking moments from the Gartner IT Symposium/Xpo™ 2022 conference. The event agenda was rich with overlapping sessions; my observations are based on sessions I could attend, and I look forward to watching the recordings of sessions I could not view live.

One: The Metaverse Equation

There are so many instances of metaverse today. Microsoft alone has a consumer, commercial, and industrial metaverse. An example of the industrial metaverse is their collaboration with Coke Hellenic to build a “digital twin” fabric of sensors on the factory floor and then allow employees to experience the twin itself. AltspaceVR, a social VR platform now owned by Microsoft, offers a space for various communities while corporations explore use cases for employee training or onboarding. The near and actual examples of the metaverse in action are everywhere; for Saudi Arabia, it is their Neom city; for Nike, it is NikeLand, and so on.

It does not matter if you use AR/VR/ MR, 3D simulation, 2-D rendering, or any device you choose. The metaverse is about presence and experience. Your cartoony avatar in the metaverse is not about photo realism. It is about the ability to interact with others in a shared space from anywhere in the world. Most importantly, it is about the ability to control your experience.

The session ‘Building a Digital Future: The Metaverse’ Gartner analyst presented the Metaverse equation using three areas as pillars: transport, transform and transact.

  • Metaverse is a shared space you need to “transport into” via a headset, spatial computing glasses, 3D glasses, a game rig, or a mobile device or a PC).
  • Once transported to the metaverse of your choice, your surroundings are transformed.
  • Finally, once you are in the metaverse, you can transact can use crypto and Web3 to transact (for example, you can buy NFT sneakers in Nikeland)

Ultimately, I think the success of the metaverse will depend on interoperability and establishing a common identity. It will be a while before we see such interoperability.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Building a Digital Future: The Metaverse
Speaker: Marty Resnik

AIS IN THE METAVERSE
AIS hosted a hiring event and decided to go remote and test a Virtual Reality-based format using AltspaceVR, the leading platform for live, virtual events. Learn more about our experience in our blog.

Two: Talent Force Multipliers

In the Opening Keynote- Gartner analyst Tina Nunno spoke about IT for Sustainable Growth and mentioned three Force Multipliers for Revolutionary work. Below are my observations on each.

Take the friction out of work.
Whether dealing with cumbersome job applicant tracking systems or everyday enterprise applications, friction is like sand in your bike’s gears, making every hill seem more significant. Workers satisfied with their day-to-day applications are twice as likely to stay.

Invest aggressively in AI augmentation.
Think of AI as a way to augment our employees’ reach, range, and capabilities by helping them with everyday decision-making. Unity Health uses an AI-driven tool like Chart Watch looks at 100 different variables in a patient’s chart, including lab results and vital signs, and determines whether the patient is at a low, moderate, or high risk of needing ICU care. AI is not replacing emergency room doctors; it merely assists them with informed decision-making.

Experiment with the highly visible, highly hyped.
Invest in emerging and highly hyped and do that publicly. We all know that a higher failure rate is associated with emerging and highly hyped projects. But organizations that are seen as innovative are more attractive to potential employees and tend to be ahead of the pack during the downturn.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Talent Force Multipliers
Speaker: Tina Nunno

Three: Continuous Modernization Value Stream to Deliver App Migration

App migration needs to be a constant process of modernizing. In the session “Use the Continuous Modernization Value Stream to Deliver Application and Cloud Migration Priorities” at the Gartner IT Symposium/Xpo, analyst Howard Dodd shared, I heard this quote: “You must continuously modernize the most valuable and vulnerable parts of your portfolio.”

This session taught me that the main idea is to think about Assess, Transform, and Evolve loops. Don’t try to do this all at once. Instead, take one opportunity, walk it to the end, learn from it, and apply the learnings to the next iteration of the loop. Such a feedback loop lets you start going faster and delivering a steady value stream. The most significant benefit of the incremental approach is that it allows you to change. If for some reason, that container orchestration platform you placed at the center of your app migration strategy does not work out, you can change it.

Finding value in app modernization aligns with your long-term strategy. It is about the outcomes. For example, rather than talking about percentages of applications migrated, talk about outcomes. Consider this example: “As a healthcare company, we want to increase engagement with our members by helping them make cost-conscious decisions about their plans. We will do that by modernizing our web applications to offer the lowest prescription….” Notice that the app modernization is tied to an outcome with a defined measure of success. Migration is not complete by deploying the app to the cloud. It needs to be continuously monitored and improved.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Use the Continuous Modernization Value Stream to Deliver Application and Cloud Migration Priorities
Speaker: Howard Dodd

Four: BeeKeeperAI Demo

BeeKeeperAI

Generalizing ML algorithms are complicated (think cost and time) mainly because using synthetic or de-identified training data can create a significant amount of overhead. Only 16 algorithms have achieved the “DEN” designation from FDA.

BeeKeeperAI is attempting to solve this problem by giving algorithm developer access to real data without data ever leaving an organization’s premises. Algorithm developers deploy their algorithm to a confidential computer-based secure enclave (created by the data owner). These secure enclaves eliminate the risk of data exfiltration and interrogation of the algorithm IP from insiders and third parties.

For more information, check out the BeeKeeperAI website.
Slide: This is not a Gartner presentation, and no deck was provided. The picture above is from the BeeKeeper website.

Five: Future of AI

AI as a companion technology. It will be pervasive across all jobs, not just as a tool but as a teammate. AI won’t replace your plumber but can check the plumber’s work.
AI will require less data and computation by combining purely data-driven with a login-driven AI. It would take a neural network of about 100,000 games to learn “Tic Tac Toe.” We can significantly reduce learning time by telling the algorithm to start in the middle.

In the session “The Future of AI,” analyst Whit Andrews mentioned, “AI Is Accessible to More People With Less Skills and More Knowledge.” I feel that the advent of open-source algorithms and composable AI patterns will make AI more accessible to more people with fewer skills but more business knowledge to drive business outcomes.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: The Future of AI
Speaker: Whit Andrews

Six: Data Ecosystems

Cohesive cloud-based data ecosystems are on the rise and are expected to be the dominant choice in future years. These ecosystems include CSP native tools and a collection of third-party ISV tools as necessary. In the session, “Why CIOs Must Care About Data and Analytics Ecosystems – Adaptability, Speed and Lower Cost,” a Gartner analyst shared the common “data use cases in the data ecosystem: applications, data science, AI/ML, IoT, analytics, event processing, marketplaces, and edge computing.” Data ecosystems, especially those completed based on CSP-native tools and services, can pose a lock-in challenge, leading to higher prices. The competition within the cloud data ecosystem is broad enough that the cost is expected to go down. Additionally, the cost of a DIY data ecosystem is high because of the integration costs.

For example, consider Microsoft’s Intelligent Data Platform, which deeply integrates its databases, analytics, BI, and data governance products into a data ecosystem. At the recently concluded Ignite, Microsoft added a partner ecosystem for the Intelligent Cloud.

Microsoft Intelligent Platform

Source: Docs.Microsoft.com
For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Why CIOs Must Care About Data and Analytics Ecosystems – Adaptability, Speed, and Lower Cost
Speaker: Donald Feinberg

Seven: Future of Cloud 2027

Cloud is transitioning from the technology core (provided by hyper-scalers today) to capability enhancement that adds value to core services. Cross-cloud data platforms like Snowflake and containers as common infrastructure layers and operations layers are examples of capability enhancement.
It is expected that most of the customer requirements would be satisfied by CSP-Native offering versus container-focused. I wonder if SP first-party services like AKS are considered CSP-native or CNCF-native.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Future of Cloud 2027: From Technology to Business Innovation
Speaker: David Smith

Eight: The New Economics of Technology

Strive to evaluate our assumptions in the face of technological disruption constantly. In the session “The New Economics of Technology,” analyst Daryl Plummer stated, “end-user organizations must manage the risk associated with failure to anticipate new economics brought about by technology disruption.” This means that technology disruptions lead to a change in technology economics. So, we must find new value stories to create new growth opportunities for our companies and customer organizations.

Below are the four primary phase shifts in technology shared by analyst Plummer and my understanding of each:

  • Control to democratization. To prepare for this shift, IT leaders must invest in a governance model allowing citizen developers to participate in content creation.
  • Heuristic to Intelligence. AI is being embedded into applications allowing the applications to analyze and reason at a higher clip. To prepare for this shift, IT leaders must invest in a robust ML Ops governance model.
  • Data Center Isolation to Cloud Concentration. This shift is already well underway. According to Gartner, “nearly 60% of IT spending on application software will be directed toward cloud technologies by 2024”.
  • Centralized authority is moving towards a decentralized model. Web3 and Smart Contracts are driving decentralized governance.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: The New Economics of Technology
Speaker: Daryl Plummer

Nine: Democratized Digital Delivery with Fusion Teams

Digital Democratization is defined as making the creation and management of information technology accessible to everyone. In the session, “Democratized Digital Delivery: Fusion Teams and Product Management Explained,” analyst Jaime Capellá shared stats. These stats made me think that as CEOs push towards digitization, they are increasingly looking for technology work to be done directly within the business function and less in IT. A dominant trend, “Fusion Teams,” is emerging to support this objective. Fusion teams are “multidisciplinary teams that blend technology or analytics and business domain expertise and share accountability for business and technology outcomes.”

The analyst referred to “Fusion Teams” as a new IT and Business interface.

For Fusion Teams to be successful, IT leaders need to build a consistent foundation of platform products/services such as cloud infrastructure, security, low code solutions, and data.
Additionally, IT leaders must embed cross-cutting experts across the fusion teams, including architecture, security, and compliance.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Democratized Digital Delivery: Fusion Teams and Product Management Explained
Speaker: Jaime Capellá

Ten: Major Trends in Robotic Pocess Automation

In the session, “Magic Quadrant for Robotic Process Automation” analyst Saikat Ray shared the following major trends in RPA, including:

  1. APIs complement screen scraping capabilities.
  2. RPA vendor is constantly evolving, leading a fluid vendor landscape.
  3. Customers are going from RPA to Hyperautomation.; In my opinion, this is moving beyond task-based RPA to a platform for automation that includes resilience, orchestration, and infusion of AI. Another critical aspect of hyper-automation is access to low-code and no-code technologies.
  4. Finally, vendors are coming out with innovative RPA pricing models, including consumption-based pricing.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Magic Quadrant™ for Robotic Process Automation
Speaker: Saikat Ray

Conclusion

Did you attend the Gartner IT Symposium/Xpo™ conference? What were your takeaways? Reach out to Vishwas Lele on LinkedIn to share your thoughts.

GARTNER and IT SYMPOSIUM/XPO are registered trademarks and service marks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

I recently worked with a group including Vishwas Lele and Jesse FitzGibbon to design and execute an accelerated learning series on Robotic Process Automation (RPA) with Microsoft Power Automate, kindly dubbed the “AIS RPA Crash Course.” We set out to engage a team of roughly 20 AIS employees with a diverse set of skills and experience levels with the goal of providing them with the practical experience to contribute to enterprise-level automation projects with AIS customers. It proved to be a rewarding experience for all of us. So, we thought it would be helpful to share some details of our approach and what we learned along the way.

Week 1

The first week was all about establishing a common understanding of RPA and its importance for our customers and employees. RPA continues to be popular as the importance of hyperautomation is being felt by IT and business users alike. By infusing RPA with AI /ML (Cognitive Services) and a low / no-code platform, Power Automate is democratizing access to RPA (please refer to the latest Gartner Magic Quadrant for RPA).

We began with Microsoft Power Automate and its role in hyperautomation – including the synergy between cloud and desktop flows. After our synchronous sessions were completed, we spent another day working in a hybrid model, loosely following Microsoft’s “RPA in a Day” structure which combined guided instruction with independent, hands-on lab work. This really helped to solidify the concepts we’d spent the previous day discussing and put our team in a good position to start thinking through how these technologies could be applied to solve real-world problems.

We wrapped up the first week with a “Proof of Learning” exercise where we asked each of our participants to identify a potential real-world application for RPA and to build a proof of concept to demonstrate hyperautomation in action. This exercise really forced everyone to think through how to get the building blocks in place. It was awesome to see the excitement, interest, and confidence that everyone had acquired as they presented the results of their work.

Week 2

During the second week, we provided the participants with a new business problem to solve. We asked the team to use RPA to react when a file containing GPS coordinate data is uploaded to Azure cloud storage, and then upload that file to a website (which we provided) that summarizes that data in terms of total distance, elevation, average speed, etc. Once the website had completed its magic, the process should then capture the results and email them to the submitter of the original file.

The process was somewhat basic, but we asked everyone to think beyond the fundamentals of “how do we automate this” to “how do we implement this in a robust and repeatable way”. We also required that the automation be implemented using Power Automate Desktop and assume REST APIs were unavailable.

We divided the team into pods. Each pod was assigned a project manager and was comprised of people working across US and India time zones. The pods were tasked with developing their own feature backlogs and independently attacking solution design and implementation. Vishwas, Jesse, and I closely monitored each of the team’s progress during the week and provided interventions and support as needed, most often in the form of brainstorming and assistance with specific technical obstacles.

The group presentations at the end of the week were remarkable. Each team had fully automated the process and had begun implementing more advanced concepts. What I found most interesting was how each of the teams’ diverse backgrounds we re represented in their solutions with an incredible emphasis on user experience. One team even built out a canvas app as a front-end to upload the .gpx files.

Week 3

Our third and final week focused on the implementation of best practices for application lifecycle management, securing secrets with Azure Key Vault, and ensuring the security of the data from start to finish – all of which we believe to be critical to making RPA solutions ready for the enterprise. The teams continued to iterate on their solutions from the previous week and incorporated new components intended to make the applications more robust and manageable. The built-in error handling leveraged the concepts of chaos testing to ensure that failures were handled elegantly and that logging was in place to allow for future troubleshooting. Other enhancements were made to plan for how their solutions would automatically and efficiently move from test to production tiers using Git and GitHub Power Platform Actions.

Results and Takeaways

We are pretty proud of the results. Our participants started with very little if any, knowledge or experience with RPA and Power Platform. At the end of 3 weeks, we had a team of people with a fresh set of skills that they were excited to use – and it was not just technical skills. Throughout this exercise, our teammates embraced our continued learning core values by striving to improve themselves to better serve our clients in a rapidly changing technological landscape.

Many of our participants have moved on to new projects since completing the course. Better yet, several of them have joined forces with other AIS engineers working to solve critical client needs using RPA!

This experience was equally enriching and energizing for those of us charged with structuring and overseeing the course. We look forward to leveraging our lessons learned to improve on the next iteration and thank all our participants for their hard work and willingness to roll with the punches as we crashed our way through the course together.

Can The Power Automate Process Advisor Be Used as a Time Study Tool?

Microsoft has given us many tools over the years with multiple applications. The Process Advisor, now available in commercial, GCC, GCC High, and DoD environments, is simple enough to understand and use but has many potential applications.

What is the Process Advisor?

The Process Advisor is a tool included within Power Automate. It is used to record mouse clicks and keyboard keystrokes during a process. After recording a repetitive task, the recordings are uploaded and analyzed by the Process Advisor, making recommendations about how Power Automate can streamline the workflow.

Why is using a Process Advisor in Power Automate Important?

Automating processes saves time and Process Advisor shows you the way.  After recording a process, it will break down the steps and recommend, on a per-step basis, what could be automated with cloud flows, desktop flows, RPA, and even AI models. 

What is a Time Study?

A time study is precisely what it sounds like; studying the time it takes to complete a task. The goal of a time study is to document the steps necessary to complete a task and how long those steps take, on average.  This information is then used to try to make the process more efficient and reduce how long the task takes. There are many ways to conduct a time study. They range from using a stopwatch and paper to using expensive time study software.

Once the information has been gathered, you must consolidate, clean, and analyze the information to decipher potential improvements. The good news is that the Process Advisor uses AI to do everything for you.

Challenges and Considerations

One of the major considerations whenever using the process advisor is the scope.  In other words, what steps are going to be included when you record this task.  For example, if you have a process to receive requests from customers through your website, email, and physical mailings, you may wish to scope your recordings to include all three or prioritize one. 

Benefits of using a Time Study with Process Advisor

There are two tangible benefits of using the Process Advisor to conduct a time study. The first is the time savings when you receive your results in real-time versus all the work to manually consolidate the data and turn it into actionable information.  The second is the immediate recommendation that Process Advisor gives you about how to leverage individual pieces of the Power Automate offerings to automate the process. 

How Do I Get Started?

Install Power Automate Desktop here: https://docs.microsoft.com/en-us/power-automate/desktop-flows/install and follow the installation setup instructions.

After installation, open the Power Automate Desktop. Create a new flow, name it descriptively, and open the Desktop Recorder or Web Recorder depending on where this process resides.

Power Automate Desktop

Click Record, complete the task, then click Finish. Do this as many times as you would like (a minimum of two required for the next step), and multiple people conducting the same actions is recommended, so the tool has a lot of information to glean from.

After recording several examples of the same task being completed, have everyone upload their individual recordings to a shared area inside of the Process Advisor, which can be found in the Power Automate section of Microsoft Office, here:

Shared Area inside Process Advisor

Here you can click Create to add a new process and share it with your team. This will allow them a single place to upload their recordings for analyzing and keeping things organized.

After you have enough recordings uploaded to analyze (the minimum requirement is two recordings), you can select Processes to begin the analysis process. Here you can see everything that has been submitted for analysis, who submitted it, and what the Status is. Select Analyze at the top and all recordings will be Analyzed at once; this may take a few minutes.

Analyze recordings

Once it successfully analyzes your recorded processes, select the Analytics button.

Analyze and record

Remember all those steps after completing your data gathering? The Process Advisor has already completed them. A flow chart has been created showing you the variance in the process. Times have been assigned to steps to show how long actions take. Actions that take longer have been color-coded with a red highlight to draw attention. Also, filters can be applied down the right side of the screen to dive into your data and really focus on what the major culprits are for increasing the length of time a task takes.

Apply Filters to Dive into Data

This works wonders for saving you time having to clean and organize data after the initial gathering phase of your time in motion study. But Process Advisor doesn’t stop there. At the top of the screen, there’s another helpful button.

Preview Automated activities

This will launch another window with context from your process recordings. It will then take you to a new instance of Power Automate, where it recommends parts of these recordings that Power Automate can do for you. In short, the Power Automate Process Advisor took the time required to collect the data, submit it, analyze it, and use it to make recommendations for process improvements and reduce it drastically.

Outcomes

The results you receive are nearly instant and take the guesswork out of how to automate your current process.  Using Power Automate to use computation power to do repetitive tasks will save you time and money.  Process Advisor will get you there faster and using it as a Time Study tool will also have the added benefit of prioritizing the biggest wins as far as time savings is concerned. 

RPA-in-a-Day

Microsoft has built a training that demonstrates exactly how to use the Process Advisor, coupled with AI, to their fullest potential. Within this training, an invoice process is analyzed. The bottleneck is identified when invoices arrive via email, and then the tool recommends an RPA process to automate that task.

Contact us to schedule your own RPA-in-a-day!

Microsoft’s RPA (Robotic Process Automation) solution will transform organizations, helping teams realize cost savings and increased productivity. And the beauty of Microsoft’s RPA capabilities is that they build atop an organization’s existing Azure infrastructure, tapping into native Microsoft cloud services – like Azure Active Directory, On-Demand Compute, Native Network Security Tools, and RBAC Access Control – rather than building these capabilities from scratch on another solution.

While automation has been around for decades, the Microsoft one cloud platform provides seamless, integrated process automation services. Microsoft automation capabilities can extend across all departments of large enterprise organizations, optimizing processes and significantly trimming costs.

Many years of Azure experience and award-winning Power Platform capabilities provided the AIS team the opportunity to collaborate with Microsoft on a whitepaper for enterprise deployment of RPA. We’re grateful for the opportunity to help create opinionated guidance and resources for organizations looking to enable and govern the rollout of Power Automate for RPA and other hyperautomation scenarios. Many talented SMEs across Microsoft and AIS partnered to deliver valuable guidance for you to accelerate your path to automation. In this blog, our team shares six key learnings and takeaways from the project.

What We Learned Helping Write the RPA Whitepaper

It’s hard to beat the opportunity to work with a product team. We’ve been a dedicated Microsoft partner for nearly as long as we’ve been a company. Throughout our partnership, we’ve been lucky enough to work closely with product engineering across various teams. It’s always enlightening and inspiring and gives both of our teams an opportunity to apply product vision to customer priorities.

Opinionated first, genericized later. New technology is released at a rapid clip. We know there’s a lot of noise. You need to get things done efficiently and the last thing you need is a case of analysis paralysis. Opinionated guidance will help you do that, giving you a leg up and a head start on the best approach. We’ve enabled cloud capabilities at some of the largest and most complex enterprises for over 13 years. We’ve infused those learnings and hardened processes into opinionated guidance by identifying what’s most effective. From identity and network operations to migration and modernization approaches, we arm delivery teams with constantly improving best practices and resources. Many organizations approaching RPA won’t be starting from ground zero in the cloud; the Microsoft RPA solution, with some help from other cloud resources (like our CDF), will allow you to leverage and build on your cloud investments.

Successful enterprise programs require insights from real-world, practical experience. The result of this whitepaper is an example of the powerful impact of talented product teams and veteran systems integrators coming together. We were able to bring learnings from building out enterprise-level Azure environments. Together with Microsoft, we shaped the vision for RPA technology into practical, hands-on resources for building and supporting hyperautomation. Insight from previous experience, enterprise technologies outside the Microsoft suite, and an understanding of customers’ business outcomes allowed us to inform the product and develop the resources to enable it. The combination of product team vision and insight and hands-on, experienced practitioners is a winning formula for developing valuable customer guidance.

Checklists are always helpful. Checklists drove the format of the whitepaper to help us make the content as actionable as possible. The goal of this whitepaper was to set forth a set of optimal milestones and share the thinking to help teams make progress faster. This approach was driven by Apostolis Papaioannou, Principal Program Manager at Microsoft. His vision was to create content that was consumable for such a vast topic, building on the foundation of the Holistic Enterprise Automation Techniques (HEAT). What you will find with this whitepaper is a thorough overview and actionable steps to get a workable environment up and going quickly. There’s a wealth of additional material and documentation available today with more coming soon.

Cross-cutting teams are the means to succeed. We brought in AIS Power Platform, Azure, M365, and other architects with varying skillsets to support the whitepaper creation, in addition to the Microsoft contributors, and the range of experience and perspectives made a big impact on the outcomes and guidance. This experience was yet another example of the success and trend towards cross-functional and cross-cloud collaboration, a concept and model the AIS team has adopted with much success.

Write a lot. Trim a lot more. Repeat. This was the flow of operations. We would quickly get up to 100 pages, then trim back a lot. This helped us get all the ideas and perspectives out there, before evaluating to focus on the right message and resources. In some cases, guidance can boil the ocean. Our goal was to vet the options and provide an opinionated best path forward to support organizations in focusing on the right things.

Get Started: Administering a Low-Code Intelligent Automation Platform for Your Organization

Are you considering RPA solutions for your organization? Check out the whitepaper, Enterprise Deployment for RPA and more in Power Automate, as well as other resources below. Please share a link with your partners and colleagues and let us know what you think. If you have any questions, please reach out to the AIS team.

Access the whitepaper: https://aka.ms/autocoeadminwhitepaper

Helpful Microsoft RPA Resources and Guidance Links

• Automation CoE Blueprint https://aka.ms/autocoeblueprint
• Automation CoE Strategy https://aka.ms/autocoestrategy
• HEAT https://aka.ms/rpapnp
• HEAT video series https://aka.ms/rpapnpvideo
• Whitepaper overview blog: https://powerautomate.microsoft.com/en-us/blog/administer-and-govern-a-low-code-intelligent-automation-platform-whitepaper-enterprise-deployment-for-rpa-and-more-in-power-automate/
• Automation Admin & Governance Whitepaper https://aka.ms/autocoeadminwhitepaper
• Manage Power Automate for Desktop on Windows https://aka.ms/padonwindowspnp
• Hyperautomation SAP Playbook (https://aka.ms/MicrosoftRPAPlaybookForSAPGUI) & video series (https://aka.ms/AutomateItSAPSeries)
• Automate It video series: https://aka.ms/AutomateIt
• RPA in a Day training: https://aka.ms/RPAinaDayPackage

Acknowledgments

Thank you to Apostolis Papaioannou, Kent Weare, Pranav Rastogi, Anitha Natarajan, Jonathan Eckman, Lav Gupta, Brent Wodicka, Vishwas Lele, Gautier Chastan, Kathy Osborne, Rakesh Krishnan, Amit Bhambri, Ashvini Sharma, and Jonathan Kendall for the partnership on this whitepaper.