This blog will share ten thought-provoking moments from the Gartner IT Symposium/Xpo™ 2022 conference. The event agenda was rich with overlapping sessions; my observations are based on sessions I could attend, and I look forward to watching the recordings of sessions I could not view live.

One: The Metaverse Equation

There are so many instances of metaverse today. Microsoft alone has a consumer, commercial, and industrial metaverse. An example of the industrial metaverse is their collaboration with Coke Hellenic to build a “digital twin” fabric of sensors on the factory floor and then allow employees to experience the twin itself. AltspaceVR, a social VR platform now owned by Microsoft, offers a space for various communities while corporations explore use cases for employee training or onboarding. The near and actual examples of the metaverse in action are everywhere; for Saudi Arabia, it is their Neom city; for Nike, it is NikeLand, and so on.

It does not matter if you use AR/VR/ MR, 3D simulation, 2-D rendering, or any device you choose. The metaverse is about presence and experience. Your cartoony avatar in the metaverse is not about photo realism. It is about the ability to interact with others in a shared space from anywhere in the world. Most importantly, it is about the ability to control your experience.

The session ‘Building a Digital Future: The Metaverse’ Gartner analyst presented the Metaverse equation using three areas as pillars: transport, transform and transact.

  • Metaverse is a shared space you need to “transport into” via a headset, spatial computing glasses, 3D glasses, a game rig, or a mobile device or a PC).
  • Once transported to the metaverse of your choice, your surroundings are transformed.
  • Finally, once you are in the metaverse, you can transact can use crypto and Web3 to transact (for example, you can buy NFT sneakers in Nikeland)

Ultimately, I think the success of the metaverse will depend on interoperability and establishing a common identity. It will be a while before we see such interoperability.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Building a Digital Future: The Metaverse
Speaker: Marty Resnik

AIS hosted a hiring event and decided to go remote and test a Virtual Reality-based format using AltspaceVR, the leading platform for live, virtual events. Learn more about our experience in our blog.

Two: Talent Force Multipliers

In the Opening Keynote- Gartner analyst Tina Nunno spoke about IT for Sustainable Growth and mentioned three Force Multipliers for Revolutionary work. Below are my observations on each.

Take the friction out of work.
Whether dealing with cumbersome job applicant tracking systems or everyday enterprise applications, friction is like sand in your bike’s gears, making every hill seem more significant. Workers satisfied with their day-to-day applications are twice as likely to stay.

Invest aggressively in AI augmentation.
Think of AI as a way to augment our employees’ reach, range, and capabilities by helping them with everyday decision-making. Unity Health uses an AI-driven tool like Chart Watch looks at 100 different variables in a patient’s chart, including lab results and vital signs, and determines whether the patient is at a low, moderate, or high risk of needing ICU care. AI is not replacing emergency room doctors; it merely assists them with informed decision-making.

Experiment with the highly visible, highly hyped.
Invest in emerging and highly hyped and do that publicly. We all know that a higher failure rate is associated with emerging and highly hyped projects. But organizations that are seen as innovative are more attractive to potential employees and tend to be ahead of the pack during the downturn.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Talent Force Multipliers
Speaker: Tina Nunno

Three: Continuous Modernization Value Stream to Deliver App Migration

App migration needs to be a constant process of modernizing. In the session “Use the Continuous Modernization Value Stream to Deliver Application and Cloud Migration Priorities” at the Gartner IT Symposium/Xpo, analyst Howard Dodd shared, I heard this quote: “You must continuously modernize the most valuable and vulnerable parts of your portfolio.”

This session taught me that the main idea is to think about Assess, Transform, and Evolve loops. Don’t try to do this all at once. Instead, take one opportunity, walk it to the end, learn from it, and apply the learnings to the next iteration of the loop. Such a feedback loop lets you start going faster and delivering a steady value stream. The most significant benefit of the incremental approach is that it allows you to change. If for some reason, that container orchestration platform you placed at the center of your app migration strategy does not work out, you can change it.

Finding value in app modernization aligns with your long-term strategy. It is about the outcomes. For example, rather than talking about percentages of applications migrated, talk about outcomes. Consider this example: “As a healthcare company, we want to increase engagement with our members by helping them make cost-conscious decisions about their plans. We will do that by modernizing our web applications to offer the lowest prescription….” Notice that the app modernization is tied to an outcome with a defined measure of success. Migration is not complete by deploying the app to the cloud. It needs to be continuously monitored and improved.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Use the Continuous Modernization Value Stream to Deliver Application and Cloud Migration Priorities
Speaker: Howard Dodd

Four: BeeKeeperAI Demo


Generalizing ML algorithms are complicated (think cost and time) mainly because using synthetic or de-identified training data can create a significant amount of overhead. Only 16 algorithms have achieved the “DEN” designation from FDA.

BeeKeeperAI is attempting to solve this problem by giving algorithm developer access to real data without data ever leaving an organization’s premises. Algorithm developers deploy their algorithm to a confidential computer-based secure enclave (created by the data owner). These secure enclaves eliminate the risk of data exfiltration and interrogation of the algorithm IP from insiders and third parties.

For more information, check out the BeeKeeperAI website.
Slide: This is not a Gartner presentation, and no deck was provided. The picture above is from the BeeKeeper website.

Five: Future of AI

AI as a companion technology. It will be pervasive across all jobs, not just as a tool but as a teammate. AI won’t replace your plumber but can check the plumber’s work.
AI will require less data and computation by combining purely data-driven with a login-driven AI. It would take a neural network of about 100,000 games to learn “Tic Tac Toe.” We can significantly reduce learning time by telling the algorithm to start in the middle.

In the session “The Future of AI,” analyst Whit Andrews mentioned, “AI Is Accessible to More People With Less Skills and More Knowledge.” I feel that the advent of open-source algorithms and composable AI patterns will make AI more accessible to more people with fewer skills but more business knowledge to drive business outcomes.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: The Future of AI
Speaker: Whit Andrews

Six: Data Ecosystems

Cohesive cloud-based data ecosystems are on the rise and are expected to be the dominant choice in future years. These ecosystems include CSP native tools and a collection of third-party ISV tools as necessary. In the session, “Why CIOs Must Care About Data and Analytics Ecosystems – Adaptability, Speed and Lower Cost,” a Gartner analyst shared the common “data use cases in the data ecosystem: applications, data science, AI/ML, IoT, analytics, event processing, marketplaces, and edge computing.” Data ecosystems, especially those completed based on CSP-native tools and services, can pose a lock-in challenge, leading to higher prices. The competition within the cloud data ecosystem is broad enough that the cost is expected to go down. Additionally, the cost of a DIY data ecosystem is high because of the integration costs.

For example, consider Microsoft’s Intelligent Data Platform, which deeply integrates its databases, analytics, BI, and data governance products into a data ecosystem. At the recently concluded Ignite, Microsoft added a partner ecosystem for the Intelligent Cloud.

Microsoft Intelligent Platform

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Why CIOs Must Care About Data and Analytics Ecosystems – Adaptability, Speed, and Lower Cost
Speaker: Donald Feinberg

Seven: Future of Cloud 2027

Cloud is transitioning from the technology core (provided by hyper-scalers today) to capability enhancement that adds value to core services. Cross-cloud data platforms like Snowflake and containers as common infrastructure layers and operations layers are examples of capability enhancement.
It is expected that most of the customer requirements would be satisfied by CSP-Native offering versus container-focused. I wonder if SP first-party services like AKS are considered CSP-native or CNCF-native.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Future of Cloud 2027: From Technology to Business Innovation
Speaker: David Smith

Eight: The New Economics of Technology

Strive to evaluate our assumptions in the face of technological disruption constantly. In the session “The New Economics of Technology,” analyst Daryl Plummer stated, “end-user organizations must manage the risk associated with failure to anticipate new economics brought about by technology disruption.” This means that technology disruptions lead to a change in technology economics. So, we must find new value stories to create new growth opportunities for our companies and customer organizations.

Below are the four primary phase shifts in technology shared by analyst Plummer and my understanding of each:

  • Control to democratization. To prepare for this shift, IT leaders must invest in a governance model allowing citizen developers to participate in content creation.
  • Heuristic to Intelligence. AI is being embedded into applications allowing the applications to analyze and reason at a higher clip. To prepare for this shift, IT leaders must invest in a robust ML Ops governance model.
  • Data Center Isolation to Cloud Concentration. This shift is already well underway. According to Gartner, “nearly 60% of IT spending on application software will be directed toward cloud technologies by 2024”.
  • Centralized authority is moving towards a decentralized model. Web3 and Smart Contracts are driving decentralized governance.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: The New Economics of Technology
Speaker: Daryl Plummer

Nine: Democratized Digital Delivery with Fusion Teams

Digital Democratization is defined as making the creation and management of information technology accessible to everyone. In the session, “Democratized Digital Delivery: Fusion Teams and Product Management Explained,” analyst Jaime Capellá shared stats. These stats made me think that as CEOs push towards digitization, they are increasingly looking for technology work to be done directly within the business function and less in IT. A dominant trend, “Fusion Teams,” is emerging to support this objective. Fusion teams are “multidisciplinary teams that blend technology or analytics and business domain expertise and share accountability for business and technology outcomes.”

The analyst referred to “Fusion Teams” as a new IT and Business interface.

For Fusion Teams to be successful, IT leaders need to build a consistent foundation of platform products/services such as cloud infrastructure, security, low code solutions, and data.
Additionally, IT leaders must embed cross-cutting experts across the fusion teams, including architecture, security, and compliance.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Democratized Digital Delivery: Fusion Teams and Product Management Explained
Speaker: Jaime Capellá

Ten: Major Trends in Robotic Pocess Automation

In the session, “Magic Quadrant for Robotic Process Automation” analyst Saikat Ray shared the following major trends in RPA, including:

  1. APIs complement screen scraping capabilities.
  2. RPA vendor is constantly evolving, leading a fluid vendor landscape.
  3. Customers are going from RPA to Hyperautomation.; In my opinion, this is moving beyond task-based RPA to a platform for automation that includes resilience, orchestration, and infusion of AI. Another critical aspect of hyper-automation is access to low-code and no-code technologies.
  4. Finally, vendors are coming out with innovative RPA pricing models, including consumption-based pricing.

For more information refer to the Gartner® IT Symposium/Xpo™ 2022 session
Session Name: Magic Quadrant™ for Robotic Process Automation
Speaker: Saikat Ray


Did you attend the Gartner IT Symposium/Xpo™ conference? What were your takeaways? Reach out to Vishwas Lele on LinkedIn to share your thoughts.

GARTNER and IT SYMPOSIUM/XPO are registered trademarks and service marks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved.

Can The Power Automate Process Advisor Be Used as a Time Study Tool?

Microsoft has given us many tools over the years with multiple applications. The Process Advisor, now available in commercial, GCC, GCC High, and DoD environments, is simple enough to understand and use but has many potential applications.

What is the Process Advisor?

The Process Advisor is a tool included within Power Automate. It is used to record mouse clicks and keyboard keystrokes during a process. After recording a repetitive task, the recordings are uploaded and analyzed by the Process Advisor, making recommendations about how Power Automate can streamline the workflow.

Why is using a Process Advisor in Power Automate Important?

Automating processes saves time and Process Advisor shows you the way.  After recording a process, it will break down the steps and recommend, on a per-step basis, what could be automated with cloud flows, desktop flows, RPA, and even AI models. 

What is a Time Study?

A time study is precisely what it sounds like; studying the time it takes to complete a task. The goal of a time study is to document the steps necessary to complete a task and how long those steps take, on average.  This information is then used to try to make the process more efficient and reduce how long the task takes. There are many ways to conduct a time study. They range from using a stopwatch and paper to using expensive time study software.

Once the information has been gathered, you must consolidate, clean, and analyze the information to decipher potential improvements. The good news is that the Process Advisor uses AI to do everything for you.

Challenges and Considerations

One of the major considerations whenever using the process advisor is the scope.  In other words, what steps are going to be included when you record this task.  For example, if you have a process to receive requests from customers through your website, email, and physical mailings, you may wish to scope your recordings to include all three or prioritize one. 

Benefits of using a Time Study with Process Advisor

There are two tangible benefits of using the Process Advisor to conduct a time study. The first is the time savings when you receive your results in real-time versus all the work to manually consolidate the data and turn it into actionable information.  The second is the immediate recommendation that Process Advisor gives you about how to leverage individual pieces of the Power Automate offerings to automate the process. 

How Do I Get Started?

Install Power Automate Desktop here: and follow the installation setup instructions.

After installation, open the Power Automate Desktop. Create a new flow, name it descriptively, and open the Desktop Recorder or Web Recorder depending on where this process resides.

Power Automate Desktop

Click Record, complete the task, then click Finish. Do this as many times as you would like (a minimum of two required for the next step), and multiple people conducting the same actions is recommended, so the tool has a lot of information to glean from.

After recording several examples of the same task being completed, have everyone upload their individual recordings to a shared area inside of the Process Advisor, which can be found in the Power Automate section of Microsoft Office, here:

Shared Area inside Process Advisor

Here you can click Create to add a new process and share it with your team. This will allow them a single place to upload their recordings for analyzing and keeping things organized.

After you have enough recordings uploaded to analyze (the minimum requirement is two recordings), you can select Processes to begin the analysis process. Here you can see everything that has been submitted for analysis, who submitted it, and what the Status is. Select Analyze at the top and all recordings will be Analyzed at once; this may take a few minutes.

Analyze recordings

Once it successfully analyzes your recorded processes, select the Analytics button.

Analyze and record

Remember all those steps after completing your data gathering? The Process Advisor has already completed them. A flow chart has been created showing you the variance in the process. Times have been assigned to steps to show how long actions take. Actions that take longer have been color-coded with a red highlight to draw attention. Also, filters can be applied down the right side of the screen to dive into your data and really focus on what the major culprits are for increasing the length of time a task takes.

Apply Filters to Dive into Data

This works wonders for saving you time having to clean and organize data after the initial gathering phase of your time in motion study. But Process Advisor doesn’t stop there. At the top of the screen, there’s another helpful button.

Preview Automated activities

This will launch another window with context from your process recordings. It will then take you to a new instance of Power Automate, where it recommends parts of these recordings that Power Automate can do for you. In short, the Power Automate Process Advisor took the time required to collect the data, submit it, analyze it, and use it to make recommendations for process improvements and reduce it drastically.


The results you receive are nearly instant and take the guesswork out of how to automate your current process.  Using Power Automate to use computation power to do repetitive tasks will save you time and money.  Process Advisor will get you there faster and using it as a Time Study tool will also have the added benefit of prioritizing the biggest wins as far as time savings is concerned. 


Microsoft has built a training that demonstrates exactly how to use the Process Advisor, coupled with AI, to their fullest potential. Within this training, an invoice process is analyzed. The bottleneck is identified when invoices arrive via email, and then the tool recommends an RPA process to automate that task.

Contact us to schedule your own RPA-in-a-day!

Practitioner’s Guide to Data Science: Streamlining Data Science Solutions using Python, Scikit-Learn, and Azure ML Service Platform by Nasir MirzaIn addition to all the other great work he does for AIS and our clients, Nasir Mirza found the time to put together a comprehensive guide for programmers who wish to pursue AI/ML development. This guide provides practical guidance and examples for how to build a solid conceptual foundation and familiarity with data sciences related processes and frameworks. The Practitioner’s Guide to Data Science: Streamlining Data Science Solutions using Python, Scikit-Learn, and Azure ML Service Platform covers data science concepts, processes, and real-world hands-on use cases.

Nasir has been with AIS for 12 years, supporting and leading many projects across cloud and data, and helping our clients deliver business outcomes. From his deep experience and expertise, he’s provided guidance to help readers learn:

  • Applied context of Data Science during unprecedented growth in the global data
  • Organizing Data Science projects using CRISP-DM and Microsoft TDSP
  • Hands-on and guidelines on Data acquisition, exploration, and analysis
  • Implementation of data pre-processing and Feature Engineering
  • Understanding algorithm selection, model development, and model evaluation
  • Hands-on with Azure ML Service, its architecture, and capabilities
  • Using Azure ML SDK and MLOps for implementing real-world use cases

If you’re interested in growing your career in AI/ML development, or you’re a Software Architect or Manager involved in the design and delivery of data science-based solutions, get your copy today and learn from a seasoned data science professional.