Boosting Microsoft Teams Adoption with Better Analytics

(Image credit: Getty Images)

A good deal of time is spent evaluating and selecting the right conferencing and collaboration technology. And yet, without broad-based adoption, the expected return on investment for new communication, collaboration, or meeting technology is almost never achieved. That being said, return on investment for each project may be tied to more specific business goals, such as reducing conferencing or travel costs, enabling remote work, or providing business continuity in the event an office location is inaccessible. In virtually every case, achieving unified communications and collaboration (UC&C) project goals requires strong adoption.

Before discussing how to measure and drive adoption, it is important to understand the difference between usage and adoption. Here’s how I distinguish between the two:

Usage: quantity of things—calls, meetings, minutes, messages

Adoption: which people—who is logging in daily, who is organizing meetings, who is attending meetings, who is using video (and often of equal importance, who is not adopting specific features and modalities).

Microsoft Teams, and Microsoft Office more broadly, includes a number of adoption-related reports. Skype for Business, like many other UC&C tools, does not include adoption reporting; however, it is possible to develop custom SQL queries to generate adoption reports for Skype for Business.

Related: AV Companies Share Video Meeting Best Practices

Certainly, quality and reliability are important factors that potentially increase both usage and adoption; however, experience has shown that user satisfaction ratings are a far better predictor of adoption. It turns out that quality, which is the focus of many reporting tools, has virtually no correlation to user satisfaction. There are several reasons for this. Firstly, quality is often expressed as a Mean Opinion Score (MOS), which is a numerical calculation intended to quantify a subjective user opinion. Creating an algorithm to estimate opinions is problematic. Secondly, in many systems, the MOS only reflects the network effects on quality. This is a holdover from the legacy telephony days when fixed, purpose-built desk phones were the primary communication device; in this case, the only variable that could be changed was the network. Today, overall communication quality is shaped by numerous factors including the endpoint device, the version of drivers installed, what other applications were simultaneously running, what audio and video devices were being used (built-in or USB attached), whether the network was wired, wireless, whether the user was moving during the call, whether the user was connected to a VPN, and more.

Reliability is arguably a better metric than quality to focus on. Most typically, reliability is defined as the ability to join a call or meeting (sometime referred to as dial-tone reliability) along with the ability to complete a call or meeting without an unexpected error terminating the session. This is far less subjective than a calculated quality score. And yet, sessions reported as having perfect reliability and quality could have still subjected a user to long wait times to join the call or meeting, yielding a poor user experience. 

Certainly better than algorithmically estimated quality ratings and even reliability ratings, user satisfaction is the best predictor of adoption, or as Kramer from Seinfeld might say, “Why don’t you just ask them how they like it?” 

Unlike the majority of UC&C tools, Microsoft Teams and Skype for Business both have a built-in “rate my call” mechanism that can be used to automatically solicit feedback from the user after a set percentage of calls (by default 10 percent of the time, or after approximately one in 10 calls). Users are prompted to score a call between 1 and 5 stars (very poor to excellent). Additionally, users can note the occurrence of one of more audio and/or video problems (if video was used) by selecting check boxes when prompted.

Rate My Call

Even knowing that you should monitor user satisfaction, the challenge becomes sorting through the proliferation of built-in and third-party reporting tools. Over time, it appears that focus has been placed on quantity and “sizzle” when it comes to reports.

With Teams, Microsoft offers no less than four reporting tools that provide analytics related to usage and adoption, quality, reliability, and user satisfaction:

  • Office 365 Admin Center Reports
  • Teams Admin Center Reports
  • Call Quality Dashboard Reports (including the ability to create and import new report templates)
  • PowerBI Reports (A pre-built Office 365 Usage and Adoption report along with seven new Call Quality Dashboard reports)

The new PowerBI Call Quality Dashboard reports (which are available for download from GitHub) are especially well designed visually; and yet, this beauty can distract from the key purpose to which all reports should aspire: helping deliver improved business outcomes. In other words, reporting and analytics needs to help advance UC&C adoption, which in turn, helps deliver on the business objectives you defined for your UC&C implementation. (You did define, document, and build consensus around your project objectives, right?)

A New ‘Beautiful’ Call Quality Dashboard

But even if you can decipher the reporting tools in order to measure adoption, how exactly do you increase adoption? It turns out that the primary drivers for adoption are not technical. Communications, ongoing training, and change management are key:

  • Communications: Why is the organization changing tools, how does it benefit the organization, and how does it benefit the individual?
  • Training: incremental and ongoing
  • Change management: detailed instructions explaining old process and new process (hopefully emphasizing some improvements with the new process)

The form that communications and training should take is often debated. The reality is “the proof is in the pudding.” What is best is what works and yields measurable results (adoption) in your organization. Cultures differ, workforce demographics differ, incentive plans and motivations differ. What remains consistent is that you need to work, measure, and then refine. 

Working to drive adoption and then analyzing and refining your efforts is complicated but necessary. Training existing IT professionals or finding external resources to assist you do this is one way you can greatly increase the probability your UC&C project will be successful.

Kevin Kieller is a UC & collaboration success advisor and partner at EnableUC, a company that helps measure, monitor and improve UC and collaboration usage and adoption through a unique set of products and services.

Kevin Kieller is a UC & collaboration success advisor and partner at EnableUC, a company that helps measure, monitor and improve UC and collaboration usage and adoption through a unique set of products and services.