MICROSOFT SUPPORT SITE

Microsoft SMC (support.microsoft.com) is Microsoft’s main support and information website for its ecosystem of products. Even though the site is visited by millions of users per month, the users’ characteristics are unclear, and telemetry data shows a low customer satisfaction (CSAT) score.

The Microsoft MARVEL team approached us to help them understand who the SMC site users are, why they visit, what challenges they face, and how this service can be improved. A business goal identified by the MARVEL team was to lower the number of people who contact their live customer support as live support is costly.

TEAM

Freda Hu

Michelle Lee

Sofia Rodriguez

Caleb Tan

TIMELINE

January - June 2021

Role

UX Designer

Tools & methods

Mindsets

Journey Maps

Competitive Analysis

Telemetry Data

Survey & Interviews

Crazy Eights

Figma

Usability Testing

Primary and secondary research was conducted during the research stage.

Secondary Research

Secondary research was conducted as due diligence prior to our primary research. With secondary research, we were able to gain a thorough understanding of the support space and the current state of Microsoft SMC before meeting with actual users.

TELEMETRY DATA

We leveraged Microsoft’s existing telemetry data to quickly gain a basic understanding of who users are and their site-usage habits. We were also able to identify pages with high traffic and click through rates but low CSAT scores as pages of interest.

MICROSOFT-RUN USERTESTING

While we were planning our research, a member of the Microsoft team deployed a UserTesting study with five participants. This study was similar in nature to our intended research and afforded us the opportunity to learn from this initial effort.

Secondary Research Key Insights

COMPETITIVE ANALYSIS

01.

An area of opportunity is to clearly indicate types of content and to improve page navigation.

02.

Community-generated responses need to be more clearly delineated if SMC wants to create tie-ins with Microsoft’s community forum.

03.

Microsoft integrating search result filtering can potentially help users better find content and streamline their search process.

TELEMETRY DATA

01.

More than 90% of users don’t sign in, so our solution would need to be accessible without authentication.

02.

52.2% of authenticated users have consumer or family accounts, indicating that the majority are self-solving.

03.

23.4% of users get to SMC via external search, which means not all users land on the homepage.

MICROSOFT-RUN USERTESTING

01.

Users generally confused SMC with answers.microsoft.com (Microsoft’s community forum) and vice versa.

02.

If users didn’t find their answer in the first place they would ‘pogo-stick’ to other websites until they found what they needed.

03.

Users like to contact live support because it is faster and a more personalized problem-solving experience.

Primary Research

We were granted access to Microsoft’s proprietary UX research tools for our primary research. We chose to use these tools as it would allow us to recruit and speak with actual SMC users for each of our primary research methods. Our primary research methods consist of Microsoft’s UX research tools – Intercept Interviews, Greenroom interviews – and a survey hosted on SMC.

INTERCEPT IN-SITU INTERVIEWS

Microsoft’s UX Research suite includes a tool that allows researchers to speak to anonymous participants in-situ. This method allowed us to meet users where they were and speak to them at the height of their problem-solving process.

8 participants

15 minutes each

SURVEY

With a large number of daily users, surveys would help to gather a breadth of experiences. We placed our surveys across several pages on SMC – these include the homepage, as well as high traffic and click through but low CSAT score pages identified with telemetry data.

Total responses: 798

Valid responses: 347

GREENROOM IN-DEPTH INTERVIEWS

Greenroom is Microsoft’s tool for creating interview panels. By using Greenroom, we were able to establish a sense of direct contact with Microsoft for interviewed SMC users. This would also create greater impact from our research as Microsoft can reach out to these participants in the future.

10 participants

8 countries

Key Research Takeaways

1. Designing for low confidence users as the lowest common denominator

“If I’m not tech-savvy, having the visual troubleshooting step will help me more than just the text”

Our survey data showed that low confidence users are the main factor for a low CSAT score. We hypothesized that designing for the low computer-confidence users would have a bottom-up effect and improve the self-solving experience for medium and high confidence users as well.

2. Users need help with describing and specifying their problem

“I'm not looking at the right places so it's stressing me”

Users’ understanding of their problem is usually vague and incomplete. They don’t know how to describe their problem in a way the system understands. We decided to lower this barrier by helping the user recognize rather than recall their problem.

3. Users struggle to distinguish between similar content and find what they need

“It was frustrating to not be able to find a solution. I probably did not spend enough time searching for the solution”

Users tend to use search to find a solution. However, search results often don’t match the user’s search intent. Many articles look similar but ultimately differ, causing confusion. Our design helps users find the solutions that match their specific needs.

4. Users find content on SMC to be overly technical and hard to parse

"It requires in-depth knowledge of background of the computer, like to be a surgeon"

There’s a mismatch between the language adopted by the website and the user. Users often find the content to be too technical and difficult to understand. We provide solutions that help users understand the techinical concepts and navigate faster.

Research Artifacts

Synthesizing our research, we developed Mindsets and Journey Maps. I was responsible for synthesizing findings and creating the Mindsets.

Mindsets were chosen over personas as they better represent the situational nature of being a low-confidence user. We noticed instances where participants would rate themselves as being confident but similarly struggling on SMC. Mindsets were best suited as they avoid any subconscious biases that may be attributed to traits like age and race. Instead, naming it stranded on an island builds greater empathy by conveying the feeling of being lost and alone.

Journey maps were used to capture the turning points in experiences across the self-solving process on SMC. These artifacts helped us communicate the emotions, pain points, and reactions that occur and the value of streamlining the support experience to the Microsoft team.

Design Question

How might we help low computer-confidence users clearly articulate their problems and distinguish the information they need so that they can more effectively locate their solution?

Brainstorming

Two areas across the experience presented opportunities to address our design question – during search and within articles.
We used the “Crazy Eights" design method to generate innovative solutions. From this eight minute sketching session, four major potential design directions surfaced. We sought feedback and iterated on these four design concepts as our core features.

In-Article Assistance

Conversational Audio Input

Personalized search results

Chatbot/Virtual Agent AI

After feedback from industry mentors to innovate and scaffold more for low confidence users, we melded the conversational style with personalized search results, creating a more dynamic filtering process and incorporated hover-over explanations as a way to increase understanding in a low-tech experience friendly way.

Initial Design

Users like the responsive and personalized experience of talking to a support agent

A compilation of the initial design of the search results page with the search assistant.

Users do not immediately seek live technical support, but desire a personalized experience. This finding inspired us to create a personalized and responsive digital experience that can guide users through their self-solving journey. A dynamic, voice-accessible search filtering tool guides problem articulation and identification, and helps with distinguishing information.

A quarter of SMC visitors enter via external search and never see the homepage.

A compilation of the initial design of in-article assistance, including community forum integration.

23.4% of SMC visitors land on an article page directly from external search (e.g., Bing, Google). This meant that an in-article solution is equally important in the self-solving process. Hover-over explanations, community responses, and the ability to highlight confusing terms were created to directly address our design question within articles.

We conducted three rounds of usability testing with low-medium confidence users over UserTesting. Through a screener we were able to recruit users who had prior experience with Microsoft Support but did not feel confident.

We started with moderated usability testing because we wanted to ensure we collected sufficiently rich data with the initial design. We would also be able to provide live assistance if necessary if the initial design proved to be overly complex. We were also Wizard of Oz-ing the voice feature and so a moderated setting was the most feasible for this.

We pivoted to unmoderated tests with the goal of assessing the natural discoverability of features and navigation, and identifying blockers. This would also account for participant social desirability bias. We adjusted testing scenarios and tasks accordingly.

Design iterations

Search assistant - Dynamic Search Result Filtering Tool

Finding

Users thought the search assistant was essentially a chatbot, and the voice input feature reinforced this impression. Therefore, they expected the search assistant to be smarter and more capable.

Improvements

The tone of the search assistant was adjusted to be less human-like but still friendly and easy to understand. Voice search was separated from the search assistant and integrated into the search box to reduce confusion.

A close up of the search assistant with the wording, "Hi, I'm your virtual search assistant!" and "I have refined the search result to match your problem".New iteration of the search tone saying, "Share more information to get the most relevant search results", and the voice option being moved to the search bar.

In-Article assistance

Finding

The in-article assistant needed work for several reasons. It was was not very noticeable and participants disliked having to highlight terms that were confusing on their own.

Improvements

The Search Assistant is carried over into the article to create continuity and users having to highlight confusing terms themselves is removed; instead, their search keywords are pre-highlighted and accessible through the Search Assistant.

The initial in-article solution with with modal call outs and a user highlighting feature.The second iteration of the in-article solution where the search assistant is on the side of the article with navigation buttons. Highlights now reflect search keywords.

“...Accessibility [is] very easy, [the functionality is] very helpful compared to my experience of not just Microsoft but help sites I use for all kinds of technical stuff”

After three rounds of usability testing, our solution was generally well received and understood. Participants mainly noted improvements in accessibility and functionality.

Our design solution focuses on dynamic search result filtering plus in-article technical term explanation and navigation aids to help users locate their solution easily. In addition, we redesigned the search results page, incorporated access to community answers, and included voice accessibility to aid the self-solving process.

By offering a similar experience to the support process provided by support professionals, it also helps Microsoft’s business goal of lowering live customer support tickets.

Dynamic search result filtering

Search assistant provides users with a low effort, visual, and conversational filter to help them specify their problem.

By proactively asking for more information (product, device, version, etc.) about the problem, the search engine can better understand the user’s specific problem and provide the most applicable solution.

Search results and answer snippets are narrowed down in real time to present the solution most relevant to users’ specific needs.

1

A low effort and conversational Search assistant.

2

Real time updated search results.

A long list of search results with the  search assistant and top search result highlighting an update. Next to it is the hierarchy of questions for narrowing down a search.

In-Article Navigation Aids

Upon entering an article, the Search Assistant continues to support users’ self-solving.

The Search Assistant highlights keywords from their search and provides quick and easy navigation. This reduces the time needed to scan for the most relevant information.

Low confidence users have an easier time with available technical terms explanations without being taken off the page. Technical terms are similarly easy to identify and navigate to with the support of the Search Assistant.

3

Search keywords for quicker information parsing.

4

Easier navigation of article with navigation aids.

5

Quick insight into technical terms with previews.

Content clarity and organization

The search result page redesign helps users identify the information they need.

Users can better distinguish between similar content with clearer article hierarchy, content type, and article previews.

With the help of suggested related search terms and questions others also asked, users can easily recognize their problem if they are not sure how to describe it.

Check out our interactive prototype and demo video!

Reflecting on my experience

This project offered me many new experiences. Working closely with a company of Microsoft’s size, reach, and resource, and collaborating with their diverse team. In the month of May, Microsoft SMC received a billion page visits. I am glad we decided to expand our toolbox by learning about Mindsets as personas would  have not captured the complexity of SMC's users.

One major challenge we experienced during the project was using the Microsoft UX Research tools. While this helped us establish a sense of direct communication with Microsoft for users, anytime we tweaked our research, we would have to go through the Microsoft team. This delayed our research process.

If I had more time, I would have continued with A/B testing of the in-article interactions and explanations. I would also design for the mobile experience as we have heard some users prefer visiting the site on their phones. Lastly, there is opportunity to further build out the hierarchy of information with the Search assistant filtering questions.