Microsoft SMC (support.microsoft.com) is Microsoft’s main support and information website for its ecosystem of products. Even though the site is visited by millions of users per month, the users’ characteristics are unclear, and telemetry data shows a low customer satisfaction (CSAT) score.
The Microsoft MARVEL team approached us to help them understand who the SMC site users are, why they visit, what challenges they face, and how this service can be improved. A business goal identified by the MARVEL team was to lower the number of people who contact their live customer support as live support is costly.
Freda Hu
Michelle Lee
Sofia Rodriguez
Caleb Tan
January - June 2021
UX Designer
Mindsets
Journey Maps
Competitive Analysis
Telemetry Data
Survey & Interviews
Crazy Eights
Figma
Usability Testing
Primary and secondary research was conducted during the research stage.
Secondary research was conducted as due diligence prior to our primary research. With secondary research, we were able to gain a thorough understanding of the support space and the current state of Microsoft SMC before meeting with actual users.
We leveraged Microsoft’s existing telemetry data to quickly gain a basic understanding of who users are and their site-usage habits. We were also able to identify pages with high traffic and click through rates but low CSAT scores as pages of interest.
While we were planning our research, a member of the Microsoft team deployed a UserTesting study with five participants. This study was similar in nature to our intended research and afforded us the opportunity to learn from this initial effort.
An area of opportunity is to clearly indicate types of content and to improve page navigation.
Community-generated responses need to be more clearly delineated if SMC wants to create tie-ins with Microsoft’s community forum.
Microsoft integrating search result filtering can potentially help users better find content and streamline their search process.
More than 90% of users don’t sign in, so our solution would need to be accessible without authentication.
52.2% of authenticated users have consumer or family accounts, indicating that the majority are self-solving.
23.4% of users get to SMC via external search, which means not all users land on the homepage.
Users generally confused SMC with answers.microsoft.com (Microsoft’s community forum) and vice versa.
If users didn’t find their answer in the first place they would ‘pogo-stick’ to other websites until they found what they needed.
Users like to contact live support because it is faster and a more personalized problem-solving experience.
We were granted access to Microsoft’s proprietary UX research tools for our primary research. We chose to use these tools as it would allow us to recruit and speak with actual SMC users for each of our primary research methods. Our primary research methods consist of Microsoft’s UX research tools – Intercept Interviews, Greenroom interviews – and a survey hosted on SMC.
Microsoft’s UX Research suite includes a tool that allows researchers to speak to anonymous participants in-situ. This method allowed us to meet users where they were and speak to them at the height of their problem-solving process.
8 participants
15 minutes each
With a large number of daily users, surveys would help to gather a breadth of experiences. We placed our surveys across several pages on SMC – these include the homepage, as well as high traffic and click through but low CSAT score pages identified with telemetry data.
Total responses: 798
Valid responses: 347
Greenroom is Microsoft’s tool for creating interview panels. By using Greenroom, we were able to establish a sense of direct contact with Microsoft for interviewed SMC users. This would also create greater impact from our research as Microsoft can reach out to these participants in the future.
10 participants
8 countries
“If I’m not tech-savvy, having the visual troubleshooting step will help me more than just the text”
Our survey data showed that low confidence users are the main factor for a low CSAT score. We hypothesized that designing for the low computer-confidence users would have a bottom-up effect and improve the self-solving experience for medium and high confidence users as well.
“I'm not looking at the right places so it's stressing me”
Users’ understanding of their problem is usually vague and incomplete. They don’t know how to describe their problem in a way the system understands. We decided to lower this barrier by helping the user recognize rather than recall their problem.
“It was frustrating to not be able to find a solution. I probably did not spend enough time searching for the solution”
Users tend to use search to find a solution. However, search results often don’t match the user’s search intent. Many articles look similar but ultimately differ, causing confusion. Our design helps users find the solutions that match their specific needs.
"It requires in-depth knowledge of background of the computer, like to be a surgeon"
There’s a mismatch between the language adopted by the website and the user. Users often find the content to be too technical and difficult to understand. We provide solutions that help users understand the techinical concepts and navigate faster.
Synthesizing our research, we developed Mindsets and Journey Maps. I was responsible for synthesizing findings and creating the Mindsets.
Mindsets were chosen over personas as they better represent the situational nature of being a low-confidence user. We noticed instances where participants would rate themselves as being confident but similarly struggling on SMC. Mindsets were best suited as they avoid any subconscious biases that may be attributed to traits like age and race. Instead, naming it stranded on an island builds greater empathy by conveying the feeling of being lost and alone.
Journey maps were used to capture the turning points in experiences across the self-solving process on SMC. These artifacts helped us communicate the emotions, pain points, and reactions that occur and the value of streamlining the support experience to the Microsoft team.
Our final mindset is of one at the crossroads, someone who is aware they don’t know everything, but is optimistic about and knows how to seek help. These mindsets tend to stay more positive as they are already aware of their own limitations. They are motivated to self-solve as much as they can.
Two areas across the experience presented opportunities to address our design question – during search and within articles.
We used the “Crazy Eights" design method to generate innovative solutions. From this eight minute sketching session, four major potential design directions surfaced. We sought feedback and iterated on these four design concepts as our core features.
After feedback from industry mentors to innovate and scaffold more for low confidence users, we melded the conversational style with personalized search results, creating a more dynamic filtering process and incorporated hover-over explanations as a way to increase understanding in a low-tech experience friendly way.
Users do not immediately seek live technical support, but desire a personalized experience. This finding inspired us to create a personalized and responsive digital experience that can guide users through their self-solving journey. A dynamic, voice-accessible search filtering tool guides problem articulation and identification, and helps with distinguishing information.
23.4% of SMC visitors land on an article page directly from external search (e.g., Bing, Google). This meant that an in-article solution is equally important in the self-solving process. Hover-over explanations, community responses, and the ability to highlight confusing terms were created to directly address our design question within articles.
We conducted three rounds of usability testing with low-medium confidence users over UserTesting. Through a screener we were able to recruit users who had prior experience with Microsoft Support but did not feel confident.
We started with moderated usability testing because we wanted to ensure we collected sufficiently rich data with the initial design. We would also be able to provide live assistance if necessary if the initial design proved to be overly complex. We were also Wizard of Oz-ing the voice feature and so a moderated setting was the most feasible for this.
We pivoted to unmoderated tests with the goal of assessing the natural discoverability of features and navigation, and identifying blockers. This would also account for participant social desirability bias. We adjusted testing scenarios and tasks accordingly.
The search assistant icon on the top left is inconspicuous and hard to notice. The left-right layout after expanding is also not friendly for users with small screen sizes.
We expanded the search assistant by default to increase discoverability and recognition. It was moved to the top center of the page to increase prominence and fit various screen sizes.
Participants are very focused on the search assistant pane and don't notice when search results have updated.
A notification and change of color to call out the change in the search results was added. A CTA button to guide users to the refined search results was also included.
Users thought the search assistant was essentially a chatbot, and the voice input feature reinforced this impression. Therefore, they expected the search assistant to be smarter and more capable.
The tone of the search assistant was adjusted to be less human-like but still friendly and easy to understand. Voice search was separated from the search assistant and integrated into the search box to reduce confusion.
The in-article assistant needed work for several reasons. It was was not very noticeable and participants disliked having to highlight terms that were confusing on their own.
The Search Assistant is carried over into the article to create continuity and users having to highlight confusing terms themselves is removed; instead, their search keywords are pre-highlighted and accessible through the Search Assistant.
Participants were confused about the purpose of their search keywords being highlighted.
Users generally take well to hover effects. Thus, we used hover cards to inform users of the purpose of highlighted search keywords. Additionally, choosing orange for visual cues was used to evoke emotions of warmth and friendliness during this stressful process.
After three rounds of usability testing, our solution was generally well received and understood. Participants mainly noted improvements in accessibility and functionality.
Our design solution focuses on dynamic search result filtering plus in-article technical term explanation and navigation aids to help users locate their solution easily. In addition, we redesigned the search results page, incorporated access to community answers, and included voice accessibility to aid the self-solving process.
By offering a similar experience to the support process provided by support professionals, it also helps Microsoft’s business goal of lowering live customer support tickets.
Search assistant provides users with a low effort, visual, and conversational filter to help them specify their problem.
By proactively asking for more information (product, device, version, etc.) about the problem, the search engine can better understand the user’s specific problem and provide the most applicable solution.
Search results and answer snippets are narrowed down in real time to present the solution most relevant to users’ specific needs.
1
A low effort and conversational Search assistant.
2
Real time updated search results.
Upon entering an article, the Search Assistant continues to support users’ self-solving.
The Search Assistant highlights keywords from their search and provides quick and easy navigation. This reduces the time needed to scan for the most relevant information.
Low confidence users have an easier time with available technical terms explanations without being taken off the page. Technical terms are similarly easy to identify and navigate to with the support of the Search Assistant.
3
Search keywords for quicker information parsing.
4
Easier navigation of article with navigation aids.
5
Quick insight into technical terms with previews.
The search result page redesign helps users identify the information they need.
Users can better distinguish between similar content with clearer article hierarchy, content type, and article previews.
With the help of suggested related search terms and questions others also asked, users can easily recognize their problem if they are not sure how to describe it.
This project offered me many new experiences. Working closely with a company of Microsoft’s size, reach, and resource, and collaborating with their diverse team. In the month of May, Microsoft SMC received a billion page visits. I am glad we decided to expand our toolbox by learning about Mindsets as personas would have not captured the complexity of SMC's users.
One major challenge we experienced during the project was using the Microsoft UX Research tools. While this helped us establish a sense of direct communication with Microsoft for users, anytime we tweaked our research, we would have to go through the Microsoft team. This delayed our research process.
If I had more time, I would have continued with A/B testing of the in-article interactions and explanations. I would also design for the mobile experience as we have heard some users prefer visiting the site on their phones. Lastly, there is opportunity to further build out the hierarchy of information with the Search assistant filtering questions.