Autodesk guided troubleshooting

Guiding customers along a solution path to solve complex software issues

content modeling - navigation - A/B testing

Problem Statement

Customers were sometimes failing to self-service when faced with troubleshooting info in the form of complex solution paths—a series of diagnostic articles users step through to find their solution. Customers were often presented with self-service articles that:

  • Were too long

  • Contained multiple solutions (”which one should I try?”). Negative helpfulness votes were associated with overly complex articles that crammed every possible solution into one article, which is daunting to users and violates the principle of progressive disclosure

  • Contained daunting and complex instructions

  • Were written as generic “how-to” articles that didn’t adequately capture the customer context or issue, leading to a solution that wasn’t findable

  • Were not served up in context: for instance, most error messages were dead ends, requiring users to exit the product and research the issue instead of receiving contextual help

Goals

  • Create an easier way for customers to diagnose their specific issue and be directed to the right troubleshooting content solution

  • Create an easier way for KDEs to author and display the relevant information

Team

As the sole UX designer on this project, I worked with product management, content strategy, visual design, Autodesk product KDEs (knowledge domain experts), and customer support agents to research and design a solution. I consulted with UX research to discuss options for user testing and collecting customer feedback.

Discovery & Research

Guided troubleshooting flows are common in software documentation and help content. Along with a competitive analysis of existing flows including Microsoft support pages, Adobe help, and online symptom checkers from healthcare sites, I consulted with Autodesk KDEs and customer support agents. I learned that candidates for guided troubleshooting articles are grouped in two categories:

  • Common Symptom and Many Solutions: Per the Knowledge-Centered Support (KCS) guidelines, “each step in this diagnostic process is itself an article. The appropriate resolution path is dictated by the outcome of each step. Computer programmers might think of this process as a series of if/then steps, and the procedural articles as reusable subroutines.”

  • Many Symptoms and Common Solution: Users are not even necessarily aware they’ve entered in to a solution path format. Improving both the content structure and UX for these paths help customers to troubleshoot problems themselves more easily. KDEs struggle to author these paths—the content standard does not address them, and there is no clear definition on how to structure them.

At the same time, customer support agents had begun to author similar troubleshooting articles using Qualtrics and Microsoft forms with generally positive feedback from customers.

Design

Based on the success of the initial troubleshooting forms, and following principles of designing simple flows for complex and detailed information, I designed a “symptom checker” that would walk customers through error messages and potential solutions.

KPIs & Customer Impact

We planned to monitor the helpfulness score on articles before and after implementation, and the total number of support cases generated from each article.

For example, pre-launch, 63 complex solution path articles showed these metrics in a single, low-activity month (July):

  • Pageviews: 32633

  • Total Votes: 216

  • Helpfulness: 21.07%

  • Total Cases: 193

This data provided a partial insight into how complex solution paths impact the effectiveness of our self-service capacity. 200 cases/mo is approximately 4 headcounts in technical support. Even with this low-ball estimate and using low-level specialists, there was a real opportunity to increase case deflection, improve self-service rates, and lower costs.

Retrospective

Our initial POC of two articles did not perform significantly better than before despite initial success with form-based troubleshooting flows. Meanwhile, support agents continued to author more of those flows reporting high customer satisfaction, suggesting that the two initial articles may not have been ideal candidates. We hypothesized that we may not have chosen ideal candidates as pilot articles, or may not have found the right balance between initial and secondary information to reveal to customers attempting to troubleshoot those particular problems.

Initial usability testing on the wireframes had received positive feedback from participants, but UX research emphasized the difficulty in accurately gauging success when it’s impossible to replicate or even approximate the real-world conditions in which customers will encounter the tool: in this case, likely frustrated and short on patience after what may have been hours of troubleshooting an issue with complex design software. A/B testing with several potential solutions may also have given us the insight we needed before implementing a successful design. But due to lack of resources and conflicting team priorities at the beginning of the pandemic, product management de-prioritized further iterations, as well as migration of additional complex solution articles to the guided troubleshooting format.

In Q1 2022, we were able to revisit the guided troubleshooting format, migrating several more of the Qualtrics “complex solution” flows that received positive customer feedback. The updated designs were implemented in the AutoCAD in-product help container as well as on the Autodesk website. As of late June 2022, customer feedback and helpfulness scores are being monitored, pending analysis.

Interested in collaborating
on a similar project?
Get in touch!

Previous
Previous

Autodesk Help Content Overhaul

Next
Next

Autodesk North Star: Learning Panel & Dashboard