Giving users access to more information

I have steered through a complex task of adding more data to a tool already overflowing with information by researching actual user needs, challenging the feature request and iterative design informed by usability testing.

Background

Metabase is an internal, data and information heavy tool for highly specialized users. It’s a medical spreadsheet and wikipedia mix on steroids. As a complex product with many user groups it consists of modules which could be separate products themselves. Simulation is one such module. It is a testing environment used by the Medical Team to reproduce medical cases, improve quality of the medical content and accuracy of the diagnosis.

Unfortunately, a lot of times doctors have been left to guesswork as not all of the data from the engine was visible in the tool’s UI. Medical Team didn't know why a certain question was asked or why the engine proposed a specific triage. Some super-users could access additional information but it required workarounds and moderate understanding of code.

Responsibilities

  • user research (surveys)

  • usability testing

  • UX and UI design

Results

  • clickable prototypes

  • 3,5 to 4,8 increase in ease of use rating

  • test summary with actionable insights

Challenges

I was tasked with adding information from engine to the tool's UI. The messages that I was asked to implement to UI were taken straight out of code - there was a lot of brackets, quotation marks, id numbers and other code elements that made the message unreadable to our user group. My first challenge was to rephrase the request into a problem statement and convince PO and developers to invest some more time into working on the message itself.

How might we provide users with as much information they need without overwhelming them with the amount and complexity.

Similarly to the whole Metabase, many features were added to Simulation over the years without designer assistance which caused problems with UX and information hierarchy. Although Simulation could probably use a complete redesign, I had to set the scope of the design work realistically, taking into account the time, technical constraints, resources and business. Focusing on exposing valuable information to users I also tried to make minimal viable changes wherever I could to fix some UX issues.

Process

First, I had to understand the actual needs of the users. Simply giving all information that we could get from the engine was not the answer. Information had to be contextual and meaningful. At the same time I had to take into account technical constraints and the format of data. I consulted with developers on the message format to see what I can do to make the message easier to understand for the user. We agreed on changing abstract IDs to names with links and formatting details like bulleted lists or bolded texts.

Research

Knowing technical limitations I prepared 3 examples of messages:

  1. Version A – low complexity

  2. Version B – high complexity

  3. Version C – combination

I prepared a survey to check how granular the data has to be to ensure ease of understanding without oversimplifying. In the survey 16 users were presented with a message version and asked to mark their agreement with the statements on a 5 point Likert scale. The same questions were repeated for each version.

  • This information is easy to understand

  • This information is useful to me

  • This information is sufficient

  • This amount of information is overwhelming

Learnings

I’ve learned that version B (high complexity) was the least easy to understand - so much so that I could not introduce it in clear conscience. Happily, version A (low complexity) was similarly easy to understand as version C. All 3 versions were similarly useful to users. Interestingly, most users were not satisfied with the amount of information in the version A. Only the C version received a unanimously positive rating in terms of sufficiency. 

Thanks to the results I knew which direction to take further and I could explore design options, prepare mockups and a task based user testing to validate the solution. During testing I’ve learned about issues with visibility of elements, copy comprehension and expected behaviour. Thanks to that I could iterate on the designs with confidence.

Results

We've checked how satisfied are users with a simple 4 question form.

  • Simulation has everything that I need

  • Simulation works fast

  • Simulation is intuitive

  • Simulation is easy to use

We surveyed Med Team before the changes and two weeks after implementation. Thanks to introducing changes we saw increase in perceived ease of use from 3,5 to 4,8 points. We’ve also learned a lot from attentively researching user needs and testing. For example, we could prioritise implementation of changes based on what users viewed as the most valuable to their workflow. By doing so we could make the most impact from the start.