User Experience Design / UX Research / Service Design
index-gdx-studies2.png

Gates Foundation: Data Exchange user research studies

Project description

Gates Data Exchange (GDx) was launched in 2019 as an initiative to establish a centralized repository for data collected across the foundation by program teams and divisions. GDx is a platform co-designed with internal data scientist users and built on a third-party platform with out-of-the-box features: CKAN.

Skip to summer 2020: now well out of its Beta phase. The team knew the platform needed improving but we didn’t have data telling us which parts of the platform to focus on. I designed and conducted 3 studies to help us collect that user data, and to evaluate the site using established user experience maxims.

Summary

My role: UX Researcher
Skills:
Research study design, participant interviewing, data analysis, report writing and presenting
Dates: 3 studies conducted between Sept 2020 and March 2021

Outcome

The time and care taken to evaluate GDx and listen to its users gave us deep insights into the direction it needs to go to fulfill its potential. These 3 studies helped shape the GDx roadmap and identified improvement opportunities that sets a course for improved access, availability and use of the foundation's key asset: the valuable data generated by its programs and partners around the world.

Our team quickly prioritized improvements based on these studies, and I pivoted to UX mode and have been rapidly designing those solutions.

Prioritized enhancement recommendations

  • Help new users successfully navigate GDx by simplifying the dataset creation experience and improving contextual support content ​​

  • Communicate the status of user to them so actions they can take are clear ​

  • Allow contributor users to both Search for and Contribute datasets from the homepage

  • Give consumers, or new users a quick guide “how to add files” if they’re interested in doing so

  • Bring primary actions higher up in workflows, such as the “add a resource” action

Goals of the studies

  • Note what other best-in-class data repositories are doing to collect and display data; what can we learn from their practice

  • Identify areas of the platform where user needs are not being met ​or are failing heuristic benchmarks

  • Observe the current experience of users on GDx; Understand where pain points occur ​​

  • Make GDx as self-serve as possible, reduce friction in users’ main activities on the platform

  • Gather data to identify clear Roadmap direction and quick fix feature improvements

 

The Three Studies

Array of data repository homepage screens

1. Competititve Analysis

By conducting competitive analyses, GDx HCD team can get a baseline for current data exchange landscape. We can better understand the strengths and weaknesses of similar platforms and gain insights and make recommendations into how better meet the needs of internal and select external GDx users.

  • What are other best in class data repositories doing to collect and display data? ​

  • What are takeaways we might be able to apply to GDx?.

Slide from audit results deck


2. Heuristic Audit

A heuristic audit of GDx lets us score the current data exchange usability. From evaluation, we can ​​better understand the strengths and weaknesses of the platform and inform priority areas for attention. Additionally, methodologies established in this audit provide us a way to evaluate future iterations of the GDx, using the same rubric.

  • How does GDx score using established easy-to-use site characteristics?​

  • Collect examples site inconsistencies that contribute to user confusion or aesthetic disharmony

deck-slide-usability.png


3. Baseline usability study

In a baseline usability study, we observe and interview users as they move through specific tasks on today's GDx​. Observations show us where pain points occur, identify areas where user needs are not being met, and in many cases, reinforce findings from the prior two studies.

  • ​​​​​​How are GDx users navigating our site today to do their tasks? ​

  • Where are they running into difficulty?

Methodology

I created a repeatable methodology for these three studies using a task success metric. For the first two studies, I conducted or observed ease of completion for 5 main user tasks—using the Semantic differential scale to measure success.

For the usability study, my product manager and I interviewed internal users and both observed them work, and asked them to report on their own impressions right after each task.

Graphic I created to illustrate the task success color coding metric

Graphic I created to illustrate the task success color coding metric

 

Five tasks evaluated for each study

Task 1: Access site and learn more​
A. Homepage ​
B. Navigation​
C. My profile information​
D. Help & support​

Task 2: Find a dataset​
A. Data you own or know about​
B. Data you seek​
C. Search​
D. Browse​

Task 3: Engage with a dataset​
A. Explore a data set detail page​
B. Finding information you’re looking for

Task 4: Create a dataset​​
A. Entering metadata ​
B. Uploading or linking to data files​

Task 5: Manage a dataset ​
A. Edit/delete​
B. Add a new resource​

 

GDx user profiles

All studies followed typical tasks that would be the objective for users either wanting to find specific data or wanting to share specific data on the repository for others to find. Secondary users included those who would want to transform or analyze the data further, possibly posting new datasets as a result.


 

Competitive analysis results

I assessed external sites and evaluated them using our task method, keeping in mind GDx's unique needs, but gleaning findings on best practices for making data easy to find and relatively easy to contribute (given sharing and license challenges of much data out there). Some noteworthy data sharing sites viewed:

  • Humdata.org

  • Harvard Dataverse

  • GitHub

Key takeaways from external sites

  • Exemplar sites reviewed are optimized for data discovery – robust search and browse, showcasing data, and putting data into real-world context for users​​

  • Data sets are communicated as living documents —keeping data up to date is an encouraged norm (and socially validated in best cases)​

  • Help content is available globally – also supplemented by content written in clear, friendly language, with visual examples​

  • Through strategic content and design, successful data exchanges help their users navigate their sites successfully

A snippet of notes taken on Humanitarian Data Exchange

A snippet of notes taken on Humanitarian Data Exchange

 

Usability audit results

Summary of collected findings for a specific task during the heuristic audit

Summary of collected findings for a specific task during the heuristic audit

Next up, I evaluated GDx using the same rubric, as well as capturing a physical map of the site and steps for each task. I took our own known business objectives, user motives, needs and tasks, and evaluated GDx against Neilson Norman’s characteristics of easy-to-use user interfaces.​

Some takeaways from GDx site usability audit:

  • GDx and its homepage is not optimized for primary user flows (find and add data)

  • Search, Add, Edit actions appear inconsistently across site features (labels, placement in context of action)​

  • Lost opportunities to increase engagement, findability, and to gain user trust​ (Long forms with little contextual help for filling out, missing search functionality)

 

Baseline usability test results

The last study was the most involved. Our team recruited a mix of internal users from across program areas to conduct the final baseline usability study. ​​We deliberately sought users who would only browse for data on the site, and users who had not yet added datasets to the site themselves but might do so in the future (and had edit permissions). We conducted 9 hour-long interviews over video and transcribed them in Dovetail.

What's working​​​​​​​
The power users who regularly upload to the GDx do so with ease. Many of our early adopters helped shape what the GDx is today—it’s designed for them to put data there​. Where in use, the site is a reliable and stable place to store and keep data updated.

user quotes.png
 
Baseline usability task completion chart

Baseline usability task completion chart

What could stand improvement
Unfortunately, some of the interactions on the GDX are making people feel dumb. Smart time-crunched people have to think more and it’s frustrating to them. ​In addition, there exists a lack of contextual help and dense terminology not every user understands or has time to decipher​.