🔒
Protected case study

Enter the password to view this case study.

Incorrect password — try again.

← Go to homepage
WorkAbout Blog Resume Contact
Back to work
Walmart Tech
B2BMarketplaceDashboard

Turning overwhelming performance data into clarity

My RoleUX Lead
ScopeEnd-to-end redesign · 3 phases
PartnersEng · PM · Research · Data Science · Content
ToolsFigma
TL;DR — The outcome
Phase 3 dashboard
2800%
CTR increase
More sellers engaging with their performance metrics after gaining clearer visibility
73%
OTD improvement
On-time delivery rate uplift across the seller base
60%
More sellers evaluated
Phase 2 extended lookback brought 60% more sellers into performance tracking
3
Phases shipped
Each phase building on the last

Walmart’s most-used seller tool was failing its users — metrics were hard to read, urgency was unclear, and most sellers had little awareness of where they stood. Across 3 phases, I redesigned the dashboard to surface clearer data, benchmark standards, and urgency signals. Better awareness led to measurable engagement and performance uplift across the board.

The problem

Sellers had no clear picture of where they stood.

The Seller Performance Dashboard was Walmart Seller Center’s most frequently visited page — yet it was failing its users. Metrics were raw numbers with no context, no urgency indicators, and no benchmark to measure against. High-volume sellers missed optimisation opportunities. New sellers felt overwhelmed and disengaged.

How might we redesign the Seller Performance Dashboard so sellers can instantly understand their standing and know where they need to improve — without a steep learning curve?

Existing Dashboard Design

Information dense, confusing visuals, too many CTAs

Existing dashboard

Who was affected

  • High-volume domestic sellers — The dashboard didn’t support them in quickly synthesising performance data, leading to missed optimisation opportunities
  • New sellers — Found the dashboard overwhelming and hard to navigate, which hindered their ability to improve and grow on Walmart’s platform
  • All existing and future sellers — Secondary beneficiaries of any improvement to dashboard functionality

Problems to solve

  • Usability — Complex and non-intuitive; sellers had to already understand the metrics to make sense of the display
  • Data interpretation — Confusuing visual treatment meant sellers couldn’t tell quickly if their performance was good, at risk, or critical
  • Engagement drop-off — Sellers who didn’t understand the dashboard couldn't take necessary actions, hurting their own performance
  • Channel inconsistency — A parallel email comms redesign was happening in isolation; I partnered to align the two experiences
Research & discovery

Data told us what to prioritize. Patterns told us how to present it.

A key finding shaped our entire approach: 6 out of 7 metrics most frequently used by sellers already lived on the Performance Dashboard. The problem wasn’t missing data — it was that sellers couldn’t read what was already there. The solution was to make existing data legible, contextualized, and awareness-building. Based on this, we decided to surface 8 metric tiles — focused and curated around what sellers actually use.

Research findings

6 of the 7 most-used features lived on the Performance Dashboard

Existing patterns audit

Before designing new components, I audited existing UI patterns across Seller Center. A familiar pattern reduces cognitive load — sellers already know how to read it, so they can focus on the data rather than the interface.

Existing Pattern 1

Seller Center UI patterns referenced for familiarity

Pattern 1
Existing Pattern 2

Seller Center UI patterns referenced for familiarity

Pattern 2
Design process

Exploring, shortlisting, and stress-testing the metric tile

The metric tile was the atomic unit of the dashboard — getting it right was everything. I explored a wide solution space before converging on three proposals to evaluate with stakeholders.

Explorations

A glimpse into the exploration space — many directions considered before narrowing

Three shortlisted proposals

Proposal 1

Proposal 1

Proposal 2

Proposal 2

Proposal 3

Proposal 3

Comparison & stakeholder alignment

Each proposal addressed the shortcomings of the original tile design — removing the confusing data density, reducing the number of CTAs, and simplifying the visual hierarchy. After socialising with product and business partners, we converged on the approach that surfaced the benchmark standard most clearly, and reinstated an urgency tag system (refined with content designers) to signal how quickly each metric needed attention.

Existing tile vs initial proposal

Existing tile design vs initial proposal

Final tile

Final tile — benchmark standard + urgency tags

Final design
Phase 1

Redesigned dashboard — simplified tiles, clear urgency, focused CTAs

The new dashboard introduced simplified metric tiles with performance benchmarks, urgency tags, and a single focused call-to-action per metric. We also recommended — and implemented — moving “Performance” to Level 1 in the Seller Center navigation, out from under “Analytics” where most sellers couldn’t find it.

Final dashboard

Phase 1 final dashboard — simplified metric tiles, benchmark standards, urgency tags, focused CTAs

Navigation update

Performance was the most-visited area of Seller Center yet buried under “Analytics.” We recommended and implemented elevating it to Level 1 for direct access.

Navigation update

Performance elevated to Level 1 navigation

Comms design — aligning email with the dashboard

Partnering with the email comms designer, we extended the new tile design into seller emails — ensuring consistent performance data presentation across every touchpoint.

Old Template

New Template

Phase 1 results

A clear jump in seller activity from July 10, 2024. Some data is redacted due to its sensitive nature.

Performance overview

Clear activity jump from launch date — July 10, 2024

OTD

On-time delivery CTR

VTR

Valid tracking rate CTR

Seller response

Seller response rate CTR

Cancellations

Cancellations CTR

Refunds

Refunds CTR

Ratings

Ratings & reviews CTR

Phase 2

Extended lookback — bringing 60% more sellers into the picture

Around 60% of marketplace sellers weren’t being evaluated at all — they didn’t hit the 50-order threshold within the 14-day lookback window. We extended the lookback for OTD, VTR, and SSR from 14 to 30 days, and updated the dashboard and warning communications to match.

Phase 2 dashboard

Phase 2 — 30-day standard applied across all metrics

17%
More sellers evaluated
Increase in sellers evaluated for performance after extending the lookback window
31%
More warning recipients
More sellers received performance non-adherence warnings, building awareness of their standing
73%
Expected OTD improvement
Projected on-time delivery improvement across the expanded seller base
Phase 3

SSS metrics — protecting sellers from penalties they didn’t earn

Sellers in the Simplified Shipment Settings (SSS) program were being penalised for delivery issues caused by Walmart or the carrier — not themselves. We upgraded the dashboard to surface clearer shipping performance data and introduced a protection system that shields sellers from unfair penalties.

Phase 3 dashboard

Phase 3 — new sections and SSS-specific metrics introduced

86%
SSS sellers protected
Of SSS sellers received On-Time Delivery protections after the system launched
73%
Orders granted OTD protection
Of orders were granted OTD protection, shielding sellers from carrier and Walmart-caused delays
5%
Seller OTD improved
Of sellers saw OTD improvement after receiving protection, building awareness of their actual performance

Other contributions

I also designed the “View Details” pages for each SSS metric — giving sellers a deeper look at the specific factors driving their scores, so they could understand exactly where and why performance was flagged.

Carrier method accuracy details
On-time shipment details
Ship-from location accuracy details

Carrier method accuracy details

I also designed the Regional Performance views — a map-based breakdown showing sellers how their OTD performance varied across different states, color-coded by severity.

Regional performance critical
Regional performance at risk
Regional performance good standing

Regional performance — critical state

Reflection

What I took from this project

💡
What I learned
  • Keeping the business team involved throughout — not just at review checkpoints — leads to better-informed decisions, especially in B2B where data drives everything
  • Products built on data-backed hypotheses have far greater potential for measurable success than those built on assumptions
  • Proactively partnering across workstreams (like the comms redesign) creates multiplied impact beyond the immediate scope
🔭
What I’d do differently
  • Test a few design variants before shipping — particularly around the tile layouts that I explored — to validate with real sellers
  • Collect qualitative data on the new design to understand not just whether sellers click, but whether they feel more confident about their performance
Yahoo Commerce Ecosystem
Next case study
Yahoo Commerce Ecosystem
View project ⟶
Get in touch
Let’s build something great.

Fill in the form and I’ll get back to you as soon as possible.

Message sent!

Thanks for reaching out. I’ll get back to you soon.