Prioritizing business KPIs to shape the information architecture of a customer account performance dashboard.

📌 Project Scope:

  • Timeline: 7 weeks (Sept ’24 - Nov ’24)

  • Role: UX Researcher

  • UXR Methods: Card Sorting

  • Tools: Optimal Workshop, Lucidspark, MS Office, MS Teams

  • Stakeholders: Product Owners / Asset Digitization

What prompted the research intervention?

Based on the insights gathered through my previous research initiatives, it was established that Brambles Customer Account Managers are facing the following challenges:

  1. Difficulty in tracking account performance

  2. Lack of alignment with source systems

  3. Lack of visibility into customer behavior

If every digitized asset can tell a story, and we can turn those stories into insights..

“How might we leverage Smart Asset data to create valuable business and customer insights for better performance monitoring?”

Project Overview

🚀 Business Goals & Obj:

To reshape the Asset tracking and Behavior tracking application to provide a unified view of account performance that delivers value through Smart asset data insights; illuminates darkest parts of the supply chain; helps identify inefficiencies & missed opportunities and enables data-driven decisions.

🔎 My Role & Approach:

I conducted a Card Sorting study to understand how users across various business regions, teams and contexts perceive and use business KPIs to monitor the performance of customers at various levels: Portfolio level; Account level; and Location level — to gather insights that can shape the initial design concepts.

🔐 Project Outcome:

The KPIs identified were used to shape the customer account dashboard and create intuitive user flows across the Account Performance solution within the app — along with additional insights gathered based on the combined mental models of users across various business regions, teams and experience levels.

Research Questions:

KPIs usage, interpretation and translation 

  • Which KPIs are most relevant to each user group within each of the 6 tasks? (e.g. Cycle Time, FTR, Leakage etc.)

  • How does each user group prioritize relevant KPIs relevant at an account level v/s a location level? How are they different?

  • Which KPIs does the user monitor over a period of time? What is that period of time? (historic trends and patterns)

  • What does the potential difference in KPI relevance tell us?

  • Does the user compare, contrast or correlate a KPI with other KPIs? When and how?

  • Does the user extrapolate the KPI data for a given context? When why and how?

  • What actions or insights does the change in a KPI value suggest or prompt? 

  • Hypothesis to test: The KPIs to review the Account Portfolio Performance v/s Individual Account Performance are different.

Why is this important? 

Providing relevant KPIs upfront will improve usability and motivate users to adopt & engage with the smart asset data insights. Answering these questions will help us define and understand what content and KPIs to prioritize and provide to each target user group for every task within the ‘new PP app’. Using the answers, we can design the information hierarchy and navigation systems more confidently and accurately. (including filters and data processing workspaces)

Research Methodology:

💡 Card sorting: helps understand how users perceive and organize information, which is crucial for creating an intuitive navigation and and information architecture that is based on the combined mental models of the users.

Study Setup:

50 Business KPIs we listed in the form of cards along with 6 pre-defined categories in the form of ‘questions’ (based on past evidence collected on key tasks) were arranged in the space. Since KPI usage and interpretation can be fairly nuanced and varied for users, this approach helped reveal how users interpreted its value and utilized them across their workflows — allowing me to dig deeper into the ‘why’ of their sorting choices.

⚠️ Challenge: Participants might use a KPI to monitor performance at a Portfolio level and an account level. The Card Sorting method generally does not accommodate for this.

In the interest of time, I resorted to a make-shift solution, which was to provide multiple card copies for every KPI, so that participants could use them to sort into the different categories. Since this was a moderated study, I was able to guide the participants as needed.

  • Tool: OptimalSort

  • Features enabled: Ranking of cards (within each category); Tooltips (for KPI definitions)

  • Type: Hybrid (participants can create new categories)

  • Study: Remote, moderated

  • Duration: 60 mins. (incl. screen-sharing)

Participant Overview:

14 participants

4 user groups

NA (USA/CA), LATAM, EU (NE/SE/CEE)


📂 Card Sorting — Summary of Findings

Stay tuned for more..

Previous
Previous

Mapped complex business workflows to discover user needs, behaviors & pain-points

Next
Next

Validated design direction and gathered user feedback early in the process for making refinements