What I Worked On

Data visualization Interaction design Dashboard UX/UI Metric definition Data mapping Filtering logic KPI framework Cross-platform analytics

U.S Based Digital Marketing Agency

A U.S.-based digital marketing and communications agency specializing in data-driven growth, performance marketing, and strategic brand development for B2B and enterprise clients.

The Problem

The existing dashboard experience did not meet brand standards or match the quality of comparable products in the market. Inconsistent UI & color usage resulted in a fragmented experience & weakened brands product identity.

  • Inconsistent dashboard designs and brand colors created a fragmented client experience
  • Lack of a unified design system weakened brand’s overall product identity
  • Poor visual hierarchy made key metrics hard to scan and understand
  • Data visualizations were unclear, increasing confusion for clients
  • Limited filtering made it difficult to explore and compare performance data
  • Inconsistent structures made dashboards hard to maintain and scale for the team

The core question we needed to answer:

Does the dashboard help users understand their data effortlessly, or does it only look visually appealing?

The Goal

The goal was to revamp the existing website to:

  • Establish a unified and consistent dashboard design system across all clients
  • Improve data visualization to make key metrics easy to scan and understand
  • Create a clear visual hierarchy that highlights the most important KPIs
  • Enable flexible filtering so users can explore and compare performance data easily
  • Support scalability with reusable components for faster setup and maintenance
  • Deliver a dashboard experience that aligns with brand standards and market expectations

Design Process

The process began by understanding user needs and key metrics, followed by a UX and data audit to uncover usability gaps. Insights were used to design a scalable dashboard structure and clear data visualizations, which were refined through ongoing validation and iteration.

Design Process

Stakeholder & KPI Alignment

What I Did

Aligned internal teams and clients on success metrics before designing the dashboard. KPIs were defined & prioritized based on business goals, reporting needs, and decision-making workflows.

What It Solved

Eliminated reporting misalignment between internal teams & clients.
Shifted focus from vanity metrics to actionable KPIs.
Built a dashboard designed for real decisions, not just visibility.

Data & User Research

This ensured the dashboard reflected real reporting needs, helping prioritize metrics that mattered most to both clients and internal teams.
Data & User Research

UX & Data Audit

Key Audit Findings

Inconsistent data visualization patterns across dashboards

Weak visual hierarchy made critical metrics hard to scan

Charts prioritized volume over insight, increasing cognitive load

Similar metrics appeared in multiple places with different contexts

Limited filtering & comparison reduced exploratory analysis

Dashboards reported data but did not guide decisions

Mapped data sources & logic
Audited hierarchy, scanability & load
Evaluated cognitive load across views
Improved clarity & decision flow
Benchmarked best-in-class dashboard
Designed insight-ready experiences

Dashboard Architecture & Decision Flow

01 GLOBAL NAVIGATION

02 MENTAL MODEL

Dashboard Architecture – Mental Model

03 PROGRESSIVE DISCLOSURE

Progressive Disclosure Flow

LEVEL 1 — EXECUTIVE OVERVIEW

Executive Overview (GA4 Summary)

KPI Cards
Trend Chart
Channel Mix
Geography Map

DRILL-DOWNS

KPI > Channel
Trend > Paid / Organic
Geography > Location

LEVEL 2 — CHANNEL PERFORMANCE

Paid Media Overview

KPI Summary
Platform Performance
Campaign Table
Creative Performance

DRILL-DOWNS

Platform > Campaign
Campaign > Ad Group
Creative > Asset

LEVEL 2 — CHANNEL PERFORMANCE

Display Overview

KPI Summary
Trends
Top Campaigns
Creative Performance

DRILL-DOWNS

Platform > Campaign
Campaign > Ad Group
Creative > Asset

LEVEL 3 — OPTIMIZATION & CONTROL

Account Pacing

Budget vs Spend
Remaining vs Burn
Monthly Projection
Forecast Accuracy

DRILL-DOWNS

Platform > Account
Account > Daily Pacing

LEVEL 3 — OPTIMIZATION & CONTROL

KPI Pacing

KPI Summary
Trends
Top Campaigns
Creative Performance

DRILL-DOWNS

Platform > Account
Account > Daily Pacing
LEVEL 4 — DEEP DIVES

Landing Page Performance

Conversion Rate Impressions Cost Per Click

Keywords Heatmap

Click-through Rate Quality Score Position Trends

Video Performance

View-through Rate Engagement Completion Rate

04 END TO END FLOW

End-to-End Decision Flow
01 Executive Summary
02 Platform → Campaign → Creative
03 Channel Performance
04 Deep Dives
05 Pacing & Control
06 Decision & Optimization

Data Visualization Design

01| KPI Summary & Trends

UX Focus: Clear hierarchy and fast scannability

Top-level KPIs like Revenue, ROAS, Spend, and Conversions were surfaced first, supported by trend indicators to provide instant performance context before deeper analysis

02| Performance Over Time & Comparison

UX Focus: Time context & comparison for decision-making

Time-series and comparative charts were used to show how performance changed over time and across channels, enabling quick identification of growth patterns & underperforming spend.

03| Funnel & Flow Visibility

UX Focus: Reduced cognitive load through clear progression

A simplified funnel connected clicks, visits, conversions, and revenue to reflect how users naturally understand performance flow and identify drop-offs.

04| Precision, Filters & System Consistency

UX Focus: Insight-first design with scalable system thinking

Tables and flexible filters supported deeper validation, while consistent color logic and interaction patterns ensured clarity across dashboards and client

What Went Wrong

Where usability, scalability, and system constraints surfaced gaps

  • Limited Scalability for Growing Data

    Some visual patterns didn’t adapt well as data volume increased, leading to clutter and reduced readability at scale.

  • Tool Constraints in Looker Studio

    Certain design ideas were difficult to implement due to Looker Studio’s layout, interaction, and customization limitations.

  • Over-Designed Low-Value Metrics

    Some metrics were overemphasized when simpler, more scannable visuals would have worked better.

  • Iteration Exposed System Constraints

    Scalability and feasibility considerations became clearer as designs were tested against real data and platform limits.

Validation & Iteration

Identifying these gaps helped refine the approach, ensuring future iterations balance clarity, scalability, and platform constraints more effectively.

Collected feedback focused on insight clarity, ease of scanning, and decision speed without additional explanation.

Identified which metrics drove action versus friction, then reduced visual density and deprioritized secondary metrics.

Refined patterns to scale reliably across datasets and clients, aligning the dashboard with actual usage behavior rather than assumptions.

30+
increase in conversions
4X
improvement in dashboard usability.
100%
Compatible with Looker Studio.

Iteration was guided by observed user behavior and platform constraints. Design decisions prioritized clarity and scalability over visual preference.

What I’d Do Differently

Validate scalability and Looker Studio constraints earlier with live data

Stress-test visual patterns against extreme data cases sooner

Prioritize decision clarity over visual variation from the first iteration

This project reinforced how I approach dashboard design. I start with clear intent, validate ideas through real usage, and build systems that scale with both data and decision making.