DESIGNWNOY
Future-Proofing Georgia Tech’s Digital Front Door
Team
2 UX Researchers, 1 Content Strategist, 1 PM
Duration
18 months [Jan'24 - Jun'25]
Methodology
Heuristic Evaluation, Clarity Heat Maps, Behavioral Analytics, Info Architecture, User Interviews, Focus Groups
(TLDR)
63% Faster Paths, 50+ Sites Unified
Roadmapping Georgia Tech’s digital future
Overview
Georgia Tech’s digital presence had become a sprawling, decentralized ecosystem of 50+ websites. Inconsistent navigation and duplicated content left students frustrated and doubled task completion time, leading to increased support tickets and potential enrollment losses.
My Role
As part of the UX Research team, I led a multi-method research initiative to uncover hidden usability failures and quantify inefficiencies.
Final Outcome
My research directly informed an evidence-based homepage-first redesign and led to a unified IA and CMS roadmap.
Key Impact
Reduced average time-on-task significantly for prospective students.
Improved findability of critical actions like “Apply” and “Schedule a Visit.”
Sparked a campus-wide digital transformation that is now aligning 50+ sites under a unified system.
(Context and Challenge)
15-20 clicks for tasks that should take 3
Expectation
When prospective students land on a university website, they expect clarity - explore majors, review program details and costs, apply - with just 3 smooth clicks.

Reality
At Georgia Tech, what should have been a clear funnel instead became a patchwork of disconnected journeys, forcing students to pogo-stick between departmental sites, wade through deep menus, and land in dead ends.

A Fractured First Impression
Instead of building excitement, the experience drained confidence and undermined trust at the critical decision point of choosing and applying to college.
Design Statment
How might we unify fragmented sites into one streamlined journey for students applying to Georgia Tech?
(Our Approach)
Triangulating Quant Research, User Behavior and Stakeholder Pain Points
GA4 and Microsoft Clarity Quantified Inefficient Paths
I audited 6 flagship sites using behavioral analytics to track click paths, scroll depth, and pogo-sticking.
Scroll maps revealed key CTAs like “Apply” and “Schedule a Visit” buried 3+ folds deep.
Exposed mobile drop-off points where primary actions disappeared from view.

Heuristic Evaluation Flagged 38 Usability Violations
Conducted a heuristic evaluation against Nielsen’s 10 Usability Heuristics, with a severity scoring system.
Heaviest violations clustering under “Recognition vs. Recall” (buried CTAs, unclear labels) and “Consistency & Standards” (inconsistent nav, button styles).
Despite being the main entry point, Homepage scored the lowest overall.

Usability Testing Revealed the Gap Between Perception and Performance
Ran 11 task-based usability tests (UG + Grad students) tracking completion, route type, and satisfaction.
SUS scores (average = 67) suggested usability was “acceptable,” masking hidden inefficiencies.
Lostness ranged 0.42–0.63 (high difficulty).
Grad testers looped through 3–4 sites before finding tuition info.
UG students failed to find “Apply” via the homepage in 5 of 6 sessions.

Lostness Mapping & Critical Path Analysis Made Detours Visible
Mapped actual click sequences vs. ideal flows for 5 high-value tasks.
Median lostness exceeded 0.4 across all major conversion flows.
Tuition, degree details, and housing were the worst offenders, often requiring users to pass through 3+ irrelevant pages.
Extra clicks averaged 90–120 seconds of added task time.


Usability Test Example 2A — Locating Undergraduate Tuition & Costs: Ideal Path (Left) vs. Actual User Paths (Right)
Focus Groups Exposed CMS Bottlenecks
Multi-sessions held with editors and department leads (n=43) across Mercury, CampusPress, and Drupal
Found no GT-specific documentation or onboarding, creating a “figure-it-out-yourself” culture.
Editors lost work due to lack of staging environments.
Stakeholder priorities revealed a gap between marketing-led homepage design and transactional user needs.

IA Testing Validated New Structures
Ran open/closed card sorts and tree tests with target audiences to validate menu re-labels and groupings.
Flagged vague category names (“Academic Environment”) for replacement with direct, user-friendly terms.
Improved findability scores for key tasks; reduced average nav depth by 1–2 levels.


Tree Test (Left), Card Sorting (Right)
Competitive Benchmarking Informed Navigation & Content Redesign
Conducted two parallel reviews:
A. Homepage, majors, and degree-specific content flows.
B. Top navigation & footer patterns at peer institutions.
Found peers favored clear, plain-language labels over marketing jargon.
