Expectation
Explore programs → Review details → Apply
in

Reality
At Georgia Tech, a clear funnel became a maze
that turned simple tasks into treasure hunts

with extensive backtracking and abandonments
The Problem
GT's homepage was the front door to a 2.7M-visitor
digital ecosystem, yet it was failing to guide students forward.
15+
Average clicks to
apply
68%
Dropped off after 10% scroll
16s
Average session duration
Despite being rich in content, it was poor in discoverability. Multiple entry points led nowhere, creating confusion instead of confidence and eroded trust.
Redesigning Georgia Tech's Front Door
Team
Institute Communications
(2 UX Researchers, 1 Content Strategist, 1 PM)
Timeline
Jan 2024 - May 2025
(18 months)
Methodology
Heuristic Evaluation, Behavioral Analytics, User Interviews, Focus Groups
My Role & Approach
Triangulating end-to-end, mixed-methods research
As UX Researcher, I aimed to uncover why students struggled and validate the problem before redesign. I led foundational research, a quick-win homepage refresh, usability testing, and a competitive-informed IA redesign recommendations for menus and degree navigation.
Methods Applied
GT’s decentralization demanded methods that exposed both usability issues and deeper structural gaps. Triangulation kept us focused on systemic challenges, not isolated bugs.
Stakeholder Focus Groups
•
5 sessions with editors and department leads (n=43)
•
CMS pain points, KPI gaps, resource constraints
Defining shared needs
Behavioral & Heuristic Analysis
•
Analytics (click paths, scroll depth) + Severity-scored HE
•
Home scored lowest
•
Buried CTAs, false affordances, navigation loops
Uncovering usability gaps
Task-based Usability Testing
•
Moderated sessions with prosp. students tracking completion + path clarity
•
Median lostness >0.4
•
Users took inefficient, looping routes
Validating with real users
Competitive Landscaping
•
Clearer nav patterns + stronger CTA hierarchy
•
Benchmarked home, majors, and degree-specific flows
Learning from peers
Impact Across the Funnel
Our research informed a homepage-first redesign that centralized key actions, unified the IA, and set the foundation for a campus-wide CMS migration.

Results from Phase 1 Homepage Refresh
2.3x
Scroll engagement
increase
3x
Time on page
increase
+180%
CTA click
increase
+52%
Program exploration
increase
Understanding Organizational Reality
Stakeholder Focus Groups Revealed Systemic Chaos
Before evaluating the user-facing experience, I needed to understand the internal challenges facing the people who create and maintain Georgia Tech's websites. This would surface technical constraints and organizational friction that would impact any redesign effort.
50+ fragmented sites often duplicating or contradicting each other
5+ CMS versions (Drupal 6–10, Mercury, CampusPress) creating inconsistent layouts
Incomplete GA4 setups left major user journeys completely untracked
No staging/version control meant updates often broke live pages
While infrastructural challenges (CMS limitations and staffing) were beyond our control, I documented them to set realistic boundaries and helped
✔
Validate that usability problems stemmed from systemic issues
✔
Establish missing KPIs with stakeholders

Parallel Discovery
Identifying Problems Through Expert + Behavioral Lenses
I moved beyond anecdotes by pairing expert evaluation with behavioral analytics, triangulating insights to build clear evidence and conviction.
1
in
3
were rage/dead clicks
68%
never scrolled past
hero section
127
Usability violations
documented
70/100
Homepage HE Score
Expert Lens
Heuristic Evaluation
Reviewed 6 high-traffic sites against Nielsen’s Heuristics + WCAG
->
Navigation consistency
->
Visual hierarchy & accessibility
->
CTAs error prevention
->
Information architecture
Behavioral Lens
Microsoft Clarity + GA4
Analyzed 10,000+ sessions across the GT ecosystem
->
Scroll depth heatmaps
->
Path loops and backtracking
->
Funnel analysis and drop-off points
->
Dead and Rage click


Four Convergent Findings
Expert evaluation reveals what should work; behavioral data shows what actually happens. When both methods surface the same issues, it confirms severity + frequency.
Finding 1: Critical CTAs Buried Below the Fold
63% of applicants never saw the application entry point.
“Apply” and “Schedule a Visit” were placed 3+ folds down, often invisible on mobile.
Heuristic Evaluation
Visibility & Accessibility Violation
Buried low contrast CTAs, missing alt text, and poor focus states made navigation difficult
+
Behavioral Analytics
Invisible CTAs in Practice
Only 9.6% of users scrolled far enough to see primary CTAs; engagement was negligible
Finding 2: Mid-Funnel Drop-Offs Signal Unclear IA
Users bounced between homepage → programs → faculty, then abandoned entirely. Path looping added 4.7 minutes to task time.
Heuristic Evaluation
Circular Navigation
Redundant paths, inconsistent hierarchy, and content loops routinely sent users backward instead of forward
+
Behavioral Analytics
Search Dependency
On desktop, search ranked among the top clicks, with users bypassing menus entirely to find basic tasks
Finding 3: False Affordances & Inconsistent Design Eroded Trust
Fragmented CMS templates caused identical patterns to behave differently across pages, creating misleading cues that led to learned helplessness and 34% higher bounce rates.
Heuristic Evaluation
Inconsistent Visual Signifiers
Non-clickable elements styled as links, decorative containers mimicking buttons, and mismatched patterns violated UX standards
+
Behavioral Analytics
2,400+ Rage Clicks -> Higher Hesitation
Heatmaps showed clusters of rage clicks on elements styled as interactive components but implemented as static
Finding 4: Content Density Drove Early Exits
Text-heavy pages caused cognitive fatigue and early drop-off, leading users to miss critical information.
Heuristic Evaluation
Content Overload Without Structure
Dense blocks of text with few visual anchors. Important info lacked clear prioritization
+
Behavioral Analytics
Shallow Scroll Depth
64% of users abandoned pages within the first quarter scroll
Usability Testing
Validating the Friction We Saw Everywhere Else
Next, we conducted task-based usability testing with both undergraduate (n=6) and graduate students (n=5) to understand exactly where the experience was breaking down.
Task Topics
8 tasks reflecting real enrollment journeys
Finding where/how to apply
Understanding the application process
Finding a major or degree
Locating housing, tuition, & research opportunities
Metrics Captured
Task success rates, lostness scores (L = (N/S - 1) + (P/N - 1) / 2), SUS, path efficiency, behavioral observations

What we Learned
The Satisfaction Paradox Emerges
Students felt confident because the homepage looked professional. But surface improvements didn't solve deeper structural issues: confusing menus, unclear pathways, inconsistent terminology across 50+ fragmented sites.
01
Perception ≠ Performance
Undergraduates rated the site highly (SUS 81), yet struggled to complete fundamental tasks; graduates rated it SUS 51, mirroring the looping, backtracking, and re-starts we observed.
Lostness scores routinely exceeded the 0.4 “extreme difficulty” threshold, confirming deep navigational friction.


Usability Test Example 2A — Locating Undergraduate Tuition & Costs: Ideal Path (Left) vs. Actual User Paths (Right)
02
Content, Pathways, and Labels Broke Users’ Momentum
Lack of clear cues caused users to loop 3–5 times between key pages in a single task.
Critical pathways (like Apply) were visually overshadowed or inconsistently styled
Majors & Degrees didn't match students' mental models (too much scrolling, unclear grouping, unfamiliar terminology)
Tuition and fees were buried in PDFs, designed for print and not digital wayfinding
High-value content, like research opportunities, was impossible to find
These weren’t isolated issues; they were systemic mismatches between user expectations and how information was presented.
03
Micro-interactions created macro-friction
Students repeatedly used the GT logo as a “reset,” only to find it didn’t reliably return home, reinforcing that the issue wasn’t content complexity, but interface unpredictability.
Benchmarking Navigation and Content Strategy
What Best-in-Class Universities Do Differently
I conducted systematic review of 9 peer institutions across two rounds: Round 1 (Jan-Mar) focused on homepage patterns, CTAs, and brand presence. Round 2 (Jun-Sep) specifically targeted menu structures, navigation hierarchies, and majors/degrees pathways informed by usability testing insights.

Key Observations
01
Simplified Wayfinding and Audience Paths
Prominent, above-the-fold search with scoped queries (“Programs, People, Courses”) and audience-based entry points (“For Students / Faculty / Alumni”).

02
Dynamic Storytelling with Clear Visual Hierarchy
Hero videos, student stories, and stats build quick emotional connection and convey culture.
Clean sections, clear headings, and strong imagery keep content easy to scan.

03
Strong CTAs + Quick Actions Tiles
Key actions (Apply, Visit, Request Info) surfaced above the fold and reinforced through quick-action modules to reduce cognitive load and guide users straight to high-value tasks.

Where GT Falls Short
GT prioritized comprehensiveness over findability
While peers had shifted to utility-first design, GT remained locked in a content-dense model that overwhelmed students and buried essential information.
From Insights to Actions
Recommendations That Shaped The Roadmap
Based on the triangulated findings across all research methods, I delivered a comprehensive recommendation framework. Each recommendation explicitly ties back to specific research methods that revealed the issue.
Design Principles
Findability Over Completeness
Surface critical CTAs above the fold. Reduce clicks to high-value content
Clarity Over
Density
Chunk information into scannable sections. Use visual hierarchy to guide attention
Consistency Over Customization
Establish design system patterns. Reduce cognitive load across pages
Guidance Over Gatekeeping
Create clear audience pathways, reduce navigation loops so users feel grounded.
Quick Wins | 0 -3 months
Simplify Top-Level Navigation
Implement persistent navigation with plain-language labels, elevate high-priority CTAs (Apply, Visit, Give), and add clear visual hierarchy cues (hover states, icons). Move from vague categories to task-first orientation.
Fix Label Clarity & Consistency
Replace vague terms ("Academic Environment," "FYSA") with user-centered language validated through card sorting. Standardize terminology across sites to reduce cognitive load. Differentiate content for undergraduate vs graduate audiences clearly.
Enhance Search Visibility & Function
Implement visible global search with robust filters (program type, audience, content type). Optimize indexing for high-value terms (Housing, Study Abroad, Deadlines). Add synonym management and improve result relevance ranking.
Foundational Structural Fixes | 1 - 6 months
Rebuild Core Information Architecture Around Student Mental Models
Use tree-testing and task analysis to reorganize top-level nav categories into predictable, action-driven pathways:
Explore Programs
Admissions & Requirements
Tuition & Aid
Student Experience
Research & Opportunities
Prioritize Page Purpose for Every Template
Introduce shared templates, naming rules, and publishing workflows across decentralized academic units to maintain consistency over time.
Complete GA4 Setup & Conversion Tracking
Define primary/secondary content for each page type (program, admissions, faculty, research) so visual hierarchy supports decision-making.
Long-Term Opportunities | 6 - 12 months
Build a Modular Design System for Academic Pages
Resolve the fragmented CMS experience by aligning typography, spacing, cards, CTAs, and grid systems across all academic units.
Strengthen Search Intelligence
Improve indexing, expand keyword matching, and optimize program queries so search becomes supportive, not compensatory.
Launch Continuous Usability & Analytics Monitoring
Establish recurring UX reviews, heatmap monitoring, and funnel analysis to ensure the redesign continues to meet evolving student needs.
Reflection and Learnings
What this project taught me
01
Triangulation Builds Conviction
When multiple research methods converge on the same findings (buried CTAs, navigation issues, weak brand) stakeholders can't ignore it. Convergence creates undeniable evidence.
02
Metrics Reveal Hidden Struggles
The perception vs. reality paradox (high SUS, low task success) taught me to never trust self-reported confidence alone. Behavioral data exposes what users won't articulate.
03
Strategic Sequencing Matters
Leading with quick wins (Phase 1) built organizational trust before proposing systemic changes (Phase 3). Without early momentum, larger transformations face resistance.
This case study represents Phase 1 research and initial design iteration at Institute Communications, Georgia Tech.

