UX Research UI Design Boot Camp Project

NASA Website
UI Redesign

One of the most awe-inspiring institutions on the planet had a website making it genuinely difficult to find anything. I set out to understand why — and to design something better.

5 tests
Usability Sessions
2.8/5
Avg Opinion Score
4 tasks
Avg 60% Success
REDESIGNED NAV
3 clear categories · no redundancy
Tools Used Figma FigJam Otter AI Zoom ChatGPT Awesome Screen Recorder CSS Peeper
Research & Design Process

How I approached it

01
Heuristic Evaluation
Scored the existing site against Nielsen's 10 principles. Identified 4 categories needing improvement, 3 performing acceptably, and 3 doing well.
02
Persona + User Flow
Built "Lisa Strong" from NASA's government analytics data. Mapped her full task flow — a 20+ step path just to find one news article.
03
Usability Testing
5 moderated sessions over Zoom. 4 tasks. Recorded with Otter AI. Average success rate: 60%. Average opinion score: 2.8/5.
04
IA Restructure
Card sorting exercise to rationalize a 200+ item sitemap into 3 top-level categories. Eliminated redundant menus and ambiguous labels.
05
Prototype & Iterate
Lo-fi wireframes → mid-fi prototype → hi-fi responsive compositions with a full design system including style tile and component library.

Meet Lisa

Most personas are invented. Lisa was built from evidence. I started with NASA's own government analytics data to understand who actually visits the site — then cross-referenced degree programs and career pathways that would realistically attract women into science fields in the late 1990s.

That research discipline matters. A persona grounded in real data creates design decisions you can defend, not just justify.

Note: While a proto-persona is informed by secondary research rather than primary interviews, grounding it in government analytics data gives it more credibility than assumption alone.

LS
Lisa Strong
Climate Researcher · Houston, TX · Age 45 · M.S. Meteorology, FSU
  • A passion for science
  • Likes to stay updated on the latest science, NASA missions, and discoveries
  • Wants access to resources for research projects
  • Likes to engage with the NASA community through forums and discussions
  • Learn what NASA is doing in relation to climate change
  • Find captivating images and videos for inspiration and presentations
  • Challenges of searching for current, relevant, and accurate information
  • Wants to find what she needs and get on with it
  • Enjoys educational videos and webinars, but finds them hard to locate at the scientific level she is looking for
  • Likes in-depth white papers and articles but feels credible ones can be hard to find
  • Challenges of being a woman in a field largely attracting men

Built using NASA government website analytics data and researched educational pathways that would realistically support women entering meteorology and climate science in the late 1990s.

Usability Testing

What the data actually showed

Five moderated usability sessions over Zoom. Four distinct tasks, each representing a realistic goal a NASA visitor might have. The results weren't subtle.

70%
Task 1 Success
Find aeronautics article
60%
Task 2 Success
Locate specific content
50%
Task 3 Success
Find podcast content
60%
Task 4 Success
Date-filtered search

Average success rate: 60% · Average website opinion score: 2.8 / 5 · Moderated via Zoom · Transcribed via Otter AI

"Yeah… I never would have found it, if you got like 6 billion things out there."
Usability Test Participant · Task 1
"The filters aren't very helpful. 'Last year' must mean something different to me."
Usability Test Participant · Task 4
"You don't know how people classify stuff. I think the search algorithms are a little too loose."
Usability Test Participant · Task 2
"That felt oddly difficult. I just think they should be able to do better."
Usability Test Participant · Task 3

Where NASA stands

I scored NASA.gov against Nielsen's 10 Usability Heuristics — the industry standard for identifying structural UX issues without user testing. The site has genuine strengths. It's visually stunning, loads quickly, and houses an extraordinary volume of content. The problems cluster around navigation, information scent, and search.

Notably, the site's readability skews high-academic: a Dale-Chall score of 9.90–10.13 places homepage articles at a college-level reading threshold — appropriate for researchers like Lisa, but worth flagging for general public outreach goals.

Great
Visibility
Navigation icons and search boxes are mostly consistent, easy to recognize, and easy to navigate through the site.
Great
Mapping (Metaphors & Language)
The site uses familiar metaphors and language that users can recognize and understand throughout.
Great
Consistency
Navigation icons and interface elements use the same patterns and language consistently throughout the site.
Acceptable
Freedom
The site provides reasonable defaults and the ability to undo or back out of actions.
Acceptable
Flexibility
Advanced tasks are reasonably fluid, though power users and novices have limited ability to customize their experience.
Needs Work
Error Prevention
Links ending in generic 404 errors and loose search algorithms return results that don't match user intent, with no preventive guidance.
Needs Work
Recognition (Discovery)
Information is difficult to discover without simply using a keyword search — and even then it's difficult to know if the correct result is being returned.
Needs Work
Minimalism
The Home page requires excessive scrolling, forcing users through motion waste to reach information. Menus contain redundancy that adds noise rather than value.
Needs Work
Error Recovery
Broken links end in generic 404 errors with no recovery path. Users must reload the URL manually to return to NASA.gov.
Needs Work
Help
The site lacks proactive, in-place hints to guide users — particularly in search and navigation, where users most frequently got lost.
Accessibility Audit

Contrast ratios by the numbers

Sampled key UI elements against WCAG 2.0 AA standards (4.5:1 normal text, 3:1 large text and UI components). All tested elements passed — several significantly exceeded the minimum.

18.42:1
Small buttons on main page
✓ Passed · WCAG AA
21:1
Search box
✓ Passed · Exceeds AA standard
4.61:1
Red CTA buttons
✓ Passed · Above 4.5:1 minimum
7.09:1
Small article links
✓ Passed · WCAG AA
21:1
Font over hero imagery
✓ Passed · Maximum contrast
NASA's stated goal
Compliance with Section 504/508, targeting WCAG 2.1 AA and above — exceeding the federal baseline.
Synthesized Findings

What the research converged on

Across heuristic evaluation, usability testing, competitor analysis, and card sorting, three root problems emerged consistently. Every design decision in the prototype traces back to one of these.

🔍
Navigation doesn't communicate content
Users defaulted to the search bar after only 1–2 menu attempts. Top-level labels ("Explore") didn't reveal what was inside them, creating what UX researchers call low information scent. All 5 test participants abandoned menu-based navigation.
→ Led to: IA restructure, card sorting, breadcrumb redesign
🔎
Search returns volume, not relevance
Keyword searches returned thousands of results with no reliable way to filter by recency, type, or relevance. Users couldn't determine publication dates on articles, making it impossible to evaluate currency — critical for a research audience.
→ Led to: Metadata-driven search recommendations, filter redesign
📜
Homepage length creates motion waste
The homepage required users to scroll past image-heavy content sections to reach information they wanted. The sitemap — technically accessible — sat so far down the page that no test participant discovered it. Information architecture buried in scroll is architecture that doesn't exist.
→ Led to: Condensed homepage, drop-down nav system, sitemap restructure
Design Decisions

Research → Redesign

Every change in the prototype is traceable to a specific research finding. Here's how the evidence translated into design.

Decision 01

Consolidated navigation from 5 sections to 3

Research basis

Card sorting exercise revealed that users grouped NASA content into three primary mental models: topic-based exploration, media (multimedia), and news/events. The existing 5-section nav mixed formats with subjects, creating ambiguity.

Design response

Redesigned top-level nav to: "Explore by Topic," "Multimedia," and "News & Events." Logo moved to standard top-left position. Explore search field renamed to "Explore Topics" and repositioned for clarity.

Decision 02

Replaced scroll-driven homepage with drop-down architecture

Research basis

No usability test participant scrolled to find the sitemap. All participants used search rather than scroll-to-browse. Homepage scrolling was identified in the 2x2 matrix as High Value / Low Effort to eliminate.

Design response

Wireframed a condensed homepage with a single hero image and drop-down navigation exposing content categories on hover — reducing scroll requirement and surfacing information scent at the top level.

Decision 03

Redesigned Podcasts section with episode-level visibility

Research basis

Task 3 (podcast search) had the lowest success rate at 50%. Users who found the Podcasts page saw only 5 series titles with no episode listings — leading them to question whether the content even existed.

Design response

Recommended a Podcasts page restructure that shows both series names and individual episode listings — matching users' mental model for how podcast content is typically organized.

Decision 04

Prioritized metadata-driven search improvements

Research basis

Multiple participants called out ambiguous date filters ("Last year — does that mean calendar year or 12 months?") and search returning articles without publication dates. The 2x2 matrix flagged search as High Effort / High Value — a longer-term investment, not a quick fix.

Design response

Framed as a phased recommendation: immediate — add publication dates to all article cards. Longer term — invest in metadata tagging and filter UX improvements that give users meaningful control over search results.

Competitive Landscape

How NASA compares to peers

I benchmarked NASA against both direct competitors (organizations competing for the same audience) and indirect ones (organizations NASA could learn navigation and search patterns from).

Organization Feature Analysis Competitive Advantage Heuristic Analysis
NASADirect Tons of articles, press releases, multimedia, and podcasts on science and space exploration for all ages. Vast collection of information on science and space exploration across many media formats. It's difficult to find information without simply using a keyword search, and even then it's difficult to know if the correct result is being returned. The filtering system is difficult to use and articles are missing dates.
Library of CongressDirect Competitor Excellent top-level search filters, followed by search by catalog, collections, visitor-type (researcher, visitor, teacher), Blogs, US Copyright Office, Trending, service & Programs, etc. The photo on their main page brings you to their events page. Very nicely organized site. More information than NASA and extremely well organized. Efficient navigation, organized, clear labeling, graphics are well-thought out, easier to navigate and filter than NASA's site, consistent design, visually appealing with a highly robust filtering system to refine search results.
Space ForceIndirect Competitor Not really a competitor. Their website is really a military recruiting site, not an educational site. Directly targeting US Space Force recruiting. Animation contains good quality scientific drawings of photos and fact sheets, but is very limited in scope compared to NASA. The site contains a fraction of the content carried by NASA, so the navigation is very limited in scope.
SpaceXIndirect Competitor Heavy on cool animations, light on content and in depth science articles. Much more of a commercial site. Papers on payload (User's Guide). Efficient navigation (even with heavy animation and high-end graphics), organized, clear labeling, consistent design, matched my expectations of what a SpaceX website might look like, visually appealing, easy to scan and read.

Key insight: The Library of Congress — comparable in content volume — achieves dramatically better findability through audience-type navigation and robust search filtering. NASA's closest model for what "better" could look like.