New Quest headset users face high cognitive load from unfamiliar virtual reality (VR) interactions and hand gestures. Meta's research showed interactive training outperforms static tutorials in building confidence and engagement. This onboarding redesign reimagined how millions would learn Meta's first major spatial user interface update in 5 years.
Redesign the onboarding tutorial for Meta Quest's biggest OS update in five years, reaching 100% of headset users. Success meant helping people master VR basics quickly, discover value immediately, and stay engaged beyond the critical first 30 days.
Core Team: Content Designer (me), Product Designers, UX Researcher, Software Engineers, Product Manager, Data Scientist
Key Partners: Art Directors, Sound Design, Voice Recording Talent, Internationalization/Localization Team, Accessibility Team
My Role: Content strategy lead, narrative direction owner, cross-functional hub between feature teams
Research led to a voice that felt like a knowledgeable friend sharing a favorite experience: empowering, conversational and direct, with Meta's signature big-hearted nerd energy.
The tone shifted based on context: celebratory when users succeeded, instructive during learning moments and encouraging throughout to balance excitement with clarity.
9 months (Feb 25–Oct 25)
Cross-functional partnership with Sound Design, Engineering, Legal, Privacy and Accessibility teams.
7 full iterations, designing onboarding for a product that was evolving in real-time.
5 rounds of in-person user research sessions.
Localization for 25 languages with voice-over recording across 6 languages.
Rapid rebuild in July after Mark Zuckerberg requested a simplified experience.
Influence product roadmap with content-driven insights
Prove simplified, human language increases engagement
Create modular scripting approach for faster iteration
Build localization-first writing process
Reduce cognitive load through strategic tone framework
Decrease support burden and improve user confidence
95% completion rate (up from 87%)—the primary success metric
70% faster to complete (2.5 minutes vs. 8 minutes) by consolidating three tutorials into one
Instructional patterns adopted across multiple feature teams
Worked with the team to define what we wanted the tutorial to accomplish, what users must learn and why.
Brought together stakeholders from PM, Engineering, UX and Art to get everyone on the same page from the start.
Dug into who our users were, what frustrated them and what made VR different from anything they'd used before.
Looked at what other companies were doing with VR tutorials to see what worked and where we could improve for our users.
Mapped out the entire experience step by step, thinking through what users would see and do at each moment.
Chose to focus on hand-tracking (vs. using controllers) because our research showed it was the trickiest thing for people to learn. We ensured controller options and instructions were always available as a backup.
Shared early ideas with and brainstormed with team in Figma to get their input and make sure we were headed in the right direction.
Wrote the instructional text, voiceover scripts and UI copy, keeping everything as clear and simple as possible.
Made sure everything followed Meta's guidelines for content, accessibility and translation.
With research, found the right voice and tone.
Determined out where people might get stuck and added contextual help in those places.
Teamed up with UX Research to prepare content questions and watch real users go through the tutorial and hear what worked and what didn't.
Rewrote content based on UXR findings, fixing anything that confused users.
Double-checked that the tutorial was accessible for everyone and would translate well across languages.
Worked with Legal and Policy to make sure we weren't missing anything on compliance or privacy.
Coordinated with Sound Design and Localization when the script was ready for voice recordings and translations.
Limited tooltips to three lines in English so there wasn't a chance of getting cut off when translated into longer languages.
Managed the handoff to Engineering of all the tested files: 6 languages with voiceovers and 25 languages total.
Adapted quickly when leadership changed direction, rewriting, re-recording and re-translating as business priorities shifted and user testing feedback and internal testing became available.
Tested using AI voice but choose to continue with human voice recordings since AI voices still weren't quite quality enough (as of Spring 2026).
Walked stakeholders through the final tutorial and got their sign-off.
Kept an eye on how users were responding after launch, tracking what resonated and what needed changing.
The tutorial launched October 2025 across all new and updated Quest headsets following a major simplification overhaul requested by Mark Zuckerberg in July.
Reached 100% of headset users (~10 Million) as the first moment in a new headset and to all existing users after a software update.
Scaled to 25 languages with human voiceover for the top six.
Overall completion rate increased from 86% to 93%
Daily active users increased nearly 6% (measured after 30 days).