Most new Meta Quest headset owners are excited on unboxing day. A month later, a lot of them have stopped opening the box.
We were losing ~40% of new users within their first month, and only 45% were hitting the two-hour mark in their first week. That was the threshold we knew predicted whether someone would stick around.
The First 28 Days onboarding program was our answer. My education team joined with lifestyle marketing to create seven emails and rule-based notifications designed to guide users through that important time. Early signals looked good: 35%+ open rates, click-through outperforming benchmarks and positive notification engagement.
Then the experiment readout came back. No statistically significant lift in weekly active users at day 28.
That gap between "people clicked" and "people stayed" is where the real work began.
Core Team: Content Designer (me), Lifestyle Marketing, Product Designers, UX Researcher, Product Manager, Data Scientist, Product Marketing Manager, Engineering, Internationalization/Localization
6+ months (V1 → V2 iteration)
I didn't only change the content, we reframed what the onboarding was meant to do.
Localization across 5+ markets
Data science readouts and retention modeling
Improve new user retention by guiding Meta Quest users through their first 28 days with relevant, confidence-building education and content discovery.
Reframe onboarding as a curriculum, not a campaign
Reduce spam perception and unsubscribes
Personalize based on onboarding stage and behavior
Integrate device education + content discovery
Improve clarity and emotional tone of lifecycle messaging
Version 1 (October release)
V1 launched with encouraging signs. Open rates hit 35.89% and click-through came in at 2.67%, outperforming prior benchmarks by 88%. Early engagement was up across the board.
Then the full experiment readout came back. Notification interaction was positive and email clicks-over-seen were up 15%, but unsubscribes had spiked 48% on high-frequency days, and the number that mattered most, weekly active users at day 28, hadn't moved.
That told us something important: we weren't solving the right problem. Users were engaging with the messages. They just weren't changing their behavior.
Version 2 (December release)
V2 was built on that diagnosis. We added structured educational content, reprioritized casting and key device behaviors, I rewrote headlines to lead with value over features and we restructured the send cadence to reduce the perception of spam.
The click-through rate came down to 1.2%, a trade we understood. Unsubscribes stabilized to 0.06%, back in line with baseline. Retention still didn't move, which confirmed what the data had been pointing toward all along: the friction wasn't in the messages.
What the program did establish was durable. The guardrails and measurement framework built during this project now inform how lifecycle onboarding gets scoped and evaluated. The unsubscribe risk patterns identified at days when both an email and notification was sent, Day 3 and Day 5, gave the team something concrete to act on going forward.
I reviewed:
All 7 lifecycle emails across Day 1–16
Rule-based notifications
Overlapping marketing touchpoints
Competing programs (Horizon mobile, Meta Quest+, etc.)
What I found:
Redundant messaging
Competing calls-to-action
Heavy emphasis on content discovery, light on skill-building
Headlines that described features but didn’t frame value
No clear narrative arc across 28 days
We weren’t onboarding. We were broadcasting.
Broadcasting assumes the audience needs more information. Onboarding assumes they need a path. V2 was built on that distinction.
We restructured the program around five confidence stages:
System basics (so they can confidently use the headset)
Key behaviors (casting, theater mode)
Social confidence
Low-risk discovery (free apps, free returns)
Re-inspiration moments
Instead of pushing apps, we paired actions + outcomes.
Before:
“Explore these top mixed reality titles.”
After:
“You can cast directly to your computer, TV or smartphone with the Meta Horzon app - making it easy to share what is happening in headset.”
I introduced:
Value-forward headlines (benefit > feature)
Shorter body copy (cut by ~30%)
Clearer primary calls to action
Reduced competing links
Stage-based content logic
Emotional tone shifts (confidence-building vs marketing hype)
We also explored:
Gamification mechanics
Education modules
Entertainment modules
Before we could measure progress, we identified what progress meant.
North star & leading indicators
Weekly active users at day 28 was our north star, the single metric that would tell us whether the program was actually changing behavior. But we knew that metric would take time to move, so we defined a set of leading indicators to track momentum along the way: how often users were visiting product detail pages, whether they were installing apps, and whether they were adopting key device behaviors like casting and time spent in apps.
Guardrails
We also built in guardrails to protect the user relationship while we iterated. Email unsubscribes had to stay at or below 0.03% per send. Mobile push opt-outs couldn't exceed 0.12% per day. These were a signal that the program was eroding trust faster than it was building it.
Clicks and open rates can tell you a message landed. They can't tell you whether someone picked up the headset. We can't message someone into a habit. We can only reduce the friction preventing it. Optimizing engagement without addressing that friction is measuring the wrong thing.
If the friction is in the headset, the fix needs to be in the headset. Users struggling with comfort, interaction or content overwhelm aren't going to be rescued by an email. Off-platform communication has real limits when the barrier to use is physical and immediate.
We mapped send schedules across teams to avoid piling on. It was a surprise how much we were already sending so we worked with teams on timing but it wasn't enough. Unsubscribe spikes clustered around our own high-frequency days, Day 3 and Day 5, so the problem wasn't just volume across programs. It was our cadence. Frequency without clear value reads as noise.
The retention needle didn't move. The program's impact did.
As the content designer on this project, I helped establish a measurement framework that didn't exist before, contributed to the guardrails that now shape how Lifecycle messaging gets scoped and surfaced the unsubscribe risk patterns that had gone untracked across send cadences. The work shifted content quality and strategic framing inside the program and changed how the broader team talked about onboarding education.
The most important thing it proved was harder to quantify. Onboarding isn't a marketing function, it's a product experience. No amount of well-crafted emails can substitute for reducing friction where it actually lives, inside the headset.
The groundwork my team laid gave the Lifecycle team a foundation to build from when they took over for lighter optimization. That handoff was the point.