Layers

Retention-driven growth

Why every acquisition playbook fails on broken retention. What retention actually means, how to measure it, and what to fix before anything else.

View as Markdown

Who this is for

You're running acquisition — organic social, paid ads, referrals, whatever — and the numbers feel like they're going in circles. You acquire users, they sign up, and somehow you're not growing. Or you're growing slowly in a way that doesn't match the acquisition investment. You suspect retention is a problem but you're not sure how to prove it, measure it, or fix it.

This playbook is not a growth tactic. It's the upstream problem. If you only read one playbook in this series, read this one first. Every other acquisition strategy in this collection assumes retention is working. If it isn't, those strategies waste money.

When to use this

  • You've acquired 500+ users and growth feels slower than the acquisition numbers suggest it should
  • You're about to start paid ads and want to know if your funnel is ready
  • Your week-over-week active user growth is lagging signup growth
  • Something feels off and you're not sure what

If you have literally no users yet, this is too early. Get to a few hundred users first using first users with no budget. Then come back.

The acquisition delusion

Here's the pattern that kills most consumer apps:

  1. Team focuses on acquisition. Posts, ads, PR, launches.
  2. User count grows. Everyone celebrates.
  3. Revenue / DAUs / weekly actives don't grow proportionally.
  4. Team doubles down on acquisition. "We just need more users."
  5. Growth stalls. Funds run out. Product dies.

What happened: they had a bucket with a hole. Pouring more water in doesn't fix the hole. It just makes the hole spill faster.

Retention is the hole. Before you pour more in, plug it.

The retention curve

The single most important chart you will ever look at for your product is your retention curve:

100% ─┐
      │●
      │ ●
      │  ●
  50% ─┤   ●
      │    ●●
      │      ●●●●●●●●●●●●●●●●●●●●
   0% ─┴────────────────────────────
       0   7   14  21  28  60  90    days after signup

The x-axis is days since signup. The y-axis is the percentage of users who are still active that day. Active can mean "opened the app," "did the key action," "logged in," depending on your product — pick one definition and stick with it.

Three shapes to know:

Shape 1: Smiling curve (good). Drops in the first week, then flattens high. Users who stick past the first week tend to stay for months. This is what product-market fit looks like on the retention chart.

Shape 2: Exponential decay (bad). Curve keeps dropping forever. No one who signs up is safe. This is a product that isn't sticky.

Shape 3: Spike and die (worst). Everyone active on day 1, almost no one on day 7. Users are trying your product but nothing's pulling them back.

You want shape 1. You'll probably have shape 2 or 3 to start with. The work is to turn the curve into shape 1.

What "good" retention looks like by vertical

Rough benchmarks for day-30 retention (percentage of signups still active at day 30):

Vertical"Good" d30 retention
Social apps25-35%
Productivity / utilities20-30%
Gaming (casual)10-20%
Gaming (core)20-35%
Subscription media15-25%
E-commerce10-20%
Fitness / health15-25%
Dating30-45%

These are directional, not absolute. Your vertical's real benchmark depends on your specific product and audience. What matters is whether your retention is above, at, or below the benchmark. If you're at 5% d30 in a category where "good" is 25%, acquisition is not your problem.

Measure the right thing

Most teams measure "retention" and mean six different things. Pin down three specific metrics:

1. Day-1, Day-7, Day-30 retention

Of users who signed up on day N, what percentage were active 1, 7, and 30 days later? This is the cleanest definition. Use cohort analysis — group users by signup week, then track each cohort's retention over time.

2. Activation rate

What percentage of signups complete the "aha" event — the thing that makes them understand the product? Examples:

  • Social app: made their first post, or followed 3 people
  • Productivity: completed their first task
  • Fitness: completed their first workout
  • E-commerce: made their first purchase, or added to cart

Activation rate is the upstream predictor of retention. Users who activate retain at 3-5x the rate of users who don't. If your activation rate is under 50%, everything downstream is struggling to compensate for an onboarding problem.

3. Engaged users / total users

Of all your signed-up users, what percentage were active in the last 7 days? This is the health of your whole base, not just new cohorts. Growing total users while this number shrinks means you're adding users faster than you're keeping them.

If you have the Layers SDK installed for attribution, you already have the event stream you need for these. Fire a signup event, an activation event (your choice), and an app_open event on launch. Everything else falls out of those three.

Find the leak

Look at your signup funnel step by step. Where do people drop off?

A typical mobile app funnel:

100%  Install
 85%  First app open
 55%  Completed onboarding
 35%  Key action performed (activation)
 18%  Active on day 2
 12%  Active on day 7
  6%  Active on day 30

The biggest drop in that example is between "completed onboarding" and "key action performed" — 35% to 18%, almost half of activated users don't come back on day 2.

For most products, the biggest leaks are one of four places:

1. Install → first open. If 15%+ of installs never open, you have a metadata or expectation problem. Users installed based on a promise that the app didn't match in the first 5 seconds. Usually a creative/brand mismatch, or an onboarding wall users bounced off.

2. First open → onboarding complete. If less than 65% of openers finish onboarding, your onboarding is too long or too confusing. Count the screens. Cut two.

3. Onboarding complete → key action. If less than 50% of onboarded users hit your key action within 24 hours, onboarding technically completed but didn't actually activate the user. You onboarded them to a menu, not to doing the thing.

4. Day 1 → Day 2. If more than half of day-1 users don't return on day 2, they didn't have a reason to come back. No notification, no hook, no reason. This is where retention loops matter.

You can have leaks at all four places; they compound multiplicatively. A 50% leak at each step means 6% retention at the end. Fixing one step to 75% gets you to 14% — more than double, from a single fix.

What failure looks like

You look at your retention chart, see it's bad, and decide to "do a retention push." You send a flurry of notifications, launch a referral program, and build a "week 1 engagement streak" feature. None of it moves the number because the actual leak was in step 2 of onboarding, where users hit a confusing permissions screen. Fix: find the specific leak step before building any retention feature.

The first retention fix: shorten onboarding

For 70% of consumer apps, the fastest retention improvement comes from cutting onboarding length. Count your onboarding screens right now. Whatever the number, cut it in half.

Yes, half. Yes, you'll lose information you think is critical. Users don't need that information to get started. They'll figure out the advanced stuff once they're hooked.

Onboarding rules that work:

  • No more than 3 screens before the user does something real. One intro, one value prop, one permission/setup, then straight to the product.
  • Defer permission requests until the moment they're needed. "Allow notifications?" on screen 2 gets denied. "Allow notifications so we can tell you when your workout is ready?" on screen 9, after they've set up a workout, gets accepted.
  • No feature tutorials. Show, don't tell. Let them explore and discover.
  • One clear CTA per screen. If there are three buttons, people pick none.

Most apps double their onboarding-complete-to-activation rate just from cutting onboarding length. This is the single highest-leverage retention intervention available to most teams.

The second retention fix: day-2 hook

If users don't come back on day 2, nothing else matters. Your job is to give them a reason.

Options that work:

Notification hooks. One notification on day 1 evening or day 2 morning, referencing something specific the user did. Not "come back!" — something like "You started a plan — ready for day 2?"

Email hooks (if applicable). Same principle, different channel. Works better for utility apps, worse for social/gaming.

Feature completion loops. Design one feature that has a multi-day cadence — a streak, a scheduled session, a "check back tomorrow" rhythm. This pulls users back naturally.

Social feeds. If there's a social component, populated feeds pull users back. Empty feeds don't.

Options that rarely work:

  • Generic "we miss you!" re-engagement pushes
  • Long welcome email sequences
  • "Tip of the day" broadcasts

The common thread: what works is specific to what the user already did. What doesn't work is generic broadcast.

The third retention fix: week-1 compound value

For users who come back on day 2, the next question is whether they come back on day 7. Retention past the first week usually means the product has started compounding for them.

Compounding value comes in three flavors:

1. Investment. The user has put something into the product that's worth coming back to. Data, customizations, progress, a streak, content they created, connections they made. Dating apps work because of the profile investment. Productivity apps work because of the task history. Gaming works because of progress.

2. Habit. The user has built a routine that includes your app. Notification at the right time, placement on their home screen, muscle memory. This takes weeks to build; you can only help it along.

3. Social gravity. Their friends are on the app, or their friends will be if they keep using it. This is the strongest form but the hardest to manufacture.

Your product needs at least one of these three to retain past week 1. If users finish your onboarding and there's no investment, no habit formation, and no social layer, they will leave — not because you did anything wrong, but because there's no reason to stay.

This is not a marketing fix. It's a product fix.

When retention is "fixed enough" to restart acquisition

You don't need perfect retention to turn acquisition back on. You need retention that's proven enough that acquisition spend pays back.

Specifically:

  • Activation rate above 40% of signups
  • Day 7 retention within range of your vertical's benchmark (doesn't have to match; has to be in the same order of magnitude)
  • Day 30 retention on an upward trend — even if it's not at benchmark yet, it's improving cohort over cohort
  • LTV > CAC with reasonable assumptions for your target CAC

Once these are true, acquisition starts working. Before they're true, acquisition wastes money no matter how well-targeted.

This is why first ads on Meta and Apple Search has retention as a prerequisite. Without it, the ad spend optimizes toward CAC, you feel like you're winning, and the users churn out the back. A year later you've spent a lot on ads and have the same DAU you started with.

The retention work is the product work

Most of the fixes above are product changes, not marketing changes. Shorter onboarding, better day-2 hooks, compounding value loops — these are product design problems.

This is why the best growth teams sit near the product team, not inside marketing. Growth driven by acquisition alone plateaus. Growth driven by retention compounds.

Marketing tools — including Layers — can drive distribution and measurement and creative rotation, but they can't fix a product that doesn't retain. If your retention curve is broken, no amount of organic social, paid ads, UGC, or referral program will fix it. You have to fix it in the product.

The measurement loop you need forever

Even after retention is "working," you need an ongoing measurement loop. Things that should happen weekly:

  • Cohort retention chart — is it holding up or drifting?
  • Activation rate per signup source — are paid users activating at the same rate as organic users?
  • Week-over-week engaged users — growing or shrinking?

Things that should happen monthly:

  • LTV analysis by cohort and acquisition source
  • Churn reason tagging for users who drop off
  • One user interview with a churned user

These habits differentiate growing products from stalling ones. The stalling ones keep acquiring; they don't keep measuring whether acquisition is still earning.

What's next

On this page