
Our investor deck said "2 million MAU." Impressive, right?
Then I looked at what "active" meant in our analytics.
Active = opened the app once. That's it. One session of any length counted.
2 million people opened our app at least once in a month. Of those:
- 50,000 actually did anything meaningful (completed a core workflow)
- Of those 50,000, only 8,000 came back the following week
We had a "2 million MAU" product with 8,000 real users.
MAU was our primary metric. It was in every board deck, every investor update, every internal dashboard. It made us feel successful while the product rotted underneath.
When we switched to "Weekly Engaged Users," the truth was brutal. But at least it was true.
Here's why MAU is lying to you — and what to measure instead.
Section 1: Why MAU Became the Default
Understanding why MAU is everywhere helps explain why it's so dangerous.
It's Easy to Measure:
MAU is trivial to implement. Fire an event on app open. Count unique users per month. Done.
More meaningful metrics require defining what "engagement" means, instrumenting specific events, analyzing cohorts. That's harder. MAU is the path of least resistance.
Everyone Tracks It:
VCs ask for it. Benchmarks exist. Comparisons are easy. "We have 2M MAU" means something socially, even if it doesn't mean much substantively.
Network effects of metric adoption. Because everyone uses MAU, everyone continues to use MAU.
"Monthly" Is Forgiving:
A month is a long window. Any activity in 30 days counts.
Someone who uses your product daily counts the same as someone who opened it once and churned. The monthly aggregation obscures this difference.
Weekly metrics are harsher. Daily metrics are harshest. Monthly is the most forgiving — which is why it's preferred by people who want flattering numbers.
"Active" Is Undefined:
There's no standard definition of "active." Companies pick the loosest definition that produces the largest number.
- Opened the app? Active.
- Received a push notification? Active (yes, some companies count this).
- Had a session longer than 0 seconds? Active.
The incentive is to inflate. Nobody gets fired for reporting big MAU numbers.
Designed for Social Networks:
MAU originated at Facebook and made sense there. Social networks are about showing up. Opening Facebook is the product.
But MAU got misapplied to everything: productivity tools, e-commerce, B2B SaaS. For these products, opening the app is not the product. Doing something is.
Metric adoption without context. A classic error.
Section 2: How MAU Lies
Let me be specific about the lies MAU tells.
Definition Gaming:
We gamed our own definition without even realizing it.
Initially, "active" meant "logged in." That produced X number.
Then we changed to "opened the app" (which includes users who didn't need to log in because they were already authenticated). That produced 1.5X.
Then we included users who received a push notification and didn't open the app (because our analytics SDK fired on notification receipt). That produced 2X.
None of this was fraud — just incremental loosening that made the number grow. The incentives always push toward inflation.
Cohort Hiding:
MAU doesn't tell you if the same users are coming back month after month, or if you're churning users and replacing them with new ones.
A "flat" MAU could mean: stable, healthy product with consistent users.
Or it could mean: 50% churn per month, offset by 50% new user acquisition.
These are radically different states. MAU treats them identically.
Engagement Invisibility:
A user who uses the product 30 times per month counts the same as a user who used it once.
Power users and churning tourists are equivalent in MAU. This is insane. Your power users are 100x more valuable. MAU pretends they're the same.
MAU Is a Ceiling, Not a Floor:
What MAU actually tells you: "This is the upper bound of people who have touched our product recently."
It doesn't tell you: how many are engaged, how many will return, how many would pay, how many love the product.
It's a ceiling measurement presented as if it's a floor. Misleading by framing.
Section 3: Metrics That Actually Matter
After abandoning MAU as our north star, we adopted metrics that actually reflect product health.
WAU/MAU Ratio (Stickiness):
This ratio tells you what percentage of monthly users are weekly users.
- Healthy products: > 40%
- Okay products: 25-40%
- Dying products: < 25%
Our ratio was 22%. People were trying the product monthly but not returning weekly. Classic sign of a leaky bucket.
L7 and L28 Retention:
What percentage of users from Day 1 are still active on Day 7? Day 28?
- L7: Did they survive the first week?
- L28: Did they become a habit?
Our L28 was 4%. Of users who signed up, only 4% were still using the product a month later. The other 96% churned.
MAU hid this because it showed the total of everyone who used the product in a month — including the massive churn.
Core Action Adoption:
What percentage of users complete the core workflow your product is built for?
If your product is for task management, how many users create tasks? If it's for collaboration, how many users share documents? If it's for analytics, how many users view reports?
This is the most important metric. It measures if people are actually using the product for its intended purpose.
Our core action adoption was 2.5%. 97.5% of "users" never did the thing the product was built for.
The Principle:
If your product exists to X, measure how many users X. Not how many users exist.
Section 4: How We Fixed Our Metrics
Here's how we transitioned from MAU to meaningful metrics.
Step 1: Define "Engaged User":
We got specific. An "engaged user" in our product is someone who completes at least one core workflow per week.
Not opens the app. Not logs in. Actually uses the product for its purpose.
Step 2: Switch Primary Metric:
We replaced MAU with WEU (Weekly Engaged Users).
The dashboard went from 2 million to 50,000. Overnight.
It was painful. Board decks looked worse. Investor updates were awkward. But honest.
Step 3: Instrument Granularly:
We added tracking for:
- Each step of the core workflow
- Time spent in meaningful activities (not just "in app")
- Feature adoption (who uses what?)
- Cohort retention (how do specific cohorts behave over time?)
This gave us the data to actually understand user behavior, not just count heads.
Step 4: Align Incentives:
Team goals and bonuses tied to WEU and retention, not MAU.
Product decisions were evaluated by: "Will this improve weekly engagement?"
Growth tactics were judged by: "Will these users retain, or just inflate the top of funnel?"
Results:
Within 6 months of switching metrics, product decisions improved dramatically. We stopped building features that inflated MAU but didn't drive engagement. We focused on retention over acquisition.
WEU grew from 50k to 85k. MAU actually declined (we acquired fewer low-quality users). But the business got healthier.
Conclusion
MAU is a vanity metric. It tells you how many people touched your product. It doesn't tell you how many care.
For most products, MAU is actively misleading. It hides churn, inflates importance of tourists, and obscures engagement patterns.
If you're still using MAU as your north star, you're navigating with a broken compass.
Switch to engagement metrics. Define what "real use" means. Measure that. It will be painful at first. The number will be smaller. But it will be true.
Measure engagement, not existence.
Written by XQA Team
Our team of experts delivers insights on technology, business, and design. We are dedicated to helping you build better products and scale your business.