Know which features are landing and which are being ignored, every Monday.
Every Monday, Doe pulls WAU for every tracked feature from Amplitude, computes week-over-week adoption velocity, and flags any feature where engagement is dropping so your PM sees the decline before it hits retention.
Doe queries Amplitude weekly for feature-level usage, ranks features by adoption velocity, flags declining engagement, and delivers a Google Sheets health report every Monday with WAU, trends, and adoption rates.
What changes
| Dimension | Before | With Doe |
|---|---|---|
| Feature coverage | Whoever remembers to check their dashboard | All 12 features reviewed every week automatically |
| Trend detection speed | Weeks or months, found in quarterly reviews | Declining features flagged within one week |
| Cross-feature comparison | No single view comparing features | All features ranked by adoption velocity in one report |
| Investment signal quality | Executive opinions and anecdotes | WAU, trend direction, and segment adoption rates |
How Doe tracks feature adoption
12 tracked features returned. Collaboration feature: 1,240 WAU, down from 3,100 at launch (60% decline over 4 weeks). New search: 8,400 WAU, up 12% WoW. Bulk export: 89 WAU, niche but stable. 3 features below 50 WAU.
All 12 features ranked. Top 3: search (accelerating), notifications v2 (steady at 6,200 WAU), API access (growing 8% WoW in developer segment). Bottom 3: collaboration (declining 15% WoW), inline comments (42 WAU, never gained traction), smart folders (launched 2 weeks ago, below baseline for new features at this stage). Collaboration flagged for PM review with decline curve.
Spreadsheet updated: 12 features with current WAU, trend direction, adoption rate in target segment, and time-to-first-use for new features. Shared with the product team.
You have analytics dashboards for every feature. Nobody checks them consistently.
You launched a new collaboration feature 6 weeks ago. The PM checked the dashboard on launch day and once the following week. Four weeks later, weekly active usage dropped 60%. Nobody noticed. The PM finds out in the quarterly review when retention data shows the feature isn't sticky.
The product team ships 3-4 features per quarter. Each has an Amplitude dashboard. Nobody compares them. Feature A has 40% adoption in its target segment. Feature B has 8% but the team keeps investing because the executive sponsor loves it. No single view ranks features by actual usage, so investment decisions run on opinions.
Get started in under 10 minutes
Connect your tools
One-click OAuth for each integration. No API keys, no engineering.
Describe what you need
“Every Monday, pull feature usage data from Amplitude for all 12 tracked features. Rank by adoption velocity and flag any feature where weekly active users declined more than 10% week over week.”
It runs on schedule
Every Monday, the feature health report lands in Google Sheets with rankings, trends, and flags.
Feature Adoption Tracker FAQ
Adoption velocity is the week-over-week change in weekly active users for a given feature. A feature gaining 12% WAU each week is accelerating. A feature losing 15% WAU each week is declining. Computed from raw Amplitude event data.
Related workflows
Stop doing the work your tools should do for you.
Set it up once. Doe runs it every time.