Skip to main content
Blog
app-store-connect

The new App Store Connect Analytics, explained: 100+ metrics, cohorts, and what indie devs should actually track

Apple's biggest App Store Connect Analytics update since launch adds 100+ metrics, cohorts, and peer benchmarks. Here's what to use and what to ignore.

Carlton Aikins8 min read

If you've logged into App Store Connect in the last month, you've probably noticed that the Analytics tab looks nothing like it did in February. Apple rolled out the biggest Analytics overhaul since the feature launched — more than 100 new metrics, cohort analysis, peer group benchmarks, and the beginning of the end for Sales and Trends, which is being folded into Analytics through 2027.

For most indie developers, Analytics used to be a courtesy feature: you'd glance at downloads, check the conversion rate, and close the tab. That's no longer a reasonable posture. The new Analytics is closer in spirit to a product analytics tool like Amplitude or Mixpanel than to the sparse dashboards of 2023, and if you're not using it, you're leaving roadmap decisions on the table.

This guide walks through what actually changed, which of the new metrics are worth your attention, how to use cohorts without getting lost in the weeds, and how to build a weekly review habit that takes fifteen minutes and actually tells you something useful.

What actually changed#

Apple announced the update at the end of March in the April Hello Developer bulletin. The short version:

The new Analytics supports up to seven filters at once, which means you can finally answer questions like "how did paid users from Germany who downloaded through Search Ads in the last 30 days convert compared to organic users from the same region?" without exporting to a spreadsheet. The new metrics cover in-app purchase and subscription performance in a way that previously required digging through Sales and Trends reports that nobody loved reading. And there's a new App Store Analytics Guide in App Store Connect Help that's actually worth reading.

The Sales and Trends migration starts in mid-2026 with the deprecation of the Subscriptions dashboards, then continues through 2027 as the remaining dashboards and reports move into Analytics. If you've built any internal tooling that scrapes Sales and Trends CSVs, note that down now — it's a long fuse but the fuse is lit.

The metrics that actually matter#

The "100+ new metrics" number is impressive but also misleading. Most of the new metrics are slight variations on ones you already know, or very specific slices that matter to large publishers but not to someone with a one-person app. Here's what's actually new and worth tracking if you run an indie iOS business.

Proceeds per download. This is the single most useful number Apple has ever added to Analytics. It's exactly what it sounds like: for every download of your app, how much money do you ultimately make after Apple's cut? This lets you reason about acquisition cost ceilings without building a data pipeline. If your proceeds per download is $0.80, you know any paid acquisition channel with a CPI above $0.80 is losing money on day one and needs to win on retention.

Download-to-paid conversion. Also new as a first-class metric. Previously you could compute this yourself from Sales and Trends; now it's sitting in the dashboard with peer benchmarks next to it. If your number is below your peer group, that's a pricing, onboarding, or paywall problem. If it's above, you have room to raise prices or lower your paywall aggressiveness.

In-app event exposure and conversion. If you're using in-app events (the App Store feature that lets you promote timed events inside your listing), the new Analytics finally gives you conversion data on them. If you've been running events on vibes, you can now actually see whether they move the needle.

Subscription offer performance. The new subscription data lets you see, per offer, how many users took it, how many converted to paid, and how many churned. This used to require Sales and Trends plus a spreadsheet plus coffee. Now it's one screen.

The metrics I'd quietly ignore unless you have a specific reason: most of the territory-level slices below your top five markets (you'll spend more time interpreting noise than signal), the session-length metrics (they're there but App Store Connect has no view into what happens inside your app, so these are proxies at best), and the engagement metrics that overlap with what you already track in a first-party analytics SDK.

Cohorts, without getting lost#

The new cohort feature is the headline capability, and also the one most likely to make an indie dev stare at a chart for an hour and close the tab. A few rules that keep it useful.

A cohort is just a group of users who share an attribute. Apple lets you define them by download date, download source (Search, Browse, Search Ads, referrer, etc.), offer start date, region, and device class. The point of a cohort isn't to look at it — it's to compare it to another cohort.

So the question isn't "how are my April downloads performing?" The question is "how are my April downloads performing compared to March, and what changed?" The first question gives you a chart. The second gives you a direction.

Concrete examples:

  • Compare users who downloaded through Search Ads against users who downloaded organically in the same week. Search Ads users almost always have lower retention and lower proceeds per download. If the gap is small, you have a good ads setup. If the gap is huge, your ads creatives are attracting the wrong people.
  • Compare users who accepted your introductory offer against users who didn't. If offer-takers have higher long-term retention, lean into the offer. If they churn as soon as the offer ends, the offer is buying loud signups that don't last.
  • Compare users from a new region against users from your established markets after you've localized. This is the cohort that actually tells you whether the localization effort paid off.

Peer group benchmarks: useful, with an asterisk#

Apple added peer group benchmarks for two new monetization metrics: download-to-paid conversion and proceeds per download. You'll see how your numbers compare against other apps in your category of similar size.

This is genuinely useful — before, you had no idea if your 3% conversion rate was good, bad, or ordinary. Now you can see whether you're above or below the median for productivity apps with your download volume.

The asterisk: the peer group is defined by category and size, not by business model. If you're a subscription-only productivity app, you're benchmarked against a peer group that includes one-time-purchase productivity apps, which have wildly different conversion curves. Treat the benchmark as a directional signal, not a target. "I'm below the median" means "look at this," not "I'm failing."

Sales and Trends isn't disappearing tomorrow. But if you've built anything on top of it — a revenue tracker, a cohort tool, a daily digest — you need a plan.

The Subscriptions dashboards in Sales and Trends are deprecating first, starting mid-2026. The rest follow through 2027. Apple has not, as of this writing, committed to a stable replacement API for the CSV-style exports that Sales and Trends has always offered, so if your tooling relies on those, monitor the App Store Connect API release notes closely over the summer.

Practical prep steps:

  1. Audit anything internal that reads Sales and Trends data. List it.
  2. For each, identify whether the underlying data is now available in Analytics. Most of it is.
  3. Where Analytics covers it, migrate. Where it doesn't yet, flag it and set a calendar reminder for July 2026 to check the API situation.
  4. If you have a commercial tool (RevenueCat, Appfigures, etc.) sitting between you and Sales and Trends, your vendor is almost certainly ahead of you on this. Check their changelog.

A 15-minute weekly analytics review#

The new Analytics is powerful enough that you can waste an entire afternoon in it. Don't. Here's a weekly review that works:

  1. Impressions and product page views. Up or down vs. last week. If down, check whether you shipped a metadata change recently. If up without an obvious reason, check the featured lists in your category.
  2. Download-to-paid conversion. Glance at the number, glance at the peer benchmark. If it moved more than 10% week-over-week, something changed — a pricing tweak, a new onboarding, a new paywall. Figure out which.
  3. Proceeds per download. Same pattern. If this moves meaningfully, something structural changed (price, offer mix, refund rate).
  4. Your one active cohort comparison. Whatever question you're currently answering — look at it once, write down what you see.
  5. Top rejection and crash alerts from the side panel. If there's a red flag, triage. If not, close the tab.

Fifteen minutes. Not more. Analytics is a tool for informing decisions you'd make anyway, not a place to spend your evening.

Closing it out#

The overhauled Analytics is a genuine upgrade, and for once it's an upgrade that indie devs benefit from as much as large publishers do. The new metrics are the ones a one-person business actually needs — proceeds per download, download-to-paid conversion, offer performance — and the peer benchmarks finally give you a reference point that isn't "I guess that's a normal number?"

The danger is spending more time in the dashboard than in your product. The metrics only matter when they change a decision you're about to make. If you build the habit of checking once a week, acting on what moved, and otherwise leaving the tab closed, this update pays for itself quickly.

On the operational side, Stora pipes Analytics metrics into your release workflow so they're already in front of you when you're about to ship an update — proceeds per download, conversion rate, the peer benchmark. If something moved since your last release, you see it before you hit publish, not three weeks later when you open the dashboard and wonder why Tuesday looked weird. The tooling is there. Whether you use it is the question.