Product analytics for startups: what to measure from day 1

15/04/2026
campo de trigo

Product analytics isn't about accumulating data, but about asking better questions. Google Analytics gives you a good foundation for acquisition and traffic, but tools like PostHog are specifically designed to understand what happens within the product: where users drop off, what they find valuable, and why they return (or don't). This article is your minimum viable starting point for measuring what matters from day one.

You're about to launch your product, and it's time to install an analytics tool. The default answer is almost always the same: Google Analytics. Everyone does it, it's free, all you need is a script, and you'll have data.

GA is a very powerful tool. It's primarily focused on understanding web traffic and marketing campaigns—where visits come from, which pages they view, which campaigns convert—but the current version (GA4) also incorporates features that allow for product analysis: custom events, funnels, and retention.

But if you're building a product—a SaaS, an app, a tool—the questions that matter are of a different nature:

  • Do my users understand how the product works?
  • How many reach the point where it adds value for them?
  • Why do those who leave, leave?
  • Which features are used and which are not?

Google Analytics can answer these questions to a certain extent, but its UX and data model are designed around the browsing session, not around the user and their actions within the product. That's why there's a category of specific tools called product analytics : designed from the ground up to focus on the user and their behavior. In practice, it's common for both to coexist: Google Analytics for marketing and acquisition, and a separate product analytics tool for everything that happens once the user enters the product.

Before installing anything, the right question is different:

What decisions do you want to make with your data?

This article starts from there: defining what you need to know and how a tool like PostHog helps you answer it from day one.

What you should be able to answer from day 1

The goal of product analytics is not to collect data, but to answer three questions. If you can't answer them, you're not measuring what matters.

1. Do users reach the moment of value?

Every product has a specific moment when the user thinks, "This is useful to me." In the industry, this is called activation , and there are famous examples:

  • Slack: a team that has sent 2,000 messages
  • Dropbox: a file in a folder on a device
  • Facebook (in its early days): 7 friends added in the first 10 days

For your product, you need a concrete answer to this question before implementing anything. If you don't have it, it's not an analytics problem—it's a product problem that you need to solve first.

Once you've defined it, you want to know: what percentage of users who sign up actually get there? How long does it take them? Where do they drop off along the way?

2. What do they actually do inside the product?

What you imagine your users do and what they actually do are two different things. Almost always.

You want to know which features are being used, which ones are going unnoticed, and where there's friction. This information allows you to prioritize better: stop polishing things no one uses and invest where real-world behavior demands it.

3. Are they coming back?

Acquiring users is just the starting point. What makes a product grow is whether they come back. You want to know what percentage returns after the first use, how often, and which first-day behaviors are correlated with higher retention. That correlation is gold: it tells you what you should optimize in onboarding.

If you can answer these three questions, you're already making better decisions than most startups we know.

Why PostHog

There are several product analytics tools on the market, and the choice depends on each team's context. PostHog is one we recommend for early-stage startups due to the following advantages:

Everything in one tool. Events, funnels, retention, session replay, feature flags, and A/B testing all in one product. For a small team, not having to integrate and pay for three or four different tools is a real operational advantage.

EU Cloud available. For European startups, this is more important than it might seem. You can choose to have your data processed and stored on European infrastructure, which significantly simplifies communication with your Data Protection Officer (DPO) and your privacy policy.

Usable free tier. The free plan covers realistic volumes for a startup in the early traction phase — around one million events per month at the time of writing (check the current figures on the pricing page before counting on it). You can get far before making any paid decisions.

What to configure from day one

Everyone starts with the temptation to instrument everything. Six months later, nobody remembers what an event with a cryptic name like "button_click_3" means, and the dashboard is terrifying. The rule of thumb is the opposite: instrument little, instrument well, and expand when you have questions you truly can't answer.

These are the four fundamentals you need to have from day one.

Correct user identification

This is the point where most implementations fall apart, and the team doesn't notice for months — until the funnels start producing numbers that don't add up.

PostHog—and any serious product analytics—works with two types of identifiers: anonymous (generated in the browser the first time someone visits your site) and identified (your internal user ID, which you assign when the user registers or logs in). The critical moment is signup: right after creating the user, you have to call the tool's identification function to link the anonymous session with the definitive user ID. If you don't do this correctly, the anonymous visitor's events are orphaned and not linked to the registered user. The result: the "landing → signup → onboarding" funnel breaks down because, before and after signup, the tool considers them two different users.

Conversely, when logging out, you have to reset the session to avoid mixing data from different users on shared devices — think of an office computer or a mobile phone that is lent to someone.

If you have a mobile app and a website, consider from the outset how you'll identify the same user on both platforms (this usually involves passing the user ID from the backend upon login). It's a half-hour conversation with your team that will save you weeks of data cleanup later on.

A single, well-defined funnel

Resist the temptation to set up ten funnels on the first day. Start with one: the path from the first visit to the moment of value.

For a typical SaaS, it would look something like this: visit to the landing page → start of signup → signup completed → onboarding completed → first valuable action. The key is the last step: that "first valuable action" isn't a generic event; it's the specific action you defined as the value moment before starting.

This funnel will give you the most important metric you'll measure in the first few months: the percentage of new users who reach value. If it's below 20-30%, you don't have an acquisition problem — you have a product problem. And no amount of marketing investment will solve that.

Session replay, with precautions not mentioned in the tutorials

Session replay is powerful, especially for detecting onboarding friction that no funnel will show you. But it has three real costs that you need to be aware of before activating it in production.

Privacy. By default, the replay can capture the content of inputs. This includes passwords, card details, and private messages. You must enable input masking in the settings and explicitly mark sensitive elements to prevent recording (all session replay tools have mechanisms for this). If you process medical, financial, or minors' data, consulting with your compliance officer is mandatory.

Cost. Replays generate a lot of data. Recording 100% of sessions quickly increases the cost. A sampling rate of 10-25% is sufficient for most cases, and you can temporarily increase it when investigating a specific problem.

Minimum segmentation

You don't need to set up a complex audience system. But you do need to capture three properties from the first event:

  • Source channel (UTM parameters captured on the first touch and persisted to the user, not just the session).
  • Account type (free, trial, paid — updated when it changes).
  • Signup cohort (month of signup, at minimum).

With just these three attributes, you can answer 80% of the questions you'll ask yourself in the first six months: Is retention worse for users coming from paid social media ? Do free plan users complete onboarding at the same rate as trial users? Does the product work better for this month's cohort than it did for the one from three months ago?

The rest of the segmentation can wait until you have questions that these three don't answer.

Common mistakes

We see these mistakes repeatedly in startups that ask us for help with their analytics:

Measuring too many things from the start. Fifty events tracked in the first month is a symptom, not an achievement. It means you haven't decided which questions matter. Six months later, no one remembers what each event measures, and everyone avoids adding new ones for fear of breaking something. Start with a few well-defined events.

Not having a defined moment of value. "I want to see how the product is used" isn't a question. "What percentage of users create their first project within the first 7 days" is. If you can't frame it this way, implementation is premature.

Don't rely solely on dashboards. Numbers tell you what's happening, not why. A funnel that fails during onboarding tells you where, but you need to watch three session replays to understand that users can't find a button. Always combine quantitative and qualitative data.

Ignoring the data. Not acting on the data. Measuring and changing nothing is worse than not measuring at all. It creates a sense of control without the results. If your activation rate has been at 18% for three months and you haven't touched it, analytics isn't your problem — prioritization is.

Treating instrumentation debt as invisible. Every misnamed event, every inconsistent property, every funnel that doesn't add up because someone changed the flow and didn't notify you—that's technical debt, just like a broken test. Include it in your planning; don't let it drag on until no one trusts the numbers anymore.

Conclusion

At first, it's easy to think the problem is having more data. It almost never is. The problem is asking better questions—and, above all, acting on the answers.

Analytics that doesn't change decisions is overhead in disguise: it creates a sense of control without delivering results. When you implement something new, the first question shouldn't be "how do I measure it" but "what will I do differently when I have the answer?" If you don't have a clear answer to the second question, it's not worth answering the first.