Skip to main content

Analytics & Tracking Glossary

Clear definitions for the terms product, engineering, and data teams use when discussing analytics events, tracking plans, and data quality.


A

Analytics Event

A discrete, timestamped record that something meaningful happened in your product - a page viewed, a button clicked, a purchase completed. Events are the fundamental unit of product analytics: every funnel, cohort analysis, and retention metric is built from event data. Each event has a name and a set of properties that describe the context in which it occurred.

Analytics Pipeline

The series of systems that move event data from your product to the tools where it is analysed: SDK or HTTP call, event collector (e.g. Segment, RudderStack), data warehouse (e.g. BigQuery, Snowflake), and BI tool or analytics platform. Each step in the pipeline is an opportunity for data to be dropped, transformed, or corrupted - which is why schema validation at the collection stage matters.

B

Behavioral Analytics

The analysis of how users interact with a product over time, based on sequences of events rather than aggregate counts. Behavioral analytics answers questions like “what actions do users take before churning?” or “which onboarding paths lead to activation?” - using the event stream to reconstruct and compare user journeys. It is closely related to product analytics but emphasizes sequences and patterns rather than single-event metrics.

C

Clickstream Data

A chronological record of every action a user takes in a digital product - clicks, page views, searches, scrolls - captured as a stream of events. Clickstream data is the raw material for understanding user journeys and product engagement. In practice, raw clickstream data is rarely used directly; it is filtered, aggregated, and enriched into the metrics and funnels teams actually work with.

Code Generation

The automatic production of platform-specific tracking code from a structured event schema. Instead of manually translating a tracking plan into Swift, Kotlin, and TypeScript, a code generator reads the schema and produces type-safe SDK methods for each platform. This eliminates the manual translation step that introduces errors and ensures that the tracking implementation stays in sync with the plan. Ordaze generates tracking code for iOS, Android, and web from a single registry.

D

Data Governance

The set of processes and standards that ensure data is accurate, consistent, and trustworthy. In the context of product analytics, data governance means defining every tracked event, validating events against their schemas before they reach the warehouse, and monitoring for unexpected changes. A practical governance framework for analytics covers three stages: Define, Validate, and Monitor. See the analytics data governance guide for a practical breakdown.

Data Quality

A measure of how accurately and completely your analytics data reflects real user behaviour. High data quality means events fire when and only when they should, properties contain the right values and types, and coverage is consistent across platforms. Data quality degrades gradually and silently - a missing required property, a type mismatch between platforms, a deprecated event that keeps firing - until dashboards stop reflecting reality.

E

Event Drift

The gradual divergence between a tracked event’s documented schema and what is actually being sent at runtime. Drift happens when code changes without corresponding tracking plan updates: a property gets renamed, a new required field is added to one platform but not others, an enum gains new values that were not defined. Over time, drift makes historical data unreliable and cross-platform analysis unreliable. Preventing drift requires version control on schemas and validation against them at runtime.

Event Naming Convention

A documented standard for how analytics event names are structured across a product. Common conventions include object-action (Cart Item Added), action-object (Added Cart Item), and namespace-prefixed (checkout.cart.item_added). The specific convention matters less than consistency: events named differently on different platforms fragment cohort analysis and make dashboards unreliable. See the complete naming conventions guide.

Event Property

A key-value pair attached to an analytics event that provides context about when or how it occurred. For a purchase_completed event, properties might include amount_usd, plan_tier, and promo_code_applied. Properties are what make events segmentable and filterable. Each property should have a defined type (string, integer, boolean, enum) and a required/optional designation in the event schema.

Event Schema

The formal definition of a single analytics event: its name, description, every property it carries with names and types, which properties are required, and which platforms implement it. An event schema is the source of truth for how an event should be implemented. A collection of event schemas across a product constitutes a tracking plan. Schemas should be versioned so that changes are recorded and breaking changes can be detected.

Event Taxonomy

The organizational structure that groups analytics events into logical categories. A taxonomy gives shape to what would otherwise be a flat list of hundreds of event names - for example, grouping events by product area (onboarding, billing, collaboration) or by user journey stage (acquisition, activation, retention). A well-designed taxonomy makes it easier to find events, identify coverage gaps, and communicate about analytics across teams.

Event Validation

The process of checking that an analytics event conforms to its defined schema before it is accepted into the analytics pipeline. Validation can happen at compile time (via type-safe generated code), at runtime before an event is sent, or at the collection layer in a CDP. Events that fail validation can be dropped, flagged, or quarantined. Validation is the enforcement mechanism that makes a tracking plan a real constraint rather than aspirational documentation.

Event Volume

The total count of events fired over a given time period, either across a product or for a specific event type. Event volume is a key monitoring signal: a sudden drop often indicates a broken implementation; a sudden spike can indicate a tracking bug firing events in a loop. Most analytics platforms can alert on volume anomalies. Volume is also relevant to pricing on many analytics tools, which charge per event ingested.

F

First-Party Data

Data collected directly from users through your own product, as opposed to data purchased from third parties or inferred from third-party cookies. Analytics events are a form of first-party data: you control how they are collected, what they contain, and how they are used. As third-party cookie tracking becomes less reliable, first-party event data collected through your own SDK becomes increasingly important for understanding user behaviour.

Funnel Analysis

A technique for measuring the percentage of users who complete each step in a defined sequence of events - for example, from sign-up to first action to paid conversion. Funnel analysis depends entirely on the quality of the underlying events: if the events that define each step are missing, inconsistently named across platforms, or firing at the wrong time, funnel metrics will be wrong. Accurate event schemas are a prerequisite for reliable funnel analysis.

I

Identity Resolution

The process of associating anonymous and authenticated event streams from the same user into a single identity. Before a user signs in, events are attributed to an anonymous ID; after sign-in, a identify call or alias maps the anonymous ID to the authenticated user ID. Most analytics SDKs handle this natively. Incomplete identity resolution creates inflated user counts and broken pre/post-signup funnels.

Implementation Coverage

A measure of what percentage of events defined in a tracking plan are actually implemented in the codebase. An event that exists in the plan but is never fired in production represents a coverage gap - the data you expected does not exist. Coverage is typically measured per platform (iOS coverage, Android coverage, web coverage) since the same event may be implemented on some platforms but not others. Ordaze’s codebase scanner reports implementation coverage automatically.

Instrumentation

The act of adding analytics tracking calls to a product’s code. When an engineer implements a new analytics event, they are instrumenting that feature. Instrumentation quality depends on how clearly the event is specified in the tracking plan and whether the implementation is validated against the schema. Manual instrumentation from a spreadsheet is error-prone; generated code reduces instrumentation errors by making correct usage the path of least resistance.

P

Product Analytics

The practice of using event data to understand how users interact with a product - to measure engagement, identify friction, improve retention, and guide product decisions. Product analytics tools (Amplitude, Mixpanel, PostHog, and others) provide the querying and visualisation layer on top of raw event data. The insights these tools can produce are only as accurate as the event data fed into them, which is why event quality and governance matter.

S

Schema Versioning

The practice of recording changes to an event schema over time so that the full history of how each event was defined is preserved. Schema versioning makes it possible to understand historical data in context (this event had a different property definition before version 2.0), detect breaking changes before they ship, and debug data anomalies by comparing the current schema to what was in place when the anomaly occurred. Ordaze versions every schema change automatically.

SDK (Software Development Kit)

A set of libraries and tools that enable a developer to integrate a service into their application. In the context of analytics, an SDK is the client-side or server-side library that provides methods for firing events - for example, analytics.track("Purchase Completed", { amount: 99.00 }). Analytics SDKs are provided by platforms like Segment, Amplitude, and Mixpanel. Type-safe tracking SDKs generated from a tracking plan are a form of code-generated wrapper around these base SDKs.

Super Properties

Properties that are automatically attached to every event fired from a given SDK instance, without needing to include them manually in each tracking call. Typically used for persistent context like plan_tier, user_role, or app_version. Super properties need to be documented in the tracking plan alongside event-specific properties, since they affect every event payload.

T

Tracking Plan

A document or structured registry that defines every analytics event a product tracks: what each event is called, when it fires, what properties it carries, which platforms implement it, and who owns keeping it accurate. The tracking plan is the contract between product (who decides what to measure) and engineering (who implements the measurement). Without a tracking plan, event names drift, properties become inconsistent, and coverage becomes unknown. See What Is a Tracking Plan for a full primer, or How to Build a Tracking Plan Template for a step-by-step guide. Ordaze provides a structured tracking plan registry with versioning, typed properties, and code generation.

Type Safety

A property of a codebase or generated SDK where the compiler or type checker enforces that analytics events are called with the correct property types at build time. With type-safe tracking code, passing a string where a float is expected - or omitting a required property - is a compile error rather than a silent runtime bug that only surfaces in dashboard data. Type safety is the most powerful technical mechanism for preventing event drift and maintaining data quality.

U

User Journey

The sequence of interactions a user has with a product over time, reconstructed from their event stream. Analysing user journeys reveals the paths that lead to desired outcomes (activation, conversion, retention) and the paths that lead to drop-off. Journey analysis requires that events are fired consistently and that identity resolution correctly stitches together sessions. Gaps in implementation coverage create blind spots in journey analysis.

Put these concepts into practice with Ordaze.