Visit Scoring

Upper-funnel campaigns rarely generate immediate conversions — leaving marketers blind to what actually works. Visit Scoring uses ML to reveal which sources drive high-intent visitors, giving you a compass when conversion data can't.

7 min read|Updated March 2026

01.Upper-Funnel Ads Lack Conversions Signals

Consideration-focused campaigns — Prospecting, Discovery, Awareness — rarely produce immediate conversions. Users browse, research, compare. The purchase happens later, often from a different device or browser. As far as your analytics are concerned, those campaigns produced clicks and nothing else.

Consider a typical high-value purchase — an expensive bike, a sofa, or a vacation. A user clicks on an Instagram ad and explores the website from their phone. They actively research, engage with product pages, compare options. High-quality behavior by any measure.

But they don't convert on the spot. They come back the next day — directly from desktop — and buy immediately. Two sessions, two devices. The first session — the one that actually acquired the customer — receives zero credit.

Day 1 · Mobile
Instagram Ad
Click
Site visit
Browses products
Leaves
No purchase
14 days later· outside Meta's 7-day window
Day 15 · Desktop (different device)
Brand search
Google
Site visit
Returns directly
Purchase
$349.00
Platform attribution
Instagram Ad
0%
credit
Google Brand Search
100%
credit

Instagram drove the acquisition, but the converting session on desktop gets 100% of the credit. The first session is invisible.

Marketers look at the report and conclude that Instagram doesn't work. They cut the budget. And then total sales drop — because the campaigns that were actually feeding the top of the funnel are gone. The problem isn't that upper-funnel ads don't work. The problem is that your measurement can't see that they do.

02.Too Few Conversions to Decide

Thousands of dollars spent. Only a handful of tracked conversions in the report. How do you decide which campaign to scale and which to cut?

Three campaigns, $16,700 in spend, 9,300 clicks — and only 2, 4, 6 tracked conversions. Which one is actually working?

Some marketers fall back on clicks and CPC. Others look at bounce rate, time on site, or pages per session. But these are surface-level metrics — random website events that often have no real correlation with actual propensity to convert. A low bounce rate doesn't mean someone will buy. A high CPC doesn't mean the traffic is worthless. These proxies lead to budget decisions that feel data-driven but are fundamentally disconnected from revenue.

When there are too few conversions to measure against, marketers default to proxy metrics that don't predict revenue. The result: budget decisions based on noise, not signal.

03.A Signal That Predicts Conversion Intent

Visit Scoring is ML-powered probabilistic scoring of every session. The model evaluates engagement patterns in real time and assigns each visit a conversion probability — how likely this user is to eventually convert, based on how similar their behavior is to users who typically end up purchasing.

This isn't bounce rate. It's not time on site. It's a trained model that has learned the actual behavioral patterns that precede a purchase — page depth, content engagement, product interactions, session velocity — and scores every visit against those patterns.

The result: a powerful signal that can be used in addition to whatever conversions you were able to track. It reveals which sources, campaigns, creatives, and keywords actually generate high-quality, high-intent visitors who behave exactly like users that end up converting — not just random visits.

vis_4E82
4m 12s7 pagesPricing viewedHigh engagement
Conversion Probability
42%
next 14 days
×
Predicted Conv. Value
$349
based on behavior
=
Synthetic Conversion
$146.58
Sent via CAPI

Each visit scored with conversion probability based on real-time engagement signals. Only high-intent sessions rise above the threshold.

Visit Scoring is the foundational measurement signal that both Signal Quality (synthetic conversions for ad platforms) and attribution reporting build on. Signal Quality uses these scores to feed ad platform algorithms. Here, the same scores serve a different purpose: measurement and budget allocation.

04.Direction Over Precision

The primary use case of visit scores in reports is not about absolute numbers. It's about direction: which creatives to scale, which campaigns to cut, which keywords to invest in — when there is no other number available.

When you've spent thousands of dollars and have zero tracked conversions, you need a signal that tells you something meaningful about the quality of the traffic you're buying. Visit scores give you exactly that — a relative measure of engagement quality that correlates with actual conversion outcomes.

Without visit scoring, all campaigns look identical — zero conversions each. With it, you see that Instagram drives nearly 3x more high-quality visits than the others.

This makes Visit Scoring a necessary solution to unlock effective upper-funnel measurement and budget reallocation. When you can see which campaigns drive engaged, high-intent visitors, you can confidently invest in them — even if the actual conversions come later from another source, browser, or device.

Visit scores don't replace conversions. They fill the gap where conversions can't reach — giving you a compass for budget decisions that would otherwise be based on guesswork.

05.How It Works

1. Train on real conversion data

The ML model trains on your actual conversion history — users who purchased, submitted a lead, or completed any tracked goal. It learns the behavioral fingerprints that distinguish converting sessions from non-converting ones: engagement depth, content interactions, navigation patterns, and session characteristics.

2. Score every visit in real time

Every session is evaluated against the learned patterns and assigned a conversion probability. High-probability visits — those that exhibit the same engagement signals as users who typically convert — rise to the top. Low-intent bounces remain at zero.

3. Aggregate scores by source

Visit scores are aggregated in your cross-channel attribution reports alongside traditional conversion metrics. Now you can compare campaigns not just by clicks or conversions, but by the quality of traffic they generate. The campaigns that attract visitors with the highest conversion probability are the ones worth investing in.

4. Reallocate with confidence

When visit scores reveal that a campaign consistently drives high-intent traffic despite showing no immediate conversions, you have the evidence to keep investing. And when a campaign shows high click volume but near-zero visit quality, you can cut it before wasting more budget. The eventual real conversions — whether they come from another session, device, or channel — validate the direction that visit scoring already pointed toward.

+Reply...Opus 4.6
Upper-Funnel Campaign Analysis
CampaignSpendConv.Visit ScoreScore/$
Prospecting
$5,20037213.8
Awareness
$3,40074412.9
Demand Gen
$8,1005253.1
Recommendation
Scale Meta
Cut
Google Demand Gen
+Reply...Opus 4.6

Ask which campaigns to scale or cut — get actionable recommendations backed by visit quality data, not guesswork.

This whitepaper is best experienced on desktop. It includes interactive demos and data tables that show how the technology works. Send yourself a link to read later.