Some conversions happen fast, others take time, and platforms like Google don’t always report everything right away. The result? Recent data is often incomplete, which makes daily decisions harder than they should be. This mystery is called conversion lag. If you’re new to the term, conversion lag is the delay between when someone takes action and when that conversion is actually reported. It’s a small gap, but one that makes a big difference when budgets are adjusted daily.
So, what do we do? We built a system that estimates what’s really happening behind those recent numbers. You’ll find the calculation of your conversion lag in your Amanda AI Insights.
Want to know how it works? Read on for a detailed explanation.
What is conversion lag, anyway?
In a nutshell, conversion lag is the delay between when a customer first interacts with your ad and when that interaction finally turns into a recorded conversion. Think of it like planning a dinner date—the initial spark might happen instantly, but the actual dinner (or conversion) might take a little more time to materialize. This delay can vary: sometimes it’s minutes, other times it’s days or even weeks.
The Amanda AI approach to conversion lag
Our method for addressing conversion lag is both systematic and data-driven. We break it down into three key steps:
Step 1: Saving the full reporting timeline
Every night, Amanda AI downloads performance reports from Google and tags them with the date they were fetched. Instead of replacing old data, we store each version—keeping a history of how reports evolve over time. This lets us see how conversions for a specific day fill in over the following day.
Step 2: Calculating the lag
With multiple versions of the same report, we can calculate what share of conversions was reported after 1, 2, 3… up to 90 days.
For example, if on February 10 you have 5 conversions eventually, you might only see 1 conversion on February 11, 3 on February 12, and so on. By calculating the share of total conversions reported at each time delta (like 20% on day one, 60% on day three, etc.), we build a sequence that helps us understand the lag. We then crunch these numbers using statistical tools—computing the mean, standard deviation, and even the 95% confidence interval for each time delay. This gives us a robust picture of how quickly (or slowly) conversions are reported.
Here’s a simplified example with conversions from February 10:

By aggregating thousands of sequences like this, we calculate the average share reported per day, plus a 95% confidence interval—so we have a safe upper bound to work with. That way, we’re cautious not to overestimate.
3. Adjusting for the calculated lag
This is where it gets practical. When it comes time to allocate your marketing budget, we look at the last 30 days of data. By comparing the reported conversion count to our calculated lag (using the upper bound of our confidence interval to stay on the safe side), we can predict the true number of conversions.
For example, let’s say it’s February 16. Here’s how we’d adjust the numbers:

So, instead of assuming only 3 conversions happened yesterday, we predict 6. It’s not about inflating results—it’s about making sure we don’t miss performance that hasn’t been reported yet.
And for conversion value? For now, we keep it simple: we use the average value per conversion and multiply it by the added conversions.
Why conversion lag matters
Understanding conversion lag isn’t just a numbers game—it’s a critical part of performance analysis. When conversions trickle in over time, early data can be misleading. If you base your budget adjustments on incomplete numbers, you risk cutting off campaigns that are actually gaining momentum. A savvy approach to conversion lag lets you:
- Attribute conversions fairly: Recognize all the touchpoints that lead to a conversion rather than just the last click.
- Set realistic expectations: Know that some campaigns need more time to fully bloom.
- Optimize budgets: Adjust spending based on more accurate, forecasted conversion counts.
This view is well supported by industry insights, such as those shared on the SavvyRevenue blog, which reminds us that understanding and accounting for conversion lag is crucial for effective campaign management.
The upside and the ongoing journey
While our current model does an excellent job at forecasting conversion counts, predicting the actual conversion value adds another layer of complexity—after all, every order is unique. For now, we simplify this by calculating an average value per conversion and multiplying that by the number of predicted extra conversions. And yes, we’re always exploring new methods to refine this further!
What about other methods?
Sure, you could estimate conversions using click data and average conversion rates, but those rates can vary greatly. Our method focuses on tracking actual reported conversions over time and learning from them. Because Amanda AI logs every prediction it makes, we can keep testing and improving this over time.
It’s not time travel. But very close.