“We're Held Accountable to Forecast Accurately": A Lesson from a Revenue Leader

“We're Held Accountable to Forecast Accurately": A Lesson from a Revenue Leader

This week I sat in on a demo with a revenue leader at a cybersecurity company. About ten minutes in, he said something that stuck with me:

"If we can get early detection signals, it helps the revenue side forecast accurately — which is kind of what we're held accountable to."

That one sentence captured something I've been hearing across hundreds of conversations with revenue and customer success leaders. 

The accountability is real. 

The pressure is real. But the tools they're relying on to meet that accountability? Most of them were built for something else entirely.

The World These Leaders Are Living In

This particular leader oversees retention at a B2B technology company. Smart team, serious product, sophisticated customers. The kind of company that has Salesforce, Gong, and a growing stack of customer data spread across systems that don't naturally talk to each other.

When I asked about churn, his instinct was immediate: of course it's on the radar. Every tech company right now is watching for contraction. But when we started digging into how they were actually tracking it, the picture got complicated fast.

The account data lived in Salesforce. The customer conversations were in Gong. Usage behavior? Somewhere else. And nobody had yet done the work of connecting those pieces into something that could actually predict what was coming.

This is not unusual. This is the norm.

The Tool Confusion That's Costing Revenue Teams

Here's where the conversation took an interesting turn. About halfway through, one of his colleagues asked: "Is this different from Gong?"

It's a fair question. And honestly, it revealed something I think is worth spending time on.

Gong is a genuinely excellent tool. It captures customer conversations, generates summaries, surfaces themes. A lot of revenue teams love it. But it's a conversation intelligence platform — it tells you what was said and by whom. It doesn't have a data warehouse underneath it. It can't blend behavioral signals with account-level trends. It can't run a model that says "here are the 12 customers most likely to contract in the next 45 days, and here's why."

The problem isn't that revenue leaders are using the wrong tools. The problem is that the lines between categories have blurred so much that it's genuinely hard to know what you're missing. If your tool gives you a summary, it feels like analysis. If it surfaces a trend, it feels like a prediction.

But there's a meaningful difference between a tool that surfaces content and one that runs structured analysis on behavioral data at scale. And that difference matters most when someone's holding you accountable to a forecast number.

What Actually Predicts Churn

What most retention-focused teams are really after isn't hard to describe: they want to know which accounts are at risk before it's too late to do anything about it.

But churn rarely has a single cause. By the time a customer says they're leaving, you're usually looking at the convergence of three or four signals that were detectable weeks or months earlier — support ticket frequency, engagement drop-off, a key champion who went quiet, a shift in product usage patterns. No single data source tells that full story.

The leaders who are actually getting ahead of churn are doing something specific: they're pulling data from multiple sources — a CRM, behavioral systems, customer success platforms — blending it into a unified view, and then running analysis across all of it to find the patterns that correlate with accounts that actually churned. Not guessing. Not running pivot tables on a single source. Building a model from the full picture.

Most teams know this is what they should be doing. Very few are actually doing it — not because they don't want to, but because it sounds like a data engineering project. Something that requires SQL, a data warehouse, a dedicated analyst, or months of setup.

That's the gap we keep running into.

The Forecast Problem Is Really a Data Blending Problem

Here's the market insight I keep coming back to after conversations like this one: the accountability that revenue leaders feel — hitting forecast, reducing churn, justifying the budget — is growing faster than the infrastructure most companies have built to support it.

Leaders are being asked to predict outcomes with a level of precision that requires ML-grade analysis. But most don't have a data scientist on their team. And even if they did, the analyst's time would be consumed by ad-hoc requests, not building the early warning systems that would actually change behavior.

What changes the dynamic isn't more headcount. It's closing the gap between "we have this data" and "we can act on this data" — without requiring a six-month data warehouse project to get there.

If you're a revenue leader who can work in spreadsheets, you already have the skills to do data blending and transformation at a level that most BI tools require engineering for. That's a capability most people in your position don't know they have access to.

The Question Worth Asking This Week

The leader we spoke with ended the call thoughtfully. He's evaluating options, watching the budget, doing this right rather than fast. I respect that.

But the question he was really asking — the one underneath all the product questions — was this: Is there a way to know earlier? Can we build a system that tells us something is wrong before it's too late to fix it?

The answer is yes. It doesn't require a data team. It doesn't require a six-month implementation. It requires connecting the data you already have, building a blended view of your customer behavior, and letting a model surface the signals that human eyes miss.

If churn prediction is on your radar this year — and in this environment, it probably should be — the best place to start is with the data you already have. Pull it together. Look for the patterns in accounts that actually churned. Then ask what those accounts had in common three months before they left.

That's not a technical project. That's an accountability project.

And if you want to see what that looks like in practice, we're happy to show you with your actual data — not a generic demo, but your customers, your signals, your patterns.

“We're Held Accountable to Forecast Accurately": A Lesson from a Revenue Leader

Brad Peters

At Scoop, we make it simple for ops teams to turn data into insights. With tools to connect, blend, and present data effortlessly, we cut out the noise so you can focus on decisions—not the tech behind them.

Subscribe to our newsletter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Frequently Asked Questions

No items found.