Why Business Intelligence Isn't Actually Intelligent

Why Business Intelligence Isn't Actually Intelligent

This week I sat down with the head of a fast-growing B2B software company — someone who's been around the technology industry long enough to have opinions about everything from enterprise BI to the right way to use Claude. About ten minutes in, he said something I've been thinking about ever since:

"I have an operations team that does all of that. I just go in and look at the dashboards and play around with them. And if I want another field added, they'll add it for me."

He wasn't embarrassed about it. He said it matter-of-factly, the way you'd describe any reasonable division of labor. And that's exactly what made it stop me.

Here's a leader who clearly understands data. He's tracked lead scoring models, discussed CRM attribution across Sales Loft, Gong, and Gainsight, and built AI-powered account scoring using 50+ Salesforce attributes. He knows what "intent signals" are and why they matter. He's not disengaged from his company's data — he's deeply engaged with the outputs. He just has no path to the inputs that doesn't run through someone else's queue.

That dynamic is everywhere. And I think it tells us something important about why self-service analytics has mostly failed — and what it would actually take to fix it.

  
    

Try It Yourself

                                  Ask Scoop Anything          

Chat with Scoop's AI instantly. Ask anything about analytics, ML, and data insights.

    

No credit card required • Set up in 30 seconds

    Start Your 30-Day Free Trial   

The 80-Dashboard Problem

At some point in our conversation, I mentioned that we've been working with large retail companies that have accumulated enormous dashboard libraries — organizations where someone had built dozens of views, each with a dozen more filters, and where the people making decisions had quietly stopped using most of them.

He laughed.

"We definitely do that. We have at least 80 dashboards."

And then the line that really landed: "Each dashboard has like 15 prompts."

Think about what that means in practice. Eighty dashboards times fifteen filters each is over a thousand pre-built views of a business. And yet, if you want something that isn't already there — a slightly different cut, a field you hadn't thought to add when the dashboard was built — you file a request and wait.

This is the state of self-service analytics at most mid-to-large companies: beautiful, well-maintained infrastructure that answers the questions somebody thought to ask last quarter. For the question you have right now, today, the one that showed up in your inbox at 7am — you're mostly on your own.

The Talent Instinct Gets Ahead of Reality

What struck me next was something he said almost as an aside, but I think it cuts to something real in how businesses think about data maturity.

He'd recently hired a new data leader — someone with strong credentials, experience at major enterprise software companies, deep technical skills. He was glad to have her. And then he said: "I worry she's not using enough AI in her modeling and trying to hire too many expensive resources. I don't think you need as many data scientists as you used to."

He quickly added: "I could be wrong, I could be wrong."

But I don't think he's wrong. I think he's pointing at a transition that most organizations are somewhere in the middle of right now — one where the instinct to hire more technical capacity is colliding with the reality that a lot of what those teams do can now be augmented, accelerated, or in some cases replaced by AI-assisted tooling.

The question isn't whether to have a data team. It's about what that team should actually be doing. There are high-leverage problems — building clean data infrastructure, encoding domain-specific business logic into analytical workflows, governing how insights move through the organization — that genuinely require human expertise and judgment. And then there's a long tail of ad hoc questions, one-off analyses, "can you just pull this for me" requests that eat up a data team's time and rarely require their real skills.

The teams that figure out how to separate those two categories — and route the second category to AI — are going to have a real structural advantage.

  
    

Try It Yourself

                                  Ask Scoop Anything          

Chat with Scoop's AI instantly. Ask anything about analytics, ML, and data insights.

    

No credit card required • Set up in 30 seconds

    Start Your 30-Day Free Trial   

"You Have to Have a Clean Data Store First"

One of the most grounded moments in the conversation was when he talked about where they actually are in their data journey. A company his size, he noted, was only now standing up a proper Snowflake instance — finally consolidating data that had been scattered across systems for years.

"You would think we would have already had this done," he said. "But it is what it is."

And then something I think is worth lingering on: "You have to have a clean data store to be able to leverage AI to really get the full benefit."

He's right, and it's an insight that's easy to skip past in the excitement around AI capabilities. The models are getting remarkably good. The interfaces are getting genuinely accessible. But the quality of what comes out is still constrained by the quality of what goes in. An organization that's still wrangling disconnected CRM data, spreadsheet exports, and manual reconciliations isn't going to unlock meaningful value from AI analytics no matter how good the tooling gets — at least not until the foundation is in place.

For a lot of companies, the actual work right now isn't choosing between analytics tools. It's getting the data architecture right so that any tool can do its job.

The Gap Between "What Happened" and "Why"

The part of the conversation that felt most relevant to what we're building at Scoop was a moment where we were talking about what dashboards are actually good for.

He got it immediately: dashboards show you what happened. They don't tell you why.

"You want to know what happened, or just deep investigation or exploration," he said at one point. "You either have to go get somebody to do that work for you, or you take a dump into Excel and play around with it."

That's a brutally accurate description of where most businesses are. And it's the thing that most traditional self-service analytics never actually solved. The tools got better at making dashboards easier to build and share. They didn't fundamentally change who could ask the investigative question — the one that starts with "why" instead of "what."

The promise of what we're building — and what I think this whole generation of AI-native analytics tools is really about — is closing that gap. Not by making dashboards faster to create, but by making the investigative layer genuinely accessible. The kind of analysis that used to require a data analyst to sit down, form a hypothesis, pull the data, test it, and iterate — that's the thing that should be self-service. Not just the chart.

  
    

Try It Yourself

                                  Ask Scoop Anything          

Chat with Scoop's AI instantly. Ask anything about analytics, ML, and data insights.

    

No credit card required • Set up in 30 seconds

    Start Your 30-Day Free Trial   

What This Means If You're Building Your Data Practice Right Now

If you're in a similar position to the leader I talked to this week — standing up new data infrastructure, thinking about team structure, wondering how much of your analytics capability should be human versus AI-assisted — I'd offer a few things worth sitting with.

First, the 80-dashboard problem is a signal, not a solution. If the answer to "I need an insight" is always "file a request," your self-service analytics isn't actually self-service. The goal should be a system where an operations leader or a regional manager can get an answer without being a technical user — and without waiting.

Second, data team talent is most valuable when it's working on the things AI can't do yet: building the logic, structuring the questions, defining what good looks like. The more of the execution layer you can route to AI-assisted tools, the more leverage that talent actually has.

Third, the infrastructure investment is worth it, but not on its own. A clean data warehouse is the prerequisite. What you do with it after that is the question.

If you're curious how Scoop fits into that picture — especially if you're thinking about what comes after the dashboard layer — I'd love to show you. The question we're trying to answer isn't "how do we make reporting faster?" It's "how do we make investigation available to everyone who needs it?"

That's the problem worth solving.

Why Business Intelligence Isn't Actually Intelligent

Brad Peters

At Scoop, we make it simple for ops teams to turn data into insights. With tools to connect, blend, and present data effortlessly, we cut out the noise so you can focus on decisions—not the tech behind them.

Subscribe to our newsletter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Frequently Asked Questions

No items found.