Field Notes
BI Modernization Self-Service Analytics Platform Strategy

The Self-Service Analytics Revolution that Never Happened

After millions spent on modernizing analytics infrastructure, everyone still works like it's 2003. Here's why—and what we can do about it.

Mat Hughes ·

There’s a story we tell ourselves about data in organizations. It goes something like this: Information wants to be free. Give people access to data, democratize it as consultants like us tend to say, and better decisions will flow throughout the organization. Empowered employees will spot trends their managers miss. Small teams will outmaneuver large competitors. The pyramid of corporate hierarchy will flatten into something more like a network, with insights flowing from anywhere to everywhere. “Data is the new oil”, after all, and most of the problem was about access and distribution of this valuable commodity.

It’s a beautiful story. It’s also, by most measures that matter, not happening.

Over the past couple years, I’ve been working closely with multiple Fortune 100 companies as they plan “what’s next” for their analytics platforms. That involves a lot of discussion with their analysts and users about their experiences working with data. And I keep thinking about a story we heard over and over again. They describe their morning routines: Log into Tableau. Export to Excel. Go find a different dashboard. Export. Repeat. Clean the data. Build the real analysis. Switch systems. Email the report. After millions spent on modernizing analytics infrastructure, everyone still works like it’s 2003.

This is the kind of observation that should stop us cold. How is it possible that after the self-service analytics revolution, the cloud data revolution, and endless conferences about “data-driven decision making,” the most sophisticated analytics users at our most sophisticated companies are still exporting to a 40-year-old business tool like Excel?

The answer tells us something important not just about data and analytics technology, but about the gap between how we hope technology will change organizations and how change actually happens.

The Problem Isn’t What You Think

Let me start with what the problem isn’t. It isn’t that people resist change. Anyone who’s watched the rapid adoption of AI tools, smartphones, or video conferencing during the pandemic knows that people will embrace new tools when those tools genuinely make their lives easier.

It isn’t ignorance, either. The employees exporting dashboards to Excel aren’t luddites. They’re often experts in their business domain, power users who know exactly what these platforms can do. But they also know what the platforms can’t do.

And it isn’t, despite what vendors claim, simply a failure of “change management” or insufficient training. We’ve worked with companies that have run large skill certification programs, created centers of excellence, and driven adoption programs. We’ve helped design and implement these programs. Yet adoption remains uneven, and extracting value often feels like it happens despite the tools rather than because of them.

The real problem is more fundamental. Think about what modern BI platforms optimize for. They process vast amounts of data quickly. They create beautiful visualizations. They scale across organizations. These are impressive technical achievements. They’re also, from the perspective of someone trying to make a decision at 3 PM on a Tuesday, largely beside the point.

How Analytics Actually Works

Let me paint you a picture of how analytics actually works in most organizations. Not how it’s supposed to work, but how it does work.

Sarah is a marketing manager. She needs to figure out why customer churn spiked last month. In theory, she opens her company’s analytics platform, explores the data, finds insights, and takes action. Clean. Efficient. Data-driven.

In practice? She opens the platform and confronts dozens of dashboards, none quite answering her question. She finds one that’s close, but it shows aggregate data when she needs segment-level detail. She tries to filter it, but the filters don’t match her mental model of the business. She exports to Excel, not because she loves Excel, but because in Excel she can add the context the dashboard lacks. She can annotate. She can combine it with data from other sources. She can shape it into something that actually answers her question. Then she can action it or track actions in the spreadsheet too.

By the time she’s done, she’s spent three hours doing what the analytics platform was supposed to make instant. And the worst part: Next week, she’ll have to do it all over again.

Sarah’s struggles illustrate four fundamental gaps in how we’ve built analytics tools:

Rigid, predefined views that can’t adapt to her actual questions. She needs the ability to explore data fluidly - whether through intuitive interfaces or by simply asking ‘Why did churn spike in our enterprise segment?’ in plain language.

Read-only experiences that force her to export data just to add context or annotations. Analytics should be a two-way conversation where she can capture insights and institutional knowledge directly in the tool.

Incomplete workflows that show her the problem but provide no path to action. She should be able to identify rising churn rates AND trigger retention campaigns or adjust onboarding - all in one interface.

Segregated insights that live in a separate platform from her actual work. The churn data should appear in her campaign management tool, not force her to context-switch to a dashboard.

But Sarah’s struggles are just the beginning. When she takes her analysis to the leadership meeting, she discovers that the CFO has different churn numbers entirely. Customer success reports 8% monthly churn. Finance claims 12% based on their revenue calculations. Marketing shows 6% influenced churn from their attribution model. Three departments, three “sources of truth,” zero trust.

The meeting that should have focused on solving the churn problem instead becomes an archaeological expedition. Which numbers are right? Why are they different? Whose dashboard should we believe? By the time they’ve excavated the truth from layers of conflicting metrics, the meeting is over and nothing has been decided.

This erosion of trust in data might be the most insidious problem of all. When metrics become contested territory, when nobody believes the numbers, when every discussion starts with 30 minutes of forensic accounting, then data-driven decision making becomes impossible.

The cruel irony? The proliferation of dashboards often makes this worse, not better. More dashboards mean more places for numbers to diverge. More visualizations mean more opportunities for misinterpretation. The tools that promised clarity instead delivered chaos.

This isn’t a story about one person or one company. This is the reality of enterprise analytics: millions of knowledge workers, every day, working around and against the very tools that were supposed to empower them, unable to trust the very numbers that should guide their decisions.

The Incentive Problem

To understand why this keeps happening, I like to think about incentives. Not the incentives of users, their incentives are clear. They want to do their jobs well with minimal friction. The incentives that matter here are those of the software companies that sell these platforms.

What does success look like for a BI vendor? It’s not actually about making users more effective or organizations more “data driven.” It’s about winning the RFP. It’s about checking feature boxes. It’s about creating enough lock-in that migration costs become prohibitive. It’s about selling to the economic buyer, usually IT or finance, not the end user.

But there’s another perverse incentive at play: the platform play. Major vendors aren’t just selling you analytics, they’re selling you an ecosystem. They want you using their hobbled data prep tool, their adequate visualization platform, their barely-functional integration features. Not because this collection is best for your users, but because it’s best for their revenue. Lock-in isn’t just about switching costs anymore; it’s about making you dependent on an entire constellation of interconnected, mutually dependent tools. (And in the new world of consolidation via acquisition, it’s not even guaranteed that these disparate tools in the platform even work well together.)

This creates a profound misalignment. The metrics that matter for procurement (total cost of ownership, vendor stability, “unified platform benefits”) aren’t the metrics that matter when you’re trying to figure out why churn spiked and do something about it.

Even more perversely, vendors often benefit from this complexity. The harder the tool is to use, the more services and training they can sell. The more frustrated users become, the more likely IT is to buy premium support contracts, customer success programs, or that new AI-powered interface that promises to finally make everything simple. The problem becomes the business model.

It’s not just end users who suffer. The people building these tools face their own frustrations: endless backlogs of dashboard requests, rigid platforms that force compromises, and the impossibility of anticipating every question users might ask. We’ve created a system where both creators and consumers are set up to fail.

Let me be clear: I’m not saying all BI vendors are bad actors. Data and analytics tools are incredibly valuable as part of a well-designed stack. Many vendors genuinely want to help their customers succeed. But the current incentive structure pushes even well-intentioned companies toward complexity and lock-in rather than user empowerment.

What Would Better Look Like?

This is where things get interesting. We’re not here just to throw rocks at enterprise BI vendors, many of whom we partner with and respect. But we do think there’s a better way, and we can see it emerging in unexpected places.

We see it in the shadow systems that employees build when official tools fail them. We see it in the workarounds that become standard practice. We see it in the tools that people actually choose to use when they have a choice.

At InterWorks, we believe analytics tools should be treated as digital products - carefully designed around real user needs, not technical capabilities. Success shouldn’t be measured by dashboards created, but by decisions made, understanding increased, and problems actually solved.

Better would mean three fundamental shifts:

Embedded Intelligence: Marketing tools that surface churn insights automatically. CRMs that highlight at-risk accounts with supporting data. Project management platforms that predict timeline risks based on historical patterns. This isn’t about better dashboards - it’s about bringing insights directly into the tools where work actually happens.

Flexible Exploration: Tools that let Sarah start with her churn dashboard but seamlessly pivot to investigate segments, time periods, or channels she didn’t anticipate needing. Whether through drag-and-drop interfaces or conversational AI that understands “Show me churn rates by customer tier for enterprise accounts in the last 90 days.” The rigid boundaries between dashboards, reports, and ad-hoc analysis need to dissolve.

Complete Workflows: Purpose-built applications that don’t just show rising churn rates but let her create retention campaigns, adjust onboarding flows, or trigger personalized outreach directly from the insight. Marketing apps, sales territory planning tools, and operational dashboards that bridge the gap between insight and action.

But building better requires something the industry has been reluctant to do: putting human needs above technical capabilities. Agency requires tools that amplify human judgment, not replace it.

Our tools should preserve and focus that attention, not scatter it across dozens of dashboards.

The emerging world of unstructured data - from customer conversations to process documentation to market intelligence - demands smarter semantic layers that can make sense of not just traditional metrics but the full context of business operations. This opens possibilities for not just automated insights but automated actions: systems that don’t just flag at-risk customers but proactively adjust their experience based on predictive models.

Most importantly, this requires building a custom-tailored set of tools and experiences that matches your organization’s specific needs, workflows, and decision-making patterns. The one-size-fits-all platform approach has failed because organizations aren’t one-size-fits-all.

The Path Forward

So where does this leave us?

First, we need to be honest about what’s not working. The analytics revolution, as currently conceived, has failed to deliver on its promise. Not because the technology isn’t powerful, but because power without usability is just complexity.

Second, we need to think differently about how change happens in organizations. Real transformation doesn’t come from deploying new tools and hoping people adapt. It comes from understanding how people actually work and building tools that enhance those workflows. It comes from human-centered design that starts with user needs, not platform capabilities.

Third, and most importantly, we need to realign incentives. As long as software companies are rewarded for complexity, lock-in, and platform proliferation rather than user success, we’ll keep getting tools that demo well and work poorly.

The good news is that change is possible. We’ve seen organizations that have figured this out, that have built analytics systems that people actually want to use. They didn’t do it by buying better dashboards or the latest high-cost SKU from their BI vendor. They did it by starting with a different question: Not “what data do we have?” but “what decisions do we need to make?”

That shift, from data-centric to decision-centric, changes everything. It changes what you build. It changes how you build it. It changes who’s involved in building it.

Most of all, it changes the story we tell ourselves. Not a story about democratizing data, but about empowering people. Not about making everyone an analyst, but about making everyone more effective at their actual job.

Building Something Better

There’s much more to explore on how to make this vision practical:

Modern Data Applications: Tools like Sigma Analytics that make building purpose-built business applications accessible to analysts, not just developers - solving complete problems rather than just showing data.

Conversational Analytics: Practical approaches to AI-powered exploration that actually accelerate insight discovery, letting users ask follow-up questions naturally rather than filing requests with the data team.

Embedded Intelligence: Proven strategies for bringing insights directly into CRMs, email tools, and project management platforms where decisions actually happen.

Flexible Metric Layers: Creating business-friendly metrics that can be composed and recomposed to answer evolving questions without building new dashboards for every slight variation.

We’ll dive into each of these topics with real examples, practical guidance, and tools you can start using immediately.

The analytics revolution isn’t about better dashboards - it’s about intelligence that works the way people do. Tools that meet users where they work, adapt to how they think, and bridge the gap between insight and action. This isn’t a far-future vision requiring massive technical investments. The tools exist today. What’s missing is the willingness to put human needs above platform features.

Because somewhere right now, someone is exporting a dashboard to Excel. And they’ll keep doing it until we build something better.

Ready to build analytics that solves your users' problems?

Answer a couple questions and I'll reach out with relevant resources.

What's your biggest analytics challenge?
What would help most?
The Dispatch

Decelerate your feed.

Infrequent, high-signal writing on analytics, human-centered design, and the tools that matter.

No spam. Unsubscribe anytime.