- The 5 Building Blocks: A Digital Transformation Framework That Actually Works
- The Pattern Behind the Building Blocks: Why Each One Matters
- Operational Excellence: The Foundation That Makes AI Transformation Possible
- Customer Intelligence: Why B2B Companies Can’t Use the Data They Already Have
- Product Intelligence: Why Your Smart Products Are Failing (And What Actually Works)
- Data Mastery: The Information Flow Problem Behind Every AI Initiative
- Team Dynamics: Why Your Biggest AI Challenge Isn’t Technical
The best AI implementations I've seen started with something unexpected: companies running 15-year-old ERP systems. Not the newest platforms - old, stable systems sitting on decades of clean operational data. Your ancient ERP might be your biggest AI advantage.
The best AI implementations I’ve seen all started with something unexpected: companies running 15-year-old ERP systems. Not the newest platforms. Not the cloud-native architectures. Old, stable, deeply understood operational systems that nobody had bothered to replace because they just worked. Those ancient systems are sitting on gold – decades of clean operational data, documented processes, transactional discipline built up over years of monthly closes and inventory reconciliations.
Here’s the problem: the gold is locked in a vault with a rusty door. Getting data out of these systems without cutting and pasting into Excel spreadsheets? That’s the real challenge most companies face. And it turns out, that’s what most “ERP modernization” projects are actually about – not replacing the system that works, but building a decent door to the vault so people can actually use what’s inside.
AI doesn’t need a new ERP system. It needs accessible operational data. And if you’ve been living in Excel hell for years, you already know exactly where the access problems are. The question is whether you’ve done anything about it.
This is why Operational Excellence is the first of the building blocks for your digital transformation. Without clean, accessible operational data, everything else you try to build – customer intelligence, product innovation, AI capabilities – sits on a shaky foundation. Get this right, and the rest becomes possible. Skip it, and you’ll spend years fighting data quality issues instead of creating value.
The Data You Have But Cannot Touch
I’ve worked with dozens of companies built through mergers and acquisitions – private equity portfolios, multi-divisional industrials, roll-ups in manufacturing and distribution. The pattern is always the same. Each business unit runs its own ERP system, often a different one from the company next door. This happened for perfectly rational reasons. The acquisition moved quickly, integration was expensive and risky, and the acquired company’s systems worked fine for their operations. Why force a migration when you can just consolidate the financials at corporate and call it done?
But then corporate needs something that sounds simple: a consolidated view of purchasing across the portfolio to negotiate better pricing with key suppliers. Or visibility into inventory levels across all facilities to optimize working capital. Or coordination between regional distribution centers and manufacturing plants to improve fulfillment speed. Suddenly, the fact that each business unit speaks a different operational language becomes a serious problem.
The value isn’t in rolling up financial statements – you solved that years ago with consolidation tools. The real opportunity is in operational data: inventory turns, supplier performance, production schedules, quality metrics. That’s where purchasing power lives. That’s where working capital gets optimized. That’s where customer service improves because you can actually see what’s available and where it’s located.
But that operational data is trapped. Each ERP system has its own schema, its own part numbering logic, its own customer master files. The item that Business Unit A calls “Widget-Blue-Large” is the same physical product that Business Unit B calls “WDG-BL-L” and Business Unit C tracks with a numeric code that means nothing to anyone outside their four walls. You can’t aggregate what you can’t reconcile. You can’t analyze patterns you can’t access.
Everyone talks about “ERP harmonization” like the answer is replacing all those systems with one big shiny new platform. Sometimes that’s the right move. But I’ve seen plenty of companies spend years and millions on harmonization projects that deliver disappointing results because they were solving the wrong problem. The systems weren’t the issue. Data access was the issue. The companies that figured out how to get operational data out of their existing systems – consistently, quickly, with reasonable effort – created more value than the ones that tried to replace everything.
This is the operational excellence foundation that AI actually needs. Not modern systems. Not unified platforms. Accessible, clean, reconcilable operational data that people can actually work with.
Why the Financial Close Taught Us Everything
Here’s what finally forces companies to solve the data access problem: the monthly financial close. Nothing exposes data quality issues and access friction like trying to close the books across multiple business units running different systems. Every month, the same painful ritual plays out. Finance teams export data from each system, reconcile differences in spreadsheets, hunt down discrepancies, adjust entries, and eventually produce consolidated financial statements. It takes too long, requires too much manual effort, and introduces too many opportunities for errors.
The cost of this process – measured in fully loaded FTE hours, delayed decision-making, and audit exposure – eventually justifies investment in better tools. Companies implement financial consolidation platforms, build data warehouses, create standard mapping tables, and establish governance processes to keep everything aligned. The stated goal is “faster close cycles” or “improved financial reporting.” The actual achievement is something more fundamental: consistent, accessible financial data flowing out of disparate operational systems.
What’s interesting is that solving financial data access creates patterns and infrastructure that extend far beyond the finance department. The mapping tables that reconcile different chart of accounts structures? Those same techniques work for reconciling part numbers and customer lists. The ETL processes that extract financial data from each ERP? Those same pipelines can pull operational data. The data governance disciplines that ensure financial data quality? Those same practices improve operational data quality.
Companies that invested in solving the financial close problem built data access capabilities that turned out to be their AI readiness foundation. They just didn’t know it at the time.
The ERP Era Built This Foundation
To understand why operational data matters so much for AI, you need to understand what the ERP wave actually accomplished. This was the first major technology transformation wave, hitting in the 1960s and continuing through the personal computer revolution of the 1980s. The goal wasn’t just automation – it was capturing implicit operational knowledge in explicit systems.
Before ERP, operations ran on expertise trapped in people’s heads and filing cabinets. The purchasing manager knew which suppliers were reliable. The production scheduler understood which jobs to prioritize. The inventory clerk could tell you what was really in stock versus what the paperwork claimed. This knowledge worked fine as long as those people stayed in their roles. But when they left, the knowledge left with them.
ERP systems forced companies to make that knowledge explicit. Every transaction followed defined rules. Every process got documented. Every exception required a decision that could be tracked and analyzed. The goal was “do more with less” through automation, but the lasting value came from transforming tacit knowledge into operational data.
This created something that didn’t exist before: a complete, detailed, continuously updated record of how the business actually operates. Order patterns. Supplier performance. Production yields. Quality trends. Inventory turns. All captured in transactional detail, all queryable if you could figure out how to get at it.
The companies that took ERP seriously – that invested in data quality, process discipline, and system understanding – built up decades of clean operational data. The companies that treated ERP as a necessary evil to satisfy the finance department ended up with systems full of garbage data and workarounds.
AI doesn’t care whether your ERP is old or new. AI cares deeply whether your operational data is clean and accessible. The companies with ancient ERP systems and strong data discipline have a massive advantage over companies with modern systems and sloppy data practices.
How AI Amplifies What ERP Started
Here’s what AI does with operational data that previous technologies couldn’t: it finds patterns humans miss, predicts outcomes before they happen, and generates recommendations at machine speed. But AI is entirely dependent on the quality and accessibility of the data going in. Garbage in, garbage out – that rule hasn’t changed since the mainframe era.
Think about demand forecasting. Traditional approaches used statistical models based on historical patterns. If you sold 100 units last January, you’d probably sell somewhere around 100 units this January, adjusted for growth trends and known events. These models worked okay when demand patterns were stable and predictable.
AI-powered demand forecasting analyzes operational data across multiple dimensions simultaneously: historical sales patterns, supplier lead times, production capacity constraints, inventory positions, customer order behavior, even external signals like weather or economic indicators. It identifies correlations that no human analyst would spot. It learns from forecast errors and continuously improves its predictions. It generates demand plans that optimize across competing objectives – customer service levels, inventory costs, production efficiency.
But here’s what makes or breaks the AI: data access and quality. If sales data is trapped in the CRM and disconnected from inventory data in the ERP, the AI can’t see the full picture. If part numbers aren’t consistent across systems, the AI can’t aggregate demand properly. If customer master data is full of duplicates and errors, the AI can’t identify buying patterns. If production schedules aren’t reliably updated, the AI can’t account for capacity constraints.
The companies that solved these data access and quality problems for financial close or operational reporting already have most of what AI needs. The companies still cutting and pasting data into Excel every month face a much harder path.
Consider predictive maintenance on manufacturing equipment. IoT sensors generate telemetry data showing vibration, temperature, energy consumption, cycle times. AI analyzes these signals to predict when equipment will fail and recommend preventive maintenance before breakdowns occur. This is one of the most proven AI use cases in manufacturing.
But predictive maintenance doesn’t run on sensor data alone. It needs context from your operational systems. When was the equipment last serviced? What parts were replaced? What products were being manufactured when the anomaly occurred? Were there any quality issues downstream? This contextual operational data determines whether the AI can distinguish between normal variation and actual warning signs.
If your operational data is accessible and reliable, connecting it to sensor data is straightforward. If your operational data is trapped in siloed systems with questionable quality, you’ll spend more time cleaning data and reconciling systems than actually using AI.
What Operational Excellence Really Means for AI Readiness
When executives ask me about AI readiness, they usually want to talk about technology infrastructure, data science talent, or algorithm sophistication. Those things matter, but they’re not the foundation. The foundation is operational excellence – specifically, the data access and quality practices built up over decades of running your business.
Here’s how to assess whether your operational foundation is ready for AI:
Can you answer basic operational questions without heroic effort? If someone asks “What were our top 20 selling products last quarter across all business units?” or “Which suppliers have the best on-time delivery performance?” or “What’s our average inventory turn by product category?” – can you answer these questions within hours instead of days? If yes, your data access patterns are probably solid. If no, you have fundamental access problems that will block AI initiatives.
Do different people get the same answers to the same questions? If Finance, Operations, and Sales all run reports on “Q4 revenue” and get different numbers, you have data quality and reconciliation issues. AI will amplify these inconsistencies, not fix them. The companies ready for AI have already invested in master data management and data governance to ensure one version of the truth.
Can new employees or new systems get data without extensive custom development? If every new hire needs weeks of training to understand your data structures, or every new reporting request requires IT to write custom queries, your data isn’t accessible enough for AI. AI initiatives need to iterate quickly, testing hypotheses and refining approaches. That requires data infrastructure that supports exploration, not just pre-defined reports.
Do you know what “good” looks like for your operational data? AI depends on training data that represents normal operations and desired outcomes. If you can’t define what good operational performance looks like in data terms – acceptable lead times, target inventory levels, quality thresholds – AI can’t learn to optimize toward your goals.
The companies that score well on these questions didn’t get there by accident. They got there through years of operational discipline, data quality initiatives, and investments in making data accessible for decision-making. Some of that work happened during ERP implementations. Some happened during financial consolidation projects. Some happened when business intelligence tools arrived and exposed how poor the underlying data really was.
The work wasn’t glamorous. It wasn’t innovative. It was the boring, unglamorous discipline of getting operational data right. But that boring foundation is what makes AI possible.
The Opportunity Hiding in Plain Sight
Here’s the counterintuitive insight that most companies miss: if you’ve been frustrated by how hard it is to get operational data out of your systems, you’re closer to AI readiness than you think. You’ve already identified the access points that matter. You know which data quality issues cause problems. You understand which operational metrics drive decisions.
The companies struggling with AI aren’t the ones with old systems. They’re the ones that never invested in solving the data access problem. They’re still cutting and pasting into Excel because nobody prioritized building proper extraction tools. They’re still reconciling data manually because master data governance seemed too expensive. They’re still arguing about which numbers are correct because they never established clear definitions and ownership.
Your operational excellence foundation doesn’t need to be perfect for AI. It needs to be good enough that you can get clean, consistent, accessible data out when you need it. The same patterns that solved financial consolidation solve AI data access. The same disciplines that enable fast month-end close enable AI model training. The same governance processes that maintain customer master data maintain training datasets.
AI is the fourth wave of business transformation, but it builds directly on the first wave that started with ERP six decades ago. The companies that invested in operational excellence during the ERP era have advantages that can’t be quickly replicated. That ancient system running your operations might be sitting on exactly the gold your AI initiatives need – if you can figure out how to open the vault door and actually use it.
The question isn’t whether to replace your old ERP before starting AI. The question is whether you’ve invested in making your operational data accessible. If you have, you’re further ahead than most companies with newer systems. If you haven’t, that’s the foundation work that comes first – and it has nothing to do with how old your technology is.
Recommended Books
- The Goal by Eliyahu Goldratt – The classic on operational excellence and constraint management
- Data Quality: The Accuracy Dimension by Jack Olson – Foundational text on why data quality determines system value
- Competing on Analytics by Thomas Davenport – How operational data becomes competitive advantage
If you’re wrestling with operational data challenges or preparing your organization for AI, I’d love to hear what’s working and what’s not. Join my mailing list for regular insights on digital transformation that actually works.
16 January, 2026






Comments (0)