Skip to content
Vuong Nguyen avatar Vuong Nguyen

AI Everywhere, Rockets on Donkeys

Visual metaphor of strapping rockets to a donkey representing ineffective AI implementation on broken systems

Walk into any tech conference right now and count how many booths promise to “add AI to your workflow.” At Money20/20 last month, I stopped counting after the first hour. Every vendor, every pitch deck, every product demo, the pattern was identical: take your existing process, sprinkle some machine learning on top, and watch the magic happen.

Except the magic rarely happens.

What you get instead is motion without movement. Noise without progress. A donkey with a rocket strapped to its back, still trudging along the same path, just louder and more expensive.

The Reconciliation Theater

I watched an impressive demo of an AI-powered reconciliation engine at Money20/20. The pitch was slick: feed it transaction data from multiple sources, let the model do the matching, save hundreds of manual hours per month. The engineering was solid. The model accuracy looked good. The ROI spreadsheet was convincing.

Then I asked about their data pipeline.

Turns out, before the AI could do its work, someone was manually exporting CSVs from three different systems, cleaning the date formats in Excel, matching merchant names by hand, and then uploading the “prepared” data to the AI system. The AI matched transactions that were already 80% matched by humans.

Six months of AI development. Zero process improvement.

The broken part was not the matching algorithm. It was the fragmented systems, inconsistent data standards, and lack of API integration between platforms. The AI did not fix those problems. It just added another layer on top, creating the illusion of automation while humans still did the hard work in the shadows.

The Pattern Repeats

This is not unique to fintech reconciliation. I see it everywhere:

The AI is not the problem. The AI works fine, usually. The problem is we are asking it to optimize systems that should not exist in their current form.

Why We Keep Doing This

Adding AI is easier than fixing foundations. It is a forward-looking narrative. It sounds innovative in board meetings. It photographs well in marketing materials. And critically, it does not require us to admit that our current processes are fundamentally broken.

Rebuilding a system means confronting uncomfortable truths:

These are hard conversations. They require executive buy-in, budget allocation, temporary productivity hits, and cultural change. Much easier to add an AI layer and declare victory.

What Real Innovation Looks Like

Real innovation happens when you rebuild the system itself, not when you add rockets to donkeys.

In production, I have seen this play out clearly. The companies that get AI right do not start with AI. They start by asking: what is the actual problem, and what would the ideal system look like if we built it from scratch today?

Sometimes the answer involves AI. Often it does not.

One team I worked with was exploring AI for invoice processing. Before building anything, they mapped their entire accounts payable workflow. What they discovered: 60% of invoice delays came from unclear approval chains, not from data extraction accuracy. They fixed the approval routing first. Manual process. No AI. Cut processing time by half.

Then they added AI for data extraction. It worked beautifully because it was solving a real bottleneck, not papering over organizational dysfunction.

Another example: a payments company wanted AI to detect fraud patterns. Smart idea. But their fraud data was siloed across three systems with inconsistent tagging. They spent three months unifying their data infrastructure before touching any ML models. When they finally deployed AI, it worked immediately because it had clean, structured data to learn from.

The pattern is consistent: fix the foundation, then add intelligence.

The Honest Conversation We Need

Most of us are still donkey mechanics. We are patching legacy systems, working around bad data, and trying to make incremental improvements without breaking things that barely work today. And that is okay. Every system needs iteration before it can evolve.

But we should be honest about it.

If you are adding AI to a broken process, you are not innovating. You are automating dysfunction. The donkey moves the same way. It just makes more noise and costs more to maintain.

The honest version sounds like this:

“We know our data pipeline is a mess. We are going to spend six months cleaning it up before we even think about AI.”

“Our customer support knowledge base is disorganized. We need to restructure it first, then consider automation.”

“Our reconciliation process is broken because we have five disconnected systems. We are going to fix the integration layer before adding intelligence on top.”

These statements do not photograph well. They do not win innovation awards. But they lead to systems that actually work.

The Question Worth Asking

Before you add AI to anything, ask this: if you were building this system from scratch today, would it look anything like what you have now?

If the answer is no, do not add AI yet. Fix the system first.

If your data is messy, clean it.

If your workflow is inefficient, redesign it.

If your systems do not talk to each other, integrate them.

If your organizational structure creates bottlenecks, restructure it.

Then, and only then, add intelligence where it actually multiplies value instead of amplifying dysfunction.

AI is powerful. It deserves better than being strapped to donkeys.

What is one process in your world that truly deserves a rebuild before adding AI?

💬 Join the conversation on LinkedIn or X/Twitter