You spent how much on NetSuite?

And your team is still reconciling data in Excel.

I see this pattern everywhere. Companies talk about their NetSuite implementation like it's a badge of honor, this massive investment that transformed their business. Then I ask someone to show me their daily workflow.

They open NetSuite. Pull a report. Export it.

Open another system. Pull data from there.

Then spend 20 minutes in Excel doing vlookups to reconcile everything.

Here's the thing that gets me: they're not even frustrated anymore. This routine has become so normal that they've stopped questioning it. It's just how things work.

That's when I know. The system they paid six figures for? It's basically a really expensive database storing information, not an operational tool actually running their business.

The capacity is sitting there. The features exist. But the integration architecture that would let them actually use it? That was never built.

The 30% Problem

Here's a pattern I keep seeing: most companies use maybe 30% of their NetSuite capacity.

Think about that for a second.

They're paying for enterprise software, and they're using less than a third of what they bought. Yet they keep buying more tools to solve problems NetSuite already handles.

Why?

The integration architecture was never properly built in the first place.

When I trace this back, it's always the same story. The data exists in NetSuite, but it's fragmented across different modules or records in ways that don't match how the actual workflow operates.

Inventory quantities live in one place. The inventory lots and their status? Somewhere else. Pending transfer information? That's in a third spot.

So even if I showed them the "right" report, it wouldn't give them the complete picture they need to make a decision. They'd still need to pull two or three different views and mentally piece it together.

The Excel vlookup isn't them being inefficient. It's them solving a real structural problem that the implementation never addressed.

The system was configured based on how NetSuite organizes data, not based on how their operations team actually thinks about inventory.

And that's the integration architecture failure right there.

When The Misalignment Gets Locked In

This misalignment? It happens during the initial implementation. And then it gets locked in concrete.

Most implementations follow the same pattern:

Consultant asks what the company does. Maps it to NetSuite's standard modules. Configures those modules. Calls it done.

They're optimizing for "go live" speed, not for how information actually needs to flow through the organization.

The problem is they're working from NetSuite's data model backward, instead of working from the operational reality forward.

You end up with a system that technically works. All the data is there. Nothing is broken. But it doesn't match the mental model of how the team actually thinks about their work.

And once it's live, that structure becomes really hard to change.

People build workarounds instead of fixing the architecture. The Excel reconciliation becomes the integration layer. It's like they're using spreadsheets to translate between how NetSuite organizes information and how they actually need to see it.

The business might change over time, but that original structural mismatch? That gets set in concrete during implementation.

Research shows that more than 70% of ERP implementations fail to reach their original business case goals. Most companies don't even realize it's a problem until way later, when they're wondering why this expensive system isn't delivering the efficiency they expected.

How I Decide What's Worth Fixing

When I come in to look at these situations, I'm not always rebuilding everything. Sometimes there are easy fixes. Sometimes there are many layers of complexity.

But if the complex problem gives them more relief, we should estimate the effort and go for it.

I look at three things: frequency, propagation, and permanence.

Frequency is straightforward. How often does this problem create friction?

If someone's doing that Excel reconciliation once a month, maybe the workaround is fine. But if it's happening daily, or multiple times a day, that's compounding waste.

Propagation is about how far the problem spreads.

Does this broken integration affect just one person, or does it create downstream issues for five other people who depend on that data?

If the inventory manager's workaround means the purchasing team is working with stale information, which means the finance team can't close books accurately, that's propagation. The initial problem multiplies.

Permanence is the big one. Will fixing this actually solve the problem, or are we just moving the friction somewhere else?

I need to trace whether the root cause is really in the integration architecture, or if there's something deeper in how the business process itself is structured.

If it's high frequency, wide propagation, and I can see a permanent fix that addresses the structural cause, then yeah, it's worth the effort even if it's complex.

Because the alternative is that compound waste continuing forever.

When Automation Makes Things Worse

I had a client where the sales team was manually updating opportunity data in NetSuite after every customer call because the information from their CRM wasn't syncing properly.

Seemed obvious. Fix the CRM integration, automate the data flow, problem solved.

So I mapped out the integration, built the connector, got the systems talking. Data started flowing automatically. Should have been a win.

But then I started getting complaints from the sales team.

They were saying the data coming through wasn't accurate. Things were being categorized wrong. They were having to go back and fix it anyway.

I thought maybe it was a mapping issue, so I refined the field mappings. Still problems.

Finally, I sat down with one of the sales reps and watched what was actually happening.

Turned out, during those manual updates, they weren't just copying data. They were interpreting it.

A customer might say one thing in a call, but the rep knew from context that it meant something slightly different for how it should be recorded in NetSuite. That interpretation step was actually valuable.

By automating it, I didn't eliminate the work. I just moved it.

Now instead of doing interpretation during the update, they were doing cleanup after the automated sync. Same friction, different location.

Maybe even worse, because now they had to undo incorrect automation instead of just entering it right the first time.

The real problem wasn't the manual update. It was that the CRM and NetSuite had different data models that required human judgment to translate between them.

The fix wasn't integration automation. It was either accepting the manual step as necessary, or rebuilding how one of the systems structured that information.

I had solved a symptom and called it architecture.

Data Gaps vs. Judgment Gaps

This taught me to ask whether a connector is bridging a data gap or a judgment gap.

If the problem is that data exists in System A and needs to exist in System B, and there's a clear, consistent transformation between them—no interpretation required—then a connector like the NetSuite MCP connector can be the right solution.

It's solving a real structural problem: the systems don't talk, and they should.

But if the problem is that the data needs human context to be meaningful, or the transformation logic changes based on business circumstances, or there's interpretation happening in that gap, then the connector just automates the wrong thing.

You end up with faster bad data.

The way I test this is by asking: if we automate this connection, what happens when something goes wrong?

If the answer is "someone will catch it and fix it," that tells me the human is still doing the real integration work. They're just doing it as quality control instead of upfront.

That's not architecture. That's just moving the friction.

How To Spot The Difference

When I'm in the diagnostic phase with a client, I watch what happens when the data is wrong.

If someone gets incorrect data and they immediately know it's wrong and can explain why, that's a judgment gap. They're applying context that doesn't exist in the system.

But if incorrect data sits there and nobody notices until it causes a downstream problem, that's usually a data gap. The information just isn't flowing.

I also ask people to walk me through their decision-making process.

"You're looking at this screen—what are you actually deciding?"

If they say "I'm checking if these numbers match," that's data. If they say "I'm determining whether this customer situation requires an exception," that's judgment.

Another signal is consistency. I'll ask "does this transformation always work the same way, or does it depend?"

If the answer involves words like "usually" or "it depends on" or "sometimes," that's judgment creeping in.

Data gaps have consistent rules. Judgment gaps have contextual rules.

The tricky part is that most people don't realize they're applying judgment. They think they're just "doing the process."

So I can't just ask them directly. I have to observe the work, watch what happens when edge cases show up, and see where the human is actually adding value versus just moving information around.

If I can write down the transformation logic in a flowchart without any "consult someone" or "use your best judgment" boxes, it's probably a data gap.

If I can't, it's a judgment gap, and automation is going to create more problems than it solves.

When People Realize Their Value

Something interesting happens when people realize the judgment they're applying is actually valuable.

Before that realization, they're usually apologetic about the manual work. They think they're being inefficient, like they're failing to use the system properly.

So when I show up, they expect me to automate everything and make their jobs easier.

But when they realize the judgment they're applying is actually valuable—that it's preventing errors, or adding context that makes the data useful, or catching edge cases the system can't handle—the whole dynamic changes.

Suddenly they're not asking "can you automate this?" They're asking "should we automate this?"

They start protecting the parts of their work that actually matter, instead of assuming everything manual is bad.

And honestly, that's when the real diagnostic work can happen.

Because now they're helping me understand which parts of the process are just tedious data movement that should be automated, and which parts are where their expertise actually lives.

They become collaborators in designing the architecture instead of just users waiting for me to fix things.

The irony is that sometimes this means we automate less than they originally wanted.

But what we do automate actually works, because we're not trying to replace judgment with rules. We're freeing them up to apply their judgment where it actually creates value, instead of wasting it on reconciling spreadsheets.

What Should Come First

If I could redesign how NetSuite implementations happen, I'd spend the first phase just mapping information flow. Not systems.

I wouldn't touch NetSuite configuration at all.

I'd follow how decisions actually get made in the organization. Who needs what information, when do they need it, what do they do with it, and where does it go next?

I'd watch people work for a week. Not ask them to explain their process—actually watch them.

Because what people say they do and what they actually do are usually different.

I'd document every time someone exports data, every time they ask a colleague for information, every time they make a decision based on something they saw.

Then I'd map that out as information flow, not as system requirements.

This person needs inventory status to make purchasing decisions. That information comes from three different sources right now. They reconcile it manually because the sources don't agree.

Why don't they agree? What's the source of truth? Who else depends on that decision?

Only after I understand how information actually flows through the organization would I start thinking about NetSuite configuration.

Because then I'm designing the system architecture to match the operational reality. Not trying to force operational reality into NetSuite's default structure.

Most implementations skip this entirely. They go straight from "what does your company do?" to "here's how we'll configure NetSuite."

That's why you get systems that technically work but don't match how people actually think.

The Resistance To Diagnostic Work

When I propose spending a week just watching people work before touching any configuration, the usual reaction is resistance.

They think I'm wasting time.

They've already decided NetSuite is the solution. They've already spent money on licenses. They want to see progress, and progress to them means configuration happening, modules being set up, data being migrated.

Watching people work doesn't feel like progress.

I get pushback like "we already know our processes" or "we documented everything in the requirements phase" or "can't we just start and adjust as we go?"

There's this pressure to show visible forward movement.

And I get it. From their perspective, they're paying for implementation and the clock is ticking. Spending a week on observation feels like delay.

Especially when the sales process probably promised them a timeline that assumed we'd skip this entire phase.

But here's what I've learned to say: "You can spend one week now understanding how work actually happens, or you can spend six months after go-live fixing why the system doesn't match reality. Your choice."

Because that's what actually happens.

The companies that rush through discovery end up in this endless loop of customizations, workarounds, and user complaints. They go live, realize it doesn't work how people need it to work, and then spend way more time and money trying to retrofit the architecture.

The ones who let me do the diagnostic work upfront? They go live slower, but when they do, the system actually gets used. People trust it. The adoption is real, not forced.

It's a hard sell though. The industry has trained people to optimize for speed, and I'm asking them to optimize for permanence.

That's a different value system, and not everyone is ready for it.

The Question That Reveals Everything

When you're evaluating whether to use something like the NetSuite MCP connector, there's one question you should ask yourself before you buy it.

"What problem are we solving with this connector, and how do we know we've correctly diagnosed the root cause?"

If the answer is "we need to get data from System A to System B" or "we need to automate this manual process," that's a symptom-level answer.

You're optimizing for speed. Just make the immediate pain go away.

But if the answer is "we've mapped our information flow, identified that these two systems need to share data in this specific way to support this business decision, and we've verified that the transformation between them doesn't require human judgment," that's a structural answer.

You've done the diagnostic work.

The connector itself is neutral. It's just a tool.

But whether it's the right tool depends entirely on whether you understand the actual problem you're solving.

Most companies buy it because they see the friction—the manual exports, the reconciliation, the duplicate entry—and they want it gone. That's speed thinking.

Get rid of the visible pain as fast as possible.

Permanence thinking asks: why does this friction exist in the first place?

Is it because the systems don't talk, or is it because we never designed how they should talk? Is the manual step covering up a judgment gap we haven't acknowledged? Will automating this create new problems we haven't anticipated?

If you can't answer those questions, you're not ready to buy the connector.

You're just going to automate your way into a different set of problems.

What's your experience been with NetSuite integration? Are you dealing with manual workarounds that feel like they should be automated, or have you found places where the manual step is actually doing important work?

Keep Reading

No posts found