By Erika Fille T. Legara ASK any leadership team why their AI investment didn’t deliver, and the answer is almost always some version of change management. PeopleBy Erika Fille T. Legara ASK any leadership team why their AI investment didn’t deliver, and the answer is almost always some version of change management. People

‘Change management’ is not an answer

2026/05/07 00:02
8 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

By Erika Fille T. Legara

ASK any leadership team why their AI investment didn’t deliver, and the answer is almost always some version of change management. People weren’t ready. Culture is resistant. Process wasn’t aligned. The answer is offered almost reflexively, like a diagnosis that doubles as an exit from the conversation. Every MBA cohort learns change management.

At the Asian Institute of Management, even the Master of Science in Data Science program, arguably the premier enterprise data science program in the region, includes Leadership and Change Management as a core course. The recognition that people and process are central to any transformation is not new, and it is not confined to business generalists. And yet, knowing about it has not made it easier to do.

Two words. Endless invocation. Almost no specificity about what, exactly, broke down and where.

I want to push back on the lazy use of this phrase, not because change management is wrong as a category, but because invoking it without unpacking it has become a way to avoid the harder work of actually diagnosing what broke. Throwing “change management” at a failed AI deployment is like an auditor writing “numbers don’t add up” in a findings report. Technically true, but professionally useless.

PROCESS AS MUSCLE MEMORY
The more I’ve worked with organizations trying to embed AI into their operations, the more I’ve found that the root problem is almost always the same; nobody has a shared understanding of the process.

I mean this precisely. Ask leadership how a particular workflow operates, and they’ll give you a confident answer. Then ask the department head. Then ask someone three levels down who actually runs the workflow day to day. You will get three different descriptions of the same process. And, oftentimes, these aren’t just minor variations; they are genuinely different accounts of how work moves, who owns what, and what “done” looks like.

In most organizations, process knowledge lives in people’s bodies, not in documentation. It is muscle memory; habits and assumptions that were never written down because they never had to be. It mostly works because everyone has learned to navigate the same ambiguity in roughly compatible ways.

I discovered this firsthand while working with my team to centralize data across several offices. We were not trying to build anything complicated. It’s simply a shared data structure, no AI model or deployment required. Leadership was aligned. Individual offices were enthusiastic. By every conventional measure, the conditions for success were there.

And then the actual work started, and suddenly we were in a room where no two people could agree on how the existing process worked, who held which piece of it, or what we were even trying to replace. At the root of the problem was a process that no one could clearly follow, not resistance.

CODIFYING THE IMPLICIT
This is where the change management framing fails. It locates the problem in attitudes — in willingness, in culture, in openness to transformation. The organizations I’ve seen struggle with AI are not, by and large, staffed by people who refuse to change. They want to change. They know they need to. The confusion arises when the change requires them to surface a process they’ve only ever understood intuitively and to hand accountability for it to a specific person.

I’ve seen this in two forms, and both are instructive. A financial institution we worked with had established thresholds for certain decisions; everyone on the team knew the thresholds and agreed they had been discussed and settled. But when a third-party audit asked them to trace how those thresholds were derived, the documentation wasn’t there. Sure, the number existed, but the rationale had evaporated.

In another case, we asked an organization to share a dataset and discovered that the data had been mapped in a particular way, presumably for a legitimate reason, perhaps privacy, perhaps something operational, but nobody had written it down. Each person we spoke to had a slightly different account of why the data looked the way it did. No single source of truth, because the decision had lived in the room where it was made and nowhere else.

I wouldn’t use negligence here, because truly, it’s how most organizations actually operate. Process knowledge accumulates in people, in habits, in shared assumptions that never had to be made explicit because everyone around them already understood. Until someone asks.

That surfacing is hard. And it is load-bearing. Because you cannot plug an AI system into a process that isn’t defined. You can bolt it on, maybe. You can run it in parallel, with a workaround, on the side. But you cannot integrate it.

I’ve watched both scenarios play out. In one instance, a company came to us at the ACCeSs Lab of the Asian Institute of Management to evaluate potential sites and forecast the success of a new branch. The workflow was admittedly rough; data was sent via e-mail, processed through the model, and returned as a report. Despite this, the outputs were useful, and the company continued to rely on them.

In a different case, an organization invested hundreds of millions of pesos in building a system that was technically sound in every respect. It produced exactly what it was designed to produce, yet it ultimately went unused. The issue was not technical performance, but integration. No one could determine where the system fit within actual operations. It was never embedded into the workflow — not due to resistance, but because the process it needed to connect to had not been defined clearly enough to provide a point of integration.

A third pattern emerges on the user side. Some systems were developed to meet real needs and delivered clear gains in accuracy, efficiency, and reliability, yet staff did not adopt them. The existing method was slower and less precise, but it was familiar. The discomfort of shifting away from a practiced routine, even toward a better alternative, was enough to stall adoption. The status quo carries weight, and comfort zones are real.

Labeling this as “change management” acknowledges the issue, but does little to clarify which specific sources of friction need to be addressed.

What actually helps is treating the process as the primary object of analysis, before any conversation about tools or systems begins. This means making the process visible, through RACI matrices, process mapping, structured interviews, and whatever gets everyone’s implicit understanding into an explicit shared form. It means accepting that the map you draw will reveal contradictions, because the same task is being done differently by different people, and nobody has realized it. It means doing that diagnostic work before deciding where AI fits, not after.

But the RACI matrix is only a document. What makes it meaningful is the conversation that produces it. In my experience, nothing replaces bringing people into the same room and working through the process together. In a real working session, someone might say, “I thought you handled that step,” and someone else says, “We’ve never handled that step,” and the gap finally becomes visible to everyone at once. Regular alignment meetings, sustained over time, help ensure that this shared understanding does not fade back into comfortable assumptions.

THE LOGIC OF ITERATIVE DEPLOYMENT
Once the process is legible and ownership is established, the question of where to introduce AI becomes tractable. And here, the instinct to start big is usually the wrong one. A small use case that cuts across at least two offices or functions is far more instructive than a contained pilot within a single team. Cross-functional work surfaces the integration problems that are most critical — the handoffs, the data dependencies, the accountability gaps. Get that working well, measure it, understand what held and what didn’t, and then build from there.

This is slow. It is supposed to be slow. Traditional, non-digital-native organizations that have made AI genuinely work in their operations did not get there through a bold deployment or a large budget. They got there by being methodical, mapping processes, resolving ownership, running small proofs, and iterating. The complexity compounds because organizations are not machines. They are networks of people who have adapted to each other over time, and the whole behaves in ways that no single part intended or can fully observe. Shortcuts through that complexity don’t actually shorten the timeline; they push the reckoning further down the road

Calling that “change management” is not wrong. But it’s not enough to be useful.

Erika Fille T. Legara, Ph.D. is a physicist, educator, and data science and AI practitioner working across government, academia, and industry. She is the inaugural Managing Director and Chief AI and Data Officer of the Education Center for AI Research, and an associate professor and Aboitiz Chair in Data Science at the Asian Institute of Management. She serves on corporate boards, is a fellow of the Institute of Corporate Directors, an IAPP Certified AI Governance Professional, and a co-founder of CorteX Innovations, Corp.

Market Opportunity
ChangeX Logo
ChangeX Price(CHANGE)
$0.00141857
$0.00141857$0.00141857
-0.02%
USD
ChangeX (CHANGE) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Tags:

You May Also Like

VTI Stock Price: Vanguard ETF Analysis & 2026 Guide

VTI Stock Price: Vanguard ETF Analysis & 2026 Guide

Track the live VTI stock price, explore Vanguard's Total Market ETF holdings, expense ratio, performance history, and whether VTI fits your 2026 portfolio.
Share
Blockchainreporter2026/05/07 06:00
Bitget Wallet Integrates Hyperliquid HIP-3 to Launch 24/7 Macro Markets On-Chain

Bitget Wallet Integrates Hyperliquid HIP-3 to Launch 24/7 Macro Markets On-Chain

Bitget Wallet integrates Hyperliquid HIP-3 to enable 24/7 macro trading on-chain in order to expand access to RWAs, commodities, and global markets for users.
Share
Blockchainreporter2026/04/02 20:30
How to earn from cloud mining: IeByte’s upgraded auto-cloud mining platform unlocks genuine passive earnings

How to earn from cloud mining: IeByte’s upgraded auto-cloud mining platform unlocks genuine passive earnings

The post How to earn from cloud mining: IeByte’s upgraded auto-cloud mining platform unlocks genuine passive earnings appeared on BitcoinEthereumNews.com. contributor Posted: September 17, 2025 As digital assets continue to reshape global finance, cloud mining has become one of the most effective ways for investors to generate stable passive income. Addressing the growing demand for simplicity, security, and profitability, IeByte has officially upgraded its fully automated cloud mining platform, empowering both beginners and experienced investors to earn Bitcoin, Dogecoin, and other mainstream cryptocurrencies without the need for hardware or technical expertise. Why cloud mining in 2025? Traditional crypto mining requires expensive hardware, high electricity costs, and constant maintenance. In 2025, with blockchain networks becoming more competitive, these barriers have grown even higher. Cloud mining solves this by allowing users to lease professional mining power remotely, eliminating the upfront costs and complexity. IeByte stands at the forefront of this transformation, offering investors a transparent and seamless path to daily earnings. IeByte’s upgraded auto-cloud mining platform With its latest upgrade, IeByte introduces: Full Automation: Mining contracts can be activated in just one click, with all processes handled by IeByte’s servers. Enhanced Security: Bank-grade encryption, cold wallets, and real-time monitoring protect every transaction. Scalable Options: From starter packages to high-level investment contracts, investors can choose the plan that matches their goals. Global Reach: Already trusted by users in over 100 countries. Mining contracts for 2025 IeByte offers a wide range of contracts tailored for every investor level. From entry-level plans with daily returns to premium high-yield packages, the platform ensures maximum accessibility. Contract Type Duration Price Daily Reward Total Earnings (Principal + Profit) Starter Contract 1 Day $200 $6 $200 + $6 + $10 bonus Bronze Basic Contract 2 Days $500 $13.5 $500 + $27 Bronze Basic Contract 3 Days $1,200 $36 $1,200 + $108 Silver Advanced Contract 1 Day $5,000 $175 $5,000 + $175 Silver Advanced Contract 2 Days $8,000 $320 $8,000 + $640 Silver…
Share
BitcoinEthereumNews2025/09/17 23:48

Starter Gold Rush: Win $2,500!

Starter Gold Rush: Win $2,500!Starter Gold Rush: Win $2,500!

Start your first trade & capture every Alpha move