According to a 2025 MIT study on enterprise generative AI, 95% of organisations are seeing no measurable return on an estimated $30–40 billion in global GenAI investment.
They are zero returns, not just underperforming results.
Importantly, this failure is not due to technological immaturity. The tools work and the platforms are capable. Adoption is widespread.
Over the past 18 months, working with mid-sized financial and professional services firms in the UAE and GCC, we have observed the same pattern repeatedly. Organisations acquire high-end AI tools, deploy them to various teams, and then struggle to describe what changed. Revenue does not shift materially. Costs do not decline in ways that can be measured. Team workload remains largely unchanged.
Now, the data confirms the underlying issue: most organisations are deploying technology without establishing baselines, without defining success metrics, and without understanding the workflows they aim to augment.
Consider the following examples:
This is the core pattern: implement first, measure later, and in many cases, never measure at all.
Avanade’s Chief Technology Officer recently observed that AI progress stalled in 2025 not due to poor technology, but due to organisations' inability to prioritise and define processes suitable for AI implementation.
Most companies lacked:
As a result, AI efforts were launched without clear operational anchors. Many attempted to "make AI work" within fragmented systems, rather than reimagining their workflows to take advantage of AI’s strengths.
This is a strategic oversight. If an organisation cannot clearly explain how a lead converts into a customer, or how client servicing is structured across departments, it is not ready for intelligent automation. Similarly, if key data is split across six incompatible platforms, any AI layer added on top will lack the inputs needed to function.
The research is clear: AI can only perform as well as the workflows and data infrastructure it supports.
This is why our framework begins with process clarity inside the process; we only follow with tool selection later.
The following methodology is used in all our AI transformation engagements. Each step is reinforced by sector research and practical application in the UAE mid-market.
The first step is not selecting a tool. It is understanding the people.
This involves more than reading an organisational chart. It includes:
Why this matters: Research shows that successful AI efforts begin with centralised, not scattered, decision-making. Leadership must define where AI investment will be focused. Without clarity on role ownership, AI cannot integrate effectively into workflows.
We use structured interviews, role-based process mapping, and capability assessments to build this understanding.
We use a simple methodology known as “Making Toast”. Each team member receives sticky notes and a marker. Each writes one step per note, using drawings only.
We start by mapping the process of making toast. The exercise quickly reveals how people interpret even basic sequences differently. Then, we apply the same visual mapping technique to actual business processes; such as onboarding, lead conversion, or proposal generation.
Why this matters: You cannot automate a process you cannot articulate. This step surfaces hidden bottlenecks and divergent mental models. It creates a shared understanding of how the business functions.
Most clients experience 30–40% efficiency gains before any technology is introduced.
Once the process is mapped, we focus on four priorities:
Why this matters: According to PwC, 80% of AI value comes not from the AI itself, but from redesigning the workflows AI will support. Without this optimisation, automation simply accelerates inefficiency.
Case example: A UAE-based executive search firm reduced its talent acquisition cycle from 18 to 6 weeks after realigning its client validation checks earlier in the process.
This step defines how work is distributed across people and systems. It includes:
Why this matters: This is where we identify the gaps between system support and actual work. AI cannot replace or enhance workflows it does not understand.
The output is a detailed map of responsibilities, dependencies, and constraints - used as technical input for system design.
Next, we translate process requirements into system requirements.
Why this matters: PwC research confirms that workflow architecture is the largest driver of value. This step designs how AI will function in practice: what it will trigger, how it will learn, and when it should escalate.
Case example: A wealth advisory firm saved 18 hours per week by integrating three systems and automating internal reporting processes.
At this stage, we select and implement tools based on the architecture designed in Step 5.
System selection is guided by:
Why this matters: Tool selection without prior workflow design often leads to overlapping systems, shadow IT, and low adoption. Correctly implemented systems support business needs at all levels.
Case example: A financial services firm eliminated an entire redundant platform and improved operational efficiency by 55%.
Only after systems are implemented and connected can AI be layered on top.
Three integration steps are required:
Why this matters: AI augmentation works only when systems and data are prepared. Agentic AI, autonomous systems operating across workflows is now achievable, but only with full integration.
This final step typically takes 8–12 weeks, covering:
Industry evidence shows that businesses are moving from AI-assisted tools (waiting for human input) to agentic AI, systems that initiate actions autonomously based on pre-defined rules.
In sectors like commercial credit, client onboarding, and treasury, this shift is already underway.
UAE ranks first globally in AI adoption (FinTech News, 2025), but sustained advantage will only accrue to organisations that apply structured implementation, not tool-first adoption.
For financial and professional services firms between 50 and 200 employees, the opportunity is real, but only if tackled with the correct sequencing.
The market is ready. The tools are mature. But none of these matters without:
Firms that apply this discipline will unlock compounding gains. Those that skip it will face increased costs, complexity, and stagnant outcomes.
95% of AI initiatives fail not because of bad tools, but because foundational work is skipped.
The 5% that succeed follow a process-first approach:
This is an operating discipline to be followed closely for success.