The cloud transition cost enterprises billions in wasted spend and a decade of false starts before the industry converged on the model that worked. The same inflection point is happening now with AI-native development. The difference is: this time, we know what the right model looks like — and the window to adopt it is measured in months, not years.
When cloud computing arrived, every enterprise faced the same question: do we retrain our existing infrastructure teams, or do we build something new? Most chose retraining. Most failed. The ones that succeeded built a Cloud Centre of Excellence (CCoE) — a dedicated, structurally independent team that established the patterns before the wider organisation adopted them.
Cloud bills 2–3× budget because nobody optimised for cloud-native patterns. Always-on VMs instead of auto-scaling. No FinOps discipline. No right-sizing.
Traditional engineers didn't understand IAM, shared responsibility models, or cloud-native security postures. Public S3 buckets. Exposed credentials. Data breaches that forced executive attention.
12–18 month migration projects that delivered nothing usable. The team didn't have the skills. The architecture was wrong. The timeline was a fiction.
The best engineers left for companies that were doing cloud properly. The company was left with the people least equipped to fix the mess — and a reputation that made hiring harder.
The universal mistake was the same every time: assuming that existing teams, with existing skills and existing processes, could adopt a fundamentally new paradigm by simply being told to do so. It did not work for cloud. It will not work for AI.
The cloud transition and the AI transition share an identical structural shape. The technology is different. The organisational mistake is the same. The only difference is speed — what took cloud a decade is happening with AI in months.
The risk is not that the engineering team cannot learn AI development. They can — given time, reference implementations, and proven patterns to follow. The risk is that without those things, they will build AI the way they build traditional software: treating agents as API endpoints, wrapping existing processes around new tools, and missing the structural advantages that make the paradigm worth adopting.
Cloud infrastructure evolved on annual release cycles. Enterprises had years to recover from early mistakes. AI models and tooling evolve on weekly cycles. A new model release can obsolete an architectural decision overnight. The competitive window for organisations that move first is narrower than it was with cloud — and the cost of moving slowly is not just inefficiency. It is irrelevance.
Before asking what needs to change, it is worth understanding what the current structure actually produces. The data is unambiguous: the vast majority of an engineering team's time is consumed by structure, not by building.
The 52-minute median is not a criticism of developers — it is a description of the structure they work inside. The overhead is not waste. In a traditional team it is structurally necessary: coordination, alignment, context transfer between people. Remove those structural conditions — through domain ownership, parallel agent streams, and conversation as the record — and the overhead disappears with them. The number is not achieved by asking developers to work harder. It is achieved by changing the structure they work within.
The instinct when velocity is low is to hire more engineers. But every new person adds communication channels — not just one, but N new connections to every existing team member. At a certain team size, the majority of effort is spent on synchronisation, not on building.
Every ceremony in a traditional engineering team exists to manage communication between people with overlapping context. Remove the overlap, and the ceremony loses its reason to exist.
| Ceremony | Exists because | In the agentic model |
|---|---|---|
| Daily standup | No shared context — each developer holds a different mental model of the system state | Domain ownership eliminates overlap. Context lives in the conversation log, not in someone's head. |
| Sprint planning | Work must be allocated manually across people with overlapping capabilities | Domain ownership determines routing. The developer self-selects from their domain. |
| Refinement | Complexity is unclear because no one has explored the implementation | Parallel probing surfaces complexity directly. The proof is in the conversation. |
| Retrospective | Process friction is invisible until explicitly surfaced in a meeting | Friction is visible in LOC trends, conversation logs, and domain boundary pressure — directly observable. |
| Ticket authoring | Requirements must be translated into structured artefacts before work begins | The conversation context is the ticket. It already contains the requirement, reasoning, and constraints. |
This is not an argument against process. It is an argument that the ceremonies of traditional software engineering were designed to solve a specific structural problem — coordinating many people across many communication channels. Reduce the channels through domain ownership and conversation-as-record, and those ceremonies become overhead without a purpose. The functions they performed still exist. They are performed better, and automatically, by the structure itself.
Left: every developer is a channel for every other — coordination is permanent overhead. Right: one developer, five agent streams — zero inter-stream coordination.
Just as the CCoE was the organisational model that made cloud adoption work, the ACoE is the model that makes the transition to agentic development work. A dedicated, structurally independent team that builds the golden path — the proven patterns, the reference implementations, the operational playbook — before the wider engineering organisation adopts it.
Build the first product using agentic development. Create the reference implementation that demonstrates the model works — not in theory, but in shipped, production-quality software.
Document the patterns, the domain ownership model, the conversation discipline, the testing standards. Create the playbook the wider team will follow — not from theory, but from what actually worked.
Developers enter the ACoE structure and adopt its norms. They learn by doing — working on real problems, inside a working model — not by attending a training course.
The instinct is to incubate the agentic model inside the existing engineering team — a small group that proves the concept before wider adoption. It is the safest-sounding approach. It is also the one most likely to fail. Not because the people are wrong, but because the structure is.
Existing teams are containers. They have their own processes, measurement frameworks, reporting lines, and cultural norms — all of which were built for the traditional model. A new team inside that container does not escape the gravity of those structures. It just experiences the pull more slowly — until a deadline, a resourcing decision, or a reporting conversation bends it back into the shape the container expects.
| Sprint ceremonies pull the new team back in. Work must be represented as tickets to fit the upstream reporting structure. The agentic model becomes a delivery method for Jira tickets. | The ACoE sets the norms. Incoming developers enter an environment where the discipline is already established and the old habits have nowhere to take root. |
| The existing org borrows people from the new team when deadlines slip elsewhere. Domain ownership fragments. The parallel model collapses back into coordination overhead. | New developers are onboarded into the conversation record. They learn by reading existing sessions, picking up a bounded domain, and producing output under the existing discipline. |
| The new team is measured on the same metrics as the old team. Velocity that looks different, ceremonies that are visibly absent — all of this attracts pressure to conform. | The ACoE defines its own measurement baseline. Productive LOC, delivery pace, and production quality are established as the standard from day one. |
| Senior traditional developers mentor the new team in old practices. Expertise flows in the wrong direction. | Traditional developers who join the ACoE see the model working before they are asked to commit to it. Conviction follows evidence. |
Existing processes close around the new team. The org's gravity shapes everything inside it.
The ACoE is the core. Developers enter its structure — and adopt its norms.
The right question is not “how do we retrain our existing team?” It is: “how do we build the thing we want, and invite people into it?” Start small. Keep the team structurally independent. Protect domain ownership, conversation discipline, and measurement standards from the gravity of the existing org. Then introduce traditional developers one at a time — into the new structure, on the new terms.
The transition to agentic development is not a single initiative. It is two parallel tracks managed in concert: converting developers to the new model, and expanding the product surface to absorb their increased capacity. One without the other fails.
Capacity grows. Product stays flat. Surplus capacity has nowhere to go.
An innovation function expands product scope in parallel. New capacity is absorbed by new opportunities.
The developers who enter the ACoE and succeed will carry that experience back into every future context they work in. The model propagates not by mandating it from the top down, but by making it demonstrably better to work inside than outside — and letting the people who have experienced it become its advocates.
The CCoE took a decade to become the standard. It cost enterprises billions in wasted migrations, failed projects, and lost talent before the evidence was overwhelming enough to end the debate. The ACoE is the same model, informed by the same lessons, applied to the next paradigm shift. The question is not whether this transition will happen. It is whether it happens with a plan — or with a decade of damage first.