Where most AI investment actually goes

BCG research into high-performing AI organisations identifies a consistent investment pattern: roughly 10% of programme effort on algorithms and model selection, 20% on technology and data infrastructure, and 70% on people and processes. The vast majority of organisations allocate roughly the inverse of this. They invest most heavily in model selection, tool deployment and data infrastructure, and treat people and process design as an afterthought or a later phase.

This inversion is not irrational. Technology spending is visible and justifiable. Tools can be demonstrated. Infrastructure can be measured. The people and process work is harder to scope, harder to quantify upfront and harder to explain to a board. So organisations systematically underinvest in the third that generates most of the value.

The result is a recognisable pattern: strong technology deployment, uneven adoption, individual results that do not compound, and a return on investment that falls short of the case made when the programme was approved.

What the 70% produces that the other 30% cannot

The 70% is not a soft investment. It is the work of building the operating model conditions that allow technology to generate institutional value.

It includes workflow redesign: the work of asking what a process should look like given what AI enables, then redesigning it from that starting point rather than augmenting what already exists. This is where most of the productivity and quality gains from AI are generated. Augmentation produces faster workflows. Redesign produces different workflows.

It includes governance design: assigning decision rights, classifying risk, establishing data handling rules and building the review cadence that keeps the framework current. Governance built before deployment governs effectively. Governance assembled in response to problems governs reactively.

It includes change management: the specific work of equipping practitioners with the judgment, checkpoint criteria and role clarity they need in redesigned workflows, rather than generic AI literacy training. And it includes the operating rhythm: the recurring governance and decision cadences that sustain AI capability after the programme ends.

None of this work can be sequenced after technology deployment and produce the same result. The 70% must precede or run in parallel with technology deployment. Organisations that sequence technology first and people and process second consistently find that the technology produces individual results rather than institutional ones.

Rebalance before the next programme design decision

Audit the investment split in your current or planned AI programme across three categories: technology and model costs, data and infrastructure, and people and process design. Where does the 70% sit?

If the largest share is in the first two categories, the programme is likely to produce deployment metrics. Shifting the balance toward people and process design is where the value differential is created. The technology decisions become easier once the operating model decisions have been made. And the return on the technology investment becomes substantially higher when the conditions for institutional value are built alongside it.