<RETURN_TO_BASE

Turning AI Hype into Real ROI: Data, Stability, and Frugal Economics

'Enterprises can achieve AI returns by treating proprietary data as a strategic asset, prioritizing stability over constant model churn, and designing for cost-efficient, user-centered workflows.'

Why AI ROI Looks Elusive

Three years after ChatGPT many commentators call generative AI a 'bubble' because measurable returns have been limited outside a few technology suppliers. High-profile reports and panels amplified this perception: MIT noted a large share of pilots fail to scale or deliver clear ROI, while other analysts pointed to agentic AI as a promising path to operational gains. Some industry leaders have even advised CIOs to stop obsessing over exact ROI because measurement is difficult and often misleading.

This leaves technology leaders with a hard question: if your current stack reliably runs the business, what compelling upside justifies swapping in new tooling?

Principle 1: Your data is the value

Enterprise AI success usually starts with data. Many implementations are effective simply because they constrain an AI model to work against your own, proprietary files and repositories. Uploading attachments or targeted datasets into a model narrows its domain, reduces unnecessary prompting, and speeds up accurate answers.

That approach, however, requires sending non-public business data into third-party models, which raises two linked concerns: enforcing confidentiality and negotiating vendor access. Large model providers cannot advance without access to high-quality primary data, and several major deals have recently reflected this need.

From a commercial perspective, it can make sense to trade selective, governed access to your data for better pricing or tailored services. Treat model procurement not as a standard supplier transaction but as a potential partnership that advances both your vendor's model and your business adoption.

Principle 2: Build for stability — boring by design

The model landscape moves fast: hundreds of new generative models can appear in a year and disappear just as quickly. Businesses that relied on now-obsolete models discovered their previously stable workflows broke when vendors shifted offerings.

Operational environments value predictability. Back-office processes and compliance-heavy tasks need tools that run reliably in the background. The most durable AI deployments focus on augmenting repetitive, mandated workflows — for example, automating cross-checks in legal reviews or expense audits while keeping final decisions with humans.

These use cases rarely require bleeding-edge models. Abstracting business workflows away from direct model APIs can provide long-term stability while preserving the option to upgrade underlying engines at a business-appropriate pace.

Principle 3: Design for frugality — 'mini-van' economics

Avoid buying technology based on vendor-led benchmarks or the allure of high-end specs. Instead, design systems around what your users consume and at what pace. High-performance or benchmark-focused buys often add unnecessary cost without improving real-world productivity.

Every external model call, remote server, or extra inference adds operational cost. Reconfigure workflows to minimize third-party usage where possible. Many companies found that customer support AI implementations ballooned OpEx; others that constrained throughput to human-readable speeds (for example, under 50 tokens per second) successfully scaled with modest overhead.

Practical steps to capture ROI

  • Treat your proprietary data as an asset and plan governance and vendor negotiations accordingly.
  • Prioritize stability: choose solutions that integrate without destabilizing core operations and consider abstraction layers between workflows and model APIs.
  • Optimize for the pace of human work and the real needs of users rather than vendor benchmarks; measure cost per user and not just technical metrics.

Start small, design for independence in underlying components, and leverage the fact that your business data has value to model providers. By aligning vendor incentives with your data strategy, focusing on predictable solutions, and managing costs deliberately, enterprises can move from AI experimentation to sustained returns.

Note on provenance

This content was produced by Intel. It was not written by MIT Technology Review's editorial staff.

🇷🇺

Сменить язык

Читать эту статью на русском

Переключить на Русский