Intermountain Pushes a New EHR Playbook for AI: Standardize First, Scale Second
At HIMSS 2026, Intermountain Health argued that the key to scaling healthcare AI is not simply deploying models but building a standardized cloud data layer from EHR systems. The story underscores a growing industry view that interoperability and data normalization are now the limiting factors for AI value in care delivery.
A recurring lesson in healthcare AI is becoming harder to ignore: the bottleneck is often not the model, but the data architecture beneath it. MobiHealthNews reported on March 9, 2026 that Intermountain Health is building a unified cloud data layer that extracts information from electronic health records, standardizes it with common semantic models and prepares it for AI analysis.
That message came from a HIMSS 2026 keynote on AI and interoperability, and it cuts against much of the sector’s marketing narrative. Health systems have spent years hearing about smarter models, yet many still struggle to operationalize AI because their EHR data remain siloed, inconsistently coded or difficult to normalize across sites and workflows. Intermountain’s approach suggests that the next competitive edge may come from infrastructure discipline rather than algorithmic novelty.
The implications are broad. Standardized data pipelines can support everything from predictive analytics and operational optimization to clinician copilots and patient engagement tools. They also create a path toward safer governance, since organizations can more consistently monitor model inputs, drift and performance when the underlying data are structured in a repeatable way.
For the wider market, this is an important reality check. AI in EHRs is no longer just about adding a chatbot or scribe to the interface. The more durable shift is the re-engineering of health-system data estates so that AI can run on reliable, comparable inputs. That may not be the flashiest healthcare AI story of 2026, but it may prove one of the most consequential.