Microsoft's Model Switch: Why AI Middleware is the Next Platform
Microsoft will now switch between OpenAI and Anthropic models for its Office365 products
Microsoft just announced they'll switch between OpenAI and Anthropic models in Office365. This is bigger than it sounds.
Today, only 4 million Office users pay for AI features (priced at an additional $30/month), a brutal 1% conversion after a year or two of hype. Users cite bugs and budget cuts.
Microsoft's response is clear: stop betting on one model. Start arbitraging between many.
Meanwhile, Google rolled out Gemini for free to all Business and Enterprise users but jacked up base prices 30%. A risky move if the model didn’t work well, but a smarter play for distribution.
Models Become Middleware
Switching between AI models will become the norm for enterprises and businesses. They won't just bet on one model. Instead they will build great routers. The value isn’t in an exclusive partnership with GPT-5 or Claude Sonnet, but is in knowing when to use which.
For foundational models, each new model could make or break billions in enterprise contracts, i.e. a rogue model that performs below expectations. Pressure is high to ship fast, but also to ship quality.
What This Points To:
Model routing protocols or startups (ability to pick the right model for each query)
Enterprise memory layer (the ability to know that Sarah from marketing always uses AP style, Jim from legal needs citations, and the CEO wants everything in 3 bullets)
Performance arbitrage (cheapest model that works the highest quality)
Real-time user research (ability to quickly understand users’ needs and prioritize model based on response deliverables)
For Builders:
Stop building for one model. Build for model agnosticism.
AI features need to be embedded within product, not in a premium walled garden. It should be table stakes.
The Prediction: Within 12 months, every enterprise AI product will support multiple models. The companies building for routing, curation, and memory to support that will capture massive value.
The Sleeper Opportunity: Enterprise Memory
Knowing that your marketing team uses AP style, your legal team needs everything cited, and your CEO wants everything in 3 bullets. This context layer (e.g. user preferences, company standards, internal data points, team dynamics) is what makes AI actually useful vs. just smart.
The company that builds the memory layer that travels across models will own enterprise AI. Still haven’t seen a ton of effective products around that.
Source: The Information
Free Workshop: How To Build For Real-time Video
We are co-hosting a free hands-on workshop with VideoDB this Wednesday in San Francisco! Join the founders as they do practical walkthroughs on how to build agentic workflows for real-time video and media.