Microsoft is intensifying efforts to reduce its reliance on OpenAI by developing proprietary AI reasoning models. The company has trained a family of internally developed models called MAI (Microsoft AI), designed to rival top-tier solutions from partners and competitors. Early benchmark tests indicate MAI models perform comparably to OpenAI’s GPT-4 and Anthropic’s Claude 3, signaling a potential shift in the $14 billion partnership that initially positioned Microsoft as an AI industry leader.
According to sources familiar with the project, Microsoft is experimenting with substituting OpenAI’s technology in its Copilot platform. The company has tested AI systems from Elon Musk’s xAI, Meta’s open-source Llama models, and Chinese startup DeepSeek as potential alternatives. This diversification strategy aims to optimize costs while maintaining performance standards for enterprise users.
The MAI framework incorporates chain-of-thought reasoning techniques that enable AI systems to break complex queries into logical sequences. Unlike standard large language models that generate direct responses, these architectures show intermediate problem-solving steps - a critical feature for enterprise applications requiring audit trails and decision transparency.
Microsoft’s AI division led by Mustafa Suleyman completed MAI training in Q4 2024, achieving parity with market leaders on 23 industry-standard benchmarks. A spokesperson confirmed plans to offer MAI models through public APIs by late 2025, enabling developers to integrate Microsoft’s AI directly into third-party applications. This move positions Microsoft as both collaborator and competitor in the AI ecosystem, supplementing its Azure OpenAI Service with native solutions.
The development signals Microsoft’s strategic hedging in the $1.2 trillion AI market. While maintaining its OpenAI partnership through 2030 per contractual agreements, the company is building parallel capabilities to control core AI infrastructure. Enterprises using Microsoft 365 Copilot could benefit from reduced latency and tighter Microsoft ecosystem integration, though pricing models for MAI-powered services remain undisclosed.
The form has been successfully submitted.