Microsoft Expands AI Footprint with New Tools for Developers and Enterprises "It's all about expanding opportunity for developers across every layer of the stack so that you all can build the apps, the agents that can empower every person and every organization on the planet," says Satya Nadella, CEO, Microsoft
You're reading Entrepreneur India, an international franchise of Entrepreneur Media.

Microsoft has announced changes across its developer platforms at its annual Build conference 2025, reflecting how AI is altering the software development lifecycle. GitHub Copilot is being restructured into an asynchronous coding agent embedded within the platform, with added features like prompt management, model evaluations and enterprise-level controls. Copilot Chat is now open-sourced in Visual Studio Code, marking a continued push toward open collaboration in AI development.
"This is the next big step forward, which is a full coding agent built right into GitHub, taking Copilot from being a pair programmer to a peer programmer. You can assign issues to Copilot, bug fixes, new features, code maintenance, and it'll complete these tasks autonomously. Today, I'm super excited that it's now available to all of you," explained Satya Nadella, CEO, Microsoft.
"It's all about expanding opportunity for developers across every layer of the stack so that you all can build the apps, the agents that can empower every person and every organization on the planet," said Nadella.
Windows AI Foundry, introduced at Build, is being positioned as a unified platform for managing training and inference tasks across local and cloud environments. It supports both open-source and proprietary models, and offers simplified APIs for common language and vision-based tasks.
Meanwhile, Azure AI Foundry now hosts over 1,900 AI models, including new additions such as Grok 3 and Grok 3 Mini from xAI. Tools like the Model Leaderboard and Model Router have been introduced to help developers evaluate and deploy models more effectively.
In line with the broader industry conversation around open AI infrastructure, Microsoft is expanding support for the Model Context Protocol (MCP) across its platforms and has joined the MCP Steering Committee. The protocol is aimed at standardising how AI agents interact with data and services online, and new contributions include a server registry and updated authorisation mechanisms.
A new initiative called NLWeb was also introduced, which aims to allow websites to offer conversational, agent-compatible interfaces to their content. NLWeb endpoints would also function as MCP servers, offering websites the option to make their data and services accessible to AI agents under agreed protocols.
Nadella also highlighted how Teams is being transformed into a multiplayer AI environment, where agents can be summoned, assigned tasks, and embedded directly into conversations.
"I feel we are at that age where we are now gonna put expertise at your fingertips. Teams takes all that and makes it multiplayer, right? All of the agents you build can now show up in Teams and in Copilot. You can ask questions, assign action items, or kick off a workflow by just at-mentioning an agent in a chat or meeting and with the Teams AI library, building multiplayer agents is easier than ever. It now supports MCP. With just one line of code, you can even have it enable A2A. You can add things like episodic or semantic memory by using Azure Search and a new retrieval system and as a developer, you can now publish. This is the biggest thing, right? Now you can build an agent. You can publish your agent to the agent store and have them discovered and distributed across both Copilot and Teams, providing you access to the hundreds of millions of users and unlocking that opportunity," added Nadella.
Microsoft has also introduced a new research platform, Microsoft Discovery, which aims to apply agentic AI to scientific work. The tool is designed to support researchers across disciplines by streamlining discovery workflows, from data analysis to product development.