23 result(s)
Azure AI Foundry has been renamed Microsoft Foundry. The October–November 2025 update highlights new announcements across agents, models, tools, and related platform improvements.
Foundry IQ Knowledge Bases can be integrated into the Microsoft Agent Framework in about 20 lines of Python. The Azure AI Search Context Provider enables intelligent, multi-hop retrieval so agents can use consolidated knowledge without fragmented pipelines.
Microsoft announced the Foundry MCP Server (preview), building on the Model Context Protocol (MCP) to help AI agents securely connect with apps, data, and systems. At Ignite, Microsoft Foundry also introduced Foundry Tools as a central hub to discover, connect, and manage both public and private MCP tools securely, simplifying integration and expanding interoperability.
AI Dev Days is a two-day virtual event from Microsoft Reactor that highlights the latest innovations across Microsoft Azure, Foundry, and GitHub, focused on modernizing apps, building with agents, and exploring new AI models to help developers level up and connect with experts.
Microsoft’s Foundry Agent Service introduces a memory capability so conversational agents can retain context across interactions rather than being stateless. This aims to prevent agents from repeating questions and removes the need for developers to build custom memory solutions using embeddings and external databases.
This Microsoft Foundry Blog post, “Translation Customization, A Developer’s Guide to Adaptive Custom Translation,” argues that translation is more than word-for-word conversion and is essential for global business communication. It highlights the need for accuracy, speed, and domain-specific terminology — noting that 70% of consumers prefer content in their native language — and frames a developer-focused guide for implementing adaptive custom translation to enable real-time, high-quality multilingual experiences.
Microsoft Foundry introduces Multi-Agent Workflows in the Foundry Agent Service to help organizations move beyond single-purpose agents toward orchestrated, multi-step, role-based AI processes that include stronger governance and operational dependability for enterprise use.
Microsoft announced that Azure Content Understanding in Foundry Tools is now generally available. The GA follows preview usage across multiple industries and incorporates customer feedback; the release emphasizes flexibility and control with models.
Microsoft announces Foundry Local for Android, a preview that lets developers run AI models directly on Android devices without cloud round trips. The release also introduces on-device speech capabilities, on-premises deployment options, and a simplified SDK to make integrating local AI into mobile apps easier. A gated preview signup is available.
This tutorial shows how to evaluate and optimize retrieval-augmented generation (RAG) agents in Microsoft Foundry using Foundry Observability. It recommends running end-to-end, reference-free RAG triad evaluators (Groundedness and Relevance) and applying advanced search and retrieval tuning before deployment.
A practical developer guide from Microsoft Foundry showing how to start using Azure OpenAI’s next-generation GPT-4o audio models to build transcription and text-to-speech applications. The post explains the capabilities of the audio models, required setup in Azure/OpenAI and Foundry, best practices for integration, and considerations for testing, deployment, and costs.
Microsoft announced the integration of Azure Language into Foundry Tools to support building enterprise-grade agents that emphasize deterministic behavior and stronger privacy protections. The update is positioned to help teams create predictable, secure agentic applications with tighter integration across their AI stack.
Microsoft Foundry now offers major public-preview enhancements for models and agentic AI pipelines, including an AI Red Teaming Agent that helps organizations proactively identify and mitigate safety and security risks in models and agentic systems before they enter production.
This post is a curated session guide for Microsoft Ignite 2025 focused on Azure AI Foundry, highlighting must-see developer sessions for hands-on learning, practical demos, and expert insights to help developers plan their conference schedule.
This guide demonstrates how to quickly improve image classification accuracy by fine-tuning GPT-4o (a vision-language model) on Azure OpenAI via Azure AI Foundry. It targets ML practitioners, app developers, and curious users, and emphasizes that boosting performance can be done without deep learning expertise through a step-by-step walkthrough.
Azure AI Foundry provides a streamlined platform for smarter, faster, and more accessible fine-tuning so developers can customize models for practical business problems — from reasoning agents to adaptive tools and scalable workflows — accompanied by best practices, hands-on resources, and recent innovations to accelerate development, testing, and deployment.
Azure AI Foundry’s September 2025 update announces GA for GPT-5-Codex and Voice Live, previews for Sora video-to-video, Browser Automation, and Key Vault, the release of Grok 4 Fast, and new knowledge sources for Azure AI Search.
Microsoft announced the open-source Microsoft Agent Framework, an engine designed for building agentic AI applications. The framework addresses the evolving needs of AI agents — which go beyond chatbots and copilots — by enabling autonomous components that can reason about goals, call tools and APIs, collaborate with other agents, and adapt dynamically. The announcement appeared on the Azure AI Foundry Blog.
Foundry Local is a high-performance local AI runtime from Azure AI Foundry that brings model execution and hardware acceleration to client devices. It enables developers to build and ship cross-platform AI apps that run accelerated models on a wide range of silicon.
The post highlights how running AI locally on developers’ devices addresses long-standing concerns about data privacy, dependency on cloud services, and limited control over tooling. By using local models, teams — especially those working on sensitive or regulated projects — can keep data on-device, customize workflows, reduce latency, and retain full control over model behavior. The article positions local models as a practical approach for AI-assisted development and is published on the Azure AI Foundry Blog.
Azure AI Foundry now can create a production-ready Azure AI Search vector index natively during agent grounding, enabling immediate indexing with no prior search setup and speeding agent deployment to production.
Microsoft announced the Computer Use tool (Preview) for the Azure AI Foundry Agent Service. The preview brings feature parity with the Azure OpenAI Responses API while offering seamless integration into the Foundry agent runtime and enterprise-grade security, allowing developers to build agents that reason and use tools within a secure environment.
August 2025 update for Azure AI Foundry: GPT-5 is available, Model Router adds GPT-5 support, Responses API reaches general availability, Browser Automation enters public preview, and there are additions and updates including Sora, Mistral Document AI, FLUX image models, OpenAI gpt-oss with Foundry Local, plus SDK and documentation improvements.