Azure’s AI ecosystem can be overwhelming with terms like Azure OpenAI Service, Azure AI Service, Azure OpenAI Studio, Azure AI Studio, and Azure AI Foundry all floating around. If you used Azure a long back and again came after some period of time and just confused now, don’t worry you are in the right place to clear up the confusion and understand what each one really means.
You might think that the services shown in the image are not available in Azure right now and you’re right. They are no longer available. This article is intended for those who used Azure AI services around six months ago and are now trying to use them again but stuck due to lot of recent changes.
These services and platforms are part of Microsoft’s Azure ecosystem, designed to empower developers and businesses to integrate artificial intelligence into their applications.
Due to rebranding and overlapping functionalities, understanding their distinctions is crucial for selecting the right tool for your needs. Below, we explore each offering, their features, use cases, and how they interrelate before.
1. Azure Open AI Service : Now it’s Azure Open AI
Azure Open AI Service is a fully managed service that provides access to OpenAI’s advanced language models, such as GPT-4o, GPT-3.5-Turbo, Codex, DALL·E, etc., It enables us to build applications with capabilities like natural language processing, chat completions, content generation, and code completion, all while leveraging Azure’s security and compliance features.
Key Features
- Model Access: Includes models like o3, GPT-4 Turbo with Vision, GPT-3.5-Turbo, and Embeddings for tasks such as text generation, image understanding, and semantic search.
- Fine-Tuning: Customize models with our data to improve performance for specific use cases.
- Integration: Seamlessly connects with Azure services like Azure Cognitive Search, Azure Storage, and Azure Key Vault.
- Security and Compliance: Offers data residency, encryption, and compliance with standards like GDPR and HIPAA.
- APIs and SDKs: Accessible via REST APIs and SDKs in Python, C#, JavaScript, Java, and Go.
- Content Safety: Includes Azure AI Content Safety filters to ensure responsible AI use.
2. Azure AI Service
The term Azure AI Service does not correspond to a specific, standalone Azure service. It likely refers to Azure AI services, a broad category encompassing various AI services(previously cognitive services) and platforms offered by Azure. These services enable us to integrate various AI Services into applications without requiring extensive AI expertise.
Key Features
- Azure Cognitive Services: Pre-built APIs for vision (e.g., Computer Vision), speech (e.g., Speech-to-Text), language (e.g., Text Analytics), and decision-making.
- Azure Machine Learning: Tools for building, training, versioning and deploying machine learning models.
- Azure Bot Service: For creating and managing intelligent bots.
- Azure Open AI Service: Also includes OpenAI stuff.
- Azure AI Vision: For advanced computer vision tasks.
- Azure AI Language: For natural language processing tasks.
3. Azure AI Studio
Azure AI Studio was the previous name for Azure AI Foundry, a friendly web based GUI platform designed to help developers build, test, and deploy AI applications using a variety of models and tools. The rebranding to Azure AI Foundry reflects an expansion of its capabilities to provide a more comprehensive AI development environment.
- Azure AI Studio was introduced as a generative AI application development platform with support for model filtering, benchmarking, prompt engineering, and AI safety guardrails (InfoWorld).
- It has been fully integrated into Azure AI Foundry, and references to Azure AI Studio now point to this platform.
4. Azure Open AI Studio : Now it’s Azure Open AI
Azure Open AI Studio was a web-based interface specifically for interacting with Azure Open AI Service. It allowed users to deploy, manage, and experiment specifically with OpenAI models like GPT-4 and DALL·E. Key functionalities included model deployment, fine-tuning, and testing through a user-friendly interface.
- Model Deployment: Deploy Azure Open AI models to a user’s Azure resource.
- Management: Scale and monitor model deployments.
- Interaction: Test models via a chat interface or completion generation.
- Fine-Tuning: Customize models with user-provided data.
5. Azure AI Foundry
Azure AI Foundry (formerly Azure AI Studio) is the current, unified platform for building, testing, and deploying AI applications on Azure. Launched as a comprehensive hub, it supports the full lifecycle of AI development, from ideation to production, and integrates functionalities from both Azure AI Studio and Azure Open AI Studio. It is designed for developers and enterprises seeking to create scalable, secure, and responsible AI solutions.
Key Features
- Model Catalog: Access to over 1,600 models from providers like Microsoft, OpenAI, Meta, Cohere, and others (Azure AI Foundry).
- Project-Based Workflow: Organizes work into projects for collaboration, resource management, and iteration from prototype to production.
- Prompt Engineering: Tools to optimize prompts for large language models (LLMs).
- Retrieval Augmented Generation (RAG): Enhances models with domain-specific data for more accurate responses.
- Safety and Compliance: Configurable content safety filters and responsible AI tools to mitigate risks like harmful content generation.
- Development Integration: Supports GitHub, Visual Studio Code, LangChain, Semantic Kernel, and AutoGen for seamless development.
- Multimodal Support: Handles text, images, and audio, enabling versatile applications.
- Generative AI Agents: Build and deploy enterprise-ready agents to automate business processes.
Use Cases
- Custom AI Applications: Develop applications using a mix of models for unique business needs.
- Conversational Agents: Create chatbots or virtual assistants with advanced language and vision capabilities.
- Content Generation: Produce personalized text, images, or multimedia content at scale.
- Business Automation: Deploy AI agents to streamline workflows, such as customer service or data analysis.
A Simple Example
A company might use Azure AI Foundry to build a customer service agent that combines OpenAI’s GPT-4o for text responses, Meta’s LLaMA for cost-effective processing, and Azure AI Vision for analyzing customer-uploaded images, all managed within a single project.
Note
- Azure AI Foundry is the recommended platform for most AI development on Azure, offering flexibility to work with Azure Open AI Service models or other providers.
- However we can still use Azure AI Service or Azure OpenAI Service independently to access their respective models without creating a full project in Azure AI Foundry. This is easy in scenarios where we only need an API key to access a model and don’t require the additional tools or infrastructure.
- Based on my exploration, most of the models are available in West US. But it is not always the case sometimes models like ‘Babbage’ are only available in regions like ‘Sweden Central’. So it is always a good idea to refer to the official Microsoft Learn documentation.
- It can be frustrating at times because not everything useful is clearly documented. For example, when I tried batch processing in Prompt Flow using a JSONL file, CSV, I encountered an error and couldn’t find any helpful solutions. Lastly I cam to know that for batch there is a sample size limit. Things are also changing frequently often without a decent shift. But despite that, many of the other services are quite impressive.
- Now, there is nothing like Azure OpenAI Service or Azure Open AI Studio, it just Azure Open AI in the portal.
Deployment Types In Azure For LLM’s
Azure provides several deployment types for large language models (LLMs), each tailored to specific performance, scalability, and compliance needs. So whenever we try to deploy an LLM model we can see the following types as option for deployment.
1. Standard Deployment
- Description: A pay-as-you-go model suitable for low to medium-volume workloads with high burstiness.
- Data Processing: Within the specific Azure region where the resource is deployed.
- Use Case: Ideal for scenarios requiring strict data residency and compliance, as data processing remains within the designated region.
- Billing: Based on token usage.
2. Global Standard Deployment
- Description: Leverages Azure’s global infrastructure to route requests to the data center with the best availability, regardless of geographic location.
- Data Processing: Across any Azure OpenAI location.
- Use Case: Recommended for most scenarios, except those with strict data residency requirements.
- Billing: Based on token usage.
3. Global Batch Deployment
- Description: Designed for large-scale, asynchronous processing tasks, offering cost efficiency and separate quota management.
- Data Processing: Across Azure’s global infrastructure.
- Use Case: Suitable for scenarios like document summarization, content generation, and data analysis.
- Billing: Reduced costs with a 24-hour target turnaround time.
4. Provisioned Managed Deployment
- Description: Involves reserving dedicated model processing capacity, suitable for high and predictable throughput requirements.
- Data Processing: Within a specific Azure region or across Azure’s global infrastructure, depending on configuration.
- Use Case: Beneficial for applications with stringent compliance and data residency needs.
- Billing: Based on provisioned throughput units.
Comparison Table
Considerations Before Using Some Services
- Access and Setup: Azure Open AI may require registration for certain models due to limited access policies. Azure AI Foundry projects are straightforward to set up via the portal at Azure AI Foundry.
- Cost: Pricing varies by service. Azure Open AI Service uses a pay-as-you-go model based on token usage, while Azure AI Foundry may involve costs for compute resources and model inference (Azure Open AI Pricing).
- Learning Curve: Azure AI Foundry offers a user-friendly interface but may require familiarity with AI concepts for advanced features like RAG or prompt engineering.
- Community Insights: Based on my observation of discussions on Reddit highlight that Azure AI Foundry provides access to a broader range of models but may lag in offering the latest OpenAI models compared to Azure Open AI Service.
A Short And Quick Recap
- Azure Open AI Service provides access to OpenAI’s advanced language models, like GPT-4, for tasks such as text generation and code completion, tailored for enterprise use.
- Azure AI Service likely refers to the broader category of Azure AI services, a collection of AI tools including vision, speech, and machine learning, though it may be a misnomer in this context.
- Azure AI Studio is the former name of Azure AI Foundry, a platform for building AI applications with various models.
- Azure Open AI Studio was a dedicated interface for Azure Open AI Service, now integrated into Azure AI Foundry.
- Azure AI Foundry is the current, comprehensive platform for developing AI applications, supporting multiple models, including those from Azure Open AI Service.
What’s There In Azure Right Now
- Azure Open AI (Previously OpenAI Studio and Open AI Service)
- Azure AI Services (Previously Cognitive Services)
- Azure AI Foundry (New For Complete End-To-End Project Development)
That’s It
I hope you now have a clear understanding of the terms that existed before and what’s now. I have also shared my thoughts along the way. Now, go ahead and start using Azure and do some pratical stuffs to build the powerful AI systems you’ve been thinking about!
About the Author
Harisudhan S
Data Science @Verchool Holdings | Python | Machine Learning | Deep Learning | NLP | LLMs | Agentic AI | Certified Microsoft Azure AI Engineer
Harisudhan, S (2025). Azure Open AI Vs Azure AI Services Vs Azure AI Foundry. Available at: Azure Open AI Vs Azure AI Services Vs Azure AI Foundry | by Harisudhan.S | Medium [Accessed: 7th August 2025].