Skip to main content

What is an Azure Machine Learning Workspace?

Workspaces are places to collaborate with colleagues to create machine learning artifacts and group related work. They provide a centralized environment for experiments, jobs, datasets, models, components, and inference endpoints.
The workspace is the top-level resource for Azure Machine Learning, keeping a history of all jobs, logs, metrics, output, and script snapshots.

Tasks Performed Within a Workspace

For machine learning teams, the workspace organizes the following activities:

Create Jobs

Training runs to build models, grouped into experiments for metric comparison

Author Pipelines

Reusable workflows for training and retraining models

Register Data Assets

Manage data used for model training and pipeline creation

Register Models

Version and track models ready for deployment

Create Endpoints

Deploy registered models for real-time or batch inference

Manage Compute

Configure compute targets for running experiments

Workspace Components

Resource Configurations

Workspaces host the following resource configurations:
Run experiments and training jobs:
  • Compute instances: Development workstations
  • Compute clusters: Scalable training infrastructure
  • Serverless compute: On-demand compute without management
  • Inference clusters: AKS for model deployment

Organizing Workspaces

For team leads and administrators, workspaces serve as containers for access management, cost management, and data isolation.

Best Practices

Manage permissions between users with predefined roles:
  • Owner: Full workspace access including role assignment
  • Contributor: Create and manage resources except role assignment
  • Reader: View workspace resources only
  • Custom roles: Define specific permissions for your needs
az role assignment create \
  --assignee user@example.com \
  --role "AzureML Data Scientist" \
  --scope /subscriptions/<subscription-id>/resourceGroups/<rg>/providers/Microsoft.MachineLearningServices/workspaces/<workspace>
Use Microsoft Entra user groups instead of individual users:
  • Simplifies permission management
  • Consistent access across resources
  • Easier onboarding/offboarding
Limit to one project per workspace for:
  • Project-level cost reporting
  • Scoped datastore configuration
  • Better resource organization
  • Clear ownership boundaries
Share associated resources between workspaces:
  • Storage accounts
  • Key Vaults
  • Application Insights
  • Container Registries
Reduces repetitive setup and infrastructure costs.
IT admins can:
  1. Precreate and secure associated resources
  2. Grant appropriate RBAC roles to data scientists
  3. Allow teams to create workspaces independently
Balances governance with team autonomy.
Group multiple project workspaces with shared:
  • Security settings and connections
  • Compute resources
  • Centralized governance
Hub workspaces work with both Azure ML studio and Microsoft Foundry.

Associated Azure Resources

When you create a workspace, Azure Machine Learning automatically provisions these resources:
ResourcePurpose
Azure Storage AccountStores job logs, notebooks, and uploaded data. Default datastore for workspace.
Azure Container RegistryStores Docker images for custom environments. Created on-demand when building images.
Azure Application InsightsMonitors and collects diagnostic information from inference endpoints.
Azure Key VaultStores secrets, connection strings, and keys used by compute and datastores.
Storage Account LimitationsYou cannot use:
  • BlobStorage accounts
  • Premium storage accounts (Premium_LRS/Premium_GRS)
  • Accounts with hierarchical namespace enabled
You can attach premium or hierarchical namespace storage as additional datastores.

How Content is Stored

The workspace keeps a history of all training runs with:
  • Logs: Console output and error messages
  • Metrics: Tracked numeric values and visualizations
  • Output: Model files and artifacts
  • Lineage metadata: Dataset versions and relationships
  • Script snapshots: Code used for each run
Artifacts and metadata are stored in the workspace and associated Azure resources.

Create a Workspace

Multiple methods are available for workspace creation:
Quick creation with default settings:
  1. Navigate to ml.azure.com
  2. Select Create workspace
  3. Provide name, subscription, and resource group
  4. Select Create

Workspace Subresources

Compute clusters and instances create additional subresources:
  • Virtual Machines: Provide computing power
  • Load Balancer: Manages traffic (even when stopped)
  • Virtual Network: Enable resource communication
  • Bandwidth: Charges for outbound data transfer
These subresources are managed automatically by Azure Machine Learning.

Management Tools

Interact with your workspace using:

Azure Portal

Full Azure resource management

ML Studio

ML-specific interface at ml.azure.com

Python SDK

Programmatic workspace access

Azure CLI

Command-line automation

VS Code Extension

Integrated development experience

REST API

Direct API integration

Cost Management

Workspace costs come from:
  1. Compute resources: VMs for training and inference
  2. Storage: Blob storage for data and models
  3. Networking: Private Link, VNet integration
  4. Monitoring: Application Insights and Log Analytics
To monitor costs:
az consumption usage list \
  --start-date 2024-01-01 \
  --end-date 2024-01-31 \
  | jq '[.[] | select(.instanceName | contains("ml-workspace"))]'

Next Steps

Create Your Workspace

Follow the quickstart guide

Compute Targets

Learn about compute resources

Security

Configure workspace security settings

Hub Workspaces

Explore enterprise workspace organization