Azure SQL to Snowflake via Microsoft Fabric Mirroring and OneLake Catalog Integration
Last updated: February 2026
Quick answer: To sync Azure SQL to Snowflake using Fabric Mirroring, create a mirrored database in Microsoft Fabric that replicates your Azure SQL tables to OneLake in Delta format. Fabric automatically virtualizes Delta as Iceberg. Then configure a Snowflake catalog integration pointing to OneLake's Iceberg REST endpoint and create Iceberg tables in Snowflake to query the mirrored data -- no ETL pipelines required.
Introduction
Microsoft Fabric Mirroring with Snowflake catalog integration enables near-real-time data synchronization from Azure SQL to Snowflake without traditional ETL pipelines. This guide covers mirroring an Azure SQL Database to OneLake and integrating the mirrored data with Snowflake using Iceberg REST Catalog integration for seamless cross-platform data access.
By combining Fabric Mirroring with Snowflake's Iceberg catalog integration, organizations achieve seamless data flow from Azure SQL to Snowflake. For a pipeline-based alternative, see our guide on Azure SQL to Snowflake via Fabric and ADLS Gen2. The mirrored data in OneLake is automatically converted from Delta to Iceberg format, making it accessible to Snowflake through its native Iceberg REST Catalog support.
Step 1 - Load Data into Azure SQL Database
The first step is to ensure your source data is available in Azure SQL Database. In this example, API data is loaded into an Azure SQL Database table dbo.api using Azure Data Factory. This table serves as the source for Fabric Mirroring.
Azure Data Factory pipelines can be configured to ingest data from various APIs, flat files, or other databases into your Azure SQL Database on a scheduled or event-driven basis.
Step 2 - Create Mirrored Database in Microsoft Fabric
To create a mirrored database in Microsoft Fabric, follow these steps:
- Open the Fabric Portal (app.fabric.microsoft.com)
- Navigate to your Workspace
- Click New and select Mirrored Database
- Select Azure SQL Database as the source type
- Enter the server name, database name, and authentication details
- Select the tables you want to mirror
- Click Start Mirroring
Fabric will begin continuously replicating the selected tables from Azure SQL into OneLake in Delta format, which is then automatically virtualized as Iceberg for external catalog access.
Step 3 - Configure Managed Identity in Azure SQL Server
During the connection setup, Fabric may request a System Assigned Managed Identity to access your Azure SQL Server. If you encounter connectivity issues, follow these steps to resolve them:
- Go to the Azure Portal
- Navigate to your SQL Server resource (not the database, but the server)
- Click Identity in the left menu
- Enable System Assigned Managed Identity and save
- Go to Networking settings
- Enable Allow Azure services and resources to access this server
This grants Fabric the necessary permissions to connect to and read data from your Azure SQL Database for mirroring purposes.
Step 4 - Verify Mirroring Status
Once mirroring is started, you can verify its status in the Fabric Portal. Look for the following indicators:
- Status: Shows as Running when mirroring is active
- Rows Replicated: The count increases as data is synced
- Last Completed: Timestamp updates to show the most recent sync
Monitor these metrics to confirm that your Azure SQL data is being continuously replicated into OneLake.
Step 5 - Extract Workspace and Item IDs
You will need the Workspace ID and Mirrored Database Item ID for configuring Snowflake's catalog integration. These can be extracted from the Fabric portal URLs:
- Workspace ID: Found in the URL when viewing your workspace (e.g.,
app.fabric.microsoft.com/groups/<workspace-id>/...) - Mirrored DB Item ID: Found in the URL when viewing the mirrored database (e.g.,
.../artifacts/<mirroreddb-id>/...)
Save both IDs as they will be used in the Snowflake integration scripts in Step 8.
Step 6 - Azure App Registration for Snowflake
Snowflake needs OAuth credentials to authenticate with Microsoft Fabric's Iceberg REST Catalog API. Create an Azure App Registration by following these steps:
- Go to the Azure Portal
- Navigate to Microsoft Entra ID (formerly Azure Active Directory)
- Click App registrations → New registration
- Provide a name (e.g., Snowflake-Fabric-Integration) and register
- Go to API permissions → Add a permission
- Select Azure Storage → Delegated permissions → user_impersonation
- Grant admin consent for the permission
- Go to Certificates & secrets → New client secret
- Generate a secret and copy the Value immediately (it will not be shown again)
Collect the following values from the App Registration:
- Tenant ID: Found on the Overview page
- Client ID (Application ID): Found on the Overview page
- Client Secret Value: The secret value you generated above
Step 7 - Grant Contributor Access in Fabric Workspace
The App Registration created in Step 6 must be granted access to your Fabric Workspace so that Snowflake can read the mirrored data:
- Open your Fabric Workspace
- Click Manage access (or Workspace settings → Access)
- Add the App Registration (search by name or Client ID)
- Assign the Contributor role
This allows the registered application (used by Snowflake) to read data from OneLake within the workspace.
Step 8 - Snowflake Integration Script
Now configure Snowflake to connect to the mirrored data in OneLake. Execute the following SQL statements in Snowflake. Replace all placeholder values (e.g., <workspace-id>, <tenant-id>, etc.) with your actual values.
1. Create Catalog Integration
This creates the connection between Snowflake and Fabric's Iceberg REST Catalog API:
CREATE CATALOG INTEGRATION onelake_catalog_int
CATALOG_SOURCE = ICEBERG_REST
TABLE_FORMAT = ICEBERG
REST_CONFIG = (
CATALOG_URI = 'https://onelake.table.fabric.microsoft.com/iceberg'
CATALOG_NAME = '<workspace-id>/<mirroreddb-id>'
)
REST_AUTHENTICATION = (
TYPE = OAUTH
OAUTH_TOKEN_URI = 'https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token'
OAUTH_CLIENT_ID = '<client-id>'
OAUTH_CLIENT_SECRET = '<client-secret>'
OAUTH_ALLOWED_SCOPES = ('https://storage.azure.com/.default')
)
ENABLED = TRUE;
2. Create External Volume
This defines the storage location in OneLake where Snowflake will read the mirrored data:
CREATE EXTERNAL VOLUME onelake_vol
STORAGE_LOCATIONS = (
(
NAME = 'onelake_loc'
STORAGE_PROVIDER = 'AZURE'
STORAGE_BASE_URL = 'azure://onelake.dfs.fabric.microsoft.com/<workspace-id>/<mirroreddb-id>/Tables/'
AZURE_TENANT_ID = '<tenant-id>'
)
)
ALLOW_WRITES = FALSE;
3. Grant Fabric Access to External Volume Identity
After creating the external volume, retrieve the multi-tenant app name and add it as a Contributor in your Fabric workspace:
DESC EXTERNAL VOLUME onelake_vol;
-- Look for the AZURE_MULTI_TENANT_APP_NAME property in the output.
-- Add this app name as a Contributor in your Fabric workspace
-- (same process as Step 7).
4. Create Linked Database (Auto-Syncs Every 30 Seconds)
This creates a Snowflake database that is automatically linked to the OneLake catalog. It refreshes approximately every 30 seconds:
CREATE DATABASE FABRIC_LINKED_DB
LINKED_CATALOG = (CATALOG = 'onelake_catalog_int')
EXTERNAL_VOLUME = 'onelake_vol';
5. Verify and Query the Linked Database
Use these commands to verify the catalog link status and explore the mirrored data:
-- Check catalog link status
SELECT SYSTEM$CATALOG_LINK_STATUS('FABRIC_LINKED_DB');
-- Explore schemas and tables
SHOW SCHEMAS IN DATABASE FABRIC_LINKED_DB;
SHOW TABLES IN DATABASE FABRIC_LINKED_DB;
-- Query mirrored data
SELECT * FROM FABRIC_LINKED_DB."dbo"."<table>" LIMIT 10;
Conclusion
This integration provides a powerful, near-real-time data synchronization pipeline between Azure SQL Database and Snowflake using Microsoft Fabric Mirroring and OneLake Catalog Integration. Here are the key benefits of this approach:
- Automatic sync: Fabric Mirroring continuously replicates Azure SQL data to OneLake without manual intervention
- No ETL pipeline needed: Eliminates the need to build and maintain traditional ETL/ELT pipelines for moving data between Azure SQL and Snowflake
- Iceberg format for interoperability: Fabric's automatic Delta-to-Iceberg virtualization enables seamless access from Snowflake via its native Iceberg REST Catalog support
- Linked database auto-refreshes: Snowflake's linked database syncs approximately every 30 seconds, providing near-real-time access to the latest data
By leveraging Microsoft Fabric's mirroring capabilities and Snowflake's Iceberg catalog integration, teams can significantly reduce data pipeline complexity while achieving near-real-time data availability across both platforms. Ensure proper data access control strategies are in place when sharing data across platforms.
Frequently Asked Questions
Q: What is the difference between Fabric Mirroring and traditional ETL pipelines?
Mirroring provides continuous, automatic replication of Azure SQL data into Fabric OneLake without building or managing ETL pipelines. Changes in the source database are automatically reflected in OneLake.
Q: How often does Snowflake's linked database sync with OneLake?
Snowflake's catalog-linked database automatically syncs approximately every 30 seconds, providing near-real-time access to mirrored data.
Q: Why is an Azure App Registration needed?
The App Registration provides OAuth credentials that allow Snowflake to authenticate with Microsoft Fabric's Iceberg REST Catalog API and access OneLake storage.