Azure SQL to Snowflake via Microsoft Fabric Mirroring and OneLake Catalog Integration

Azure SQL to Snowflake via Microsoft Fabric Mirroring and OneLake Catalog Integration

Celestinfo Software Solutions Pvt. Ltd. Jan 29, 2026

Last updated: February 2026

Quick answer: To sync Azure SQL to Snowflake using Fabric Mirroring, create a mirrored database in Microsoft Fabric that replicates your Azure SQL tables to OneLake in Delta format. Fabric automatically virtualizes Delta as Iceberg. Then configure a Snowflake catalog integration pointing to OneLake's Iceberg REST endpoint and create Iceberg tables in Snowflake to query the mirrored data -- no ETL pipelines required.

Introduction

Microsoft Fabric Mirroring with Snowflake catalog integration enables near-real-time data synchronization from Azure SQL to Snowflake without traditional ETL pipelines. This guide covers mirroring an Azure SQL Database to OneLake and integrating the mirrored data with Snowflake using Iceberg REST Catalog integration for seamless cross-platform data access.


By combining Fabric Mirroring with Snowflake's Iceberg catalog integration, organizations achieve seamless data flow from Azure SQL to Snowflake. For a pipeline-based alternative, see our guide on Azure SQL to Snowflake via Fabric and ADLS Gen2. The mirrored data in OneLake is automatically converted from Delta to Iceberg format, making it accessible to Snowflake through its native Iceberg REST Catalog support.


Step 1 - Load Data into Azure SQL Database


The first step is to ensure your source data is available in Azure SQL Database. In this example, API data is loaded into an Azure SQL Database table dbo.api using Azure Data Factory. This table serves as the source for Fabric Mirroring.


Azure Data Factory pipelines can be configured to ingest data from various APIs, flat files, or other databases into your Azure SQL Database on a scheduled or event-driven basis.


Step 2 - Create Mirrored Database in Microsoft Fabric


To create a mirrored database in Microsoft Fabric, follow these steps:


  1. Open the Fabric Portal (app.fabric.microsoft.com)
  2. Navigate to your Workspace
  3. Click New and select Mirrored Database
  4. Select Azure SQL Database as the source type
  5. Enter the server name, database name, and authentication details
  6. Select the tables you want to mirror
  7. Click Start Mirroring

Fabric will begin continuously replicating the selected tables from Azure SQL into OneLake in Delta format, which is then automatically virtualized as Iceberg for external catalog access.


Step 3 - Configure Managed Identity in Azure SQL Server


During the connection setup, Fabric may request a System Assigned Managed Identity to access your Azure SQL Server. If you encounter connectivity issues, follow these steps to resolve them:


  1. Go to the Azure Portal
  2. Navigate to your SQL Server resource (not the database, but the server)
  3. Click Identity in the left menu
  4. Enable System Assigned Managed Identity and save
  5. Go to Networking settings
  6. Enable Allow Azure services and resources to access this server

This grants Fabric the necessary permissions to connect to and read data from your Azure SQL Database for mirroring purposes.


Step 4 - Verify Mirroring Status


Once mirroring is started, you can verify its status in the Fabric Portal. Look for the following indicators:



Monitor these metrics to confirm that your Azure SQL data is being continuously replicated into OneLake.


Step 5 - Extract Workspace and Item IDs


You will need the Workspace ID and Mirrored Database Item ID for configuring Snowflake's catalog integration. These can be extracted from the Fabric portal URLs:



Save both IDs as they will be used in the Snowflake integration scripts in Step 8.


Step 6 - Azure App Registration for Snowflake


Snowflake needs OAuth credentials to authenticate with Microsoft Fabric's Iceberg REST Catalog API. Create an Azure App Registration by following these steps:


  1. Go to the Azure Portal
  2. Navigate to Microsoft Entra ID (formerly Azure Active Directory)
  3. Click App registrationsNew registration
  4. Provide a name (e.g., Snowflake-Fabric-Integration) and register
  5. Go to API permissionsAdd a permission
  6. Select Azure StorageDelegated permissionsuser_impersonation
  7. Grant admin consent for the permission
  8. Go to Certificates & secretsNew client secret
  9. Generate a secret and copy the Value immediately (it will not be shown again)

Collect the following values from the App Registration:


Step 7 - Grant Contributor Access in Fabric Workspace


The App Registration created in Step 6 must be granted access to your Fabric Workspace so that Snowflake can read the mirrored data:


  1. Open your Fabric Workspace
  2. Click Manage access (or Workspace settings → Access)
  3. Add the App Registration (search by name or Client ID)
  4. Assign the Contributor role

This allows the registered application (used by Snowflake) to read data from OneLake within the workspace.


Step 8 - Snowflake Integration Script


Now configure Snowflake to connect to the mirrored data in OneLake. Execute the following SQL statements in Snowflake. Replace all placeholder values (e.g., <workspace-id>, <tenant-id>, etc.) with your actual values.


1. Create Catalog Integration

This creates the connection between Snowflake and Fabric's Iceberg REST Catalog API:

SQL - Snowflake Catalog Integration
CREATE CATALOG INTEGRATION onelake_catalog_int
  CATALOG_SOURCE = ICEBERG_REST
  TABLE_FORMAT = ICEBERG
  REST_CONFIG = (
    CATALOG_URI = 'https://onelake.table.fabric.microsoft.com/iceberg'
    CATALOG_NAME = '<workspace-id>/<mirroreddb-id>'
  )
  REST_AUTHENTICATION = (
    TYPE = OAUTH
    OAUTH_TOKEN_URI = 'https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token'
    OAUTH_CLIENT_ID = '<client-id>'
    OAUTH_CLIENT_SECRET = '<client-secret>'
    OAUTH_ALLOWED_SCOPES = ('https://storage.azure.com/.default')
  )
  ENABLED = TRUE;

2. Create External Volume

This defines the storage location in OneLake where Snowflake will read the mirrored data:

SQL - Snowflake External Volume
CREATE EXTERNAL VOLUME onelake_vol
  STORAGE_LOCATIONS = (
    (
      NAME = 'onelake_loc'
      STORAGE_PROVIDER = 'AZURE'
      STORAGE_BASE_URL = 'azure://onelake.dfs.fabric.microsoft.com/<workspace-id>/<mirroreddb-id>/Tables/'
      AZURE_TENANT_ID = '<tenant-id>'
    )
  )
  ALLOW_WRITES = FALSE;

3. Grant Fabric Access to External Volume Identity

After creating the external volume, retrieve the multi-tenant app name and add it as a Contributor in your Fabric workspace:

SQL - Describe External Volume
DESC EXTERNAL VOLUME onelake_vol;
-- Look for the AZURE_MULTI_TENANT_APP_NAME property in the output.
-- Add this app name as a Contributor in your Fabric workspace
-- (same process as Step 7).

4. Create Linked Database (Auto-Syncs Every 30 Seconds)

This creates a Snowflake database that is automatically linked to the OneLake catalog. It refreshes approximately every 30 seconds:

SQL - Snowflake Linked Database
CREATE DATABASE FABRIC_LINKED_DB
  LINKED_CATALOG = (CATALOG = 'onelake_catalog_int')
  EXTERNAL_VOLUME = 'onelake_vol';

5. Verify and Query the Linked Database

Use these commands to verify the catalog link status and explore the mirrored data:

SQL - Verify and Query
-- Check catalog link status
SELECT SYSTEM$CATALOG_LINK_STATUS('FABRIC_LINKED_DB');

-- Explore schemas and tables
SHOW SCHEMAS IN DATABASE FABRIC_LINKED_DB;
SHOW TABLES IN DATABASE FABRIC_LINKED_DB;

-- Query mirrored data
SELECT * FROM FABRIC_LINKED_DB."dbo"."<table>" LIMIT 10;

Conclusion


This integration provides a powerful, near-real-time data synchronization pipeline between Azure SQL Database and Snowflake using Microsoft Fabric Mirroring and OneLake Catalog Integration. Here are the key benefits of this approach:



By leveraging Microsoft Fabric's mirroring capabilities and Snowflake's Iceberg catalog integration, teams can significantly reduce data pipeline complexity while achieving near-real-time data availability across both platforms. Ensure proper data access control strategies are in place when sharing data across platforms.

Frequently Asked Questions

Q: What is the difference between Fabric Mirroring and traditional ETL pipelines?

Mirroring provides continuous, automatic replication of Azure SQL data into Fabric OneLake without building or managing ETL pipelines. Changes in the source database are automatically reflected in OneLake.

Q: How often does Snowflake's linked database sync with OneLake?

Snowflake's catalog-linked database automatically syncs approximately every 30 seconds, providing near-real-time access to mirrored data.

Q: Why is an Azure App Registration needed?

The App Registration provides OAuth credentials that allow Snowflake to authenticate with Microsoft Fabric's Iceberg REST Catalog API and access OneLake storage.

Related Articles

Burning Questions
About CelestInfo

Simple answers to make things clear.

Our AI insights are continuously trained on large datasets and validated by experts to ensure high accuracy.

Absolutely. CelestInfo supports integration with a wide range of industry-standard software and tools.

We implement enterprise-grade encryption, access controls, and regular audits to ensure your data is safe.

Insights are updated in real-time as new data becomes available.

We offer 24/7 support via chat, email, and dedicated account managers.

Still have questions?

Ready? Let's Talk!

Get expert insights and answers tailored to your business requirements and transformation.