Power BI with Snowflake: DirectQuery vs Import Mode and When to Use Each

Celestinfo Software Solutions Pvt. Ltd. Aug 14, 2025

Last updated: September 2025

Quick answer: Use Import mode when your dataset is under 10M rows, doesn't need real-time freshness, and requires complex DAX. Use DirectQuery when the dataset is large, freshness matters, or many users need different data slices. The hybrid approach works best for most teams: import slowly-changing dimensions, DirectQuery for fact tables. Big gotcha: Power BI fires a separate Snowflake query per visual -- a page with 12 visuals means 12+ queries per page load.

Import Mode: How It Works

Import mode pulls data from Snowflake into Power BI's in-memory VertiPaq engine during a scheduled refresh. Once imported, all interactions (filtering, slicing, drilling) happen locally in memory. Snowflake isn't involved at all until the next refresh.

The upside: dashboards are fast. Visuals render in milliseconds because the data lives in Power BI's compressed columnar store. Complex DAX calculations (time intelligence, running totals, percentile calculations) execute instantly because they operate on local data.

The downside: data staleness. If your refresh runs at 6am, users see 6am data all day. You can configure refresh schedules up to 8 times per day (or 48 with a Premium capacity), but that's still not real-time. There's also a 1GB dataset size limit on shared capacity and 10GB on Premium per dataset. For a 500M-row fact table, that's usually not enough.

DirectQuery Mode: How It Works

DirectQuery doesn't import data. Every interaction in the dashboard sends a SQL query to Snowflake in real time. Click a filter? Snowflake runs a query. Drill into a chart? Another query. Change a slicer? Every visual on the page re-queries Snowflake.

The upside: data is always fresh. Users see the latest data every time they interact with the dashboard. There's no dataset size limit because the data stays in Snowflake. You can query tables with billions of rows without importing anything.

The downside: performance depends entirely on how fast Snowflake can execute those queries. And here's the part that surprises people: Power BI sends individual queries per visual. A dashboard page with 12 visuals, 3 slicers, and 2 cards fires at least 17 separate Snowflake queries every time the page loads. Change one slicer, and most of those queries fire again. At $3/credit and 1 credit/hour for an X-Small warehouse, those queries add up fast if the warehouse is running all day.

When to Use Import

When to Use DirectQuery

The Hybrid Approach: Best of Both

Most production deployments we've built use a composite model. Here's the pattern: import your dimension tables (products, customers, dates, regions - typically small, slowly-changing tables that compress well) and use DirectQuery for your fact tables (transactions, events, logs - large, frequently-updated tables).

This gives you the best of both modes. Slicer interactions against dimension tables are instant (in-memory). Fact table aggregations query Snowflake in real time for fresh data. The dimension tables act as filters that get pushed down to the Snowflake query, so the fact table queries are scoped to just the relevant partition. In Power BI Desktop, you set this up by choosing "Dual" storage mode for dimensions and "DirectQuery" for facts when configuring table storage.

Setting Up the Snowflake Connection

Power BI connects to Snowflake via the Snowflake ODBC driver. Download the latest driver from Snowflake's website (not the generic ODBC driver - it needs to be the Snowflake-specific one). In Power BI Desktop, go to Get Data → Snowflake → enter your account identifier and warehouse name.

For the Power BI Service (published reports), you've got two authentication options:

Warehouse Sizing for Power BI

Create a dedicated Snowflake warehouse for Power BI. Don't share it with ETL workloads - a heavy ETL job will starve your dashboard queries. Start with X-Small and auto-suspend at 60 seconds:

SQL - Create a dedicated Power BI warehouse
CREATE WAREHOUSE powerbi_wh
  WITH WAREHOUSE_SIZE = 'X-SMALL'
  AUTO_SUSPEND = 60
  AUTO_RESUME = TRUE
  MIN_CLUSTER_COUNT = 1
  MAX_CLUSTER_COUNT = 2
  SCALING_POLICY = 'ECONOMY';

Why X-Small? Power BI's DirectQuery queries are typically simple: SELECT col1, SUM(col2) FROM table WHERE date_col BETWEEN x AND y GROUP BY col1. These queries don't need a lot of compute. An X-Small warehouse (1 credit/hour) handles most dashboard workloads fine. If you've got 50+ concurrent users hitting the dashboard simultaneously, set MAX_CLUSTER_COUNT = 3 or 4 and let multi-cluster warehouses handle the concurrency.

Row-Level Security: Snowflake-Side vs Power BI-Side

You've got two options for RLS, and the right choice depends on your authentication setup:

Performance Tips

The Multi-Query-Per-Visual Gotcha

This is the #1 surprise for teams new to Power BI + Snowflake. When a user opens a dashboard page with 12 visuals, Power BI sends 12+ independent SQL queries to Snowflake. It doesn't batch them or send one big query - each visual gets its own query. If one visual has a complex filter, it sends an even more complex SQL statement.

The cost implications: if you've got 50 users each opening this dashboard 10 times a day, that's 50 * 10 * 12 = 6,000 Snowflake queries per day from one dashboard. At an X-Small warehouse, the compute cost isn't huge. But Snowflake's result cache helps - if two users run the same query (same filters, same data), the second query is served from cache at zero cost. Design your dashboards with shared default views to maximize cache hit rates.

Conclusion

The TL;DR is: start with a hybrid composite model (import dimensions, DirectQuery facts), use a dedicated X-Small Snowflake warehouse with 60-second auto-suspend, keep visuals under 10 per page, and build aggregation tables for high-traffic dashboards. The combo of Power BI's in-memory engine for dimensions and Snowflake's compute for facts gives you fast dashboards with fresh data - without the cost of importing a billion-row fact table into Power BI's memory eight times a day.

Kiran, Digital Marketing & BI Analyst

Kiran is a Digital Marketing & BI Analyst at CelestInfo specializing in Power BI, dashboard design, reporting best practices, and data-driven marketing strategies.

Related Articles

Burning Questions
About CelestInfo

Simple answers to make things clear.

Import for datasets under 10M rows with complex DAX needs and acceptable staleness. DirectQuery for large datasets, real-time freshness, or many concurrent users. The hybrid approach (import dimensions, DirectQuery facts) works best for most teams.

At least one query per visual. A page with 12 visuals fires 12+ independent Snowflake queries on every page load or filter change. This is the biggest cost and performance consideration when using DirectQuery.

Start with X-Small (1 credit/hour) with auto-suspend at 60 seconds. Power BI DirectQuery queries are typically simple aggregations that don't need large compute. Scale up only if you observe consistent query slowness in Performance Analyzer.

If you're using Azure AD SSO, enforce RLS in Snowflake with secure views -- it's more secure because data never leaves Snowflake unless authorized. With a service account, implement RLS in Power BI using DAX filters.

Reduce visuals to under 10 per page, avoid bidirectional cross-filtering, use aggregation tables for high-level views, and move computed columns into Snowflake views. Use Power BI's Performance Analyzer to identify slow visuals before publishing.

Still have questions?

Ready? Let's Talk!

Get expert insights and answers tailored to your business requirements and transformation.