Calculation group
A calculation group applies one DAX pattern to every measure in your model. You write YTD, MTD and YoY% once instead of repeating them for e...
Read definitionDirect Lake is a storage mode for Power BI that reads straight from Delta tables in OneLake. You get the speed of Import without the refresh copy, and the freshness of DirectQuery without the SQL round-trip. It does require a Fabric capacity.
Direct Lake is a storage mode for semantic models in Microsoft Fabric. A report reads straight from Delta tables in OneLake, without importing the data first and without sending every query to the source system.
Where Import loads a full copy into your capacity's memory and DirectQuery forwards every question to SQL, Direct Lake takes a different route. Only the columns and rows a visual actually needs right now are pulled from the Parquet files in OneLake at that moment. That is fast, because the VertiPaq engine that Power BI already uses for Import reads the same column layout.
Think of a library that only fetches a book when someone asks for it, but has the full catalogue ready to go. No overnight refresh window, no "data is 12 hours old", and still no slow SQL round-trip for every screen a user opens.
Anyone reporting large volumes in Power BI runs into the same trade-off.
Import is fast but heavy
You load all data into memory in one go. Your capacity has to be big enough, your refresh window takes time, and users only see new numbers the next morning.
DirectQuery is live but slow
Every click sends a query to the source. If that is a SQL warehouse, you keep weight out of Power BI memory but you pass it to the SQL engine instead, with all the usual risks of sluggish response in reports.
Microsoft built Direct Lake to sidestep that trade-off inside a Fabric stack. The data already sits in OneLake in Delta format. VertiPaq can read Parquet directly. An extra import copy is redundant at that point.
Framing instead of refresh
A classic Import refresh copies every row into the model. A Direct Lake refresh only updates metadata: which Parquet files belong to which table right now? That operation is called framing and takes seconds, not hours.
Transcoding on demand
The moment a visual or DAX query touches a column, that column is converted from the Parquet file into the VertiPaq format. Once loaded, the column stays in memory until capacity needs the space for something else.
Delta tables as the foundation
Direct Lake only works with tables in Delta format. That gives you the transaction logs, schema guarantees and time travel of Delta Lake on every report by default.
Automatic updates
When the underlying Delta table changes (new row, updated dimension), Direct Lake detects it and makes sure the next query sees the new version. You can let those updates happen automatically or drive them programmatically.
Large volumes in a lakehouse architecture. You have gold Delta tables in Fabric and want to report on them directly, without a separate ETL into a Power BI dataset.
Near-real-time needs without DirectQuery pain. You want a new row to show up in the report within minutes, without firing SQL queries on every click.
IT-led projects with a warehouse or lakehouse in Fabric. Microsoft recommends Direct Lake explicitly for the gold layer in a medallion architecture.
Saving capacity on refresh cycles. An import refresh of billions of rows costs CPU and memory. Framing does the equivalent in seconds.
There are two flavours and they behave differently.
Direct Lake on OneLake
Reads straight from Delta tables in OneLake, no detour through SQL. No DirectQuery fallback: if a query cannot run, you get an error. Composite models are allowed, so you can mix Direct Lake tables with Import tables in the same model. This is the direction Microsoft is steering towards.
Direct Lake on SQL endpoint
Goes through the SQL endpoint of a lakehouse or warehouse. Supports a DirectQuery fallback when a table is a SQL view or when SQL-based security is active. Composite models with other storage modes are not supported. It does offer better integration with SQL RLS and object-level security.
Import remains the natural choice for smaller self-service datasets and for sources that do not live in OneLake. You do not need a Fabric capacity for it.
DirectQuery suits scenarios where you genuinely want to hit the source database per query (for instance, live operational dashboards on OLTP systems) or where compliance means you may not copy the data.
Direct Lake is the default choice for analytical reports on top of a Fabric lakehouse or warehouse whenever you have volumes too big for Import. You combine Import-style performance with DirectQuery-style freshness.
A Fabric capacity is required
Direct Lake only runs on an F (or legacy P) SKU. A Pro-only organisation gets no benefit.
Row and model-size limits per SKU
On F2 through F32, a single table is capped at 300 million rows and a model is capped somewhere between 10 and 40 GB on disk. Exceed that and the query fails, or falls back to DirectQuery on Direct Lake on SQL. Large models need F64 or higher, which unlocks unlimited model size and 1.5 billion rows per table.
No calculated columns, no complex Delta types
Calculated columns are not supported on Direct Lake tables. Binary and GUID columns need casting to string. Handle transformations upstream in Spark, SQL or a Dataflow, not in the model.
Delta tables need to be well maintained
Too many small Parquet files or badly shaped row groups will hurt performance. Run OPTIMIZE and V-Order on your tables on a regular schedule.
No cross-region
The semantic model has to run in the same Fabric region as the lakehouse or warehouse it reads from. If you need to bridge regions you work with shortcuts or separate models.
Object and row-level security behave differently
On Direct Lake on OneLake, SQL-based RLS is not applied; you have to define RLS in the semantic model itself. On Direct Lake on SQL, RLS works through the SQL endpoint but queries then fall back to DirectQuery, which affects performance.
A calculation group applies one DAX pattern to every measure in your model. You write YTD, MTD and YoY% once instead of repeating them for e...
Read definitionData mesh is an organisational model for data in which each business domain owns its datasets and offers them as products. It breaks with th...
Read definitionA data warehouse is a central database that collects data from many source systems and structures it for reporting and analysis. It's optimi...
Read definitionDAX is the formula language behind Power BI, Excel Power Pivot and Analysis Services. You use it to build calculations like totals, margins ...
Read definitionDelta Lake is an open storage format that extends plain Parquet files with transactions, schema enforcement, and time travel. It forms the f...
Read definition
Ten practical steps to automate your business processes without AI hype. Start small, fix the process first, use the tools you already own, ...
Simple guide to set up version control for Power BI using PBIP, Git and clean repo structures. Learn branching, deployments and safe AI work...