Hey Cris, can you cover more related how to this new SQL databases is stores in onelake? Is Delta format?, do we need to apply the same "house keeping" for a regular Azure SQL in this new?
Great question. The SQL DB is in what ever Azure SQL format they keep it in. The warehouse, as I understand it, is kept in Delta format and is kept in sync with the SQL DB all in the backend. All without you having to do anything.
Great video ! I'd rather connect my Power BI import mode semantic models to the OLTP Fabric Database vs the Fabric Warehouse. I have found out Power BI processing time is 15-50% faster using the OLTP DB (and no latency with the Delta conversion).
Thanks for the video. A great feature to add to Fabric. To me it seems like the downside is yet another additional workload competing for a limited amount of Capacity Units. Sure we can upgrade a capacity to cope with the additional load, but that involves a 2X in spend. Sort of feels like we will need to isolate a Fabric DB in its own capacity.
Yeah, we are just going to have to have more capacities. The good side is every capacity is multi-use and already integrated with everything else. This allows you to get the most out of your spend.
I was just thinking about this today... do Datamarts continue to get replaced with DW or should the recommendation be Databases? I gotta think on this.
Thanks for the explanation for us mere mortals trying to keep up with all of these advancements!
Happy to help!
Hey Cris, can you cover more related how to this new SQL databases is stores in onelake? Is Delta format?, do we need to apply the same "house keeping" for a regular Azure SQL in this new?
Great question. The SQL DB is in what ever Azure SQL format they keep it in. The warehouse, as I understand it, is kept in Delta format and is kept in sync with the SQL DB all in the backend. All without you having to do anything.
Great video ! I'd rather connect my Power BI import mode semantic models to the OLTP Fabric Database vs the Fabric Warehouse.
I have found out Power BI processing time is 15-50% faster using the OLTP DB (and no latency with the Delta conversion).
Great tip! I will have to test that out!
Thanks for the video. A great feature to add to Fabric. To me it seems like the downside is yet another additional workload competing for a limited amount of Capacity Units. Sure we can upgrade a capacity to cope with the additional load, but that involves a 2X in spend. Sort of feels like we will need to isolate a Fabric DB in its own capacity.
Yeah, we are just going to have to have more capacities. The good side is every capacity is multi-use and already integrated with everything else. This allows you to get the most out of your spend.
Maybe could be used with PPU?
I was just thinking about this today... do Datamarts continue to get replaced with DW or should the recommendation be Databases? I gotta think on this.