Options is the key word, coming from an analysts view it's great to have all these lakehouse/warehouse and now sql dB options but where to start and what to choose.
Too many cooks in the kitchen. Their product roadmap needs to get tightened up instead of shoehorning all these products together with a wrapper and calling it a framework.
Your semantic model has a diamond next to it, which usually indicates a Premium feature. Is all of this a Premium service? (It would be nice if these videos are clear on what requires Premium, so folks don't get all excited about a cool feature, only to find out it's Premium, which generally breaks the bank for smaller companies) Thanks for all the videos!
If a semantic model takes 30 minutes to refresh. How much time is added by enabling this feature. Naturally it takes time to write to one lake. Does this mean the semantic model refresh will be slower?
Hi Create 1 time RLS in the data lake and it can be applied also in power bi reports instead of doing RLS for every single PBIX file ? if not what are the alternate ways?
This is a great feature indeed, but I prefer consuming Semantic Models through Semantic Link for two reasons -> access to full business logic as you can call Measures, and you don't replicate the data and increase the OneLake storage cost. But the beauty of Fabric is that everyone can pick what suits them best.
@@pabloaschieri305 Semantic Link you could say is a library / api collection that allows to establish connection between semantic model and data science / data engineering workloads of Fabric. So far you could do something similar, by calling REST API or use Power Automate to query the Semantic Model by providing the DAX Code. Sematic Link does the same but within Fabric Notebooks, so the data can be used by other personas - Data Engineers / Scientists. Storage Mode setup in Semantic Model doesn't matter as far as I know.
I love Patrick and Adam. But is there anything in Fabric that doesn't feel complicated? So many options = too much confusion.
Options is the key word, coming from an analysts view it's great to have all these lakehouse/warehouse and now sql dB options but where to start and what to choose.
Too many cooks in the kitchen. Their product roadmap needs to get tightened up instead of shoehorning all these products together with a wrapper and calling it a framework.
Your semantic model has a diamond next to it, which usually indicates a Premium feature. Is all of this a Premium service? (It would be nice if these videos are clear on what requires Premium, so folks don't get all excited about a cool feature, only to find out it's Premium, which generally breaks the bank for smaller companies)
Thanks for all the videos!
If a semantic model takes 30 minutes to refresh. How much time is added by enabling this feature. Naturally it takes time to write to one lake. Does this mean the semantic model refresh will be slower?
I'd love to know how to use Fabric Direct Lake mode with Power BI embedded reports as I don't think it is possible. Thanks
Hi
Create 1 time RLS in the data lake and it can be applied also in power bi reports instead of doing RLS for every single PBIX file ?
if not what are the alternate ways?
Hi!
I made step by step but when i downloaded "One Lake File Explorer", its empty. Do you have any idea why it might be like this?
First, make sure OneLake Explorer is running. Once done, try right-click on the folder and choose Synch from Onelake.
So this also possible to use with macOS
This is a great feature indeed, but I prefer consuming Semantic Models through Semantic Link for two reasons -> access to full business logic as you can call Measures, and you don't replicate the data and increase the OneLake storage cost.
But the beauty of Fabric is that everyone can pick what suits them best.
Consuming form Semantic Link means you access the model through Direct Query conection right? Is there a way of changing it to Import Mode?
@@pabloaschieri305 Semantic Link you could say is a library / api collection that allows to establish connection between semantic model and data science / data engineering workloads of Fabric. So far you could do something similar, by calling REST API or use Power Automate to query the Semantic Model by providing the DAX Code. Sematic Link does the same but within Fabric Notebooks, so the data can be used by other personas - Data Engineers / Scientists. Storage Mode setup in Semantic Model doesn't matter as far as I know.
Geaux Tigers!!!