Hey everyone, this is video FIVE in our Power BI to Fabric series. You can download the data engineering notebook used in this example here: www.skool.com/microsoft-fabric/classroom/c75b239c?md=939ef9bdae2145389f6d7244c622349f
This is another great one, continuing my journey of going throguh this series. many thanks. I see you as an infuencer in this domain of introducting MS Fabric on YT soon you will be top voice one.
Hey there, enjoying your Videos on Fabric! In this case I believe it would be important to mention that any user interaction is consuming CU of your Capacity. So depending on the number of users and the interaction direct lake can be quite expensive in number of CU compared to import mode.
Hey thanks for the comment! Really good point, thanks for drawing my attention to that. It's not something I've investigated in detail yet (CU usage of Direct Lake), but definitely will in the future!
Thanks for your videos!!! They are all really useful in my job Just one question, is there a way to create another variables/columns in power query if we use One lake / lake house?
Great videos! Are you able to use a shortcut as a source for a semantic model or do you need to move the data out of a shortcut first? Shortcuts don’t show up as tables when I try to pull them into a new model.
Yes, I am able to bring shortcut data in my semantic model Are you creating a new semantic model? (and then click refresh when you get the popup to select datasets to include in the semantic model(
Hello Will, Thanks for the videos, I have two questions: 1- If in the semantic model itself, you created a table using DAX for example a datetable using CALENDAR, would the Direct Lake fall back to Direct Query? 2- I am having a problem when creating a semantic model from 2 large delta tables (Billions of records, yes Billions not millions), I am getting an error of can't add tables, the connection timed out, and i am not sure why? i thought the semantic model will not be creatinga duplication of data that already exist in the lakehouse? Thanks in advance
Hello really nice video! Do you know if there is a way to transform the data of that semantic model in a power query like method or are we obliged to transform the data upstream ?
Thanks - in Fabric, we have Dataflow Gen2 which is I think what you're asking? Can use Power Query to transform data before Power BI semantic modelling?
Hey Will, how much have you tried Fabric with a paid capacity versus trial? I run into a lot of snafu when I use my paid capacity where it says I need to upgrade to trial, though it feels like that should be a downgrade! Any similar experiences?
Hmm that doesn’t sound right, I would raise a support ticket with Microsoft - haven’t heard of that before. Mostly I’ve been using trial capacities with some paid capacity to test features. Haven’t experienced what you’re describing
I am facing error as I want to copy one semantic model to another workspace with attached to new workspace lakehouse but the thing there isn't an option to download the semantic model so I dont have to manually enter relationships in new workspace ? can someone please help
I can't create a semantic model in a Workspace that doesn't have a lakehouse right? And if I have 2 lakehouses in 1 workspace I can't connect to tables from both when creating a semantic model? I'm guessing I should avoid creating architecture based on this, thoughts?
On your first question: you can create semantic models from a Lakehouse or a Data Warehouse (KQL databases are a bit different atm), you need at least one. Q2; if you have two lakehouses and you want to use Direct Lake mode, you will need to shortcut the tables you need from one Lakehouse into the other, then build your model from there.
If it's only once per day, and your dataset is small enough to use import mode, probably best just to use that 👍 I talk a bit more about choosing between them here ruclips.net/video/1KK3UKWkcN8/видео.htmlsi=td0uYkvkgYL-cUt_&t=1186
@@LearnMicrosoftFabric I did - so much of the reason we are getting through the initial steps of our fabric journey is owed to your content. I thank you!! Hope you see you next year!
Hey everyone, this is video FIVE in our Power BI to Fabric series. You can download the data engineering notebook used in this example here: www.skool.com/microsoft-fabric/classroom/c75b239c?md=939ef9bdae2145389f6d7244c622349f
these videos are really good. Haven't watched them all yet, but they are on my to do list! Many thanks!
Thanks for watching and thanks for the feedback... there's lots to catch up on!! 😂 And also lots more to come in the future 😊🙌
The series Will is putting together here are indeed very good! I dare to say even the very best on all "getting started with fabric" content out here.
@@hansvetters8026Very kind of you to say, thanks Hans! The next one is out on Friday 🙌🙌
This is another great one, continuing my journey of going throguh this series. many thanks. I see you as an infuencer in this domain of introducting MS Fabric on YT soon you will be top voice one.
Thanks for watching 🙌🏽
Great video, WIll. I thought I was done for now, but your teaser for the next video was too compelling. Off I go to that one.
Thanks for watching!
Great video! Thank you for sharing your knowledge and experience.
No worries, thanks for watching!!
Thanks!
Thanks a lot for your support 😃🙏
Please explain about the date table hierarchy in direct lake
Hey there, enjoying your Videos on Fabric! In this case I believe it would be important to mention that any user interaction is consuming CU of your Capacity. So depending on the number of users and the interaction direct lake can be quite expensive in number of CU compared to import mode.
Hey thanks for the comment! Really good point, thanks for drawing my attention to that. It's not something I've investigated in detail yet (CU usage of Direct Lake), but definitely will in the future!
Thanks for your videos!!! They are all really useful in my job
Just one question, is there a way to create another variables/columns in power query if we use One lake / lake house?
Will, thank you for share your knowledgement!
Great videos! Are you able to use a shortcut as a source for a semantic model or do you need to move the data out of a shortcut first? Shortcuts don’t show up as tables when I try to pull them into a new model.
Yes, I am able to bring shortcut data in my semantic model Are you creating a new semantic model? (and then click refresh when you get the popup to select datasets to include in the semantic model(
can one create semantic model based of the tables from external sources such as on-prem database, azure sql or synapse workspaces?
Hello Will, Thanks for the videos, I have two questions:
1- If in the semantic model itself, you created a table using DAX for example a datetable using CALENDAR, would the Direct Lake fall back to Direct Query?
2- I am having a problem when creating a semantic model from 2 large delta tables (Billions of records, yes Billions not millions), I am getting an error of can't add tables, the connection timed out, and i am not sure why? i thought the semantic model will not be creatinga duplication of data that already exist in the lakehouse?
Thanks in advance
What persmissions do people need, that want to see the data in the report which is using direct lake?
you can read more about how the fabric permissions model works here: learn.microsoft.com/en-us/fabric/security/permission-model
Hello really nice video! Do you know if there is a way to transform the data of that semantic model in a power query like method or are we obliged to transform the data upstream ?
Thanks - in Fabric, we have Dataflow Gen2 which is I think what you're asking? Can use Power Query to transform data before Power BI semantic modelling?
Hey Will, how much have you tried Fabric with a paid capacity versus trial? I run into a lot of snafu when I use my paid capacity where it says I need to upgrade to trial, though it feels like that should be a downgrade! Any similar experiences?
Hmm that doesn’t sound right, I would raise a support ticket with Microsoft - haven’t heard of that before. Mostly I’ve been using trial capacities with some paid capacity to test features. Haven’t experienced what you’re describing
I am facing error as I want to copy one semantic model to another workspace with attached to new workspace lakehouse but the thing there isn't an option to download the semantic model so I dont have to manually enter relationships in new workspace ?
can someone please help
Try this: learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/create-rules?tabs=new
I can't create a semantic model in a Workspace that doesn't have a lakehouse right? And if I have 2 lakehouses in 1 workspace I can't connect to tables from both when creating a semantic model? I'm guessing I should avoid creating architecture based on this, thoughts?
On your first question: you can create semantic models from a Lakehouse or a Data Warehouse (KQL databases are a bit different atm), you need at least one.
Q2; if you have two lakehouses and you want to use Direct Lake mode, you will need to shortcut the tables you need from one Lakehouse into the other, then build your model from there.
What is the benefit of Direct Lake on data imported and transformed once a day?
If it's only once per day, and your dataset is small enough to use import mode, probably best just to use that 👍 I talk a bit more about choosing between them here ruclips.net/video/1KK3UKWkcN8/видео.htmlsi=td0uYkvkgYL-cUt_&t=1186
Thanks so much for this. Did you make it to Fabcon Will?
Glad you enjoyed! I didn’t make it this year, but the photos and announcements looked amazing, maybe next year ☺️ did you go?
@@LearnMicrosoftFabric I did - so much of the reason we are getting through the initial steps of our fabric journey is owed to your content. I thank you!! Hope you see you next year!
⏳
⌛
Thanks for watching!!