This is very exciting. Lots of possibilities. Perhaps a slightly prosaic use-case, but I can see this coming in very handy for documenting our semantic models.
It's as if you are reading from my list of future videos I am creating on Semantic Link, Simon! I think documenting Power BI models is a great use case - watch this space.
@@LearnMicrosoftFabric we can create service principale and authenticate right? Or is there any way that I can access semantic model data outside of fabric environment using programming languages? Thanks in advance
Not sure I understand the question? are you looking to export a CSV of the data returning by semantic link? You should be able to write out the dataframe using the pandas df.to_csv('Files/file_path')
Thank you for this valuable review of Semantic Link. IMHO: Semantic link feels a bit like a solution in search of problems that should ahve already been resolved in ETL. You do your data validations there, you do your fact and dimension value validations and violation alerts....so this should all be done long before you ever get your data into your data warehouse. As for updating your data sets...this can easily be done in a powershell script using a couple restAPIs and you then execute a sroc that runs the powershell script to update your dataset... Getting meta data on your datasets...yeah I see some value here that alleviates some manual work. The biggest benefit that I see is allowing other data consumers access to your measures, and of course helping these consumers get insight into the dax that created the measures.
Hey thanks for your thoughtful comments! You're right in that there are many places in an end-to-end pipeline where we should be doing data validation, with most of the validation being done as upstream as possible. However I do see a place for semantic link in validating that semantic models are built and functioning as designed (how/where else would you validate this?). For example, those measures you mentioned written by a Power BI Dev, it would be nice to consistently monitor the outputs of these measures, and also cross table relationships too.
That is absolutely where I see the bigest benefit too - you should watch this video next where i talk in detail about data validation strategies using semantic link ruclips.net/video/Xj0AnZ8qT58/видео.html
This is a great future. Would be great to access datasets and measures with no limitations. Previously, when we wanted to use data from a power bi matrix, we needed to export csv and there was row limitations. Now we will able to access it without export and use in a cycle. Can you do a deep dive video for dax for above scenario? Thanks for the video.
exciting video, thanks a lot it will be great to see if you can do a video on EDA( Exploratory data analysis) using Fabric Notebook and Power BI with analysis that we can currently do with Panda Profile or Sweet Viz and maybe then take it to one further and create a machine learning ( selecting the right model for the right use case, training/testing, etc)
Thanks Andy, yeh absolutey I think there's a lot of scope to do EDA directly on Power BI datasets now - I'm on the case :) And yeh defo plan on doing an ML video: getting data from Power BI, training a model, writing predictions back to Lakehouse and then integrate into your Power BI visualisations. Thanks for your comment!
Want to learn more about Semantic Link, I just released a new 1hr workshop on Semantic Link: ruclips.net/video/Xj0AnZ8qT58/видео.html
A big thanks to Markus and his team for this one!
How will you be using Semantic Link?! So many opportunities!
This is very exciting. Lots of possibilities. Perhaps a slightly prosaic use-case, but I can see this coming in very handy for documenting our semantic models.
It's as if you are reading from my list of future videos I am creating on Semantic Link, Simon! I think documenting Power BI models is a great use case - watch this space.
Very interesting feature...its indeed going to be a competitive advantage for Microsoft
It's so useful and they are adding even more capability to it regularly too, I need to record an update video... 🙌
How can I use semantic link in python script outside of fabric environment
Last time I checked I think it was a NO. Because the fabric notebook handles all the authentication for you.
@@LearnMicrosoftFabric we can create service principale and authenticate right?
Or is there any way that I can access semantic model data outside of fabric environment using programming languages?
Thanks in advance
Hi Will. I really like the format of the video, how you organize it. It was very valuable, thank you.
Thanks for watching!
Hey how to export these table in csv or documentation purpose , I am
Unable to export these data frames in any format
Not sure I understand the question? are you looking to export a CSV of the data returning by semantic link? You should be able to write out the dataframe using the pandas df.to_csv('Files/file_path')
Great session indeed ....... Really found it interesting and going to follow each video you post in future related to this. .. ...
Thanks shashi! You might enjoy this follow up video ruclips.net/video/Xj0AnZ8qT58/видео.html
Great stuff, Will. Appreciate all you work on this stuff.
Hey thanks for the comment Kenneth! And no worries - I actually really enjoy making them and exploring all the new ways to use these amazing tools!
Thank you for this valuable review of Semantic Link.
IMHO: Semantic link feels a bit like a solution in search of problems that should ahve already been resolved in ETL. You do your data validations there, you do your fact and dimension value validations and violation alerts....so this should all be done long before you ever get your data into your data warehouse. As for updating your data sets...this can easily be done in a powershell script using a couple restAPIs and you then execute a sroc that runs the powershell script to update your dataset...
Getting meta data on your datasets...yeah I see some value here that alleviates some manual work.
The biggest benefit that I see is allowing other data consumers access to your measures, and of course helping these consumers get insight into the dax that created the measures.
Hey thanks for your thoughtful comments!
You're right in that there are many places in an end-to-end pipeline where we should be doing data validation, with most of the validation being done as upstream as possible. However I do see a place for semantic link in validating that semantic models are built and functioning as designed (how/where else would you validate this?).
For example, those measures you mentioned written by a Power BI Dev, it would be nice to consistently monitor the outputs of these measures, and also cross table relationships too.
Great video 😊. I would very much like to ser more videos on the data validation scenarios 👍
Thanks! will definitely be covering that v soon ☺️
the biggest benefit i see from this is the ability to write tests in python on your dataset
That is absolutely where I see the bigest benefit too - you should watch this video next where i talk in detail about data validation strategies using semantic link ruclips.net/video/Xj0AnZ8qT58/видео.html
Great videos. Cheers!
Thanks for watching!
This is a great future. Would be great to access datasets and measures with no limitations. Previously, when we wanted to use data from a power bi matrix, we needed to export csv and there was row limitations. Now we will able to access it without export and use in a cycle.
Can you do a deep dive video for dax for above scenario?
Thanks for the video.
Hey! Yes, I’ll be covering these aspects in more detail soon ☺️
exciting video, thanks a lot it will be great to see if you can do a video on EDA( Exploratory data analysis) using Fabric Notebook and Power BI with analysis that we can currently do with Panda Profile or Sweet Viz and maybe then take it to one further and create a machine learning ( selecting the right model for the right use case, training/testing, etc)
Thanks Andy, yeh absolutey I think there's a lot of scope to do EDA directly on Power BI datasets now - I'm on the case :) And yeh defo plan on doing an ML video: getting data from Power BI, training a model, writing predictions back to Lakehouse and then integrate into your Power BI visualisations. Thanks for your comment!
Awesome!
Exciting one.
It sure is! Has you used semantic link yet?