When you do like dbt-fabric + unified star schema way then you get lineage as a bonuds utherways you get a lot of python code and you must do very good housekeeping to find where is the actual gode that does that functionality like the latest thing is to put all the code you wrote into metadata table and then update that when code is changed. The key is how the single source of data is built.
Just think of all the reverse ETL you can now do! 😍
This is really interesting. A much easier way to run data quality assertions comparing the Lakehouse with the PBI Dataset
Loved it. Would be interesting to see if it works in databricks
yeap same here! is there a way to connect azure databricks directly to any FabricDataframe?
It is called Unified star schema aka Pupini bridge that does the thing behind semantic link
Is functionality available in azure databricks Simon?
When you do like dbt-fabric + unified star schema way then you get lineage as a bonuds utherways you get a lot of python code and you must do very good housekeeping to find where is the actual gode that does that functionality like the latest thing is to put all the code you wrote into metadata table and then update that when code is changed. The key is how the single source of data is built.
Would this work for a model with direct connections against something like DBSQL?
Buh-bye Metric Flow 😂
You've only done a job that is insufficient in this video. Why didn't you show Every steps instead of cutting the video and doing lots of talking?