Thanks for great explanation - does this mean that 15 minutes is as low as we can get in terms of latency? What solutions would you recommend if the requirement is to detect change in data in sub-minute (or ideally couple of second) range? Thanks
Yes 15 minutes is the lowest latency here for delta merging. In terms of sub-minute you could look at the normal CSV export process, but even then Microsoft state "near-real time" which could mean up to a few minutes before any changed data in Dynamics is available in Synapse for querying
Good afternoon, I am having problems develoing this process. I have done everything but my sync status in my azure synapse link go from "Initial sync in progress" to "Error" without giving any further information. If I go to my data lake, the selected table is inside it but with CSV format not Delta. The only difference is the connection is from Dynamics F&O. Do you think that the problem can come from LCS Dynamics F&O? Thanks in advance.
Very nice video! I was looking at your channel to see if I could find a way to set up the "BYOL" concept through a synapse link. According to Microsoft Techtalks, it should be possible to export Dynamics tables into Data lake without an Analytics workspace. I even saw it briefly during one of the techtalks, however they never explained it in detail. When I tried it, my Finance and Operations tables aren't visible to me unless i choose the analytics workspace and a spark pool. I'm finding the Microsoft documentation extremely confusing regarding this. Any ideas?
Thanks for the great video. If I have a requirement to get this data into Azure SQL DB. CRM -- > Synapse link -- . Microsoft Data fabric -- > SQL . Does this make sense.
There's a lot of steps there, if you just want it in Azure SQL DB then you can configure the Dataverse export to Azure Data Lake and then import into Azure SQL DB learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=synapse-analytics
In the documentation Microsoft mentions: "For the Dataverse configuration, append-only is enabled by default to export CSV data in appendonly mode. But the delta lake table will have an in-place update structure because the delta lake conversion comes with a periodic merge process." Does that mean that when we delete a row in Dataverse, the latest version of the Delta table has no record of the record? If so, do older versions of the Delta file still contain the deleted record, or does the 'once per day optimize job' remove that history?
In Append-only mode there is a flag added to the destination table which indicates if the source row has been deleted. It is not hard-deleted from the Delta tables.
@@DatahaiBIDoes this mean that if we export data in Delta Lake format, we won't have a history of records available in Delta Lake? If something is deleted, can I still query it from Delta Lake? How can I use the time travel feature of Delta Lake? My requirement is to query all the historical data. Will exporting to Delta Lake format provide this feature or not?
@@nishantshah38 Yes exporting to Delta will give you the features of Delta out of the box. However, part of the Synapse Link process is to run daily OPTIMIZE and VACUUM jobs to remove "old" data, this defaults to 7 days retention period.
Great Video!
Very nicely explained.. you now have a new subscriber. keep creating more content.
Great explanation! Subscribed immediately
Thanks for great explanation - does this mean that 15 minutes is as low as we can get in terms of latency? What solutions would you recommend if the requirement is to detect change in data in sub-minute (or ideally couple of second) range? Thanks
Yes 15 minutes is the lowest latency here for delta merging. In terms of sub-minute you could look at the normal CSV export process, but even then Microsoft state "near-real time" which could mean up to a few minutes before any changed data in Dynamics is available in Synapse for querying
Good afternoon, I am having problems develoing this process. I have done everything but my sync status in my azure synapse link go from "Initial sync in progress" to "Error" without giving any further information. If I go to my data lake, the selected table is inside it but with CSV format not Delta. The only difference is the connection is from Dynamics F&O. Do you think that the problem can come from LCS Dynamics F&O? Thanks in advance.
I have exactly the same problem. CSVs are loaded successfully to the data lake but spark job is failing when converting to delta format.
why is the spark pool not being picked up on our setup. It's in the same workspace and and resource group?
Very nice video! I was looking at your channel to see if I could find a way to set up the "BYOL" concept through a synapse link. According to Microsoft Techtalks, it should be possible to export Dynamics tables into Data lake without an Analytics workspace. I even saw it briefly during one of the techtalks, however they never explained it in detail.
When I tried it, my Finance and Operations tables aren't visible to me unless i choose the analytics workspace and a spark pool.
I'm finding the Microsoft documentation extremely confusing regarding this.
Any ideas?
Great explanation.
When I select my spark pool, the storage account drop down is empty. Otherwise, the storage account dropdown is populated. Any idea?
Thanks for the great video. If I have a requirement to get this data into Azure SQL DB. CRM -- > Synapse link -- . Microsoft Data fabric -- > SQL . Does this make sense.
There's a lot of steps there, if you just want it in Azure SQL DB then you can configure the Dataverse export to Azure Data Lake and then import into Azure SQL DB learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=synapse-analytics
In the documentation Microsoft mentions: "For the Dataverse configuration, append-only is enabled by default to export CSV data in appendonly mode. But the delta lake table will have an in-place update structure because the delta lake conversion comes with a periodic merge process." Does that mean that when we delete a row in Dataverse, the latest version of the Delta table has no record of the record? If so, do older versions of the Delta file still contain the deleted record, or does the 'once per day optimize job' remove that history?
In Append-only mode there is a flag added to the destination table which indicates if the source row has been deleted. It is not hard-deleted from the Delta tables.
@@DatahaiBIDoes this mean that if we export data in Delta Lake format, we won't have a history of records available in Delta Lake?
If something is deleted, can I still query it from Delta Lake?
How can I use the time travel feature of Delta Lake?
My requirement is to query all the historical data. Will exporting to Delta Lake format provide this feature or not?
@@nishantshah38 Yes exporting to Delta will give you the features of Delta out of the box. However, part of the Synapse Link process is to run daily OPTIMIZE and VACUUM jobs to remove "old" data, this defaults to 7 days retention period.