Configuring Delta Lake Export in Synapse Link for Dataverse

Поделиться
HTML-код
  • Опубликовано: 25 окт 2024

Комментарии • 17

  • @samial2341
    @samial2341 15 дней назад

    Great Video!

  • @pankajnakil6173
    @pankajnakil6173 Год назад

    Very nicely explained.. you now have a new subscriber. keep creating more content.

  • @JonathanCrotz
    @JonathanCrotz 8 месяцев назад

    Great explanation! Subscribed immediately

  • @radekou
    @radekou Год назад +3

    Thanks for great explanation - does this mean that 15 minutes is as low as we can get in terms of latency? What solutions would you recommend if the requirement is to detect change in data in sub-minute (or ideally couple of second) range? Thanks

    • @DatahaiBI
      @DatahaiBI  Год назад

      Yes 15 minutes is the lowest latency here for delta merging. In terms of sub-minute you could look at the normal CSV export process, but even then Microsoft state "near-real time" which could mean up to a few minutes before any changed data in Dynamics is available in Synapse for querying

  • @marcosmartin3148
    @marcosmartin3148 9 месяцев назад +1

    Good afternoon, I am having problems develoing this process. I have done everything but my sync status in my azure synapse link go from "Initial sync in progress" to "Error" without giving any further information. If I go to my data lake, the selected table is inside it but with CSV format not Delta. The only difference is the connection is from Dynamics F&O. Do you think that the problem can come from LCS Dynamics F&O? Thanks in advance.

    • @artem77788324
      @artem77788324 7 месяцев назад

      I have exactly the same problem. CSVs are loaded successfully to the data lake but spark job is failing when converting to delta format.

  • @marvinalmarez4458
    @marvinalmarez4458 4 месяца назад

    why is the spark pool not being picked up on our setup. It's in the same workspace and and resource group?

  • @rwnemocsgo2542
    @rwnemocsgo2542 9 месяцев назад

    Very nice video! I was looking at your channel to see if I could find a way to set up the "BYOL" concept through a synapse link. According to Microsoft Techtalks, it should be possible to export Dynamics tables into Data lake without an Analytics workspace. I even saw it briefly during one of the techtalks, however they never explained it in detail.
    When I tried it, my Finance and Operations tables aren't visible to me unless i choose the analytics workspace and a spark pool.
    I'm finding the Microsoft documentation extremely confusing regarding this.
    Any ideas?

  • @bilalshafqat1634
    @bilalshafqat1634 5 месяцев назад

    Great explanation.

  • @Blade17534
    @Blade17534 8 месяцев назад

    When I select my spark pool, the storage account drop down is empty. Otherwise, the storage account dropdown is populated. Any idea?

  • @sukumarm5926
    @sukumarm5926 6 месяцев назад

    Thanks for the great video. If I have a requirement to get this data into Azure SQL DB. CRM -- > Synapse link -- . Microsoft Data fabric -- > SQL . Does this make sense.

    • @DatahaiBI
      @DatahaiBI  6 месяцев назад

      There's a lot of steps there, if you just want it in Azure SQL DB then you can configure the Dataverse export to Azure Data Lake and then import into Azure SQL DB learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=synapse-analytics

  • @MarnixLameijer
    @MarnixLameijer 11 месяцев назад

    In the documentation Microsoft mentions: "For the Dataverse configuration, append-only is enabled by default to export CSV data in appendonly mode. But the delta lake table will have an in-place update structure because the delta lake conversion comes with a periodic merge process." Does that mean that when we delete a row in Dataverse, the latest version of the Delta table has no record of the record? If so, do older versions of the Delta file still contain the deleted record, or does the 'once per day optimize job' remove that history?

    • @DatahaiBI
      @DatahaiBI  11 месяцев назад

      In Append-only mode there is a flag added to the destination table which indicates if the source row has been deleted. It is not hard-deleted from the Delta tables.

    • @nishantshah38
      @nishantshah38 7 месяцев назад

      @@DatahaiBIDoes this mean that if we export data in Delta Lake format, we won't have a history of records available in Delta Lake?
      If something is deleted, can I still query it from Delta Lake?
      How can I use the time travel feature of Delta Lake?
      My requirement is to query all the historical data. Will exporting to Delta Lake format provide this feature or not?

    • @DatahaiBI
      @DatahaiBI  6 месяцев назад +2

      @@nishantshah38 Yes exporting to Delta will give you the features of Delta out of the box. However, part of the Synapse Link process is to run daily OPTIMIZE and VACUUM jobs to remove "old" data, this defaults to 7 days retention period.