Scenario-10: Copy multiple files from blob storage to azure Sql database

Поделиться
HTML-код
  • Опубликовано: 3 янв 2025

Комментарии • 18

  • @KaraokeVN2005
    @KaraokeVN2005 9 месяцев назад +1

    How can we implement increment load when we have newest file and the records will be updated in sql table, they will be continous of the old data imported, this is my use case.

  • @Mayank-gd6jb
    @Mayank-gd6jb Год назад

    Excellent video sir

  • @sohaibshah1771
    @sohaibshah1771 8 дней назад

    In this use case Do we need to map source tables columns with sink tables columns or not?

    • @saicloudworld
      @saicloudworld  7 дней назад

      Hi, thank you for your questions, we need to map the data types between source and sink .

    • @saicloudworld
      @saicloudworld  6 дней назад

      Yes, we need to map datatypes from source to target(sink).

  • @sohaibshah1771
    @sohaibshah1771 8 дней назад

    How many link dataset and sink dataset to we need to create in this case?

    • @saicloudworld
      @saicloudworld  7 дней назад

      We need to create separate dataset for each type of file format .

    • @sohaibshah1771
      @sohaibshah1771 7 дней назад

      @@saicloudworld but here file format is only cv. then how many?

    • @saicloudworld
      @saicloudworld  6 дней назад

      Hi, here we will use one source dataset and one sink data set.

  • @nadeemrajabali3166
    @nadeemrajabali3166 Год назад +1

    Great video. What if I want to delete all records from Table first and write new data. Also, I have 13 files with large amount of data. Some are around 100 mb, it works perfectly fine until 6 files but later pipeline get stuck at 500,000 records. Any suggestion ?

    • @amritpalsingh4913
      @amritpalsingh4913 Год назад +1

      Add Truncate statement in Pre-sql tab. You need to increase copy activity throughput. Use Degree of parallelism and increase DIU.

  • @sravanthiyethapu9970
    @sravanthiyethapu9970 2 года назад

    When I pass parameter in dataset level automatically get metadata activity is asking a value
    Please help how to fix it

  • @gopavarammohankumar9053
    @gopavarammohankumar9053 2 года назад

    I have 11 collections in my azure cosmos mongo db, and I have 11 json files in my Azure Blob Storage Container. I'm using Data Factory copy to copy json files from blob to mongodb api. Here I'm able to copy only one file to one collection. I need to copy similarly all jsons to collections. How to copy multiple json files to multiple collections using data factory

    • @saicloudworld
      @saicloudworld  2 года назад

      Please refer this : ruclips.net/video/gASkX3BFUcY/видео.html&feature=share

  • @ranjansrivastava9256
    @ranjansrivastava9256 Год назад +1

    Need to copy multiple files from ADLS to multiple tables in Azure SQL Server...:)

  • @danishthev-log2264
    @danishthev-log2264 Год назад

    instead of sql server we can import from my sql workbench is it possible or not ?