18. Copy multiple tables in bulk by using Azure Data Factory

Поделиться
HTML-код
  • Опубликовано: 8 фев 2025
  • In this video, I discussed about Copying multiple tables from SQL Database in bulk to Azure blob storage using Azure Data Factory
    Link for Azure Databricks Play list:
    • 1. Introduction to Az...
    Link for Azure Functions Play list:
    • 1. Introduction to Azu...
    Link for Azure Basics Play list:
    • 1. What is Azure and C...
    Link for Azure Data factory Play list:
    • 1. Introduction to Azu...
    Link for Azure Data Factory Real time Scenarios
    • 1. Handle Error Rows i...
    Link for Azure LogicApps playlist
    • 1. Introduction to Azu...
    #Azure #ADF #AzureDataFactory

Комментарии • 64

  • @diogodallorto1
    @diogodallorto1 3 года назад +5

    I've whatched so many classes of him that now I can perfectly understand his English. Thank you bro!

  • @samrattidke896
    @samrattidke896 3 года назад +2

    Awesome...you deserve 80k+ subscribers...👍👍

  • @RahulKumar-jg5ly
    @RahulKumar-jg5ly 3 года назад +1

    Superb explanation and presentation.... I watching your all the videos. 👍👍👍

  • @BeingSam7
    @BeingSam7 3 месяца назад

    this was unexpectedly explained very simply. Awesome brother. After learning from you maybe one day I will create my own channel called 'Bewafa Studies' LOL.. :)

  • @patrickmuaenah3733
    @patrickmuaenah3733 2 года назад +1

    Thanks so much for this, Saved my week

  • @sudipmazumdar5358
    @sudipmazumdar5358 2 года назад +1

    Great topic & very helpful for understanding several key activities. Thanks a lot :)

  • @rk-ej9ep
    @rk-ej9ep 2 года назад +1

    Clear explanation.👍👍Many Thanks

  • @sudheerjajjarapu6876
    @sudheerjajjarapu6876 3 года назад +3

    Scenario1: Copy all table data from SQL DB to Azure BLOB storage. Output file name will be Schema name ‘_’ table name .csv
    1. Copy all tables in the Azure SQL DB
    2. Put a condition if the table is having only one row or zero row then data will not be copied for that table.
    3. Copy data in a BLOB storage path as CSV file
    in this scenario how , where and how we put 2nd condition

    • @bobbyj2492
      @bobbyj2492 3 года назад +1

      use lookup inside foreach table, to count num of rows in that table, if it matches the condition, then copy

  • @Rafian1924
    @Rafian1924 3 года назад +1

    Really good man. Thanks a lot.

  • @varunvengli1762
    @varunvengli1762 11 месяцев назад +1

    If we want to copy only few tables and automate it..whats the method? No one is RUclips has explained this real time scenario.. Kindly explain

    • @nigereast3796
      @nigereast3796 9 месяцев назад

      I think you can use a stored procedure for this

  • @MsWishnew
    @MsWishnew 9 месяцев назад

    Please discuss about the restartability, say for example out of 100, if 10 of the tables fail, I need to restart only the failed one automatically. How this scenario will be handled. BTW thanks for this tutorial :)

  • @acverse
    @acverse 3 года назад +1

    Very helpful! Thank you :)

  • @Montreal_powerbi_connect
    @Montreal_powerbi_connect 3 года назад +1

    good one, thanks for sharing.

  • @vijaybodkhe8379
    @vijaybodkhe8379 Год назад

    Thanks for Sharing🎉

  • @MaheshReddyPeddaggari
    @MaheshReddyPeddaggari 3 года назад +2

    Will u pls make more on real time videos
    that would be very helpful to the new people

  • @narra_rakesh_2021
    @narra_rakesh_2021 3 года назад +1

    Bro Make a video on '' Copy data From SAP to Azure Data lake using ADF"

  • @manjunathbn9513
    @manjunathbn9513 3 месяца назад

    If one of the table has 1 Billion Records in it, then what is the approch to copy that table?

  • @saketsrivastava84
    @saketsrivastava84 Год назад

    How can we use similar approach if we want to copy data from snowflake to azure sql db?

  • @vishalshrivastava1073
    @vishalshrivastava1073 Год назад

    once our pipe line start if any of table throwing error while copy then how to make sure it wont stop others to copy ?

  • @anujgupta-lc1md
    @anujgupta-lc1md 3 года назад +1

    same situation please explain incremental load or delta load

    • @WafaStudies
      @WafaStudies  3 года назад

      Will do soon

    • @anujgupta-lc1md
      @anujgupta-lc1md 2 года назад

      @@WafaStudies till now no update for delta load how to do from azure sql tables to blob

  • @codeworld8981
    @codeworld8981 3 года назад +1

    Hi i tried this but problem is only one table copied... When I run the PL it's prompting for table name.. But I have parameterized the table .. Pls can u clarify

  • @prajvalsingh810
    @prajvalsingh810 2 года назад

    At 11:00 I am getting only the schema parameter while I have created table parameter also.

  • @ajaiar
    @ajaiar 8 месяцев назад

    WHen i click preview data it says enter the value for schema name and table name

  • @Offical_PicturePerfect
    @Offical_PicturePerfect 3 года назад

    Copy data from different multiple sources to ADLS by using single pipeline ?

  • @ABDULSAMADKHAN2007
    @ABDULSAMADKHAN2007 3 года назад +1

    Rather than Blob storage, Can we Insert the same table record into another table directly, without using storage service?

  • @rathikavenkatesh2510
    @rathikavenkatesh2510 2 года назад

    Have a doubt. Support if any table deleted in source database, how do we handle.

  • @MuhammadAzimi-n6i
    @MuhammadAzimi-n6i Год назад

    How can I also get column headers in csv files. I am able to get all data but first row is not column header

  • @thesujata_m
    @thesujata_m 3 года назад

    How to copy folders and files without changing hierarchy structures, also am not aware what is the depth of hierarchy?

  • @himanshutrivedi4956
    @himanshutrivedi4956 3 года назад +1

    Good 1..

    • @WafaStudies
      @WafaStudies  3 года назад +1

      Thanks 😊

    • @himanshutrivedi4956
      @himanshutrivedi4956 3 года назад

      @@WafaStudies I have just replied on your scd2 video if you add those 2 date columns it would be great..

  • @souranwaris142
    @souranwaris142 2 года назад

    Hello Sir, This is really nice. I have done my pipeline perfectly, Thanks for this. but actually, I would like to ask one more thing. I want to add a trigger after this pipeline that will work if data will update in any tables then it will update in our tables as well but just new data not change past rows. Could you help me for this

  • @RKTECH1021
    @RKTECH1021 2 года назад

    Hi Maheer, I found your channel looking for ADF help, Great content
    here, sequential series one by one for each area of ADF. I have been
    continuously watching ADF videos one by one. learning a lot. I have one
    question. we have Dataverse folders in Datalake and I want to load those folder based files to Azure SQL database. each folder has csv partitioned by monthly based csv

  • @priyaperfect9485
    @priyaperfect9485 3 года назад +1

    LOOKUP has a limitation of fetching 5000 rows. what can we do to fetch more than 5000 rows

    • @WafaStudies
      @WafaStudies  3 года назад

      I will try to create a video in this soon and post 🤗

    • @priyaperfect9485
      @priyaperfect9485 3 года назад

      @@WafaStudies 😊

  • @bmaheshwarreddy4911
    @bmaheshwarreddy4911 Год назад

    Hi Maheer. Your channel really worthy one for ADF Developer community. Thank you for providing videos and suggesting developers. I have a small doubt. As per 18 scenario you are moving sql tables to blob. but i am trying to snowflake tables to blob which causes ab error "
    ErrorCode=UserErrorInvalidCopyBehaviorBlobNameNotAllowedWithPreserveOrFlattenHierarchy,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot adopt copy behavior PreserveHierarchy when copying from folder to a single file.,Source=Microsoft.DataTransfer.ClientLibrary,'
    can you please help me in this regard

  • @nivethashrishabhat2610
    @nivethashrishabhat2610 Год назад

    Hi sir,
    i have seen your video of getting external table from the linked data(eg. blob). my question is ' can we insert data to the external table'?

    • @nivethashrishabhat2610
      @nivethashrishabhat2610 Год назад

      The use case here is whenever the blob file gets modified, i need to insert it in the external table using serverless sql pool

  • @TechnoSparkBigData
    @TechnoSparkBigData 2 года назад

    Suppose I have some 200 tables in on premise sql server and want to migrate some 100 out of those to Azure sql server then how can dynamically create Table in Azure sql server with different table schema, because each table will have different columns so how to dynamically create these tables and copy data?

  • @chandruk3729
    @chandruk3729 2 года назад

    Hi Bro, I wanna do this in reverse way.. Need to copy excel sheets from blog storage to different sql tables.. can you please suggest ?

  • @mrpoola49
    @mrpoola49 3 года назад +1

    That's really cool....I have a similar requirement but I need only subset of columns from the source to destination. How can I do it? For e.g tbl_employee has 100 columns but I am interested to copy only 10 columns to the destination and also I need only few employees (where clause) from the source to destnation. How is that possible. Thanks

    • @swagatmohanty5818
      @swagatmohanty5818 3 года назад +1

      Follow the above steps to get all the columns and then under mapping tab of Copy activity, you can import schema and choose only the required columns.

  • @rathikavenkatesh2510
    @rathikavenkatesh2510 2 года назад

    Hi I have doubt, if some tables get deleted in source db, will it impact in destination. or will it create only the existing tables in source only. Since I am daily delete db and do the copying activity from source as the source gets changed daily basis. please confirm

  • @bhawnabedi9627
    @bhawnabedi9627 3 года назад +1

    👍🏻👍🏻

  • @padmajakotturi239
    @padmajakotturi239 2 года назад +1

    is it possible to copy multiple files form blob storage to sqldb without using foreach loop(this one i got in interview question)

    • @WafaStudies
      @WafaStudies  2 года назад

      Use wild card file paths

    • @amanahmed6057
      @amanahmed6057 2 года назад

      Hello I need your help
      I am having interview on Azure
      Can you guide me 🙇🙇🙇🙇

  • @Ali-q4d4c
    @Ali-q4d4c Год назад

    👍👍👍👍