18. Copy multiple tables in bulk by using Azure Data Factory
HTML-код
- Опубликовано: 8 фев 2025
- In this video, I discussed about Copying multiple tables from SQL Database in bulk to Azure blob storage using Azure Data Factory
Link for Azure Databricks Play list:
• 1. Introduction to Az...
Link for Azure Functions Play list:
• 1. Introduction to Azu...
Link for Azure Basics Play list:
• 1. What is Azure and C...
Link for Azure Data factory Play list:
• 1. Introduction to Azu...
Link for Azure Data Factory Real time Scenarios
• 1. Handle Error Rows i...
Link for Azure LogicApps playlist
• 1. Introduction to Azu...
#Azure #ADF #AzureDataFactory
I've whatched so many classes of him that now I can perfectly understand his English. Thank you bro!
Thank you 🙂
Awesome...you deserve 80k+ subscribers...👍👍
Thank you 🙂
Superb explanation and presentation.... I watching your all the videos. 👍👍👍
Thank you ☺️
this was unexpectedly explained very simply. Awesome brother. After learning from you maybe one day I will create my own channel called 'Bewafa Studies' LOL.. :)
Thanks so much for this, Saved my week
Welcome 🤗
Great topic & very helpful for understanding several key activities. Thanks a lot :)
Thank you 😊
Clear explanation.👍👍Many Thanks
Scenario1: Copy all table data from SQL DB to Azure BLOB storage. Output file name will be Schema name ‘_’ table name .csv
1. Copy all tables in the Azure SQL DB
2. Put a condition if the table is having only one row or zero row then data will not be copied for that table.
3. Copy data in a BLOB storage path as CSV file
in this scenario how , where and how we put 2nd condition
use lookup inside foreach table, to count num of rows in that table, if it matches the condition, then copy
Really good man. Thanks a lot.
Thank you ☺️
If we want to copy only few tables and automate it..whats the method? No one is RUclips has explained this real time scenario.. Kindly explain
I think you can use a stored procedure for this
Please discuss about the restartability, say for example out of 100, if 10 of the tables fail, I need to restart only the failed one automatically. How this scenario will be handled. BTW thanks for this tutorial :)
Very helpful! Thank you :)
Welcome 😊
good one, thanks for sharing.
Thank you 😊
Thanks for Sharing🎉
Will u pls make more on real time videos
that would be very helpful to the new people
Sure
Bro Make a video on '' Copy data From SAP to Azure Data lake using ADF"
If one of the table has 1 Billion Records in it, then what is the approch to copy that table?
How can we use similar approach if we want to copy data from snowflake to azure sql db?
once our pipe line start if any of table throwing error while copy then how to make sure it wont stop others to copy ?
same situation please explain incremental load or delta load
Will do soon
@@WafaStudies till now no update for delta load how to do from azure sql tables to blob
Hi i tried this but problem is only one table copied... When I run the PL it's prompting for table name.. But I have parameterized the table .. Pls can u clarify
At 11:00 I am getting only the schema parameter while I have created table parameter also.
WHen i click preview data it says enter the value for schema name and table name
Copy data from different multiple sources to ADLS by using single pipeline ?
Rather than Blob storage, Can we Insert the same table record into another table directly, without using storage service?
Yeahhh same process for a sink DB
Have a doubt. Support if any table deleted in source database, how do we handle.
How can I also get column headers in csv files. I am able to get all data but first row is not column header
How to copy folders and files without changing hierarchy structures, also am not aware what is the depth of hierarchy?
Good 1..
Thanks 😊
@@WafaStudies I have just replied on your scd2 video if you add those 2 date columns it would be great..
Hello Sir, This is really nice. I have done my pipeline perfectly, Thanks for this. but actually, I would like to ask one more thing. I want to add a trigger after this pipeline that will work if data will update in any tables then it will update in our tables as well but just new data not change past rows. Could you help me for this
Hi Maheer, I found your channel looking for ADF help, Great content
here, sequential series one by one for each area of ADF. I have been
continuously watching ADF videos one by one. learning a lot. I have one
question. we have Dataverse folders in Datalake and I want to load those folder based files to Azure SQL database. each folder has csv partitioned by monthly based csv
LOOKUP has a limitation of fetching 5000 rows. what can we do to fetch more than 5000 rows
I will try to create a video in this soon and post 🤗
@@WafaStudies 😊
Hi Maheer. Your channel really worthy one for ADF Developer community. Thank you for providing videos and suggesting developers. I have a small doubt. As per 18 scenario you are moving sql tables to blob. but i am trying to snowflake tables to blob which causes ab error "
ErrorCode=UserErrorInvalidCopyBehaviorBlobNameNotAllowedWithPreserveOrFlattenHierarchy,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot adopt copy behavior PreserveHierarchy when copying from folder to a single file.,Source=Microsoft.DataTransfer.ClientLibrary,'
can you please help me in this regard
Hi sir,
i have seen your video of getting external table from the linked data(eg. blob). my question is ' can we insert data to the external table'?
The use case here is whenever the blob file gets modified, i need to insert it in the external table using serverless sql pool
Suppose I have some 200 tables in on premise sql server and want to migrate some 100 out of those to Azure sql server then how can dynamically create Table in Azure sql server with different table schema, because each table will have different columns so how to dynamically create these tables and copy data?
Hi Bro, I wanna do this in reverse way.. Need to copy excel sheets from blog storage to different sql tables.. can you please suggest ?
That's really cool....I have a similar requirement but I need only subset of columns from the source to destination. How can I do it? For e.g tbl_employee has 100 columns but I am interested to copy only 10 columns to the destination and also I need only few employees (where clause) from the source to destnation. How is that possible. Thanks
Follow the above steps to get all the columns and then under mapping tab of Copy activity, you can import schema and choose only the required columns.
Hi I have doubt, if some tables get deleted in source db, will it impact in destination. or will it create only the existing tables in source only. Since I am daily delete db and do the copying activity from source as the source gets changed daily basis. please confirm
👍🏻👍🏻
Thank you ☺️
is it possible to copy multiple files form blob storage to sqldb without using foreach loop(this one i got in interview question)
Use wild card file paths
Hello I need your help
I am having interview on Azure
Can you guide me 🙇🙇🙇🙇
👍👍👍👍