How can we implement increment load when we have newest file and the records will be updated in sql table, they will be continous of the old data imported, this is my use case.
Great video. What if I want to delete all records from Table first and write new data. Also, I have 13 files with large amount of data. Some are around 100 mb, it works perfectly fine until 6 files but later pipeline get stuck at 500,000 records. Any suggestion ?
I have 11 collections in my azure cosmos mongo db, and I have 11 json files in my Azure Blob Storage Container. I'm using Data Factory copy to copy json files from blob to mongodb api. Here I'm able to copy only one file to one collection. I need to copy similarly all jsons to collections. How to copy multiple json files to multiple collections using data factory
How can we implement increment load when we have newest file and the records will be updated in sql table, they will be continous of the old data imported, this is my use case.
Did you ever find a solution in adf for this use case?
@@allthingsmicrosoft365 not yet
Excellent video sir
In this use case Do we need to map source tables columns with sink tables columns or not?
Hi, thank you for your questions, we need to map the data types between source and sink .
Yes, we need to map datatypes from source to target(sink).
How many link dataset and sink dataset to we need to create in this case?
We need to create separate dataset for each type of file format .
@@saicloudworld but here file format is only cv. then how many?
Hi, here we will use one source dataset and one sink data set.
Great video. What if I want to delete all records from Table first and write new data. Also, I have 13 files with large amount of data. Some are around 100 mb, it works perfectly fine until 6 files but later pipeline get stuck at 500,000 records. Any suggestion ?
Add Truncate statement in Pre-sql tab. You need to increase copy activity throughput. Use Degree of parallelism and increase DIU.
When I pass parameter in dataset level automatically get metadata activity is asking a value
Please help how to fix it
I have 11 collections in my azure cosmos mongo db, and I have 11 json files in my Azure Blob Storage Container. I'm using Data Factory copy to copy json files from blob to mongodb api. Here I'm able to copy only one file to one collection. I need to copy similarly all jsons to collections. How to copy multiple json files to multiple collections using data factory
Please refer this : ruclips.net/video/gASkX3BFUcY/видео.html&feature=share
Need to copy multiple files from ADLS to multiple tables in Azure SQL Server...:)
instead of sql server we can import from my sql workbench is it possible or not ?