2. Get File Names from Source Folder Dynamically in Azure Data Factory
HTML-код
- Опубликовано: 8 сен 2024
- In this video, I discussed about Getting File Names Dynamically from Source folder in Azure Data Factory
Link for Azure Functions Play list:
• 1. Introduction to Azu...
Link for Azure Basics Play list:
• 1. What is Azure and C...
Link for Azure Data factory Play list:
• 1. Introduction to Azu...
#Azure #ADF #AzureDataFactory
thank you sir, youre legend, respect from brazil
Nice videos, Clear Steps.. Please keep uploading. Thank You !
Expertly done!! You explained this perfectly for me. Thank you for sharing your expertise.
Thank you 😊
Excellent content !! Thanks Mate for taking out your time. Could you pls do a series on Data Factory DevOps integration. Building a CI / CD pipeline using library variables
Very good content, practical scenarios are helpful
Thank you ☺️
Very good explanation. Thank you for this video!
Very good explanation👌 . Its very helpful
Thank you 🙂
Amazing! Thank you my friend!
Welcome 🤗
Good one, very helpful and practical scenario. You made it exactly as it is needed!
Wonderful content ✌️. Really helpful
Thank you 🙂
Amazing and really helpful
Thank you 🙂
Good content , really helpful, clear steps , thanks for the video
Very good explanation
very helpful sir
Thank you 😊
What a content boss.. Really very impressive.. May I know which videos should I refer to get started with Azure Cloud as I am relatively new to this.. I know MSBI and wants to get upgrade myself to Azure Cloud.. Kindly suggest and Your Contents are awesome.. Hats off to You.. 🤟👏
U can start with Azure data factory and Azure Synapse Analytics
Can you please create a video, how to upload multiple Excel data in Sql Server using Data Flows and please also used data conversion. It doesn't seem to be as easy as we do in SSIS.
It would much helpful if uploaded
Thank you for great explanation. Please could we expect learning videos on Azure Synapse Analytics?
I created a playlist on Synapse analytics. Kindly check it. Link below.
ruclips.net/p/PLMWaZteqtEaIZxPCw_0AO1GsqESq3hZc6
@@WafaStudies Tons of thank you,dear Sir . Much much helpful
@@deepaksahirwar welcome 😁
If possible try a video on crating Global parameters and pass the values dynamically with different DataBases
thank you, this helped me...
Very usefull videos pls make data bricks videos also
Hi I am already doing databricks playlist. Kindly check it. Currently that playlist is inprogress
If possible can you make in telugu also
Hi Maheer, Thanks for the detailed explanation. For this topic the scenario should be "Read Files from Source Folder Dynamically in Azure Data Factory" instead Get File Names.. We are not reading/getting "filenames" right, the files were being just copied from source to Sink.?
First he is copying file names n then using those file names he is copying the files.
Nice work!
Could you please make a video on. How to check 0Kb csv files / zero row record from source. If zero Kb file/Zero records in source trigger an email, in azure Data Factory.
Thanks in advance.
Hello Maheer,
could you please make a video about copying excel files. if we implement as like above video, in excel files its asking sheet name.
amazing plz add some incremental load handling data & how to check whether files are present in blob or not from validation or getmetda actitiy.
Sure will add videos on those scenarios. Thank you
Can we pass file path dynamically? I have sql table from there I can take the file path. This file path needs to be passed to getmetadata and list the files.
Looking for your help. Thank you so much!
Hi, This Tutorial is really helpful but I find it difficult to point the files to a database instead of the same directory as the source file as is the case in most real life scenarios, please help on how to get these files into an SQL server database
what if one wants to to a similar things but with .txt or .sql files ( stored in a ADLS Gen 2 container ) ?
Hi please tell us how do we import multiple files from different sources in ADF (this is interview quesion)
Hello thanks for your wonderful videos, can you please give an idea on pushing all folders along with files in data lake with one go.
I can't find the "recursively " option anymore in Get Metadata activity. Can you please let me know how to get all the files recursively in GetMetadata acty ? Thanks.
At minutes 4:27 in the video, we can see the recursive property, however, that's not the case for me. The software has been updated and the video might be outdated. Can you please help me with this as I cannot find the recursive property?
Hi Maheer, do you video, where we copy csv file from dynamic folder in adls to new folder in adls and store it as parquet.
Hi... I could nt see recursive option in synapse analytics. can u suggest some ways to get subfolder?
Thanks for video first. My question is; what should i do if i want to copy only a specific file types from my input folder (ex. just csv files) in this foreach loop example?
the data set which you use to get the file names is not the same what you select inside the copy activity. Ideally both should be same ?
nice
Thank you 😊
Hi
I am using pkg parameter Startdate is false but it was not picking parameter due to UTC time zone how can we change this dynamically.
Hi , thanks for uploading amazing content . i have one question here , what if we hav different file types in the source blob container like .TXT,.CSV,parquet files , orc files and I want to copy all of these to a different path . This was asked in one of my interviews. Can you suggest what can be done here .TIA
in this scenario you can use a lookup activity. Create a config file store all the dynamic parameters you want to pass to a pipeline, after lookup use foreach and copy activity. Sorted
In your metadata, there is an option of recursively but in my metadata, I did not see this option
He please us how do we import data from different sources for multiple files this is interview quetion
Can i use the items().name inside activity which will be inside a if else loop activity which is inside for each activity ? I hope my question makes sense
You can use if that expression is within foreach
If want to copy different files into different output sub folder how could I do it
Could it be possible to create a destination folder structure same as the source folder structure automatically while uploading files in a data leak using a data factory? The destination folder structure should be created automatically.
Example -
I have 3 files in the "C:\Test\Upload" folder. This is on-premises.
Now I want to upload those files in data leak using the data factory and destination folder structure should be C:\Test\Upload", which should create automatically.
Please advice.
Nice video but if I'd like to store/capture the filename into a sql table column then how can we do that? Could you please update us here. That means load the three files into sql table with respective filename into sql table column. Thanks
Hi Bro,
Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.
if i want to copy multiple file from blob to sql db , how can i do so
Can I use with the configured wildcard?
How can we execute dataflow tasks parallel in ADF..? Any idea please..
To execute data flows in adf u need to use data flow activity. You may have your data flow in foreach and execute it for multiple iterations in parallel
@@WafaStudies Thanks bro....
if the schema varies how foreach detect metadata of different tables or files and load data into different tables or files.
@@krishnakuraku6853 u need handle that. Allow schema drift option allow you to handle that. Pls check my vidoe on schema drift concept in data flows.
Is there any possible to change the linked service dynamically at triggering level, if yes please share your video here
ruclips.net/video/M22Mj0rcBcs/видео.html
How to contact you?
ANOTHER video that shows the same thing - getting files from a singlr folder level. Too easy. How about getting all the files from a nested folder structure where the actual files may be n levels down (n is unknown)???
Hi.......