2. Get File Names from Source Folder Dynamically in Azure Data Factory

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024
  • In this video, I discussed about Getting File Names Dynamically from Source folder in Azure Data Factory
    Link for Azure Functions Play list:
    • 1. Introduction to Azu...
    Link for Azure Basics Play list:
    • 1. What is Azure and C...
    Link for Azure Data factory Play list:
    • 1. Introduction to Azu...
    #Azure #ADF #AzureDataFactory

Комментарии • 83

  • @rbor-xb5eg
    @rbor-xb5eg 10 месяцев назад +1

    thank you sir, youre legend, respect from brazil

  • @damayantibhuyan7147
    @damayantibhuyan7147 4 года назад +2

    Nice videos, Clear Steps.. Please keep uploading. Thank You !

  • @GunNut37086
    @GunNut37086 Год назад +1

    Expertly done!! You explained this perfectly for me. Thank you for sharing your expertise.

  • @debasisroy9625
    @debasisroy9625 4 года назад +4

    Excellent content !! Thanks Mate for taking out your time. Could you pls do a series on Data Factory DevOps integration. Building a CI / CD pipeline using library variables

  • @rakeshupadhyay1
    @rakeshupadhyay1 2 года назад +1

    Very good content, practical scenarios are helpful

  • @tipums123
    @tipums123 4 года назад +1

    Very good explanation. Thank you for this video!

  • @aruntejvadthya1309
    @aruntejvadthya1309 4 года назад +1

    Very good explanation👌 . Its very helpful

  • @MarceloSilva-us1gh
    @MarceloSilva-us1gh 3 года назад +1

    Amazing! Thank you my friend!

  • @empowerpeoplebytech2347
    @empowerpeoplebytech2347 3 года назад

    Good one, very helpful and practical scenario. You made it exactly as it is needed!

  • @SatyaPParida
    @SatyaPParida 4 года назад +1

    Wonderful content ✌️. Really helpful

  • @bhavnakamble975
    @bhavnakamble975 3 года назад +1

    Amazing and really helpful

  • @sabinsunny655
    @sabinsunny655 3 года назад

    Good content , really helpful, clear steps , thanks for the video

  • @abnavalagatti
    @abnavalagatti 3 года назад

    Very good explanation

  • @gubscdatascience1805
    @gubscdatascience1805 2 года назад +1

    very helpful sir

  • @siddheshamrutkar8684
    @siddheshamrutkar8684 2 года назад +1

    What a content boss.. Really very impressive.. May I know which videos should I refer to get started with Azure Cloud as I am relatively new to this.. I know MSBI and wants to get upgrade myself to Azure Cloud.. Kindly suggest and Your Contents are awesome.. Hats off to You.. 🤟👏

    • @WafaStudies
      @WafaStudies  2 года назад

      U can start with Azure data factory and Azure Synapse Analytics

  • @subhashkomy
    @subhashkomy 3 года назад +2

    Can you please create a video, how to upload multiple Excel data in Sql Server using Data Flows and please also used data conversion. It doesn't seem to be as easy as we do in SSIS.

  • @deepaksahirwar
    @deepaksahirwar 2 года назад +1

    Thank you for great explanation. Please could we expect learning videos on Azure Synapse Analytics?

    • @WafaStudies
      @WafaStudies  2 года назад +1

      I created a playlist on Synapse analytics. Kindly check it. Link below.
      ruclips.net/p/PLMWaZteqtEaIZxPCw_0AO1GsqESq3hZc6

    • @deepaksahirwar
      @deepaksahirwar 2 года назад +1

      @@WafaStudies Tons of thank you,dear Sir . Much much helpful

    • @WafaStudies
      @WafaStudies  2 года назад +1

      @@deepaksahirwar welcome 😁

  • @YanadireddyBonamukkala
    @YanadireddyBonamukkala 4 года назад

    If possible try a video on crating Global parameters and pass the values dynamically with different DataBases

  • @Kannan-lt1ud
    @Kannan-lt1ud 3 года назад

    thank you, this helped me...

  • @sirisiri3797
    @sirisiri3797 2 года назад +1

    Very usefull videos pls make data bricks videos also

    • @WafaStudies
      @WafaStudies  2 года назад

      Hi I am already doing databricks playlist. Kindly check it. Currently that playlist is inprogress

    • @sirisiri3797
      @sirisiri3797 2 года назад

      If possible can you make in telugu also

  • @venukumar1094
    @venukumar1094 3 года назад +2

    Hi Maheer, Thanks for the detailed explanation. For this topic the scenario should be "Read Files from Source Folder Dynamically in Azure Data Factory" instead Get File Names.. We are not reading/getting "filenames" right, the files were being just copied from source to Sink.?

    • @tallaravikumar4560
      @tallaravikumar4560 Год назад

      First he is copying file names n then using those file names he is copying the files.

  • @sandeep5996
    @sandeep5996 4 года назад +1

    Nice work!
    Could you please make a video on. How to check 0Kb csv files / zero row record from source. If zero Kb file/Zero records in source trigger an email, in azure Data Factory.
    Thanks in advance.

  • @rajeshgowd
    @rajeshgowd Год назад

    Hello Maheer,
    could you please make a video about copying excel files. if we implement as like above video, in excel files its asking sheet name.

  • @anujgupta-lc1md
    @anujgupta-lc1md 4 года назад +1

    amazing plz add some incremental load handling data & how to check whether files are present in blob or not from validation or getmetda actitiy.

    • @WafaStudies
      @WafaStudies  4 года назад

      Sure will add videos on those scenarios. Thank you

  • @pradeepert
    @pradeepert 2 года назад

    Can we pass file path dynamically? I have sql table from there I can take the file path. This file path needs to be passed to getmetadata and list the files.
    Looking for your help. Thank you so much!

  • @yusufuthman8571
    @yusufuthman8571 3 года назад +1

    Hi, This Tutorial is really helpful but I find it difficult to point the files to a database instead of the same directory as the source file as is the case in most real life scenarios, please help on how to get these files into an SQL server database

  • @lib133
    @lib133 Месяц назад

    what if one wants to to a similar things but with .txt or .sql files ( stored in a ADLS Gen 2 container ) ?

  • @itech7313
    @itech7313 Год назад

    Hi please tell us how do we import multiple files from different sources in ADF (this is interview quesion)

  • @manognadamacharla3346
    @manognadamacharla3346 3 года назад

    Hello thanks for your wonderful videos, can you please give an idea on pushing all folders along with files in data lake with one go.

  • @mohangonnabathula2261
    @mohangonnabathula2261 2 года назад

    I can't find the "recursively " option anymore in Get Metadata activity. Can you please let me know how to get all the files recursively in GetMetadata acty ? Thanks.

  • @viveknimmagadda2397
    @viveknimmagadda2397 Год назад

    At minutes 4:27 in the video, we can see the recursive property, however, that's not the case for me. The software has been updated and the video might be outdated. Can you please help me with this as I cannot find the recursive property?

  • @esrasultan8963
    @esrasultan8963 11 месяцев назад

    Hi Maheer, do you video, where we copy csv file from dynamic folder in adls to new folder in adls and store it as parquet.

  • @princyshinas8740
    @princyshinas8740 2 года назад

    Hi... I could nt see recursive option in synapse analytics. can u suggest some ways to get subfolder?

  • @ArabaEfsanesi
    @ArabaEfsanesi 3 года назад

    Thanks for video first. My question is; what should i do if i want to copy only a specific file types from my input folder (ex. just csv files) in this foreach loop example?

  • @sid0000009
    @sid0000009 3 года назад

    the data set which you use to get the file names is not the same what you select inside the copy activity. Ideally both should be same ?

  • @srinivasu5984
    @srinivasu5984 3 года назад +1

    nice

  • @sureshch8328
    @sureshch8328 3 года назад

    Hi
    I am using pkg parameter Startdate is false but it was not picking parameter due to UTC time zone how can we change this dynamically.

  • @mathangiananth6599
    @mathangiananth6599 3 года назад +1

    Hi , thanks for uploading amazing content . i have one question here , what if we hav different file types in the source blob container like .TXT,.CSV,parquet files , orc files and I want to copy all of these to a different path . This was asked in one of my interviews. Can you suggest what can be done here .TIA

    • @sumitbarde3677
      @sumitbarde3677 2 года назад

      in this scenario you can use a lookup activity. Create a config file store all the dynamic parameters you want to pass to a pipeline, after lookup use foreach and copy activity. Sorted

  • @vinayraghuwanshi9419
    @vinayraghuwanshi9419 4 года назад

    In your metadata, there is an option of recursively but in my metadata, I did not see this option

  • @itech7313
    @itech7313 Год назад

    He please us how do we import data from different sources for multiple files this is interview quetion

  • @mobinbagwan5747
    @mobinbagwan5747 3 года назад +1

    Can i use the items().name inside activity which will be inside a if else loop activity which is inside for each activity ? I hope my question makes sense

    • @WafaStudies
      @WafaStudies  3 года назад +1

      You can use if that expression is within foreach

  • @anupgupta5781
    @anupgupta5781 3 года назад

    If want to copy different files into different output sub folder how could I do it

  • @kunalratnaparkhi828
    @kunalratnaparkhi828 3 года назад

    Could it be possible to create a destination folder structure same as the source folder structure automatically while uploading files in a data leak using a data factory? The destination folder structure should be created automatically.
    Example -
    I have 3 files in the "C:\Test\Upload" folder. This is on-premises.
    Now I want to upload those files in data leak using the data factory and destination folder structure should be C:\Test\Upload", which should create automatically.
    Please advice.

  • @rakeshmishra4650
    @rakeshmishra4650 3 года назад

    Nice video but if I'd like to store/capture the filename into a sql table column then how can we do that? Could you please update us here. That means load the three files into sql table with respective filename into sql table column. Thanks

  • @vijaysagar5984
    @vijaysagar5984 2 года назад

    Hi Bro,
    Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.

  • @ravisamal3533
    @ravisamal3533 3 года назад

    if i want to copy multiple file from blob to sql db , how can i do so

  • @fannymaryjane
    @fannymaryjane 3 года назад

    Can I use with the configured wildcard?

  • @krishnakuraku6853
    @krishnakuraku6853 4 года назад +1

    How can we execute dataflow tasks parallel in ADF..? Any idea please..

    • @WafaStudies
      @WafaStudies  4 года назад +1

      To execute data flows in adf u need to use data flow activity. You may have your data flow in foreach and execute it for multiple iterations in parallel

    • @krishnakuraku6853
      @krishnakuraku6853 4 года назад

      @@WafaStudies Thanks bro....

    • @krishnakuraku6853
      @krishnakuraku6853 4 года назад

      if the schema varies how foreach detect metadata of different tables or files and load data into different tables or files.

    • @WafaStudies
      @WafaStudies  4 года назад

      @@krishnakuraku6853 u need handle that. Allow schema drift option allow you to handle that. Pls check my vidoe on schema drift concept in data flows.

  • @YanadireddyBonamukkala
    @YanadireddyBonamukkala 4 года назад +1

    Is there any possible to change the linked service dynamically at triggering level, if yes please share your video here

    • @WafaStudies
      @WafaStudies  4 года назад +1

      ruclips.net/video/M22Mj0rcBcs/видео.html

  • @bandarurohithkumar439
    @bandarurohithkumar439 2 года назад

    How to contact you?

  • @stangoodvibes
    @stangoodvibes Год назад

    ANOTHER video that shows the same thing - getting files from a singlr folder level. Too easy. How about getting all the files from a nested folder structure where the actual files may be n levels down (n is unknown)???

  • @ayubshaik2415
    @ayubshaik2415 3 года назад

    Hi.......