How to Load Multiple CSV Files to Multiple Tables According to File Name in Azure Data Factory

Поделиться
HTML-код
  • Опубликовано: 2 фев 2025

Комментарии • 41

  • @mairaacevedo8960
    @mairaacevedo8960 2 года назад +4

    Fantastic content. I combined some of your videos to do a bit of a complex task and I'm so happy it worked!. Thanks heaps!

  • @jayong2370
    @jayong2370 2 года назад +1

    Thank you! This is exactly what I was trying to configure.

  • @piraviperumal2544
    @piraviperumal2544 2 года назад +4

    Hi Brother,
    Not sure whether Azure team fixed it or not but @replace(item().name,'.txt','') is working fine. I guess you have missed @ sign before replace function in your attempt.

  • @bitips
    @bitips 2 года назад +1

    Thanks for share this knowledge. It is fantastic !

  • @TheMadPiggy
    @TheMadPiggy 3 года назад +3

    works like a charm, however the auto created tables all are nvarchar(MAX). Not the best for database size, not for useability. Any way around this.

    • @TechBrothersIT
      @TechBrothersIT  3 года назад

      I noticed that too, the data type is nvarchar max. You might want to create final tables once data is loaded and have final tables with correct data types, create stored procedure to load data from these staging tables to your destination tables. if you have already created the tables with correct data type, then you will be fine too.

  • @marcuslawson-composer2892
    @marcuslawson-composer2892 2 года назад +1

    Very helpful video. Thank you!

  • @devops840
    @devops840 2 года назад +1

    Hi Sir,
    I am able to insert the data using dynamic CSV files, Could you please help me in upserting the data ?

  • @neftalimich
    @neftalimich 3 года назад +1

    Thank you very much, was really helpful.

  • @insane2093
    @insane2093 9 месяцев назад

    Small query sir , once created the table, again if new data or new files come with suffix changes like date change then again it create new table or insert the data into the already created table coz you are using auto created option . Thank you in advance

  • @vishal-xf6ev
    @vishal-xf6ev 3 года назад +1

    Hi Brother ,
    Great Video & thanks for sharing :-)

  • @tomasoon
    @tomasoon Год назад

    very great tutorial, i have a question, if I run a pipeline, and there's a new csv file in the bucket with the same schema as other, this method will apend the data to the table with same schema or will create another one?

  • @maartencastsbroad
    @maartencastsbroad 2 года назад

    Great video, exactly what I needed!

  • @williamtenhoven8405
    @williamtenhoven8405 Год назад

    Hi, thanks for this ! 1 question. suppose I wanted to convert the csv files to parquet files the how would I proceed ? I used the concat replace, but looking at the target parquet files they seem to be corrupted : The file 'Emp1' may not render correctly as it contains an unrecognized extension. @concat(replace(item().name,'csv','parquet')) does not work either..... Any suggestions ? Thanks

  • @harshanakularatna
    @harshanakularatna 3 года назад +1

    you are awsome. keep it up!

  • @ayushikankane530
    @ayushikankane530 7 месяцев назад

    If csv file is hqving some columna as json structure than how to proceed?

  • @Eraldi2323
    @Eraldi2323 2 года назад +1

    Hi TechBrothers, thanks for this very useful video.
    I had a question, I am trying to truncate the tables with the following
    @{concat('truncate table',item().name)} but is not working for me, giving an error
    Please advise.
    Thank You

    • @niranjanchinnu8295
      @niranjanchinnu8295 2 года назад

      i tried this today as well. My implementation idea is to truncate and insert into tables. For that I truncated the table with TRUNCATE TABLE [SCHEMA_NAME].@{item.name} . After this step if the table exists already then it would truncate. Orelse try pointing a fail output line to the same block that you are pointing the sucess block. So by doing this if table doesnt exists then it will go in the fail block and execute it and if it is present then it will truncate and give you the appropriate results

  • @viswaanand4578
    @viswaanand4578 2 года назад

    Hi
    I can see my csv files in SSMS but cannot see in table format in SSMS also it is in CSV format did i miss anything?

  • @kiranreddy9103
    @kiranreddy9103 2 года назад

    HI, if file names are like emp1 ,emp2, emp3 etc. in this case how we can write a expression to remove numb
    ers in REPLACE. could you help us.

  • @gebakjes1099
    @gebakjes1099 3 года назад +1

    Thanks! Really helpful!

  • @SRINIVASP-fx5kz
    @SRINIVASP-fx5kz 2 года назад

    excellent video super

  • @uditbhargava8762
    @uditbhargava8762 2 года назад

    Sir can we use split() function to remove .txt ?

  • @rohitsethi5696
    @rohitsethi5696 Год назад

    hi im rohit can we use copy data activity from CSV files if not why ?

  • @purushothamnaidu5544
    @purushothamnaidu5544 3 года назад +4

    Sir...Can you show once how to load the files available in blob container and load into multiple existing tables in azure sql database, that would be really helpful to me

    • @ambatiprasanth4292
      @ambatiprasanth4292 Год назад

      Brother i was looking for the same... Now did you know how to do it.?

  • @boatengappiah2116
    @boatengappiah2116 3 года назад +1

    Great videos. I however don't see any video on SharePoint with ADF. Do you have a video or can you make one? Thank you

    • @TechBrothersIT
      @TechBrothersIT  3 года назад

      Hoping to have one soon. working in many videos and scenarios. thanks for feedback

  • @Deezubwun
    @Deezubwun 2 года назад

    Hi. This was a great help to me. One issue I am having is the data is failing to load due to multiple data type errors (such as String to DATETIME). As the data in the CSV is exported as string, do you have a way of mapping the formatting of each field which is a problem, bearing in mind the columns may be named something different?

  • @thyagaraj1124
    @thyagaraj1124 3 года назад

    Is it possible to load the different source files into existing tables in the SQL server? Means the source file names do not match with the existing table names?

    • @TechBrothersIT
      @TechBrothersIT  3 года назад +1

      Hi, yes that is possible, but you have to provide some type of source and destination, if file names are different , you can group them in source and then destination table can stay same.

  • @niteshsoni2282
    @niteshsoni2282 2 года назад +1

    THANKS SIR

  • @vijaysagar5984
    @vijaysagar5984 2 года назад

    Hi Bro,
    Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.

    • @TechBrothersIT
      @TechBrothersIT  2 года назад

      One of the way could be load the data without header information into staging table and then remove the bad header data and only use clean data.

  • @kirubababu7127
    @kirubababu7127 3 года назад

    How to do this in HTTP server?

  • @sujiths4165
    @sujiths4165 23 дня назад

    good