Azure Data Factory | Copy multiple tables in Bulk with Lookup & ForEach

Поделиться
HTML-код
  • Опубликовано: 20 апр 2020
  • With Azure Data Factory Lookup and ForEach activities you can perform dynamic copies of your data tables in bulk within a single pipeline.
    In this episode I will show you how to perform bulk copies with ADF.
    Source code for demos: github.com/MarczakIO/azure4ev...
    In this episode live demo of ADF lookup activity in SQL to Blob export scenario and how to control this using metadata table.
    Next steps for you after watching the video
    1. ADF bulk copy scenario
    - docs.microsoft.com/en-us/azur...
    2. Lookup Activity docs
    - docs.microsoft.com/en-us/azur...
    3. ForEach Activity docs
    - docs.microsoft.com/en-us/azur...
    Want to connect?
    - Blog marczak.io/
    - Twitter / marczakio
    - Facebook / marczakio
    - LinkedIn / adam-marczak
    - Site azure4everyone.com
  • НаукаНаука

Комментарии • 355

  • @apogeeaor5531
    @apogeeaor5531 19 дней назад

    Thank you, Adam. I rewatch this video at least twice a year, Thank you for all you do.

  • @rapchak2
    @rapchak2 2 года назад +26

    Cannot thank you enough for your incredibly well laid out, thorough explanations. The world needs more folks like you :)

  • @quyenpn318
    @quyenpn318 2 года назад +4

    I really really like how you guide step by step like this, it is quite easy to understand. You are the best “trainner” I’ve seen, really appreciated for your time on creating those useful videos.

  • @albertoarellano1494
    @albertoarellano1494 4 года назад +13

    You're the best Adam! Thanks for all the help, been watching your tutorials on ADF and they're very helpful. Keep them coming!

  • @genniferlyon8577
    @genniferlyon8577 2 года назад +3

    Thank you Adam! I had been trying to follow some other written content to do exactly what you showed with no success. Your precise steps and explanation of the process were so helpful. I am successful now.

  • @ahmedroberts4883
    @ahmedroberts4883 Год назад

    Excellent, Excellent video. This has truly cemented the concepts and processes you are explaining in my brain. You are awesome, Adam!

  • @satishutnal
    @satishutnal 3 года назад +13

    You are the example for how teaching should be. Just awesome 👍

  • @shaileshsondawale2811
    @shaileshsondawale2811 Год назад

    What a wonderful content you have place in social media.. What a world class personality you... People certainly fall in love with your teaching..

  • @markdransfield9520
    @markdransfield9520 Год назад

    Brilliant teaching style Adam. Very watchable. I particularly like how you explain the background. I've subscribed and will watch more of your videos.

  • @pratibhaverma7857
    @pratibhaverma7857 2 года назад

    Your videos are great. This is the best channel on RUclips platform to learn about ADF. THANKS 🙏😊

  • @rajanarora6655
    @rajanarora6655 2 года назад

    Awesome explanation, the way you teach assuming in layman terms is pretty great, thanks!!

  • @gunturulaxmi8037
    @gunturulaxmi8037 Год назад

    Videos are very much clear to the people who would like to learn and practice.Thanks alot.your hard work is appreciated.

  • @elisehunter3424
    @elisehunter3424 2 года назад

    Brilliant tutorial. Easy to follow and it all works like a charm. Thank you!!

  • @hollmanalu
    @hollmanalu 4 года назад +1

    Adam, thanks for all your great video's! I appreciate your work very much! Keep up your great work!

  • @anderschristoffersen2513
    @anderschristoffersen2513 3 года назад +2

    Great and simple walk through, good job Adam

  • @sumanthdixit1203
    @sumanthdixit1203 3 года назад +1

    Fantastic clear-cut explanation. Nice job!

  • @AVADHKISHORE
    @AVADHKISHORE 3 года назад

    Thank you Adam!! These videos are really very helpful and builds the foundation to understand ADF.

  • @priyankapatel9461
    @priyankapatel9461 3 года назад +1

    You have depth knowledge in every service. I learn from scratch using your channel. Keep posting Thanks you and God bless you.

  • @RavinderApril
    @RavinderApril 2 месяца назад

    Incredibly simplified to learn. .. Great!!

  • @geoj9716
    @geoj9716 3 года назад +2

    You are a very good teacher.

  • @jakirajam
    @jakirajam Год назад

    The way you explain is super Adam. Really nice

  • @amarnadhgunakala2901
    @amarnadhgunakala2901 4 года назад +2

    Thanks Adam, I'm waiting like this video on ADF, Please do regularly...

  • @xiaobo1134
    @xiaobo1134 3 года назад +1

    Thanks Adam, your tutorials are very useful, hope to see more in the future

  • @paullevingstone4834
    @paullevingstone4834 3 года назад +1

    Very professionally demonstrated and very clear to understand. Thank you very much

  • @NewMayapur
    @NewMayapur 3 года назад

    fantastic video Adam!! Really helpful to understand the parametrisation in ADF.

  • @maimemphahlele1102
    @maimemphahlele1102 3 года назад +1

    Hi Adam
    Ur videos are just too brilliant. This is subscription I wouldn’t mind paying to support. Ur lessons are invaluable to learning.

  • @pdsqsql1493
    @pdsqsql1493 2 года назад

    Wow! What Great video, very easy way step by step tutorials and explanations. Well done!

  • @anacarrizo2209
    @anacarrizo2209 10 месяцев назад

    THANK YOU SO MUCH for this! The step-by-step really helped with what I needed to do.

  • @santanughosal9785
    @santanughosal9785 Год назад

    I was looking for this video. Thanks for making this. It helps a lot. Thanks again.

  • @amtwork5417
    @amtwork5417 3 года назад +1

    Great video, easy to follow and to the point, really helped me to quickly get up a running with data factory.

  • @garciaoscar7611
    @garciaoscar7611 10 месяцев назад

    This video was really helpful! you have leveled up my Azure skills, Thank you sir, you have gained another subscriber

  • @avicool08
    @avicool08 2 года назад +1

    very simple yet powerful explanation

  • @deoroopnarine6232
    @deoroopnarine6232 4 года назад

    Your videos are awesome man. Gave me a firm grasp and encouraged me to get an azure subscription and play around some more.

  • @amoldesai4605
    @amoldesai4605 3 года назад

    I am a beginner in Azure Data Engineering and you made it simple to learn all the tactics.. thanks

  • @wouldyoudomeakindnes
    @wouldyoudomeakindnes 3 года назад +1

    your skills are in the tops thanks, love to see your channel grow

  • @waseemmohammed1088
    @waseemmohammed1088 3 года назад +1

    Thank you so much for the clear and nice explanation, I am new to ADF and learning a lot from your channel

  • @radiomanzel8570
    @radiomanzel8570 Год назад

    it was so perfect , I was able to follow and copy data in first attempt .thanks

  • @CoolGuy
    @CoolGuy 2 года назад

    You are a legend. Next level editing and explanation

  • @eatingnetwork6474
    @eatingnetwork6474 3 года назад +1

    Thanks Adam, amazing workshop, very clear and easy to follow, thanks for helping, i am wiser now :)

  • @naseemmca
    @naseemmca 3 года назад +1

    Adam you are just awesome man! The way you are teaching is excellent. Keep it up.. you are the best...

  • @leonkriner3744
    @leonkriner3744 Год назад

    Amazingly simple and informative!

  • @Vick-vf8ug
    @Vick-vf8ug 3 года назад +1

    It is extremely hard to find information online about this topic. Thank you for making it easy!

  • @diatprojects7220
    @diatprojects7220 Год назад

    Very well explained. Thank you so much!

  • @Cheyenne9663
    @Cheyenne9663 Год назад

    Wow this was explained so well. Thank you!!!

  • @eversilver99
    @eversilver99 3 года назад +1

    Excellent video and knowledge sharing. Great Job!

  • @rubensanchez6366
    @rubensanchez6366 3 года назад +1

    Very interesting video Adam. I found quite enlightening your idea of storing metadata. probably it could be maintained separately tracking last record loaded so we could use it as an input for delta loads through queries instead of reloading the full table on each run.

    • @AdamMarczakYT
      @AdamMarczakYT  2 года назад

      You can either use watermark or change tracking patterns check this out docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-overview?WT.mc_id=AZ-MVP-5003556

  • @ashokveluguri1910
    @ashokveluguri1910 3 года назад

    You are awesome Adam. Thank you so much for detailed explanation.

  • @e2ndcomingsoon655
    @e2ndcomingsoon655 2 года назад

    Thank you! I really appreciate all you share, it truly helps me

  • @anilchenchu1017
    @anilchenchu1017 Год назад

    Awsome adam there cant be a way to explain better than this

  • @shivangrana02
    @shivangrana02 3 года назад +1

    You are really best Adam! Your tutorial helped me a lot. Thanks

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      Happy to hear that!

    • @shivangrana02
      @shivangrana02 3 года назад

      @@AdamMarczakYT You are welcome. Please keep up the good work.

  • @jacobklinck8011
    @jacobklinck8011 3 года назад +1

    Great session!! Thanks Adam.

  • @dev.gaunau
    @dev.gaunau 3 года назад +1

    Thank you so much for sharing these valued knowledge. It's very helpful for me.

  • @abhishektrivedi3406
    @abhishektrivedi3406 3 года назад

    You're awesome Adam, thanks for such a great tutorial. I also tweeted this video. Thanks.!!

  • @TheSQLPro
    @TheSQLPro 3 года назад +1

    Great content, easy to follow!!

  • @sameeranavalkar9352
    @sameeranavalkar9352 Год назад

    Thank you so much for this. It helped a lot

  • @Ro5ho19
    @Ro5ho19 3 года назад +2

    Thank you! It's under appreciated how important it is to name things something other than "demoDataset", but it makes a big difference both for understanding concepts, and maintainability.

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +2

      Glad it was helpful! You are of course correct, if it's not demo then take care of your naming conventions.

  • @ivanovdenys
    @ivanovdenys 4 года назад

    It is really cool that you make it so simple :)

  • @kamalnathvaithinathan5737
    @kamalnathvaithinathan5737 3 года назад

    Awesome Adam!! you are the best. Thank you so much

  • @nathalielink3869
    @nathalielink3869 3 года назад +1

    Awesome. Thank you so much Adam!

  • @ElProgramadorOficial
    @ElProgramadorOficial 3 года назад +1

    Adam, You are the best!. Thanks man!

  • @lucasortiz537
    @lucasortiz537 3 года назад +1

    These tutorials are so useful!

  • @prasadsv3409
    @prasadsv3409 3 года назад

    Realy great stuff sir.this what am looking in youtube

  • @mosestes9417
    @mosestes9417 3 года назад

    Really helpful! you made it very easy!

  • @GaneshNaik-lv6jh
    @GaneshNaik-lv6jh 2 месяца назад

    Thank You so much.... Very good explanation, Just Awesome

  • @verakso2715
    @verakso2715 2 года назад +1

    Thanks for your awesome video, it helped me out a great deal

  • @joncrosby1
    @joncrosby1 2 года назад +2

    Thank you for your great videos, they have been super helpful. I'm working on a proof of concept similar to this video, however it's SQL to Azure SQL. Any links or references you can offer to help me with parameterizing the Azure SQL sink side?

  • @RajivGuptaEverydayLearning
    @RajivGuptaEverydayLearning 3 года назад +1

    Very nice video with good explanation.

  • @arjunsaagi590
    @arjunsaagi590 4 года назад

    Thank you Adam, Very much informatics video

  • @southernfans1499
    @southernfans1499 2 года назад

    👍👍👍 very good explanation.. 👍👍.

  • @vicvic553
    @vicvic553 3 года назад

    Hey, one thing about English - please guys correct me if I am wrong, but I am pretty sure what I am talking about - you shouldn't say inside a sentence "how does it work", but "what it works". Despite that, the content is awesome!

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      You can if you ask a question. "How does it work" is a question structure, not a statement. it should be "how it works" if I'm stating a fact. You wrote "What it works" but I assume that's a typo. It's one of my common mistakes, my English teacher tries to fix it but it is still a common issue for me ;) Thanks for watching!

  • @veerboot81
    @veerboot81 3 года назад

    Hi Adam, very nice work this, I made this for a client of mine and found out one important thing: within the For each not all blocks are executed as if they are working together atomically. What I mean is that if you start two thinks in parallel using the For Each block and within the for each block you have two blocks - say block A and B - connected using parameters (item()) within these blocks say X and Y. Block A starting with item X will not necessarily be using the item X in block B although connected!
    So I want to suggest one extra advice to use only one block in a For Each block at max if using parameterazed block within or if you need more than one block start a separate pipeline within the For Each block which will have the multiple blocks. These pipelines will be started as separate childs and to the work in the correct order.
    With kind regards,
    Jeroen

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Hey, not sure I understood what you meant here. Using parameters is not making any connection between the actions.

    • @veerboot81
      @veerboot81 3 года назад

      @@AdamMarczakYT I'm using a for each loop to load tables with dynamic statements, if I need more than one block (like a logging call to sl server, a copy block to load the data and a logging block after being done with loading these blocks can be in the for each loop itself, but if you start in parallel multiple times different load of tables the blocks will not follow each other sequencially, but will be running through each other, so the logging will not belong to the copy block for example. I will see if I can make an example if I find the time. To solve this I start always another pipeline within the for each and put the blocks in this pipeline. This will create child pipelines in the for each loop ensuring the right order of execution of the blocks (logging start, copy and logging end)

  • @musafirhainyaaron1720
    @musafirhainyaaron1720 3 года назад +3

    Hi Adam, this is awesome session. One question: Where in Azure documentation I can find the information regarding all possible output variable of a given activity e.g. when you were explaining about lookup activity, you talked about the property "firstrow". Where can I find such properties supported by all activities in Azure documentation ?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      Thanks! For details always check the docs by googling "ADF action". For lookup this would pop up which explains everything you asked docs.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity

  • @satyabratabarik49
    @satyabratabarik49 Год назад

    Great Explanation !!!!

  • @szymonzabiello2622
    @szymonzabiello2622 3 года назад +1

    Hey Adam. Great video! Two questions regarding the pipeline itself. 1. How do we approach Source Version Control of the pipeline? In SSIS we could export a package and commit to Git or use TFS. How do we approach versioning in Azure? 2. What is the approach to deploy this pipeline in upper environment? Assuming that this pipeline was created in dev, how do I approach deployment in i.e. UAT?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      I think this page describes and answers both of your questions. docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment?WT.mc_id=AZ-MVP-5003556 thanks for watching :)

  • @pavankumars9313
    @pavankumars9313 Год назад

    You are very good 👍 explained well thanks 😊

  • @sunnygoswami5358
    @sunnygoswami5358 2 года назад

    Amazing explanation Adam! Thank you for this! qq- Can the For Each activity run things concurrently? i.e. in this example, can it pass the 3 table_name, schema_name values to the Copy Data activity at the same time?

  • @feliperegis9989
    @feliperegis9989 3 года назад +1

    Hey Adam, awesome work and explanation! Do you have another video explaining how to deal with massive data copies from tables in bulk using ADF and that may resolve issues with maximum data or rows of data? Can you make a video with a demo explaining how to deal with this kind of scenarios that you mentioned that's the story for another day? Thanks a lot in advance!! =D

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Thanks. Well, Lookup shouldn't be used for data, but for metadata driven approach, so 5000 rows limit is very good here. It is rare when you will copy over 5000 tables/files with different structure)/etc. If you do you can do different techniques but in those cases I probably would shift approach entirely. Will think about this.

  • @jac94
    @jac94 4 года назад

    Thanks Adam! very clear!

  • @krzysztofrychlik9913
    @krzysztofrychlik9913 3 года назад +1

    dzieki! bardzo pomocne filmy!

  • @jozefmelichar5960
    @jozefmelichar5960 3 года назад +1

    Fine tutorial. Thanks.

  • @dataisfun4964
    @dataisfun4964 7 месяцев назад

    Thanks, this is beautiful.

  • @aks541
    @aks541 4 года назад

    Very well explained & succinct. One request - if possible create a video for loading ADW (Synapse) data-warehouse by ADF

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thanks! I'm waiting for synapse new workspace experience to be released to make video about it ;)

  • @marvinvicente3138
    @marvinvicente3138 2 года назад

    Hi Adam, I really appreciate you video. Thanks for your videos! I hope you can also create a video for ODBC as data source.

  • @matthewmark7224
    @matthewmark7224 5 месяцев назад

    amazing work. thanks.

  • @nalinbuddhika
    @nalinbuddhika 4 года назад

    Thanks for the vidoes. I am letting the adds runs till the end :)

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Awesome thank you for your support! ♥

  • @BACHARBOUAZZA
    @BACHARBOUAZZA Год назад

    Thank you! well done.

  • @mk42948
    @mk42948 Год назад

    Hi Adam, thanks for the video! I am wondering if it is possible specify just one row from the table by id and copy it? Thanks in advance!

  • @surafeltilahun7404
    @surafeltilahun7404 3 года назад +1

    Excellent!!!!!

  • @MrPadmanabham
    @MrPadmanabham 4 года назад

    Very good and clear..

  • @YenBui-dn2ek
    @YenBui-dn2ek 2 месяца назад

    Thanks for your great video. Is there any limitation in terms of a number of tables, or table size when we use copy multiple in bulk according to your experience?

  • @sujanbhattacharya6029
    @sujanbhattacharya6029 4 года назад

    Great Video Adam!! Simple and elegant. Thanks!!
    Quick question: Is it possible to make the copy activities happen in parallel instead of looping sequence.
    Also, it would be great if you can make another video showing how to incrementally update data of these table from blob to a sink/SQL Server. May be you have already did it - just couldn't find it.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +2

      Yes you can! Actually by defualt for-each runs in parallel unless you select 'sequential' checkbox on it.

    • @sujanbhattacharya6029
      @sujanbhattacharya6029 4 года назад +1

      @@AdamMarczakYT - Thank you for answering my question.

    • @klklk3162
      @klklk3162 3 года назад

      @@AdamMarczakYT, Hi Adam. Thank you for the awesome videos, I got a requirement to copy parallelly, I got more than 40 tables to copy. Can I know how many tables I can run parallelly at a time? please let me know. Thank you so much.

  • @gouravjoshi3050
    @gouravjoshi3050 3 года назад +1

    Good one adam sir .

  • @dannythedabbler
    @dannythedabbler Год назад

    Thank you for the great information. I wonder how much of this gui-based development can be replaced by code only. In my experience it is easier to troubleshoot a thousand lines of code than gui settings across 50 forms.

  • @skyhorseflying
    @skyhorseflying 4 года назад

    Thanks Adam. Awesome videos for Azure! Do you have one for Azure CI/CD pipeline?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Not yet! I plan to work on this but it's tricky with data factory to get it right. In the meanwhile check out MS article on this docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment

  • @donovan6094
    @donovan6094 2 года назад

    Super cool video sir. You are my priority to search about azure after stackoverflow LOL.
    Sir, how about the cost ? Is it better using loop or paralel. Coz using loop it makes you have extra running time.

  • @akshaybhardwaj7626
    @akshaybhardwaj7626 2 года назад

    Thanks for your all helpful videos, thank you so much. I have one Query
    How can we run a pipeline in parallel to copy data from 5 different Sources to respectively 5 different Targets ? 1 is it possible by passing 5-5 different connection strings (source & target string) ? 2nd Can we have a master pipeline wherein a Foreach activity we can call this One pipeline and run it in parallel for 5 different source and targets movement?

  • @rafalziolkowski2860
    @rafalziolkowski2860 3 года назад

    This is really great stuff, thank you for really easy explanation. It looks like you are using some kind of tool to access Azure Portal, what is this tool?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      I use only chrome browser. Thanks for checking the video out Rafal! ;)

  • @asasdasasdasdasdasdasdasd
    @asasdasasdasdasdasdasdasd 4 года назад +1

    Hi Adam, how would you add a system/custom column in a bulk copy? For example I want to add a pipeline name, date or the value '1' in a column that is shared on all tables.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      There are many ways to do this. Simplest and most similar would be loading data into staging tables and calling stored procedure with merge in it. Then you can apply any additional logic you need.

  • @alikh1984
    @alikh1984 Год назад

    Greate one!

  • @mpramods
    @mpramods 3 года назад

    Awesome video Adam.
    I would like to understand the next step on how to loop through the files and load into tables. Do you have a video on that or could you point me to a link with that info?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      No video on this, but it's very similar, just use GetMetadata activity instead of the lookup :)