Building Dynamic Pipelines in Azure Data Factory v2

Поделиться
HTML-код
  • Опубликовано: 18 дек 2024

Комментарии • 28

  • @RedmondGamer
    @RedmondGamer 5 лет назад

    Went through this video before starting with ADF, and have returned at least 3-4 times sine to fill in the gaps. This was a solid walkthrough that has been very helpful in getting my head wrapped around the ADF structure. Thank you for putting this together!

  • @robertonavarro2487
    @robertonavarro2487 6 лет назад

    This has been the best video about adfv2, the other I saw was only about copy activity. Thanks a lot.

  • @millisecondhmd9076
    @millisecondhmd9076 5 лет назад +1

    Hi, thank you for the video, it was great!
    @42:00 @activity("") Can't it be replaced with something dynamic? What if you rename your activity? thanks!

  • @Subho1919
    @Subho1919 3 года назад +1

    Awesome Video.

  • @sameerm683
    @sameerm683 5 лет назад +1

    after a lot of searching, now i can say, this is the best tutorial for any beginner 😀😀

  • @NeumsFor9
    @NeumsFor9 3 года назад +2

    Best way to navigate the json of activity output: Put the json from the output in VS Code. You'll automatically get the path of what you're looking for so long as you have the right extension for json viewing.

    • @PragmaticWorks
      @PragmaticWorks  3 года назад

      That's awesome Steven, I will check this out and thanks for sharing! -Mitchell Pearson

    • @NeumsFor9
      @NeumsFor9 3 года назад

      @@PragmaticWorks No prob. Really wish they would allow > 40 activities in a pipe, or at least not making the setting of a variable value an activity. My guess is because they don't want the Json payload getting too big. There's got to be a better way.

    • @TheyCalledMeT
      @TheyCalledMeT Год назад

      @@NeumsFor9 see a pipeline as a container and break down what your 40+ steps would do into definable sub tasks in individual pipelines called by a parent pipeline?

  • @yuvrajthorat8583
    @yuvrajthorat8583 5 лет назад +1

    This is wondeful video training content, please keep it up.

  • @Ferruccio_Guicciardi
    @Ferruccio_Guicciardi 5 лет назад +1

    Thanks Mitchell for great presentation. What about doing same solution but using Linked Service for a "file system" data source for CSV files ingestion ?

  • @riteshsharma344
    @riteshsharma344 5 лет назад

    Thanks Mitchell for great demo

  • @bhanuprathap189
    @bhanuprathap189 6 лет назад +2

    Hi, How to Load multiple tables from one source into another data store parallel way by using Azure data factory v2.

  • @kunv4220
    @kunv4220 4 года назад

    thanks Mitchell, can you also please explain how we can transfer REST API data(in csv format) into Azure Data lake

  • @TheAkhanadBharat
    @TheAkhanadBharat 5 лет назад

    Thanks Mitchell for this great video. Can you please explain how to do CI/CD of these dynamic pipelines using Azure devops. I am successfully able to deploy normal pipelines but dynamic pipelines are giving error while deploying. The error is about the dynamic parameters of the linked service.

  • @LJG999ab
    @LJG999ab 3 года назад

    The If Condition step code requires both the parameter output and the Lookup outputs be strings. The lessor equals statement cannot compare two dates. Having visibility to the simple stored procedure code use in the lookup task would have made this clearer.

  • @mikeheavypecs1
    @mikeheavypecs1 5 лет назад

    Awesome VID! totally awesome!

  • @dynamite9832
    @dynamite9832 6 лет назад

    Hi Mitchell. Great Video. Thank you. Quick question. Have you been able to get any Performance metrics based on big data loads ?

  • @RavikiranS
    @RavikiranS 5 лет назад

    good! keep them coming

  • @pratikfutane8131
    @pratikfutane8131 5 лет назад

    Great Content!!

  • @taschadevoll9533
    @taschadevoll9533 5 лет назад

    Awesome video

  • @velupalani2496
    @velupalani2496 5 лет назад

    Nice one

  • @Kal-zo5ym
    @Kal-zo5ym 5 лет назад

    It was very interesting and well done. Hopefully you can do something working with a series of analytics processes. Typical useless and vague microsoft error messages.