#102

Поделиться
HTML-код
  • Опубликовано: 26 ноя 2024

Комментарии • 46

  • @baladenmark
    @baladenmark 2 года назад +3

    You saved my job with this video. Can't thank you enough!

  • @AnandKumar-dc2bf
    @AnandKumar-dc2bf 3 года назад +3

    very complex pipeline. Thanks for this video,
    such are the scenarios that we face in real-time.. !

  • @revathyvijaymukundan8672
    @revathyvijaymukundan8672 Год назад

    Nice explanation 👏 Thanks for this real time scenario. Hope it will be useful for me. Let me try thanks again

  • @harinim3976
    @harinim3976 3 года назад +1

    Thank you, I have an additional scenario to it. Do you have a video where Lookup data is passed as an array into Set variable activity, Finally the array is written into json file. If so, pls let me know. Thx

  • @himanshubharambe5460
    @himanshubharambe5460 3 месяца назад

    in iteration you have warning it throughs error when I run my pipeline with the same logic.
    for example iteration variable @range(1, add(div(int(variables('rowcount')), 1000), 1))
    The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type

  • @sangeethasreenath834
    @sangeethasreenath834 3 года назад +1

    Hi - I am looking for similar solution. But my source file is in data lake not in database.
    So can we do the same steps for the file in the data lake as well.

  • @reallifevideos6770
    @reallifevideos6770 3 года назад

    👌 Nice explanation and clear example. And yes , agree with Anand Kumar that this was real world example.
    Just one suggestion if I may suggest - that if the variables and activities are named properly, then it would be more easier to understand. Like instead of variable8, variable9, Lookup1 etc something meaningful ... This would also set a great example and help people know how to use the naming conventions in ADF.

    • @AllAboutBI
      @AllAboutBI  3 года назад

      Valid suggestion. I'll keep in mind. Thanks

  • @AshwaniAshish
    @AshwaniAshish 3 года назад

    That's awesome.. 😊 do keep bringing more real time scenarios like these..

    • @AllAboutBI
      @AllAboutBI  3 года назад

      It's you who gave the idea of implementing it Ashwani saab 😊

  • @avinashkale5019
    @avinashkale5019 9 месяцев назад

    Superb video mam !
    Can we handle same thing when source having File present in blob container ?

  • @sidneyhodieb5755
    @sidneyhodieb5755 Год назад

    Thanks a lot !
    In my case i have a XML file,
    when i do lookup it directly go in error ,
    i can't have the number of rows because the file is too big to be read ,
    i don't want to use dataflow it's too expensive for the client

  • @sreelakshmins2491
    @sreelakshmins2491 2 года назад

    Hi ,
    Very Useful Video.
    Can you please help me with any solution for this how we can overcome the limit of lookup activity output count if we connect to salesforce service cloud as the soql bulk api query query doesn't support offset function . Is there any function apart from offset we can use for this while connecting to salesforce from adf.

  • @junaidmalik9593
    @junaidmalik9593 2 года назад

    Great scenarios , will be v helpful, thanks so much :)

  • @roshankumargupta46
    @roshankumargupta46 3 года назад

    Thanks for the video mam.. very helpful!
    Can you also help like how do we do unit test on table rows and columns like to check whether copied table contents are correct or not? Thanks

  • @sivasubrahmanyam8471
    @sivasubrahmanyam8471 3 года назад

    Great workaround madam. Tq

  • @ShriramVasudevan
    @ShriramVasudevan 3 года назад +1

    Lovely teaching

  • @praveenkumar-jj1or
    @praveenkumar-jj1or 3 года назад

    Hi , Could you please help me to read the 5 MB Json file from Blob Storage through look up activity

    • @AllAboutBI
      @AllAboutBI  3 года назад

      Split the file to multiple small files and apply look up

  • @AnandKumar-dc2bf
    @AnandKumar-dc2bf 3 года назад +1

    Mam,
    can u show 1 scenario with ADF U-SQL activity...

  • @muerte9945
    @muerte9945 2 года назад

    Awesome thank you for this

  • @debasreepal6424
    @debasreepal6424 3 года назад

    Hi, I am a huge fan of your videos on ADF. Could you please make same on Azure Synapse and Azure Databricks too? Thanks in advance.

  • @rafikgyurjyan2410
    @rafikgyurjyan2410 2 года назад

    thank you this is really helpful

  • @sinikishan1408
    @sinikishan1408 3 года назад

    Good session mam.. thanks

  • @Ap-ki-bokka316
    @Ap-ki-bokka316 2 года назад

    Hi I saw your videos it's good but
    Could you please provide the if conditions the divide varible

    • @AllAboutBI
      @AllAboutBI  2 года назад

      Sorry, I'm not sure what is the ask. Please elaborate

    • @Ap-ki-bokka316
      @Ap-ki-bokka316 2 года назад

      @@AllAboutBI
      Thank you for responding
      I fixed if condition at my side.
      Like n+1 condition place
      One more thing
      Which you posted video all the time the lookup query hitting the source server
      All the loops
      Could you please suggest me adf will allow temp table
      My plan is first to load the data into temp table later I am planning to load target table with json format like 10 rows one json file

    • @Ap-ki-bokka316
      @Ap-ki-bokka316 2 года назад

      Could you please provide me your contact if possible.
      I have few queries regarding adf

  • @விரேவதி
    @விரேவதி 3 года назад

    Very well done

  • @pavanchakilam7713
    @pavanchakilam7713 Год назад

    looks like it's not working. first iteration offset is 0 and fetch next rows is 5000. this is working fine. in the second iteration, offset is 5000 and fetch next rows is 6776. it will fetch 6776 records. it has to fetch next set of 5k right. could you please check and clarify

    • @AllAboutBI
      @AllAboutBI  Год назад

      Hmm, I'll chk n tell u

    • @AllAboutBI
      @AllAboutBI  Год назад +1

      Anyhow, if you move pointer to 5000 records and say bring 6000 records, still it will fetch only 5k records as per design

  • @kalpanamore1530
    @kalpanamore1530 3 года назад

    Hi Mam, Thank you for this video.
    I have try this but in my cases it take 1 hours to execute 5000 recorders. please help me how to optimize the time.

  • @oriono9077
    @oriono9077 3 года назад +1

    Excellent 👍👍

  • @kunalr_ai
    @kunalr_ai 18 дней назад

    No comments put the alternative approaches

  • @Ap-ki-bokka316
    @Ap-ki-bokka316 2 года назад

    Hi

  • @DivyaM-r2t
    @DivyaM-r2t Год назад

    Hi , How can I take the count of each item in the array in adf? Which activity need to be used.
    For example :
    In Look UP Activity I am getting the output as below:
    "value":[{"id":"1"},{"id":"118631"},{"id":"111637"},{"id":"110848"},{"id":"111636"},{"id":"110801"},{"id":"118631"},{"id":"2"}]
    I need to use a activity where it gives me output as below
    id = 118631 ; count = 2 and rest id's with count = 1
    Kindly Can someone help me on above

    • @AllAboutBI
      @AllAboutBI  Год назад

      Hey is it a look up from table ?