Thank you, I have an additional scenario to it. Do you have a video where Lookup data is passed as an array into Set variable activity, Finally the array is written into json file. If so, pls let me know. Thx
in iteration you have warning it throughs error when I run my pipeline with the same logic. for example iteration variable @range(1, add(div(int(variables('rowcount')), 1000), 1)) The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type
Hi - I am looking for similar solution. But my source file is in data lake not in database. So can we do the same steps for the file in the data lake as well.
👌 Nice explanation and clear example. And yes , agree with Anand Kumar that this was real world example. Just one suggestion if I may suggest - that if the variables and activities are named properly, then it would be more easier to understand. Like instead of variable8, variable9, Lookup1 etc something meaningful ... This would also set a great example and help people know how to use the naming conventions in ADF.
Thanks a lot ! In my case i have a XML file, when i do lookup it directly go in error , i can't have the number of rows because the file is too big to be read , i don't want to use dataflow it's too expensive for the client
Hi , Very Useful Video. Can you please help me with any solution for this how we can overcome the limit of lookup activity output count if we connect to salesforce service cloud as the soql bulk api query query doesn't support offset function . Is there any function apart from offset we can use for this while connecting to salesforce from adf.
Thanks for the video mam.. very helpful! Can you also help like how do we do unit test on table rows and columns like to check whether copied table contents are correct or not? Thanks
@@AllAboutBI Thank you for responding I fixed if condition at my side. Like n+1 condition place One more thing Which you posted video all the time the lookup query hitting the source server All the loops Could you please suggest me adf will allow temp table My plan is first to load the data into temp table later I am planning to load target table with json format like 10 rows one json file
looks like it's not working. first iteration offset is 0 and fetch next rows is 5000. this is working fine. in the second iteration, offset is 5000 and fetch next rows is 6776. it will fetch 6776 records. it has to fetch next set of 5k right. could you please check and clarify
Hi , How can I take the count of each item in the array in adf? Which activity need to be used. For example : In Look UP Activity I am getting the output as below: "value":[{"id":"1"},{"id":"118631"},{"id":"111637"},{"id":"110848"},{"id":"111636"},{"id":"110801"},{"id":"118631"},{"id":"2"}] I need to use a activity where it gives me output as below id = 118631 ; count = 2 and rest id's with count = 1 Kindly Can someone help me on above
You saved my job with this video. Can't thank you enough!
very complex pipeline. Thanks for this video,
such are the scenarios that we face in real-time.. !
Thankz.
Nice explanation 👏 Thanks for this real time scenario. Hope it will be useful for me. Let me try thanks again
Thank you, I have an additional scenario to it. Do you have a video where Lookup data is passed as an array into Set variable activity, Finally the array is written into json file. If so, pls let me know. Thx
in iteration you have warning it throughs error when I run my pipeline with the same logic.
for example iteration variable @range(1, add(div(int(variables('rowcount')), 1000), 1))
The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type
Hi - I am looking for similar solution. But my source file is in data lake not in database.
So can we do the same steps for the file in the data lake as well.
👌 Nice explanation and clear example. And yes , agree with Anand Kumar that this was real world example.
Just one suggestion if I may suggest - that if the variables and activities are named properly, then it would be more easier to understand. Like instead of variable8, variable9, Lookup1 etc something meaningful ... This would also set a great example and help people know how to use the naming conventions in ADF.
Valid suggestion. I'll keep in mind. Thanks
That's awesome.. 😊 do keep bringing more real time scenarios like these..
It's you who gave the idea of implementing it Ashwani saab 😊
Superb video mam !
Can we handle same thing when source having File present in blob container ?
Thanks a lot !
In my case i have a XML file,
when i do lookup it directly go in error ,
i can't have the number of rows because the file is too big to be read ,
i don't want to use dataflow it's too expensive for the client
Hi ,
Very Useful Video.
Can you please help me with any solution for this how we can overcome the limit of lookup activity output count if we connect to salesforce service cloud as the soql bulk api query query doesn't support offset function . Is there any function apart from offset we can use for this while connecting to salesforce from adf.
Great scenarios , will be v helpful, thanks so much :)
Thanks for the video mam.. very helpful!
Can you also help like how do we do unit test on table rows and columns like to check whether copied table contents are correct or not? Thanks
Great workaround madam. Tq
Lovely teaching
Thanks sir.
Hi , Could you please help me to read the 5 MB Json file from Blob Storage through look up activity
Split the file to multiple small files and apply look up
Mam,
can u show 1 scenario with ADF U-SQL activity...
I'll try
Awesome thank you for this
Hi, I am a huge fan of your videos on ADF. Could you please make same on Azure Synapse and Azure Databricks too? Thanks in advance.
Sure
thank you this is really helpful
Good session mam.. thanks
Hi I saw your videos it's good but
Could you please provide the if conditions the divide varible
Sorry, I'm not sure what is the ask. Please elaborate
@@AllAboutBI
Thank you for responding
I fixed if condition at my side.
Like n+1 condition place
One more thing
Which you posted video all the time the lookup query hitting the source server
All the loops
Could you please suggest me adf will allow temp table
My plan is first to load the data into temp table later I am planning to load target table with json format like 10 rows one json file
Could you please provide me your contact if possible.
I have few queries regarding adf
Very well done
looks like it's not working. first iteration offset is 0 and fetch next rows is 5000. this is working fine. in the second iteration, offset is 5000 and fetch next rows is 6776. it will fetch 6776 records. it has to fetch next set of 5k right. could you please check and clarify
Hmm, I'll chk n tell u
Anyhow, if you move pointer to 5000 records and say bring 6000 records, still it will fetch only 5k records as per design
Hi Mam, Thank you for this video.
I have try this but in my cases it take 1 hours to execute 5000 recorders. please help me how to optimize the time.
Oh is it
@@AllAboutBI yes
Excellent 👍👍
Thanks
No comments put the alternative approaches
Hi
Hi , How can I take the count of each item in the array in adf? Which activity need to be used.
For example :
In Look UP Activity I am getting the output as below:
"value":[{"id":"1"},{"id":"118631"},{"id":"111637"},{"id":"110848"},{"id":"111636"},{"id":"110801"},{"id":"118631"},{"id":"2"}]
I need to use a activity where it gives me output as below
id = 118631 ; count = 2 and rest id's with count = 1
Kindly Can someone help me on above
Hey is it a look up from table ?