Hi Maheer, a small correction at 6:10 timeStamp You have passed the list of 3 strings into an array i.e [ output1, output2, output3 ] as a deafult value Since they are list of Strings, They should be passed in Inverted Commas as [ "output1", "output2", "output3" ] Else while validating it will throw warning as "Parameter value should be a valid array " Hope this helps for the rest of Learners fom your channel :)
Why not my engineering lectures were like you..!! After 5 hours of going through several videos, I finally managed to make sense of "For each" component using your video. I am able to append today's date to a file which copying from one blob to another. Zillion thanks. Small suggestion is that after showing all these, you might have just ran the pipeline to show what happens and through the process you might have debugged the Array quotes issue
Thank you for kind words 🙂. Ya at that time of making these videos I didn't had proper Azure subscription to send some money to execute. Even i was under worry like how much money it take and all. But all videos after intiall few days has everything ran. Sorry for the trouble in intial videos
Hi Maheer, Thank you for creating such a good video but one thing I would like to inform, when you are creating parameter like OutputFolders then value should be within the double quote like ["output1","output2","output3"] otherwise it throws a invalid parameter value.
Thank you for the information, but I have a tiny proposal. Your explanation of the activity is excellent, but my worry is that you demonstrate the pipeline execution as well as the output, which will be useful to beginners.
In the pipeline parameter, you have given value as [Output1, Output2, Output3], But it should be in double quotes ["Output1", "Output2", "Output3"] I believe. Correct me if I'm wrong
How to handle this: Example: file1.xlsx > (sheet 1).json, (sheet2).json I don’t know how many files are the input folder and I don’t know how many sheets are in each file but the output should be whatever is the sheet name (within the xlsx) with json extn ?
Is it possible to put all of the files in the same output folder? I tried adding another Parameter in the pipeline outputFolder as a string value, then created another parameter outputFile array type. Instead of creating 3 different files in a single folder, it created 3 separate output folders for each file.
Inside continer I'm not getting the container names like output1,output 2 ,output3 Could please execute once your pipeline line activity .we can understand clearly
Good Work. I wan to copy each record of the table separately. Does this concept work. Would you mind sharing your E-mail for further communication about it?
@@WafaStudies I did that and Please check your video it is showing invalid for you too (when u gave the value to folder name it showed invalid ) Thankyou for your videos they r helping me for my carrier
Keep it up, excellent content
Thank you 😇
Hi Maheer, a small correction at 6:10 timeStamp
You have passed the list of 3 strings into an array i.e [ output1, output2, output3 ] as a deafult value
Since they are list of Strings, They should be passed in Inverted Commas as [ "output1", "output2", "output3" ]
Else while validating it will throw warning as "Parameter value should be a valid array
"
Hope this helps for the rest of Learners fom your channel :)
Yes thanks for correcting it.
Thanks for the video.
one suggestion, Please take one example and run it. this will help us understand what output we are getting.
Second that!
Why not my engineering lectures were like you..!!
After 5 hours of going through several videos, I finally managed to make sense of "For each" component using your video.
I am able to append today's date to a file which copying from one blob to another.
Zillion thanks.
Small suggestion is that after showing all these, you might have just ran the pipeline to show what happens and through the process you might have debugged the Array quotes issue
Thank you for kind words 🙂. Ya at that time of making these videos I didn't had proper Azure subscription to send some money to execute. Even i was under worry like how much money it take and all. But all videos after intiall few days has everything ran. Sorry for the trouble in intial videos
Hi Maheer,
Thank you for creating such a good video but one thing I would like to inform, when you are creating parameter like OutputFolders then value should be within the double quote like ["output1","output2","output3"] otherwise it throws a invalid parameter value.
I had the same issue. Had to look it up and solved the problem.
maheer did not run the pipeline he just explain the blueprint
waste fellow
without run the pipeline how we get know whether its working or not
Thanks a ton Maheer. This is one of the most informative playlist I have viewed in RUclips.
Thank you 🙂
I like the way you explain .. Very informative and helpful. Thanks a Bunch!!
Thank you for the information, but I have a tiny proposal. Your explanation of the activity is excellent, but my worry is that you demonstrate the pipeline execution as well as the output, which will be useful to beginners.
Hi Maheer, the way u explaining the concepts understandable.awesome sessions and vr 💯% satisfied, thank you so much
Welcome 😊
can we use this on 'Copy Data Sink' part that @{item().OutputFolder}.....this output from parameter in pipeline? instead only @item().
In the pipeline parameter, you have given value as [Output1, Output2, Output3], But it should be in double quotes ["Output1", "Output2", "Output3"] I believe. Correct me if I'm wrong
Short and Clear videos Thanks Maheer
Welcome 🤗
do u have any theoritical material what u are teaching
What if I want to more then 10 output do i need to give ["output1","output2",....."output10"]?
Hello, instead of hardcoding array, is it possible to pass folder values dynamically?
How to handle this: Example: file1.xlsx > (sheet 1).json, (sheet2).json
I don’t know how many files are the input folder and I don’t know how many sheets are in each file but the output should be whatever is the sheet name (within the xlsx) with json extn ?
Your class are reaaly usefull! Just keep making!
Thank you a lot!
Is it possible to put all of the files in the same output folder? I tried adding another Parameter in the pipeline outputFolder as a string value, then created another parameter outputFile array type. Instead of creating 3 different files in a single folder, it created 3 separate output folders for each file.
I am going through these videos last one week. you are explanation & examples are really very helpful to every one. Thank you @wafastudies
Thank you for your kind words 🙂
Hello, how to transform multiple file in single dataflow under the ForEach?
where is the array example vanished ? [1,2,3,4,5,6,7] is it exist then can you share the lnumber in the list.
Hi Maheer, about this topic Foreach can u provide a actual time demo because demo u provide is not properly explanatory.
Which all playlists by you should be watched for Azure Data Engineer role ? Thanks in advance bro.
Adf, synapse, Databricks
@@WafaStudies Thanks brother
Inside continer I'm not getting the container names like output1,output 2 ,output3
Could please execute once your pipeline line activity .we can understand clearly
hi when I did this I got an error message while publish -parameter value should be a valid array,can you help me to sort out?
Array is not correct format i guess. Pls use comment ur format
@@WafaStudies thank you for the reply .your videos are great . I got it correct when I go through someone’s comment .
[“output1”,”output2”]
parameter should be arry? can be copy this alredy existed folders?
Hi Maheer firstly big thanks to your great efforts, hope your channel should grow exponentially. Can we try nested for each activity in ADF?
Thankyou Imran for kind words. No we can't use nested foreach in adf. Currently it's not supported
Many many many thks friend. Great video.
Thank you ☺️
i dont see the output in my container my debugging happened well whats the problem
How do we handle SSIS object type in SSIS
Good Explanation...Keep going on
Thank you 😊
more video plz upload some on incremental files coming in blob how to store in azure sql viceversa.
Super Video, Great work
Great Work!!
Parameter value should be a valid array
when i validate the pipeline showing above parameter error
It should be ["output1","output2","output3"]
@@maheshgaddam5729 thanks :) you are right. After changing this, I got the output
in every video you are just show the flow but you didn't execute pipeline
Good work...carry on
Thank you 😊
I am getting an error Saying "Parameter should be vaild array" please help me with this
It should be ["output1", "output2", "output3"] since the Folder names are list of Strings
thank you sir great work
Welcome 🤗
Good Work. I wan to copy each record of the table separately. Does this concept work. Would you mind sharing your E-mail for further communication about it?
very informative
Thank you 🙂
Tried it but didn't work, after adding value to folder name (@item())its showing invalid
To access items in array of for each you need to use @item()
@@WafaStudies I did that and Please check your video it is showing invalid for you too (when u gave the value to folder name it showed invalid )
Thankyou for your videos they r helping me for my carrier
@@prudhviraj3315 Array parameter you should define like ["output1","output2","output3"] then only it will work. I tried it is working fine. Thank you.
@@sateeesh1000 i dont see the output in my container my debugging happened well whats the problem
@@sateeesh1000 Thanks Sateesh. You saved my time.
Good demo
Thank you 😊
Please show me a vedio how to delete less than 50kb files in a blob storage
kindly check below
ruclips.net/video/gxqiRu9w8EQ/видео.html
WHY YOU ARE NOT RUN THE PIPELINE
IF YOU RUN THE PIPELINE THEN ONLY WE WILL LET YOU KNOW WHETHER ITS WORKING COREECT OR NOT.
Thank you Sir
Welcome 😊
Very confusing 😢
No sé entiende NADAAA
Thanks for the video.
one suggestion, Please take one example and run it. this will help us understand what output we are getting.