Went through this video before starting with ADF, and have returned at least 3-4 times sine to fill in the gaps. This was a solid walkthrough that has been very helpful in getting my head wrapped around the ADF structure. Thank you for putting this together!
Best way to navigate the json of activity output: Put the json from the output in VS Code. You'll automatically get the path of what you're looking for so long as you have the right extension for json viewing.
@@PragmaticWorks No prob. Really wish they would allow > 40 activities in a pipe, or at least not making the setting of a variable value an activity. My guess is because they don't want the Json payload getting too big. There's got to be a better way.
@@NeumsFor9 see a pipeline as a container and break down what your 40+ steps would do into definable sub tasks in individual pipelines called by a parent pipeline?
Thanks Mitchell for great presentation. What about doing same solution but using Linked Service for a "file system" data source for CSV files ingestion ?
Thanks Mitchell for this great video. Can you please explain how to do CI/CD of these dynamic pipelines using Azure devops. I am successfully able to deploy normal pipelines but dynamic pipelines are giving error while deploying. The error is about the dynamic parameters of the linked service.
The If Condition step code requires both the parameter output and the Lookup outputs be strings. The lessor equals statement cannot compare two dates. Having visibility to the simple stored procedure code use in the lookup task would have made this clearer.
It was very interesting and well done. Hopefully you can do something working with a series of analytics processes. Typical useless and vague microsoft error messages.
Went through this video before starting with ADF, and have returned at least 3-4 times sine to fill in the gaps. This was a solid walkthrough that has been very helpful in getting my head wrapped around the ADF structure. Thank you for putting this together!
This has been the best video about adfv2, the other I saw was only about copy activity. Thanks a lot.
Hi, thank you for the video, it was great!
@42:00 @activity("") Can't it be replaced with something dynamic? What if you rename your activity? thanks!
Awesome Video.
Thanks Subhasis! Glad you liked it.
after a lot of searching, now i can say, this is the best tutorial for any beginner 😀😀
Best way to navigate the json of activity output: Put the json from the output in VS Code. You'll automatically get the path of what you're looking for so long as you have the right extension for json viewing.
That's awesome Steven, I will check this out and thanks for sharing! -Mitchell Pearson
@@PragmaticWorks No prob. Really wish they would allow > 40 activities in a pipe, or at least not making the setting of a variable value an activity. My guess is because they don't want the Json payload getting too big. There's got to be a better way.
@@NeumsFor9 see a pipeline as a container and break down what your 40+ steps would do into definable sub tasks in individual pipelines called by a parent pipeline?
This is wondeful video training content, please keep it up.
Thanks Mitchell for great presentation. What about doing same solution but using Linked Service for a "file system" data source for CSV files ingestion ?
Thanks Mitchell for great demo
Hi, How to Load multiple tables from one source into another data store parallel way by using Azure data factory v2.
thanks Mitchell, can you also please explain how we can transfer REST API data(in csv format) into Azure Data lake
Thanks Mitchell for this great video. Can you please explain how to do CI/CD of these dynamic pipelines using Azure devops. I am successfully able to deploy normal pipelines but dynamic pipelines are giving error while deploying. The error is about the dynamic parameters of the linked service.
The If Condition step code requires both the parameter output and the Lookup outputs be strings. The lessor equals statement cannot compare two dates. Having visibility to the simple stored procedure code use in the lookup task would have made this clearer.
Awesome VID! totally awesome!
Hi Mitchell. Great Video. Thank you. Quick question. Have you been able to get any Performance metrics based on big data loads ?
good! keep them coming
Great Content!!
Awesome video
Nice one
It was very interesting and well done. Hopefully you can do something working with a series of analytics processes. Typical useless and vague microsoft error messages.