Regarding parent child handling of email success/failure subtly makes me appreciate event handlers in SSIS but also the parent-child package reusability story as well......without the complexity of bubble up behavior and scopes, of course!
I agree. Flow-control is something I definitely hope that MS will expand on in the future. Same with no 'terminate' block for 'throw error' scenarios. ;)
Thanks! I used the same logic in azure synapse but had issue to capture dynamic error message from data flow. The fix for this is : "@{ replace(activity('Execute Pipeline1').error.Message,'"' ,'' ) } ",
Definitely a topic on the list. It's a bit complicated topic so I need to figure it out couple of items, but I hope to get around to this topic soon. :) Thanks for stopping by!
Great video ... One question is it possible just one email for several pipelines I would like to have daily an email with the pipeline's last status failure or success ??
Hi Adam, thanks for the video it has valuable information. The way how you illustrated really awesome. I really appreciate your efforts. Request you to do more videos on data Engineer 🙏🙏
@Adam, My question is how to send a notification when a pipeline/pipelines execution duration is >10 mins under a data factory. Much appreciated your help.
Sometimes it's more of a question what is your company policy on that. In many enterprises SMTP is a way to go, I typically use the one that works best for my scenario. If I have smtp relay I use that, because o365 account requires exchange license.
Dam ! I'm from the Power PLatform World, Nice demo One thing, One the Technical Account on Logic App, PassWord never expire should be configured to maintain consistensy on the Outlook connection.
Your videos are very helpful and your way of Presentation and explaining each and every topic make everything so simple . Thank you Adam this is what I was looking for on a starting level. One question can we get the pipeline start and end time in any way in ADF?
Hi Adam, ran across this video thanks for sharing, I wanted to know instead of a pipeline()parameters such as "example": "@{pipeline().parameters.example}", can you use a variable ? something like this I haven't been able to make it work though. "example": "variables('example')", Thank you let me know your thoughts or where to look for help.
Hi Adam, I tried to reproduce your scenario. Unfortunately I encountered Problems with the following sentence: "message": "@{activity('POC Alerting Failure').error.message}". The activity('POC Alerting Failure') is a pipeline but the mail won't be send. When i replace the pipeline with another activity like a Lookup (which produces a failure) then the mail will be send correctly. Do you maybe know what has changed here?
Hi Adam, Great video! If I wanted to send a different email (different body of the email) for different pipelines, would I need more than one logic app?
Or you can parametrize the body and construct it in ADF, but be careful about this so that noone will impersonate you :) For safety reason I wouldn't do that.
Hi adam!! thanks a lot for yout tutorial!! I'm having a problem..I have got the 2 pipelines, however when a run the DEMO-PIPELINE, the Master(email pipeline) it's no executed , I follow all your tutorial but I'm not able to do it...The email pipeline is invoking the DEMO-PIPELINE should I do something else? or add another thing on the DEMO-PIPELINE dashboard?
Hello Adam, thank you for making these videos. One help. Could you please make another video about how to attach attachments to these notification emails. These attachment could be again a document from the blob storage.
Thank you for your video! May I ask a question, you said it wasn't secure to send the recipient as a part of the JSON body. But isn't the HTTPS body by definition encrypted? Would a malefactor still be able to see the contents of the request?
It's encrypted but if you use Logic Apps like this then it's a publicly available URL which if exposed can be used by anyone, anywhere in the world. So if someone would get it they would be able to use that URL to send emails to anyone they want. Sometimes it could be employee who commits code to a public repo by mistake, etc. Things happen. It's a low risk thing but why take it?
@@AdamMarczakYT Thanks for the quick response! I see what you mean. But the malefactor would be able to use the app to send emails only if (s)he knows the JSON structure, that there is a field called "recipient". Maybe it's wise to use code names for the fields instead.
Hi, I tried this and I am getting an error : 'After parsing a value an unexpected character was encountered: \" . When I look ate input I see \ before every ":" . Any idea what may be the reason ?
Hey Marcin. I’ve ran this demo like 10 times, so I’m very positive it works. That means you’ve made a mistake somewhere. My advise, its best just to do last few steps over, sometimes it’s easier to find a mistake this way.
Good afternoon, your explanation was fantastic, but I think it is insane not to have a direct component for shooting. If you have the programming of only one trigger, could you tell me? thankful
Direct component for shooting? Not sure what do you mean by that :) But thanks for watching, glad you enjoyed the video. Also "If you have the programming of only one trigger, could you tell me? thankful" not sure I understand this question either. Can you please rephrase it?
Hey Adam....There was an error while logging into the outlook account...The error states as follows...".Failed with error: 'The browser is closed.'. Please sign in again." can you help me with this... i tried doing with my corporate mail also but it is not happening
Cool stuff! One question - how about instead of creating a wrapping pipeline for error reporting, chain error email action on success email action with dependency condition 'skipped'?
This potentially could work but it would be tricky to figure out which block raised error and would require custom logic to get that error message. it would increase the effort required when modifying pipelines and could result in missed error messages, especially if you would have many parallel branches in one pipeline. But for simple cases you potentially could do it.
Hi Mark, Been following your videos. Thank you for the rich content. Just having a doubt about notifications. In Email From address your name can be seen. Is there a concept of Service Account to configure the From Email address just like SQL Server Agent so that we can avoid showing your individual email address. Much appreciated.
Hey Adam thanks for great content. My problem is that I want to compare two columns value in azure postgresql and send email notification if there is any extraordinary difference between them mathematically. Which tool should I use on azure, I am quiet confused. Any help would be appreciated. Thanks in advance.
Hi Adam, Thanks for such an informative video. Could you please suggest how can we configure to send an email if any pipeline gets stuck/ queued or long running?Thanks in advance!
Excellent one, thanks very much. Do you have any video on reading multiple csv filenames from Azure blob storage to databricks using pyspark or python.?
Check out my databricks tutorial. I have few samples there using Python and Scala. The difference between python and scala for basic tasks is so minimal that you can easily grab my Scala scripts and with minor changes use them for python.
@@AdamMarczakYT right, didn't clarify that, I meant the Cloud services PaaS docs.microsoft.com/en-us/azure/cloud-services/cloud-services-choose-me I've seen that I can only use this tool to perform sharding (split-merge) but that's going to deprecate in the near future.
Hi I'm getting this error, Invoking endpoint failed with HttpStatusCode - '404 : NotFound', message - 'The requested endpoint(url) does not exist on the server. Please verify the request server and retry.' You guidance could be a great help
Hi, I need to add recipient email address dynamically. May be more than one. But not hard coded, using a csv file or from a table, i need to pass to the parameter. Can you help me in this ?
Simply awesome thanks a ton Adam. can you please help me with how to send email notification in Azure devops release pipeline if one stage is failing like i have QA , DEV , DEV1 stages i want to send email on if any of these stages failed for any reason. I don't see any inbuild option in Azure for specific stages. Your guidance will really help me.
Thanks! Check your notification settings because normally you should get emails when pipeline fails docs.microsoft.com/en-us/azure/devops/notifications/manage-team-group-global-organization-notifications?view=azure-devops?WT.mc_id=AZ-MVP-5003556
We had this Logic App approach configured several months ago by a contractor, with just 3 ADF pipelines running once per day (so 3 total per day), only sending on failures which was only couple per month. We have 3 subscriptions for our different deployment levels (dev/test/prod) and the Logic Apps purely from the service charges (App Service Plan) have cost us $500USD/mo. The pricing is a little misleading as the calls/actions themselves cost _very_ little, but just to have the app exist and listen costs quite a lot.
You probably would need to save output to blob storage and pass the blob path to a logic app and in logic app read the file using path and add it as attachment.
@@AdamMarczakYT Thanks Adam for your quick reply! I appreciate! I am keeping the file in a blob storage. I'm just struggling trying to pass the blob storage path as a parameter in the web activity settings body section. Is it possible to add an "Attachment" element to the JSON?
Hi Awesome Adam.. I tried using @activity("activity_name").error.message to send the error message but I gives an error that "The request content is not valid and could not be deserialized.. After parsing a value and unexpected character was encountered:.. Path 'message'
Hard to say without looking at this. Maybe you've made a mistake somewhere. Check in the debug the output of that activity to verify if objects for error and message are there. Sometimes it's faster just to delete this and try again, it's also good for learning. :)
@@avnish.dixit_ Please try pasting your body content into the body text field box WITHOUT clicking on 'Add Dynamic Content' in web activity calling Logic App
Super! What about this scenario - I am importing some data. Imagine there is a Column with name "Running". The row is True or False. How to make alert via mail in case the column "Running" will be False? Is there any possibility how to make it? Thank you, Adam !
I'd probably try using Lookup to source data and create foreach with conditions to send emails. Note that lookup should be used for small datasets. Everything depends on details! Thanks for watching!
That was a great knowledge share. Can you give me some ideas about from datafactory source as Azure SQL & Sink as an excel file. uploading to sharepoint. How to achieve this???
Not sure, it worked for me last time I checked. Please check the video and ensure all stops in the logic app were done the same way. Especially passing HTML body as variable. If it won't work let me know.
Remember that with every tutorial I provide samples on github. Link is in the comments so you can quickly copy and paste the templates and request body.
You can, logic apps allow for attaching files to emails. Check the docs for details :) Although if query result would be big I'd probably save it as a csv on blob and just generate link to it from logic app and attach that in the email. Small query results could be send directly to logic app in the body.
This is really informative. May I please ask if there is a way to send a text file to an office 365 account as an attachment? Could we add a trigger on that as well? Let's say if the txt file updates then send the attachment, else do nothing? Thank you!
Hey, I want to perform delete activity in adf but first want to ask user permission. Like if user will get any email or alert that pipeline has been failed so do you want to delete all data? And if userr opt for yes than delete activity will be performed. Can you help? I hav tried to use "send email with option in logic app" but not working. I want my pipeline to run on the basis of that answers. If yes than delete If no than dont delete.
Hi Adam, your tutorials are excellent. Can you guide me on how to connect ADF and Snowflake? I would like to copy flat files from Azure Blob to dynamic tables in Snowflake.
@@AdamMarczakYT I tried the linked service for Snowflake. By creating the SAS token URL for the blob storage, I established connection between snowflake and Azure Blob storage. However the CSV files read using copy activity couldn't be converted to dynamic tables in snowflake!!. Only when the appropriate table pre-exists in snowflake, the files get copied. Is there a way to create dynamic tables for the appropriate files.
Thank you for this, this is just the approach we need for our project. I am having one small issue though. If I use "@{activity('Execute Pipeline').output}" for "message", I get this error: {"error":{"code":"InvalidRequestContent","message":"The request content is not valid and could not be deserialized: 'After parsing a value an unexpected character was encountered: p. Path 'message', line 3, position 14.'."}}. If I use "@{activity('Execute Pipeline').error.message}" for "message", I get this error: {"error":{"code":"InvalidRequestContent","message":"The request content is not valid and could not be deserialized: 'After parsing a value an unexpected character was encountered: S. Path 'message', line 3, position 52.'."}}. Wondering if I missed a step somewhere.
My pleasure! You probably missed some step, like linking error path (red line) to error message. Your error states that there is no property error.message on the object you are referring to.
@@AdamMarczakYT It looks like they don't allow us to pull the error messages from the pipeline any more. Found this as I've been researching it (scroll down to the comment from douglaslMS): github.com/MicrosoftDocs/azure-docs/issues/18481 Going to try to see if I can capture output from Azure Monitor as they suggest.
It worked in the past. I don't think they would remove this feature, in my opinion they are talking about returning error as the output of ADF pipeline, which isn't supported.
Nice work Adam! Could you please make a video on. How to check 0Kb csv files / zero row record from source. If zero Kb file/Zero records in source trigger an email, in azure Data Factory. Thanks in advance.
Hi Sandeep, if you are aiming to be a data engineer then those will be your every day challenges that you will need to solve. Typical data engineer should understand the technology, be able to search documentation online, explore, play around with it, and finally use those skills to combine and build working solution. I can't create video on every single scenario. You already stated what needs to be done, split it into separate items and search in google on how to get them. 1. how to check file size with ADF > will probably return that you need to use get metadata action 2. how to create if in ADF > will point you to conditions 3. my video on bulk load will show how to iterate 4. my video on emails shows how to send email just combine this and build the solution. You got this.
Thanks much for the video Adam. I have implemented this in my solution. And I took it little advance by passing the values to the elements by using a lookup activity. Excel file ( which has column names as Jason element name and value ) For example : "Title": "@{activity('Lookup1').output.FirstRow.Title}" This way I can modify the values outside the pipeline without touching the code But the problem is.. in my excel the title value is : Failed at @{utcnow()} and the adf is not converting that into time , it is populating as is to the email. My email body is looking like this Failed at @{utcnow()} how to print the formulae value in the email Please help. thanks much.
Hey, thank you for watching. Glad to hear that you enjoyed the video. Unfortunately I can't help you debug your solutions. One thing that comes to mind to to ensure that you clicked "add dynamic content" and put this in the expression window, not in the field itself. Otherwise it's treated as a text.
Am I the only one struggling because Microsoft has changed the Logic Apps UI? First of all, I needed to create a workflow which wasn't the case for Adam. Now I am struggling to find rub history. The pipeline debug was successful, but workflow Run History is completely empty...
@@AdamMarczakYT Thank you Adam . I am planning to pass blob path from ADF to Logic apps and Then let Logic app to get content from Blob for attachment . Please let me know if you have different ideas.
Thanks!!! You are the best! Keep producing these videos. A future suggestion - a video applying SCD techniques in ADF
Thanks Jeff. SCD is surely a good topic, I'll try to get that video somehow on the schedule :) Thanks for watching!
Adam, This is an awesome tutorial to learn sending emails from ADF. Thank you.
Glad you enjoyed it!
Thank you for the great example of how to use logic app and ADF.
My pleasure!
Thanks Adam, this video was a blessing in disguise. Thanks a-lot for posting this video.
Thanks! I hope I don't disguise my videos too much, I want them to be inviting ;)
Adam, Excellent tutorial. Very precise and shown clearly with examples. Thank you.
Glad it was helpful!
All your videos are crystal clear concept explanation and fantastic demo videos including this.
Thank you, appreciated! :)
Very much useful topic and good hands-on eyeopening experience
Hi Adam. Your video have a excellent explanation. This help me a lot. Don't stop! Continue and go ahead! Congrats!
Regarding parent child handling of email success/failure subtly makes me appreciate event handlers in SSIS but also the parent-child package reusability story as well......without the complexity of bubble up behavior and scopes, of course!
I agree. Flow-control is something I definitely hope that MS will expand on in the future. Same with no 'terminate' block for 'throw error' scenarios. ;)
My favourite channel to learn Azure.. Thanks Adam for awesome videos on Azure .. keep it up.. my best wishes to you :)
Thanks a ton!
Thanks! I used the same logic in azure synapse but had issue to capture dynamic error message from data flow. The fix for this is : "@{ replace(activity('Execute Pipeline1').error.Message,'"' ,'' ) } ",
Hi Adam. Awesome content with real-time use case 👏
Glad you liked it! Thank you :)
Awesome video. I love it. Thanks for the tutorial. I can't wait to test this.
Glad it was helpful! Thanks :)
Wonderful work Adam! This is exactly what I was looking for.
Perfect! Thanks for stopping by!
Hi Adam, your tutorials are excellent. Can you do a tutorial on using Git and moving ADF code from development, to test, to production?
Definitely a topic on the list. It's a bit complicated topic so I need to figure it out couple of items, but I hope to get around to this topic soon. :) Thanks for stopping by!
Hi Adam can u pls explain how we perform version controling of adf piplines and how to deploy azure data factory pipline into dev uat and prod
@@AdamMarczakYT We are now trying to figure that one with DevOps... hard piece of cake. Greets from Poland ;-)
Great video ... One question is it possible just one email for several pipelines I would like to have daily an email with the pipeline's last status failure or success ??
Same requirement I have .. do you got the solution?
Thank you Adam for such a crystal clear demo. It really helped me
Hi Adam, this is a super clear and helpful tutorial. Thanks for posting it!
You're very welcome!
Hi Adam, thanks for the video it has valuable information. The way how you illustrated really awesome. I really appreciate your efforts. Request you to do more videos on data Engineer 🙏🙏
Awesome, thank you!
@Adam, My question is how to send a notification when a pipeline/pipelines execution duration is >10 mins under a data factory. Much appreciated your help.
Thanks, Adam! Great tutorial. What you recommend for outgoing notification emails? Office 365 service account or SMTP Relay service?
Sometimes it's more of a question what is your company policy on that. In many enterprises SMTP is a way to go, I typically use the one that works best for my scenario. If I have smtp relay I use that, because o365 account requires exchange license.
Excellent demo
Cheers! :D
Hey Adam.. Thanx a lot for such a cool and clear explanation!!
My pleasure!
Dam ! I'm from the Power PLatform World, Nice demo
One thing,
One the Technical Account on Logic App, PassWord never expire should be configured to maintain consistensy on the Outlook connection.
Your videos are very helpful and your way of Presentation and explaining each and every topic make everything so simple .
Thank you Adam this is what I was looking for on a starting level.
One question can we get the pipeline start and end time in any way in ADF?
Thank you so much 🙂
Excellent Video Adam. Thanks
Thank for this my friend!!
No problem 👍
Hi Adam, ran across this video thanks for sharing, I wanted to know instead of a pipeline()parameters such as "example": "@{pipeline().parameters.example}", can you use a variable ? something like this I haven't been able to make it work though. "example": "variables('example')", Thank you let me know your thoughts or where to look for help.
Awesome video! Relevant, well designed.
Wish there were more content like yours on youtube and less Flat Earthers!
Cheers from Brazil.
Much appreciated!
Really appreciate your videos, such good quality!!
I appreciate that!
Hi Adam,
I tried to reproduce your scenario. Unfortunately I encountered Problems with the following sentence: "message": "@{activity('POC Alerting Failure').error.message}".
The activity('POC Alerting Failure') is a pipeline but the mail won't be send. When i replace the pipeline with another activity like a Lookup (which produces a failure) then the mail will be send correctly.
Do you maybe know what has changed here?
Your video helped me a lot. Thank you very much.
Hi ADM ,
how can send failed adf notification to multiple mail or any DL or ALL mail id comes in any resource group
Hi Adam, Great video! If I wanted to send a different email (different body of the email) for different pipelines, would I need more than one logic app?
Or you can parametrize the body and construct it in ADF, but be careful about this so that noone will impersonate you :) For safety reason I wouldn't do that.
Excellent video man. Great work. Thank you.
Much appreciated!
Excellent stuff, Adam! Invaluable, better than online courses I paid for (tylko mow 'debug' jak 'dibag' =) ).
Wow, thanks! No czasami mi sie zapomni a potem do konca video zle mowie :D
@@AdamMarczakYT Super mowisz, wyraznie.
Well done! Thanks Adam for the good walk-through exercise :)
My pleasure!
Hi adam!! thanks a lot for yout tutorial!! I'm having a problem..I have got the 2 pipelines, however when a run the DEMO-PIPELINE, the Master(email pipeline) it's no executed , I follow all your tutorial but I'm not able to do it...The email pipeline is invoking the DEMO-PIPELINE should I do something else? or add another thing on the DEMO-PIPELINE dashboard?
Trigger master pipeline, not the demo-pipeline. Parent pipeline won't be called if you call demo-pipeline. That's why it's called master.
Hello Adam, thank you for making these videos. One help. Could you please make another video about how to attach attachments to these notification emails. These attachment could be again a document from the blob storage.
You can use Logic Apps blob connector to get file content and add it as attachment to an email.
How do you send email using outlook if you have a microsoft exchange outlook account, assuming office 365 is saying rest api is not enabled
Thank you for your video! May I ask a question, you said it wasn't secure to send the recipient as a part of the JSON body. But isn't the HTTPS body by definition encrypted? Would a malefactor still be able to see the contents of the request?
It's encrypted but if you use Logic Apps like this then it's a publicly available URL which if exposed can be used by anyone, anywhere in the world. So if someone would get it they would be able to use that URL to send emails to anyone they want. Sometimes it could be employee who commits code to a public repo by mistake, etc. Things happen. It's a low risk thing but why take it?
@@AdamMarczakYT Thanks for the quick response! I see what you mean. But the malefactor would be able to use the app to send emails only if (s)he knows the JSON structure, that there is a field called "recipient". Maybe it's wise to use code names for the fields instead.
Excellent!! Clear and precise
Glad you think so!
Hi, I tried this and I am getting an error : 'After parsing a value an unexpected character was encountered: \" . When I look ate input I see \ before every ":" . Any idea what may be the reason ?
Hey Marcin. I’ve ran this demo like 10 times, so I’m very positive it works. That means you’ve made a mistake somewhere. My advise, its best just to do last few steps over, sometimes it’s easier to find a mistake this way.
Good afternoon, your explanation was fantastic, but I think it is insane not to have a direct component for shooting. If you have the programming of only one trigger, could you tell me? thankful
Direct component for shooting? Not sure what do you mean by that :) But thanks for watching, glad you enjoyed the video. Also "If you have the programming of only one trigger, could you tell me? thankful" not sure I understand this question either. Can you please rephrase it?
Much informative, thanks Adam!
My pleasure!
Hey Adam....There was an error while logging into the outlook account...The error states as follows...".Failed with error: 'The browser is closed.'. Please sign in again." can you help me with this... i tried doing with my corporate mail also but it is not happening
Cool stuff! One question - how about instead of creating a wrapping pipeline for error reporting, chain error email action on success email action with dependency condition 'skipped'?
This potentially could work but it would be tricky to figure out which block raised error and would require custom logic to get that error message. it would increase the effort required when modifying pipelines and could result in missed error messages, especially if you would have many parallel branches in one pipeline. But for simple cases you potentially could do it.
Hi. Is an outlook email required to do this? Can i use my corporate gmail to send the message?
Cannot be better than this
Great to hear!
Great tutorial !
Hi Mark,
Been following your videos. Thank you for the rich content.
Just having a doubt about notifications. In Email From address your name can be seen. Is there a concept of Service Account to configure the From Email address just like SQL Server Agent so that we can avoid showing your individual email address.
Much appreciated.
Excellent!
Thank you! Cheers!
Hey Adam thanks for great content. My problem is that I want to compare two columns value in azure postgresql and send email notification if there is any extraordinary difference between them mathematically. Which tool should I use on azure, I am quiet confused. Any help would be appreciated. Thanks in advance.
interested in this
Excellent tutorial..
Hi Adam, Thanks for such an informative video. Could you please suggest how can we configure to send an email if any pipeline gets stuck/ queued or long running?Thanks in advance!
Hi priya..May I know how did you solved this problem please..
Awesome video as always !!!
Thanks again!
Thanks Adam!
My pleasure!
Excellent one, thanks very much. Do you have any video on reading multiple csv filenames from Azure blob storage to databricks using pyspark or python.?
Check out my databricks tutorial. I have few samples there using Python and Scala. The difference between python and scala for basic tasks is so minimal that you can easily grab my Scala scripts and with minor changes use them for python.
Hey Adam as usual, great work, quick question, do you know any tool to perform database sharding using Azure and not using Cloud Services?
Hi thanks! Not sure I understand what do you mean using Azure but not cloud services.
@@AdamMarczakYT right, didn't clarify that, I meant the Cloud services PaaS docs.microsoft.com/en-us/azure/cloud-services/cloud-services-choose-me I've seen that I can only use this tool to perform sharding (split-merge) but that's going to deprecate in the near future.
Where did you find the info that this will be deprecated? Nowhere in the documentation it says so.
Can i use functions instead of logic apps?
Hi I'm getting this error,
Invoking endpoint failed with HttpStatusCode - '404 : NotFound', message - 'The requested endpoint(url) does not exist on the server. Please verify the request server and retry.'
You guidance could be a great help
While triggering logic app server
Hi, I need to add recipient email address dynamically. May be more than one. But not hard coded, using a csv file or from a table, i need to pass to the parameter. Can you help me in this ?
You can use Lookup action and pass that output to Logic Apps as one of the parameters.
Hey Adam,
can we write integration test cases for logic app and pipeline if yes plz can u share something,
thanks in advance.
Not that I'm aware of at this time :( Good question though!
Simply awesome thanks a ton Adam. can you please help me with how to send email notification in Azure devops release pipeline if one stage is failing like i have QA , DEV , DEV1 stages i want to send email on if any of these stages failed for any reason. I don't see any inbuild option in Azure for specific stages. Your guidance will really help me.
Thanks! Check your notification settings because normally you should get emails when pipeline fails docs.microsoft.com/en-us/azure/devops/notifications/manage-team-group-global-organization-notifications?view=azure-devops?WT.mc_id=AZ-MVP-5003556
Can we do this with azure alerts?
How much will it cost to use using Azure logic apps to send the failure ADF pipelines email notifications?
We had this Logic App approach configured several months ago by a contractor, with just 3 ADF pipelines running once per day (so 3 total per day), only sending on failures which was only couple per month. We have 3 subscriptions for our different deployment levels (dev/test/prod) and the Logic Apps purely from the service charges (App Service Plan) have cost us $500USD/mo. The pricing is a little misleading as the calls/actions themselves cost _very_ little, but just to have the app exist and listen costs quite a lot.
Hi Adam, could you please explain how to add a file processed in the pipeline as an attachment to the email?
Thanks a lot! Great tutorials!
You probably would need to save output to blob storage and pass the blob path to a logic app and in logic app read the file using path and add it as attachment.
@@AdamMarczakYT Thanks Adam for your quick reply! I appreciate! I am keeping the file in a blob storage. I'm just struggling trying to pass the blob storage path as a parameter in the web activity settings body section. Is it possible to add an "Attachment" element to the JSON?
Hi Awesome Adam.. I tried using @activity("activity_name").error.message to send the error message but I gives an error that "The request content is not valid and could not be deserialized.. After parsing a value and unexpected character was encountered:.. Path 'message'
Hard to say without looking at this. Maybe you've made a mistake somewhere. Check in the debug the output of that activity to verify if objects for error and message are there. Sometimes it's faster just to delete this and try again, it's also good for learning. :)
@@AdamMarczakYT thanks I'll check again for now I hard coded it
@@avnish.dixit_ Please try pasting your body content into the body text field box WITHOUT clicking on 'Add Dynamic Content' in web activity calling Logic App
Super! What about this scenario - I am importing some data. Imagine there is a Column with name "Running". The row is True or False. How to make alert via mail in case the column "Running" will be False? Is there any possibility how to make it? Thank you, Adam !
I'd probably try using Lookup to source data and create foreach with conditions to send emails. Note that lookup should be used for small datasets. Everything depends on details! Thanks for watching!
That was a great knowledge share.
Can you give me some ideas about from datafactory source as Azure SQL & Sink as an excel file. uploading to sharepoint.
How to achieve this???
Thank you Adam for this grt video. I have tested this whole video but facing one issue. The color coding is not working in the email body.
Not sure, it worked for me last time I checked. Please check the video and ensure all stops in the logic app were done the same way. Especially passing HTML body as variable. If it won't work let me know.
@@AdamMarczakYT Thank you.Now it's working for me.Just added a semicolon after the color and it started working.
Hello Adam, this is very helpful. But I am getting error when I the previous activity error message syntax.
Make sure to pass proper JSON to Web Activity like
{"Message":"@activity('Execute Pipeline').error.message"}
Remember that with every tutorial I provide samples on github. Link is in the comments so you can quickly copy and paste the templates and request body.
how to cover the error messages in alerts
Hi Adam, thank you this tutorial, can we add attachment to email, such as file or query result..?
You can, logic apps allow for attaching files to emails. Check the docs for details :) Although if query result would be big I'd probably save it as a csv on blob and just generate link to it from logic app and attach that in the email. Small query results could be send directly to logic app in the body.
seriously good video!
Thank you Roger!
How to send outlook email without 'From' from logic app?
Is it possible to set up email alert for any pipeline failure in ADF?
Yes satish, it can be done, jump on 21:14 in this video for more details
Hi Adam,How to send email with attachment(blob storage file)
Send email allows sending attachment. Check some blogs/docs for examples stackoverflow.com/questions/51473878/azure-logic-app-how-to-send-an-email-with-one-or-more-attachments-after-gettin
This is really informative. May I please ask if there is a way to send a text file to an office 365 account as an attachment? Could we add a trigger on that as well? Let's say if the txt file updates then send the attachment, else do nothing? Thank you!
Hey Chen. Yes you can. Check this guide, it shows briefly how to add attachment to email via logic app.
@@AdamMarczakYT Thanks Adam, which guide you are referring to?
Uhhh link is missing, here is the guide: docs.microsoft.com/en-us/azure/logic-apps/tutorial-process-email-attachments-workflow :)
Hey,
I want to perform delete activity in adf but first want to ask user permission.
Like if user will get any email or alert that pipeline has been failed so do you want to delete all data?
And if userr opt for yes than delete activity will be performed.
Can you help?
I hav tried to use "send email with option in logic app" but not working.
I want my pipeline to run on the basis of that answers.
If yes than delete
If no than dont delete.
You can either use webhook activity in ADF or start new pipeline. For approval process you can use logic app with action called send approval email.
@@AdamMarczakYT yup working , thanks man!!
Hi Adam what is the alternate of logic apps to send notifications...
Why do you need an alternative?
@@AdamMarczakYT since logic apps is chargeble so mangment is asking alternate Sir
Everything costs money. Check out sendgrid free tier maybe. Or use Microsoft Graph REST api to send email via exchange.
@@AdamMarczakYT thanks Adam
Hi Adam, your tutorials are excellent. Can you guide me on how to connect ADF and Snowflake? I would like to copy flat files from Azure Blob to dynamic tables in Snowflake.
Hey thanks! Did you try using linked service for snowflake? It should work without any problems.
@@AdamMarczakYT I tried the linked service for Snowflake. By creating the SAS token URL for the blob storage, I established connection between snowflake and Azure Blob storage. However the CSV files read using copy activity couldn't be converted to dynamic tables in snowflake!!. Only when the appropriate table pre-exists in snowflake, the files get copied. Is there a way to create dynamic tables for the appropriate files.
Thank you for this, this is just the approach we need for our project. I am having one small issue though.
If I use "@{activity('Execute Pipeline').output}" for "message", I get this error: {"error":{"code":"InvalidRequestContent","message":"The request content is not valid and could not be deserialized: 'After parsing a value an unexpected character was encountered: p. Path 'message', line 3, position 14.'."}}.
If I use "@{activity('Execute Pipeline').error.message}" for "message", I get this error: {"error":{"code":"InvalidRequestContent","message":"The request content is not valid and could not be deserialized: 'After parsing a value an unexpected character was encountered: S. Path 'message', line 3, position 52.'."}}.
Wondering if I missed a step somewhere.
My pleasure! You probably missed some step, like linking error path (red line) to error message. Your error states that there is no property error.message on the object you are referring to.
@@AdamMarczakYT Thanks for the quick response. Let me check that.
@@AdamMarczakYT It looks like they don't allow us to pull the error messages from the pipeline any more. Found this as I've been researching it (scroll down to the comment from douglaslMS): github.com/MicrosoftDocs/azure-docs/issues/18481
Going to try to see if I can capture output from Azure Monitor as they suggest.
It worked in the past. I don't think they would remove this feature, in my opinion they are talking about returning error as the output of ADF pipeline, which isn't supported.
Make sure to pass proper JSON to Web Activity like
{"Message":"@activity('Execute Pipeline').error.message"}
Nice work Adam!
Could you please make a video on. How to check 0Kb csv files / zero row record from source. If zero Kb file/Zero records in source trigger an email, in azure Data Factory.
Thanks in advance.
Hi Sandeep, if you are aiming to be a data engineer then those will be your every day challenges that you will need to solve. Typical data engineer should understand the technology, be able to search documentation online, explore, play around with it, and finally use those skills to combine and build working solution. I can't create video on every single scenario. You already stated what needs to be done, split it into separate items and search in google on how to get them.
1. how to check file size with ADF > will probably return that you need to use get metadata action
2. how to create if in ADF > will point you to conditions
3. my video on bulk load will show how to iterate
4. my video on emails shows how to send email
just combine this and build the solution. You got this.
@@AdamMarczakYT thank you Adam. For step by step explanation. Happy to let you know, I sorted my posted query.
Thanks much for the video Adam. I have implemented this in my solution. And I took it little advance by passing the values to the elements by using a lookup activity.
Excel file ( which has column names as Jason element name and value ) For example : "Title": "@{activity('Lookup1').output.FirstRow.Title}"
This way I can modify the values outside the pipeline without touching the code
But the problem is.. in my excel the title value is : Failed at @{utcnow()}
and the adf is not converting that into time , it is populating as is to the email.
My email body is looking like this
Failed at @{utcnow()}
how to print the formulae value in the email
Please help.
thanks much.
Hey, thank you for watching. Glad to hear that you enjoyed the video. Unfortunately I can't help you debug your solutions. One thing that comes to mind to to ensure that you clicked "add dynamic content" and put this in the expression window, not in the field itself. Otherwise it's treated as a text.
@@AdamMarczakYT 😊👍🏼 Thanks Adam. But really great channel to follow. Thank you
thank u for sharin
Am I the only one struggling because Microsoft has changed the Logic Apps UI? First of all, I needed to create a workflow which wasn't the case for Adam. Now I am struggling to find rub history. The pipeline debug was successful, but workflow Run History is completely empty...
Thank you
Hi Adam - can we attached file ?
You can attach files in logic app yes. It's a little tricky to do from ADF though.
@@AdamMarczakYT Thank you Adam . I am planning to pass blob path from ADF to Logic apps and Then let Logic app to get content from Blob for attachment . Please let me know if you have different ideas.
Chyba zrobiłem kupę
Ciezko powiedziec czy film az tak dobry czy tak zly, ale dziekuje za ogladanie! :)
Adam Marczak - Azure for Everyone Mocny
Great video! Thanks Adam