Azure Data Factory Custom Email Notifications Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey guys this is Adam in this tutorial I want to show you how to build an email notifications for your are your data factory pipelines both using built-in functionality of Asha but also extending with your custom logic all of that in today's episode stay tuned there are many scenarios in other data factory when you will want to send a custom email notification like notification of your business users whenever you're syncing they're uploaded data maybe pulling the data out of their CRM systems or you simply want to notify them that the reports have been refreshed second common scenario is notification for your development and operation teams whenever something unexpected is happening on data factory at this point in time you have two options to send email notifications from the data factory first one is using Azure monitor alerts or using web activity directly from your pipelines and calling other logic observers this way you get the highest flexibility when it comes to email notifications when it comes to other monitor it's out-of-the-box solution with another for monitoring your application operating systems and other resources in which case each other resource is trimming metrics to either monitor and you can respond to changes in those metrics by using alerting capability of our monitor or integrate further with our logic apps and the same thing applies to other data factory when you have your pipeline's created in data factory whenever you run them and metrics will be ultimately streamed to other monitor and for each activity each pipeline each important event will be visible with an azure monitor and if you review all the available metrics you'll see that there's quite a lot all the canceled fail runs integration runtime information and all the succeeded runs once you are taking the metric that we want to track you simply create a condition pick a metric and create a condition on that metric for instance whenever succeeded pipeline runs is bigger than zero in the five minute window please raise an alert and when raising an alert you are able to pick what is the action that you want to execute based on this alert and this can be email SMS push notification or integration with our logic app and many many other few things note here first of all this is out of the box so our monitor is already there if you're using our data factory you can take advantage of our monitoring and it's alerting and monitoring capabilities second of all it incurs very minimal costs you will probably not even notice this for your own applications first of all there's quite limited customizability because you cannot customize the emails that are being sent and if you try you will notice that this customization is quite complex even if you use logic apps because of the format of the data that is being sent and you should note that there will be a noticeable delay between the events within your data factory and the alerts being sent we're talking several minutes here and lastly the emails that we will get out of the box from our monitor are not very readable if you have technical team they might be able to understand what happened based on this email but if you have business users you should not send us the email because it's very hard to understand what is the purpose of this email and what does it really say and I wanted to cover our monitor because it gives you that option of sending emails although it's good for out-of-the-box and platform monitoring it's not so good for sending customized emails notification which is our topic today which brings me to the second option using web activities with an azure data factory and calling our logic up service so let's talk about how this works you simply add additional step within your pipeline and call link a web service so you are using logic apps as your standard web service in this case each logic app that you create will start with HTTP trigger and then use a step to send an email this email will be sent using external service so logic apps on its own cannot send emails so you will need to connect to a separate service like for instance an office in that office you need to set up a technical email to send your emails please don't use personal emails for this matter and technically mail will send emails to your users once you're done with it you can additionally use our allotted gaps HTTP response if you want to return some information back to the pipeline but as you see the design is quite straight forward so we can go and talk about the benefits here first of all it's cheap and effective because in just couple of minutes we'll get this done and you will be able to send custom email notifications whenever and wherever you want within your pipeline this is also quite easy to setup because you don't have to have any coding knowledge to do it just couple of minutes you leverage or your logic gaps which is visual way of building applications and you're ready to go fruit of all by using this approach you will get that full custom is ability for your application so it doesn't matter what you will want to do as soon as you leverage logic gaps you can do pretty much anything so let's go to Azure portal and in here let me show you what I already have set up I have a storage account inside of that storage account I created a container called input and output and my pipeline simply grabs three files cars movies and planes out of the input container and copies them into the output container so fairly simple workflow but I want to present to you how does it work and then create a custom email notification for this workflow second of all I really have set up our data factory so if I open this go to outer and monitor and review I will show you that I currently have only one pipeline so this pipeline is us on my presentation called demo pipeline it has three steps copy cars copy movies and copy planes and only single data set a generic data set for my blob files and as per our diagram right now what we need to do is to add additional step which we'll call our logic app in this case go to general find a web activity and simply drag it on the screen once you have it on the screen simply grab the line from the last step from copy plains and drag it to the web activity in here you might give it more significant name like send email and what do you need to do right now as per our diagram you need to call a logic up but we don't have a logic up yet so what we need to do is go back to our portal go to our resource group and hit add to create new resource type logic up in the marketplace to find the standardized template for our logic observers select it hit create and give it a name in this case my name will be send email and every other property should be left as default so simply hit review and create and hit create creation of logic gaps takes about 10 seconds so let's wait for that once it's done select go to the resource to open our logic gap again as per our diagram what we need right now is to set up HTTP trigger with an azure logic app and to do that you can use the start with the common trigger panel here and select one HTTP request is received once this is done simply click Save to save our logic up at which point a URL will be generated for us to call this logic gap and this URL is available here so simply hit copy once you copy the URL go back to our data factory select your web activity go to the Settings tab on the bottom and in here paste in the URL of the logic app select the method to post and the post is used because we'll be sending over some details from the data factory in order to customize our email notifications and we'll be using body payload to do so so right now we need to customize this body and this will be pretty much a JSON format of data and the data that will be within this JSON is pretty much the data that we want to use to send our email because I will want to send the details of my data factory dynamically I am selecting here add dynamic content and in here I will paste the JSON object that I want to send and I already prepared it beforehand because I don't want to spend time right now to generate that JSON but what we're sending over is a title of the email message that we want to send over a color indicating within whenever this is a good or bad notification data/factory name that generated this email pipeline named pipeline run ID and the time that this pipeline will started or finish depending on what kind of notification we want to send right now we are good when it comes to sending data I'm not yet filling those properties but right now if you hit finish this will work this will send a post request to your logic up we need to be sure that logic up is also expecting this as an input so what you need to do is also go to our logic up and in here select use sample payload to generate schema in this paste the same payload that you will use from our data factory so use the same sample that's why I like to prepare the sample of the body that I will be sending over before hand so we can quickly paste it here and there and make sure that logic out we'll get the same input when you do it select done and it will ultimately direct generate the schema that logic app will expect when receiving this payload so let's go to other data factor and send a proper information so inside of our data factory again go to the dynamic content of your payload and in here you can start filling for instance you can say pipeline run finished as your title you can send message if you want do whatever you want here for instance demo pipeline finished running successfully color in this case it's good because we are able to execute this pipeline correctly next is data factory name in here you can always use a static messages like I'm doing right now or you can use expressions if you want to use expressions simply use outside and use curly brackets and within those brackets you can use data factory expressions and as you see I need data factory name and on the bottom I see data factory name being available as one of the system variables they simply need to click on this which will fill the expression for me which grabs the pipeline and the data factory name so you can do pretty much the same for the pipeline run again at curly brackets and within curly brackets select system variable called pipeline name do exactly the same thing for the run ID select the run ID and for the kind for the time I could use trigger time for for my pipeline or I can use current date by using date functions so you go to date functions and one of the date functions is UTC now so you simply paste it over you hit finish and that's pretty much it right now we are able to call the logic up using this payload and pass some information from our data factory so just select the bug so we can test the run go back to logic up and before moving away from the screen remember to hit save or you will lose what you already created select the logic app name on the top and refresh to see the latest run history for your load gap you will see the status succeeded from our current data factory run selected and you can review what kind of information was passed from data factory simply go to the outputs at the bottom and you will see because you use that schema payload inside of the first step on HTTP trigger you can now easily review the payload that was sent from data factory as you see the color was Green that name of our data factory that was our message message pipeline named run ID current time and a title as I see everything is really nice that means we can go to this diagram and now uses information to send the email using our Technical Account so go back to your logic app select edit to go back to designer add a new step and in here we can use email if you type Outlook you will find two connectors and the two connectors the main difference between them is Outlook comm is for private accounts private life ID accounts if you're using your corporate account make sure to use office 365 Outlook it's a different licensing for me since I'm using a demo and I have only my private accounts I'm using outlook comm site select this connector and I need to find send an email in this case version to send an email vitu once you selected the first thing you need to do is to sign in to Outlook calm to do that select sign in and in the pop-up window log into your Outlook account in this case I just need to setup password because I'm ready connected to my private account select sign in and I'm ready to use it as see setting up the connection is fairly simple so right now we need to specify to whom we will send the email in this case I will set up my private account as the recipient of the emails but now that I did it on purpose on purpose I did not want to send a recipient as part of my data factory parameterization because Indian is very important to secure this logic out and exposing ability to send to anyone in the world your email is very dangerous therefore I like to set up logic ups and a pretty much hard-coded the recipient emails within the lotta gaps so that make sure that I can only send the proper content from the data factory while staying secure in here we also need to provide a subject and the subject of our message you can find in HTTP properties on the right this dynamic content here if you select this subject you can select the dynamic content and because in the first step here we use the sample payload to generate schema and we've got that entire schema here specified when you go to subject you can use that schema as variables later in your logic up therefore I have all the properties like color data factory name message pipeline name and title which will be the title the subject of my email and inside of the body I can say just hello world to test this so let's hit save and test if every thing works fine to test that simply go back to your data factory and hit the bug again this will run the pipeline and send the email again in just few seconds the pipeline finished running the calling of the logic app was successful so what you can do right now is go and verify whenever you've got that email so if I go to my demo email account that I created I can see the new email pipeline run finished with the hello world message so everything works as expected we are able to call the logic app and send email correctly so the last thing that we need to do is to customize the message that we're sending to do that go back to your logic app and in here you have this body but the body as you see has this basic type of editor which is what you see is what you get type of editor so we need to do a formatting in here but most of the time what you will want to is to create HTML based emails and there's a pretty neat workaround to be able to specify HTML because if you would H to use HTML here like so this would not work this would not be bolded you just get that as a message because this is just a plain text so what you need to do here is add a new step be Queen your trigger and your email so add an action and search for variables actions because you need to instantiate a new variable so select this action group and in here you need to select initialize variable you need to give it a name I will call it email body type of string and provide a value in this case this value can now be an HTML message and for that reason I created a small sample for HTML that I want to send first of all we will send some sort of header so we'll send a header with a color I will put the color here and the title of the message and I put two horizontal lines just for the looks so in here I need to replace the color as you see it as a standard CSS stylesheet so I can dynamically select the color of my text using the HTML so remove this and select the color from the dynamic content select the title here and again pick the title from the dynamic content so we will get a small message with the title we probably additionally want to grab some basic information passed from our data factory like the data factory named pipeline run run ID and the time and do pretty much the same thing here so replace the name here with the name of your data factory replace the name of the pipeline name here with the pipeline name content run ID pretty much the same thing and the time which is at the bottom and lastly you could probably add the full message as a separate block just to get that clear indication and in here I like to put also a P which is the paragraph also add the color here just to be sure that my message is very nicely exposed in the email and put the message here and lastly I usually also like to add a very small notification that this email was generated automatically please do not respond to it and contact and the email contact of Martine make sure that users know who to contact when they get this email especially if they would get it in unexpected moments or maybe too many times so they can actually notify you once you have all of that you can use it as payload for your send email action please note that this HTML sample that I'm using here will be also available on github so if you want to pick it up from there you don't have to write it from scratch then go to send email action and in here remove the entire body and just go to dynamic content go to variables see more and simply select email body once you do it select save go back to your data factory and hit the bug again looks like the pipeline has finished that means you can go again to your email to find the latest email sent from the pipeline and as you see it's pretty nicely looking email compared of what you would cut achieve when using the specific editor you have nice color that you are able to pass as a parameter from data factory you have the standard information and the information from the data factory that you will want to send right now you are able to use that with pretty much any step along the way of your data factory pipelines so you could potentially use that same step in different manner so you can disconnect it from here and maybe you want to send a notification after the movies are copied it's as simple as that so now when it comes to good practices there are a few things that you will want to do here first of all when you're connecting an email like that that's fine but if you want to do an error handling emails for instance copy this block and send email whenever there's something bad happening usually what I've seen people do is go here select add an output select failure and drag a failure here which is fine if you have only one arrow but what I've seen people do is add multiple failures here and drag all of those lines here what is important here is that those lines are logical and that means every step would have to fail for this email to be sent copy cars would have to fail but also movies and planes would have to fail during the copy for this email to be sent therefore usually this would never go through because here it will only continue after successful run in which case you would have to copy this block over and over let me just scroll this down outline everything and copy one more block you would have to do something like that we'd have to delete this drag it here dete this and drag it here out a line to get a little bit more clean pipeline and UCC right now we have more blocks sending emails than the logic itself which is not the best practice so my recommendation here is remove that blocks that send email from within the pipeline and nest them in the parent pipeline like so create a pipeline called for instance master pipeline put a name here master use general execute pipeline and select your demo pipeline in here use control X to cut D send email action and paste it inside of the master pipeline right now go back to demo pipeline to clean all the other blocks to make sure that there are no blocks left in your original pipeline and in your master pipeline drag the successful email to your web activity and copy this and called failed email and drag the failure from the execute pipeline step that means if anything fails within this pipeline please send that failure email so what you can additionally do here is now customize a few options so go to dynamic content for the failed run and in here you probably will want to change pipeline run finished to pipeline run failed you will probably also want to change the message and one of the cool things that you can do here is grab a dynamic message from your data factory pipeline so let's remove this message use an expression again add at curly brackets and use it the expression here to get the output of the pipeline you can find this again on the bottom as you see activity outputs execute type line 1 which is our current step here selected and as you see it says activity dot output but since this is an error run what you can do here is type error dot message to get the message why this pipeline failed change the color to red and leave everything as default click on the finish button and select the bug to test again the correctly running pipeline pipeline finished we can go review the email everything works as expected right now we want to test the failure run so in my case I will go back to our portal and I can either navigate to research groups or I can navigate from my dashboard into the storage account select the containers go to input container and delete one of the files in this case I will want to delete from the demo pipeline the copy movies which in this case is deleting the movies CSV I want to be sure that it will fail in the middle of the run once you deleted this go back to your order data factory go to your master pipeline select the bug to run it again as you can see our pipeline failed but the step sending email was successful therefore my expectation is if I will go back to my email right now I should find a new email called pipeline run failed when I review this email you will see not only the content being read because I was able to parameterize this but also the information here is the error from the pipeline an SEC operation on target copy movies failed with the error message being required blob is missing so this means with logic ups I was able to customize mails in a way that I'm already getting the error message inside of the email so if I am sending an operational email my team can review and see what is the error what is the problem directly from the email and start investigating right away and that was fairly easy to achieve if we go back to our presentation let's talk about the last few points on the subject so remember that you can use our monitor if you need a personal platform monitoring so if you need that out-of-the-box functionality and you just want to set up something quickly you can use our monitor for that if you need something custom or you need to notify your business users you should use web activities with logic gaps and lastly remember about enhancing security because you're exposing a public endpoint for logic gap that can send email from your organizational email therefore don't use personal account err because anyone who has access to logic up will have access to your personal account second of all data Factory now supports static IP ranges so you can add those ranges to your firewall settings on logic apps to make sure that only data factory can call it next you can you should be bit careful about being too generic as you seen I didn't add a recipient email as a parameter because I don't want to expose too many possibilities out of that logic out I want to be sure that only my recipients that I expect them to get will get the emails and lastly you can remove SAS tokens which is this very long URL from your logic app if you use a new feature called logical authorization policies feel free to check it out and if you're working in a highly secure environment in enterprises you might want to use API management to even further secure logic apps but all of that depends on what kind of application do this and what kind of calls are you making from data factory to logic app definitely remember those options are there for you but you don't have to pick them right away I think all of us can agree that logic apps are perfect combination for data factory whenever you're refreshing analysis services calling external services or just sending emails it's perfect combination perfect service to use and you should definitely take advantage of it if you liked the video hit thumbs up leave a comment down below and subscribe if you want to see more videos and see you next time [Music]
Info
Channel: Adam Marczak - Azure for Everyone
Views: 32,963
Rating: undefined out of 5
Keywords: Azure, ADF, Data Factory, Email, Notification, Custom
Id: zyqf8e-6u4w
Channel Id: undefined
Length: 29min 22sec (1762 seconds)
Published: Tue Jun 16 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.