AZURE FUNCTION ⚡ - Integrate with Azure Queue Storage | Getting Started With ASP.NET Core Series

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
functions provide serverless compute for azure it allows us to focus on code that matters the most to us and functions handles the rest azure functions allows us to build web apis respond to database changes process streams handle message queues etc with azure functions we write less code have less infrastructure to maintain and in turn we save on the cost in this video let's write our first azure function to process messages coming into a queue and learn more about how azure functions work and how to set this up hello everyone my name is rahul and welcome back to this youtube channel if you're new here i make videos on dotnet cloud and devops i'm sure one of this interests you so make sure to hit that subscribe button and help me grow this channel without much delay let's head off right into visual studio and create our first azure function here i have an existing project that i had used for my previous video when i talked about queue storage now if you are new to azure queue storage i highly recommend checking out that video which will be linked here and in the description below to give a quick recap on this project this is an asp.net application which has the startup program.cs and the default weather controller i had added a new endpoint in this which was the post method which was to send messages to a azure storage queue now all this does is take in a weather forecast object serialize it and send it using the queue client the queue client in this case was injected in using the constructor and was wired up in the startup.cs using dependency injection so i showed how to use managed identity to connect a queue client which removes any need for username and password in this particular connection to the queue client i had also created a weather data service which is a background service job that runs every 10 seconds to pull the queue for messages now as in when we post messages into the queue this service polls and reads the message and process it in this particular example all we did was just log out that particular message to simulate that it is processing the message this background service is also registered in the startup.cs if we want more control on the processing of these messages scale on demand etc then using a background service might not be the best option a more appropriate option would be to use an azure function which can run independently on a same service plan or even different and scale separately as well so let's move this background service to an azure function and learn more about azure functions so first let's comment out this background service so that this doesn't get run on application startup to create a new azure function let's go on to the solution explorer right click and say add new project in here if you have installed the right templates for azure you should be able to see an azure function option to do that you can search or even filter it using the drop down boxes here i have this already filtered down so i have azure functions in the list here so let's click this and click next let's give this function app a name so let's call this weather data in gesture because it ingests in weather data objects so let's leave the location as the same and click create this prompts a further dialog which enables you to select the different templates for building azure functions you can see there are different options inside here starting from empty where you can build from a blank project there are also different triggers based on which you can create these azure functions triggers are what causes an azure function to run a trigger defines how a function is invoked and a function must have exactly one trigger so this is basically the triggering point for the function now this could be an api endpoint in which case a http method would be the trigger for the azure function you could also trigger based on various other options this is exactly what we are getting to choose in this particular template here so as you can see you can do a service bus q trigger or an http trigger which is basically for the api based and you can also have a timer based trigger q blob etc now in our example we were dropping the messages into an azure queue storage so choosing the queue trigger is the most appropriate for us so this gets triggered every time a message gets dropped into our queue which is exactly what we want so let's select that make sure you're using the appropriate.net version in this particular case i will use dotnet core 3. on the right side we have the option to select a storage account an azure function when it runs needs to have a storage account this storage account stores the metadata that's related to the function and it's running in this particular case i can choose storage emulator when i'm running on my local machine so let's leave it like that this also asks in for a connection string setting name which is basically the app settings name that you will be using for the connection string this is the connection string to the queue that we will be listening for this q trigger so let's name this weather data queue because that is where we are going to have the messages come in now if you remember from the previous video i had used the queue name as add weather data now i'll show this again in the source code but let's leave it like that for now and click create this will create a new azure function project which is triggered whenever a message drops into a queue so we have the weather data ingester project successfully created and this has the function1.cs file it also has the host.json and a localsettings.json which we will soon see what it stands for even though we gave a name for the function the function name is still called as function 1. so let's rename this to be more appropriate so let's name this as process weather data because this function is actually going to process weather data messages now if you see here this function has an attribute which is the q trigger and has the cube name and also the connection string name that we had specified this means that anytime a message is dropped in this particular queue on this particular connection string name then this particular function would be invoked this is exactly what this q trigger handles for us here we have a static run method which takes in two parameters now if you look at the methods that are recognized as functions the function name attribute is what marks as the function's entry point now note that the name must be unique with the project and should start with a letter and there are certain constraints on the type of attributes that it can contain the method signature parameters needs to have one trigger attribute which is what we have with the queue trigger and it can have a few other parameters now if you look at here the options that you have are basically the logger or the trace writer a cancellation token and binding expressions parameters also note that the order of parameters in the function signature doesn't matter so basically you can switch around the parameter order in the function that we just saw if we come back to our method here we have one trigger which is the q trigger and we also have the i logger being injected in so this is automatically being injected by the run time when this function gets invoked this function by default just locks the queue message that it receives so we receive the message as a string which is basically the body of that particular message now when we use the q trigger attribute it expects the message to be base64 encoded string you can update the options so that this is also changed but in this particular case let's make sure that the messages that we dropped to the queue are base64 encoded so if we go back into our startup.cs where our queue client is registered we can see this takes in a queue client options this is where we can update that it needs to be a base64 encoded so if you look at the parameters here there is a queue client options being passed in so let's name this as options and then update the options to have base64 encoding so let's set the options dot message encoding and set this to be base 64. so let's say q message encoding dot base 64. now to pass in these options we have the q client here so after we pass the credential we can also pass in the options to it now this is going to make all the messages that this queue client is going to send as base64 encoded let's switch back to our function1.cs and look at the connection string here so this uses the weather data queue which is the connection string name for this particular queue trigger so if you go into the solution explorer under the local settings.json which is like a local configuration file we can add in a value for those connection string so let's switch back make sure to copy this and come back to localsettings.json and add in the connection string for this so let's add the weather data queue and also pass in the connection string for this particular weather data queue to get the connection string let's switch over to the azure portal and go to our weather data queue so under the access keys in the storage account you can get a connection string to this particular queue so let's say show keys and copy this connection string note in the previous video i had shown how to use managed identity to connect to the queue when sending messages from asp.net we'll get started with the connection string and i'll also show you how to use managed identity with q trigger as well so let's copy this connection string come back to our visual studio and paste this as the weather data queue connection string now when this function is going to run it's going to look at this connection string name get the connection string and use that to connect to this particular queue in that let's run this project and see this in action so let's make sure to put a breakpoint here and run this application so let's right click and say set startup projects and set both the projects to be running so let's say multiple startup projects and let's give the action as start for both of them and click apply now if i click f5 both of these projects are going to be run now one of them is our web api which will launch the swagger api endpoint and the other one is going to run the function on my local environment now i have the azure function score tools installed which is also part of the visual studio installation you can also get it as standalone so if you haven't installed it make sure to install that on your local machine you can see that this is starting the azure storage emulator as we had selected the storage emulator for the default azure storage needs now this is also set up in the local settings.json where you can see azure web job storage is set to local development storage this again is the storage to manage the metadata of the function that is running this has nothing to do with the actual queue that we are listening for messages now the azure function is running successfully and it's waiting for a message to appear let's switch back to our swagger endpoint and post a message to our endpoint say try it out and may give the message as first function message and click execute now this is going to use managed identity connect to the queue and drop a message as base64 encoded version the message was successfully dropped and immediately it got picked up by our function and we have hit the breakpoint here this is because this q trigger is listening on the same queue that this message was dropped in now if you look at the message queue item we can see the string message that we just passed so you can see the first function message in the summary here so let's click close and continue execution of this particular function this logs to the logger as expected so here in the console we have this log statement which says q trigger function processed now if we remove the breakpoint and drop in a lot more messages we can see all of these messages getting processed so let's click execute a couple of times and you can see as and when these messages are arriving the function is automatically picking this up so we have successfully added a new azure function to process the messages that's coming into the queue if you switch back to our queue go to the queues and select the add weather data we can see that the messages have also been removed from here which means this was successfully processed earlier when we did the weather data service we had to explicitly delete the message to indicate that we had processed the message in this particular case the queue trigger is doing this automatically for us but it marks this as deleted only if the function returns a success in any case if you throw an exception then that message will not be processed so let's say if we had a condition here if my queue item dot contains so let's say we are going to pass in a message which contains exception so let's name this as exception then we can throw a new exception inside this particular method this is just to simulate an exceptional case for certain types of messages so let's say new exception and say exception found in message so in this case this function will not be successfully completing anytime a message has the word exception inside this let's run this and see this in action we have both the azure function and the web api running so let's again go to post try it out and drop in a message now this message does not have exception inside it so this is going to be processed successfully you can see the message was picked up and it was processed it marks the message as successful so if you switch over to the queue and refresh this you will not be able to see the message here because the queue trigger knows that we have successfully processed the message because there was no exceptions inside the handler so let's come back to the api and drop a message with an exception so let's say message with exception and click execute now this is going to drop a message that is going to throw an exception inside the function handler so let's continue the execution here as soon as we continued the execution the message got picked up again so this is again throwing the new exception here because this message was reprocessed now if i continue this again this is going to keep going for a couple of times now if we go back to the console where the azure function was running you can see here this exception message was logged five times this is because we have a default setting of maximum dq count on the message to b5 so every time we dequeue the message the dq count on the message increases once it has reached 5 it will move this message to add weather data dash poison this is just by convention that it appends the dash poison which is similar like a dead letter queue where the messages that could not be processed are being dropped in since we don't have this queue created this message will not be there as well so let's come back to this particular queue that we have and create a new queue and say add weather data dash poison and click ok let's come back to visual studio and make sure to run the application this time i'm running it without debugging using control and f5 which starts the both projects without debugging so let's click weather forecast try it out and say again message with exception and click execute now this is going to drop the message into the queue which will be picked up by our azure function that is running locally the message was successfully posted you can see the function has already picked it up and this has run exactly five times and it has again errored out this maximum limit finally it moves the message to the add weather data dash poise and queue if we come back to our queues if we go to the add weather data poison queue you can see this message will now be available here now you can use this to manually monitor the messages and see what has gone wrong and reprocess these messages if required in a real world application it could be because of some mishandled scenarios or some invalid data in your messages based on this you will have to do appropriate processing if you want to modify any of these default settings the dq count etc you can do that by using the host.json file and giving in a settings file so here we have the extensions cues and the default settings being applied here you can overwrite this if this is something you want let's copy this come back to our source code in the host.json let's paste this in so after the version let's paste in the extension and the queue's property let's simply specify the maximum dq count because that's the only thing we are currently interested in so let's remove all the other settings and make this as two so we are specifying explicitly that the maximum dq count must be two let's run this application again and see what happens both the applications are running let's click try it out and specify the summary as new exception and click execute this has successfully dropped the message to the queue and our azure function has picked it up as expected you can see this time it has only picked up the message two times and we have a message here that maximum dq count of 2 has reached and this will be moved to the poisson queue this is because we had explicitly set up the maximum dq count to be 2 inside our dot json file so if we go back to our queue here and refresh the poisson message we can see the new message as well inside here which is the new exception now that we understand how azure functions work let's see how we can use managed identity to connect to these azure queues inside function1.cs we had specified a connection string which is now specified in localsettings.json file now this is used while you're running locally when you have deployed this function to your azure functions you will have to specify this as part of your application settings or environment variables in the function environment but let's see how we can completely avoid the need of this connection string to use identity based connections which includes managed identity you will need specific versions of the azure queue nuget packages that's being used for azure queues we need 5.0.0 beta 1 or later once you have the latest nuget package you can use specific conventions to give in the queue name that it has to connect to so in this particular example we'll have to specify a service uri in the end for the connection string name let's see how this works in action so if we come back to our function go to the new get packages inside here you can see that we use the microsoft azure web jobs extensions dot storage and the version is 3.0.10 let's update this first to the latest version so let's right click and say manage nuget packages let's select update and make sure to include pre-releases because as you can see it was a beta version that we were looking at so let's select the extensions.storage and select the latest versions in this case this is beta.5 which is greater than beta.1 so let's click update which will update this nuget package let's click accept and this is installed the latest package into our azure function let's collapse this come back to our local settings.json and remove this particular connection string now this azure function which has the queue trigger is coming from the latest nuget package this still needs to know which is the url that it has to connect to for this particular queue so we will still leave in the connection name as weather data queue but in the local settings.json we will specify that this is a service uri to do that let's give double underscore and specify service uri this is exactly what was specified in this documentation here so let's make sure we have the appropriate case so the u and r and i is not capital so let's leave it like that let's navigate back to our azure queues so let's go to the storage account go to the youtube demo under queues let's copy the full url to this particular queue so let's copy that come back and use that as the service uri the azure function should now be using default azure credential to automatically connect to this particular queue now earlier in the queue storage video when we registered the queue client inside the startup.cs we had explicitly used default azure credential and set up the managed identity for my user in azure for the storage queue because of which this is automatically able to connect to this queue since this function is also running under the same context it will also be able to connect automatically so to recap if i go into the storage queue if i go under access control iam you can see role assignments and this is where we have given the users that can connect to this queue so here we had specified my user does have access to the storage queue data contributor now you can see two ids because one is the domain specific id and also the other one is the external user which is my hotmail id under which my visual studio is running now the storage queue data contributor role allows us to read write and edit messages inside this queue before we run the application under the local settings.json let's make sure this queue name is removed from the service uri all it needs is the url to this particular storage account which has the queue and corewindows.net inside it because the queue name is being specified inside the add weather data as part of the queue trigger let's run both the applications and see this in action both the applications are successfully running and you can see this is already picked up a message from the queue so let's drop one more message so let's say post click try it out and drop this message as is because this doesn't have any exception this should get processed successfully this message was successfully posted and you can see inside the azure function it was also picked up and processed as expected now as you can see we don't have any connection string inside the local settings.json file and this is all connected using managed identity let's now see how we can deploy this azure function to the azure infrastructure so let's go to the solution explorer right click our azure function and select the publish option now this is directly publishing the function from visual studio if you want to set up a build deploy pipeline i have a separate video that walks through the entire process the video will be linked here or in the description below in the publish dialog let's select azure and select next since this is an azure function let's choose azure function app windows and click next since we don't have an existing function app let's create a new one inside our resource group so let's click the plus button and create a new function app so this allows us to specify the name so let's remove the date let's select the appropriate subscription and also the resource group in this case i'll choose delete let's keep the plan as consumption you have other plans as well we'll do a separate video on the plans and scaling later in this channel for now let's leave this as consumption plan let's move the location to australia and select australia east now we need to select a storage account for this azure function while we were running locally we use the storage emulator but once we publish to the azure function we need to explicitly specify a storage so let's create a new storage for this particular function as well so let's specify all the defaults let's make sure to select the location to australia and select ok clicking create is going to create this azure function app in the azure infrastructure so you can see the status on the dialog here as well so right now it is creating an app service if we switch back into our azure portal go to the overview and go into our resource group which is delete so let's click delete from here that's going to show all the resources in that particular resource group once this is successfully created we will see the azure function in here the function successfully created the function is successfully created so if we switch back to the azure portal and refresh here we can see the azure function and the associated storage account being created here so let's click next to generate the publish file you can also generate a ci cd using github actions workflow right from here which will generate a yml file but for now let's keep it simple and select the publish option and click finish so this will create a publish profile for our application within visual studio connected to the function that we just created once we click publish it's going to deploy this function into this azure function that we have created so let's click publish which will start the deployment process now in a real world application i would be using a build deploy pipeline to deploy these azure functions so that anytime you push the code into your source control it will automatically deploy into these azure functions if you want to do that check out my other video that was linked earlier the publish was successful so let's switch back to azure portal and go into the weather data ingester function that we have created inside this if we navigate to functions we will see the function which is the process weather data this has the trigger from q and it is all ready to be run now before we run this we need to make sure we enable managed identity for this azure function to do that on the left side of the function let's scroll down and select identity you can see this is turned on by default so let's now assign the storage queue access to this function as well so let's navigate back to our resource group go into the storage queue which is the youtube video demo and go to iam again here we can click add and specify add role assignment let's select the same role which is the storage queue data contributor and search for our function so this should be named weather data ingester so you can see this is a function app so let's select that and click save now here we have assigned the identity of the azure function and access into this particular storage account so you can see this has storage queue data contributor and we have the weather data in jester as well this is an app service or a function app with this access setup the azure function should now be able to run successfully so let's go back into delete go to the azure functions under configuration let's make sure to specify the configuration value for this storage account let's click new application setting from our local settings.json file where we had specified the queue connection name so let's go into that and copy this particular value so let's say whether data queue underscore service uri go back and use that name and specify the queue url as well so let's copy this and paste this inside the app settings so let's click ok and click save to save this particular settings this is going to restart our application so let's say continue so the settings is all successfully updated let's go to the functions and in the process weather data let's make sure this is running successfully so let's go into code and test where we can see the details of this function.json and you can also expand the logs down below here so let's expand this which says successfully connected to drop a message into the queue let's run our web application so let's come to the solution explorer right click from here and run this application so let's say debug and start a new instance now i don't have the azure function running on my local machine because of which it will not pick up the messages from the queue the azure function will pick this up normally you would have this web app also in the same environment that you are testing this but for this demo i have just chosen to keep the web application in my local environment i could use the same publish mechanism to publish this app into azure infrastructure so let's go to post select try it out and drop the message inside the queue the first time it drops the message it takes a while to connect because it's trying to set up the default azure credential and validate the whole connection once that is in place this token would be cached and the subsequent request will be faster the message was successfully dropped let's click execute a couple of times which will drop a few more messages let's navigate back to the queue and here you can see inside our azure functions all those messages were successfully picked up if you were to drop a message with exception in it the azure function would also throw an exception to test that let's come back to our web application let's say message with exception and click execute now this is dropped successfully and you can see in the azure function we see these error messages this has again picked it up two times because we had the overwrite for the maximum dq count to be two i hope this helps you to get started with azure functions how to set this up and how to wire this to queue messages now using an azure function you can scale this separately to your web app and also to your background services so in cases where you want to scale the processing of the messages so let's say you expect a load of messages to come in at some time you can set up appropriate scaling for your azure functions i will cover how to do that in a separate video if you like this video please make sure to hit the like button if you have any comments questions or feedback make sure to drop a message below if you want to be notified of similar such videos in the future don't forget to hit subscribe this also helps me to grow this youtube channel thank you and see you soon
Info
Channel: Rahul Nath
Views: 1,949
Rating: undefined out of 5
Keywords: azure functions c#, azure functions tutorial, getting started with azure functions, azure function queue, managed identity azure functions, queuetrigger with azure functions, how to use azure functions, azure functions .net core, queue to azure function, asp.net azure functions, process queue message azure function, azure function and managed identity
Id: 27OUTVdK2_0
Channel Id: undefined
Length: 31min 41sec (1901 seconds)
Published: Thu Oct 14 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.