Azure Queue Storage Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Very good presentation. I always find the different messaging services a bit confusing. Queue storage, service bus, event hubs, event grid, iot hubs and so on.

👍︎︎ 1 👤︎︎ u/simenk 📅︎︎ Jan 23 2020 🗫︎ replies
Captions
hey guys this is Adam and I'm back with another video this time I want to talk about other storage cues simple but very powerful messaging service in Azure stay tuned as always let's start with basics cue storage is a service that was created for the purpose of storing large number of messages it is also currently a sub service of storage account with another therefore if you've been already using blob storage then you already have access to the cue storage itself so how does it work first of all it's part of the storage account so it takes all the best features of the storage accounts like your application security high scalability but one important thing is to note that it can only process 20,000 messages per second only this may be a stretch because that's a lot of messages per second but it's only 20,000 on entire storage account that's the limit you cannot exceed and this assumes a one kilobyte message in this estimation second of all under each search account you can have one or multiple queues usually separate used by your business logic like you can have incoming outgoing orders and returns and process them separately but each queue is a container for your messages and it can store up to five hundred terabytes of data so it's pretty much limitless scalability but a single cube can only process 2,000 messages per second so if you need more take a different approach if you can live into it maybe scale across multiple queues or maybe even multiple storage accounts it's up to you under each cue you have messages and the message is pretty much data in any kind of format JSON binary text CSV etc etc a single message can be only up to 64 kilobytes of data and you cannot exceed it it's a hard limit but it's a good limit because messages should be small it should only send key information and pull everything else from your database or any other data storing solution additionally there's something called time to live so when you put a message on a queue by default if no one picks it up it will be there for seven days after seven days it will expire and get deleted automatically but you do have an option for non-expiring messages you simply said that TTL time to live to minus one although it's not the greatest approach always reconsider that so why would you want to use the queues let's take a simple scenario where you have a storage in your SQL where you store your data and you have your applications where let's say for each of hundred users you're gonna have one instance of your web application just to scale up and when new users will start coming you will Auto scale with another application but during heavy period traffic's you might encounter some issues maybe your database is too small maybe your back-end API is not good enough you will hit those blocks and your application will start failing how can you handle that scenario without simply increasing the database because you could always as they call throw a hardware of the problem but that's not the best design you can actually solve this challenge without paying more by adding a simple queue here and outputting all the messages to the queue and then having some function handler or any other API that will grab these messages from this queue at very steady rate that your database can handle this will pretty much flatten out so-called load level your application workload so that you can actually have everything like your dress had introduced very small components that don't cost more and pretty much get the scalability that you would want of course you would see some slowdown on the request for the users so definitely something that you should consider one of additional scenarios that you can have if Cuse is something called fan-out so you have your user sending request a web application web application is putting messages in a queue but what is also here is that you can then create functions that will scale depending on the queue size and pick messages separately and process them in parallel this is so called competing consumers pattern and it's amazing way to scale your applications separately and is one of the core features of how serverless in order actually works but let's talk about one of the best benefits of using the queues which is retries if you have your web application in a queue and your web application sends a request to get the message from the queue but the queue service dose is put this message as a hidden and then returns the message to the client also with something called pop receipt so receipt for the client so that this client is currently processing every other client application cannot see this message it's hidden from everyone else but it's not really that it's very important to note that then when your app web application processes message and it would encounter an issue what would happen then well in this state the message is still hidden and this web application never processed the message so we have an issue but kills solved this issue by having something called visibility timeout so after certain period of time this message will be visible again and at that point your web application or any other web application connected to this queue can get that message again and put it as hidden get it back get the pop receipt again and process it and if you process it properly you can then send a delete request to the cue with the pop receipt that you got previously and then and only then message is being deleted what's important here notice that you actually have to respond back after getting message that you process it successfully and until then it's still on the queue is just not visible this gives you very highly resilient architecture which allows you to create applications and retry multiple times because you know when programming something might happen not only your code but also cloud in cloud you always can get failed requests because all the services with another have some sort of SLA by having this queue you are making your architecture more resilient by very low effort and pretty much no cost at all so definitely one of the best things about queues what are the support authorization methods because you are using storage accounts you're pretty much getting the same options first of all you have account key this is like a password for your entire storage account including the queue service you can also generate shared access signatures so define what kind of actions on which API which IP or which protocol can it connect to and how long does the token is actually valid for so that you can get much more granular with your accesses but the best way is using Azure Active Directory so you can create those service accounts or give actual users accounts auerbach roles which is role based access control those roles currently therefore built-in storage queue data contributor reader message processor and message sender as you can imagine contributor is like an admin can do everything on the queue reader can only read messages but cannot pop them so it cannot process can only see what messages are on the queue message processor on the other hand can read everything from the queue can process the message and take them away so delete the messages but cannot create any messages and on the opposite side message sender can only send messages but cannot read or process the current messages on the queue with this you can actually get very granular and this is very very secure I would say other active directory is always recommended method when it comes to security in any kind of applications if your service in order supports it always and always use it before we go to demos let's talk about some key information that are very important every single operations within the queue is atomic so you're always sure that your get message is end-to-end transaction that cannot be split but when it comes to transactions itself the queue service doesn't support those so what this means you cannot for instance send 10 messages and be sure that all ten were committed you can only send message one by one and if you send twenty let's say 19th can fail and it's not gonna roll by those twenty that you sent so there's no transaction across multiple operations and there's no batching like that so if you need that definitely look different services like service bus and also there's available as the case you have dotnet Java pretty much most of the popular languages have SDK for storage accounts and for the queues so definitely check those out so you don't call the API from the ground up and the demos today I will show you how to create a queue service how to manage the data using storage Explorer and the portal and then we're gonna do a very simple fan-out demo where I will create two loader gaps and a queue service and I will chain those together and show you how can you actually put the date angle to blob storage in parallel using multiple logic ups from the same queue and of course in the diagrams you don't really have to drop those usually do you draw it like this so let's go into the demos let's start by creating queue storage let's go to create resource and here don't type Q you actually go to the storage account and you need to create a new storage account I'm gonna create a new resource group for this queue storage introduction I'm gonna call my storage account a.m. demo storage a one I'm gonna choose the North Europe a performance for me will be standard in version two everything else as default hit create validate selection and hit create again once the storage account was provisioned you can go to the resource and you will actually see the classic stuff when it comes to storage accounts and one of the four services that you will see is a queue service when you go here you like she don't see much you can actually hit button add queue to create new queue I'm gonna call it messages hit ok notice that when I created messages a new URI was created with the address of my storage account slash messages you can open that queue and as you see there's not much you can do here you can only add messages and DQ messages and clear entire queue this is the first place where you can action manage data and see what is on your queue you can actually hit MIT add message and here you need to fill in the message text like demo and you can put what is that default TTL which is what is the expiration date by default seven days and you can encode the body in base64 or not if you want it okay and you can add couple messages like that you can the add them message of the same text if you want it doesn't really matter at this point you can DQ the message if you want to remove something or clear entire queue so this is the first part of interaction with the queue if you go back you can also review it from the storage Explorer in the portal you have storage Explorer or you can download the desktop version I always advise to download the desktop version because it has a lot more features and it's much quicker to work with but you can open in the portal the queues messages and pretty much review the same stuff like just before demo message and almost the same experience that we just had a second ago so you don't really get much more from the queues for the star explore you get much more for other services for the queues it's very simple in this case let's close this one and let's start creating the loader gaps so we need to logic gaps right now so let's hit creed resource it logic up let's hit create call it logic up one use existing resource group in this case our Q storage introduction in north Europe as well hit create and we need another logic UPS I'm just gonna up create both of them at the same time logic up hit create call it logic up to and also within the same research group for the Q storage introduction hit create and we're good if I will go to my storage intro research group I can refresh here and they refresh sometimes takes a couple of seconds but if you don't want to wait you can hit on this notifications and go here to the empty workflow and hit go to the resource it will take you directly to logic gap once you created the load gap let's start from the blank one since I want to create it and started using this Run button here I'm just gonna go to all and type request or actually use it from the most common ones and when HTTP request is received this will pretty much create a public URL for me to call but I also can't call it using run button because this is just any kind of trigger for me it doesn't really matter so what we really do want to do is add a message to the queue so let's take here to find the connector for the queues and inside of the queues as you see there's couple of actions here but what we want to use is the put message on a queue it will ask me which storage account do you want to connect you to create a connectivity so queue connection for the connection name I'm gonna choose aim demo storage a one hit create this right now sty saves the credentials to my storage account to the queue and now I can choose the queue that I want to put the messages on which is the messages and give it a message I'm gonna call it demo and maybe up and something dynamics I'm gonna choose up dynamic content expression and type Rand for some random generated number from 1 to 10,000 and that will be the message that I will output there let's hit save and let's hit run in a second tab I will also open our report all so that I can edit the logic up but at the same time look what is inside of my queue as you see there's logic up right just successfully so we can now go and review the messages on the queue so open our resource group open the resource group that will country working on go to the storage account and as I said you can either review in the queues or the storage Explorer open the queues messages and as you see randomly generated message I'm gonna clear and Turkey right now so let's do a bit more challenging demo shall we let's open designer and let's add a very small loop in this case I'm gonna use loop which is the control action because I want to do a loop of ten iterations click on the control and do for each in here you need to specify what is the object or you can simply give it an expression with the are a list this pop-up is not the best but what I did is a very small array with one two three four and then put it inside of that for each and simply hit save you can now run this and if this works correctly it will iterate four times over this array and send four random messages to our queue let's see the result as you see it run successfully you can review each duration and review what kind of message was randomly put on our queue and you can then refer to the queue to refresh and see the messages so we have the first part done we have now logic gaps which are currently outputting the messages to the queue in a loop so now we need to create the second logic app I already created it so I'm just gonna close this go back to the resource group and use the second logic gap to trigger based on a first logic up output on a queue so I'm gonna open it I always start from the blank I'm gonna type Q but it's still gonna be visible on the recent so hit Q and as you see when a specified number of messages are in the given Q or word there are messages in the queue there's not much difference between the two in the first one you can also specify but on how many those messages you want to wait for so when there are message is an acute trigger I want to use my messages since the connectivity was already saved I want to check every 10 seconds and mostly because I want very quick demo and that's good at this point I'm gonna save it just to show you something interesting and then I save this logic up and I will go to the logic up overview and hit refresh because notice it run four times so it actually detected those four messages on the queue if you go into each of these messages you will see that a message queue was checked and it took that information including the message text so it's already getting the body of the message but interestingly if we'll go back to the queue and hit refresh you will notice that messages disappeared but at the bottom you'll see showing 0 out of 4 messages in a queue therefore those messages are still there because we never actually deleted them after taking them because we need to send that delete request with that pop receipt that we got otherwise after 30 seconds which is the default visibility time--all that will reappear on the queue therefore when you're designing and logic up like that you need to go back hit edit and add additional step in which case you need to go to queue and then delete message action say select messages queue you need to provide a message ID which you have from the first step I need to provide a pop reseed as I said we also need to add that so let's save that in and let's go back and run our first logic gap so let's run the manual trigger and let's hit refresh a couple of times as you see the messages the new messages are running in the NICU showing four out of eight messages and I shall feed refresh showing a dollop of eight so as you see they came back to life hit refresh after a couple of seconds the second logic up should run so we can actually go and start reviewing their history and as you see it did process eight additional messages if this succeeded if we go here and refresh I see there are no messages NICU that's definitely very important behavior of how cute work with other services you need to be aware of that and you need to be prepared that you will be processing the same message twice if something happens so make sure you don't add to database you should always merge with existing entities if you find any etc etc make sure that logic is resilient enough that you can actually reprocess the same messages so the very last thing that we're missing is this saving to blob from our demo where we correction gonna show how to process multiple messages at the same time and output them as a file so if you go edit add a step in the middle add an action type blob head as your blob storage create blob to the same storage account blob connection hit create so let's grab the message that we're sending from the queue and save it as a file right now we don't have an eco name container so we need to go back to our storage Explorer and create new blob container called demo and inside of this container we're gonna store our files we're gonna name that file randomly we can for instance use such id dot txt and then use the content so we need to click see more and get the message text to use it as a blob content let's hit save and rerun the logic up one run the trigger and go to the storage account go to demo container and you can start refreshing because after 10 seconds we should start seeing four blobs here start as you see we got four more files you can open the contents demo this is how easy you can from one logic up span across multiple audio gaps which will process the messages in parallel and pretty much connect to external services making very highly scalable architectures in your applications I think we can all agree that storage queues are very simple but powerful service that allows you to do very cool stuff with your applications there are critical part of any web application and micro service architecture in ever so you should definitely use them for today that's it it thumbs up and leave a comment subscribe if you want to see more and definitely see you next time [Music]
Info
Channel: Adam Marczak - Azure for Everyone
Views: 22,841
Rating: undefined out of 5
Keywords: Azure, Storage, Storage Account, Queue, Queue Storage, AQS, Load-leveing, Retry, azure storage
Id: JQ6KhjU5Zsg
Channel Id: undefined
Length: 22min 25sec (1345 seconds)
Published: Thu Jan 23 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.