Develop super fast micro-services with Redis cache

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello guys it's kris here again so today i'm going to do a video a little bit different little bit different tutorial uh most of us somehow managing to get the backend services in the front end and everything up and running but the problem is sometimes users disappoint because they don't see the speed the performance from the system what they expect sometime when they click some grid it takes like hours to load it may i mean not the exact hours but at least uh like 30 30 seconds or something to load and that is not acceptable right usually any data should come the first bite should arrive like within a few seconds like one or two seconds and it is a data grid ideally it should load within like three to four seconds right but some time this worked fine as soon as we developed and deployed but when the data collected on the back end and when we have any dealing with a large number of records sometimes this doesn't work right so today i'm going to show you how we can improve these performance improve latencies and the response times using caching especially reddish right but today i'm not going to do like step by step walkthrough of coding i'm in a code along with me i'm not going to do that i'm going to do that for the specific uh area where we do the caching and those things because that is the way you can easily remember but the things like um creating these services and getting this data and those type of things i'm not going to do one by one because i don't want to waste your time by looking at doing this so i'm going to anyway publish this uh code on the github so you can take it and try yourself but meantime this is a basic nesjs application uh where i have some backend service right so though i'm demonstrating this with the next days you can use this with any language you want if it can be with the node.js it can be with dot net it can be the java you can use with anything because the concept of the fundamental is um same right if you can find some redis client for your language yeah you can do the same thing so why i am not doing like code along with you from the scratch creating application and everything because rather i want to show you the problem and i want to show you how you can design a solution to that problem right so rather i am just designing a solution and telling this is the solution i'm going to take you step by step through design the solution how you can design a solution to overcome this type of problem right so that is the way i'm going to spend my time okay so i'm going to talk about mistakes and wrong direction usual developers goes and how to do in the right way and everything i'm going to discuss solving performance issues is not just a single puzzle to solve it's not just not a single problem to solve okay now the performance is everything is fine because you need to find where the problem is probably uh it may affect based on where you're grouping data and where you do the calculation right where you do the sorting and so many places this can go wrong and also where where you find the data whether it's a database or a third party api so many places so this is not just a one video to tell you how to improve the performance this video we are going to discuss how we can use the caching right let's get the one step to improve the performance if still you have a problem then you need to go further so in the upcoming videos i'm going to show you how to do the pagination how you do how you can do the searching using uh caching and kind of some advanced topics i'm going to do in future videos so if you're not subscribed yet this is the right time to subscribe and also make sure you click on the bell icon so as soon as i do a video you will get a notification and also if you learn something from this video please give a thumbs up or add a comment saying you like this okay so let's start so i have something to show you um so let me just plug this okay right good so here i have simple nsgs application okay so we need few dependencies here i already installed those if you go here you can see uh i have nest js module io redis right and also io red is installed because io reddis is the um really nice library a client library you can use to deal with radish and also i use xcos to deal with the backend services right so this is my use case i have this uh uh this module right called order module right so this one and then uh you have all the controller right so controller you can send a request with the order category right you can specify order category it's a get call then it's supposed to return all the categories right so what it does is it go here and it invoke the backend service right and it get all these orders right so now this is this this is a mock service right you as you can assume this is your back end or you this is your erp or this is your some sort of a system where it has the orders okay so now this remote service will deliver orders for you so what you need to do is you can send this to your front end let's say fundamental application okay so let's run this and see how perform this right so this has huge data load okay um so let's get this okay so now you can see here i am going to run the uh this one okay so now you can see we got bunch of results right so you can see it's been 1.9 seconds it's almost two seconds right it almost two seconds okay so let's go and put this into a json viewer right and format and let's say view right so you can see 47 different orders comes into a response right so now just assume you have an angular application to show these on a table so let's make this little more complicated so we can have a uh that will help to our future videos right so now this is if you take one order so this is the one particular order right if you get this one so this e this one is the particular order right so if you extract one right this is the particular order okay right here you have what a number zone and everything and then you have a stakeholder information which is who is the uh stakeholder which this always belongs to and also what the lines in kind of items what are the items these uh orders carry right so you can have one item right if you get one item it has a delivery zones or the numbers item codes and everything sales prices discount delivery date and everything right and then it has some other internal objects and so and so forth right so this is our uh kind of a data model what we have let's assume let's assume to make this little complicated to help other future videos uh let's assume this is something like this okay so we need to group this order we just not to show just in the table we need to show this we need to group this order let's say by um let's say you go some take some properties to model i okay so we need to uh group this order by store location right and also you have some configuration right i need to uh group by configuration as well right i need to group by store location and also i need to group by uh configuration and also by the of course order number right because there are multiple order numbers can come here right so this whole number ending with the one one six one and this is what number either one six nine four that will multiply the numbers right so we are going to group this with the order number as well as our uh this store location as well as our configuration right so we need to group this three so usual approach might be when you see this okay this response come in 1.9 second that means two seconds let's get this response to the ui and do the grouping right but the problem is if it is like 10 20 records that's fine okay but if you have to deal with the 500 or 1000 or 2000 records then you take in this entire payload to the browser and grouping inside the browser is create problem the reason for that is because browser don't have a real computing problem right because browser depends on the client machine you don't know what the combination configuration is so therefore it means not much efficient you use the client machine to group this group this and do this whatever the the logics running on the client side uh it's not good i'm not saying you should eliminate that right you can use in a certain level but doing this type of complete complex grouping into the client side may create some performance impact right also if you're dealing with the 500 and thousand records client may ask in the paginations right so each page you should show in the 25 records or something like that like you click here you click the next page next page next page next page right so in that case most probably you have you heard that's saying uh if you want to hide a dead body you should go to the second page of the google have you heard that okay so likewise if you bring in a thousand items right if the 50 items per page most probably user may not go beyond third page right so it's a kind of waste of time you bring entire thousand records and do all those grouping into all the pages because user may really not use that right so therefore it's ideal to do this in a packet okay so let's you get this 1.9 second so i'm going to send it again right and now it's 1.4 seconds because probably the server side also the back-end side also may have little caching involved right it's written in 1.47 right so if you take this and if you start grouping on the client side it may consume like a total process may take 3.5 or 4 seconds but again when you have more and more and more records this go further high okay so this is the way we are going to find the solution first let's talk about how we can design a solution for this in our problem what we have is we get a bunch of data to the front and then grouping in the front end first we need to solve first problem one problem at a time right so let's solve the grouping problem so what i want to do is rather than sending these entire payload to the front end i want to group this how it should show in a table in that case front end just need to pull this data from the back end and show in a table so that will be super fast doesn't matter what the client performance is right it can be in a very slow computer can be high-end computer doesn't matter because it doesn't do much processing okay uh how we can do that so let's stop this and go to what a service and i'm going to create because this is the way it get the data i'm going to create private method group data okay i'm going to create a private method to group the data here i'm going to get the data as a array right let's say for now just mark it say any okay here what i'm going to do is for the grouping i'm going to use um [Music] low dash okay grouping error can be do in multiple different ways you can write your own mapreduce and that is one option but here i think i'm going to use a low dash here's this project i haven't done any pre-work like i don't have a test project usually i have a test project so may things go little wrong but let's do let's say it's all a real problem how to do the real coding right because in the real world you don't have a test project to run right so here also i don't have a test project and let's do the grouping together and see where we go okay first i'm going to use a group by function from low dash as you can see it's imported for the low dash so group by i'm going to get the data and also let's say something like this let's say make this as orders it is better if we create uh orders object dto probably we may need that if we need it let's create that right and i'm going to get um no first we need to map this because the orders we can see here you have multiple orders right so you can see here it is multiple order so let's map through the orders and then group okay so let's say odess dot map so see this has to be an array because we expecting to uh order this we expect it to run iterate this right or a dot map all right so i can get a single order and this order i can do group by over the dot i can say group by and for that i can pass the order right and also um no not the order but order uh i need to pass this order line okay i need to pass this order line so since we don't have a dto let's write it in this way otherwise you it will not let you to write because order doesn't have an order line object right because we don't have a dtu i can get a single loader and i can write the function here so now what happens is so how the group by works with the load dash you can give the collection and also you can give the logic to uh solve this right logic to group this so i'm going to return um order key right i'm going to return over the key like this i want to group by uh order number okay for the line number and also i'm going to mark the pipeline what else i said um store location right and also i said something like that item.config.name item.config.name order item right okay so now this is how i want to order this right for here what i do is to make it easy let me just order line right because we take away the light so what i'm going to do is i'm going to give the order order lines here to entire order lines here and then i'm asking to group by order number and also store location also by name right so let's go step by step see how this comes and i'm going to assign this to um right group orders and then i'm going to return this so for this one what i'm going to do is i'm not returning here const data come from the back end and then i'm going to return by sending this to group data right so you just pass data okay so this is what i'm going to do i'm going to get the data from the back end and then i'm going to pass this to group data and take the results and return okay so this is my plan so let's see whether this comes the way we need to we whether we need way we need in the ui okay good so let's run this okay for some reason it comes all null we'll see why it should be something small so easiest way to see what the problem is now you can see my logic doesn't work right the easiest way to see what the problem is you can see here i'm getting bunch of nulls then mean is going through my logic right so that's why it return some sort of a group result but the problem is for some reason whatever groups object doesn't return okay so the reason is i think because i don't have a written statement right either i need to remove these curly braces either i need to put the return statement right so let's remove this curly braces and see what happened okay so if i release that curly braces then i need to return these two right something like this okay so now let's run this hmm it worked right it worked but i have a problem here uh how come this is order number come 102 okay now the order line number it should be the order number okay it should be the order number or the order line number okay so now you can see i'm getting group results right group by what i'm grouping from um [Music] it's older number and uh store location and and some configuration right to see whether this is working properly i'm going to add some one more um here so some one more uh grouping right right so let's see if i add one more what's going to happen okay so not the store location i'm going to add one more property here okay and order line dot delivery location dot name right dot name okay so let's see now i added one more uh grouping let's see how that works okay see what is the how the key looks like now and i'm going to add one more so now i got internal server error i cannot read the property name i created this property wrong so let's run again right orderline dot delivery location [Music] dot name it should correct right okay so i think the problem is what i did is correct but seems for certain uh records you don't have a delivery location right so this way you can eliminate it if it is exist you can use it otherwise you can ignore it right so see the key okay so now it's added more keys right so now let's go back here and put this into our json viewer and see how it looks like okay now last time it was 47 still it's 47 right so now we solve one problem actually part of one problem right so our goal is to group this data so then ui can take it and show in a table itself so ideally in order to show like that single date a single row represent by the single element right so that mean this supposed to be 47 rows right but if you open this you will see each these object carrying two different groups right so because what we have group is by order number right because now we have 47 orders here right you can see the order number is different from each group but within the order number we added these different addresses configs i'm not sure what those mean but for demo purposes i grouped from these fields as well right so now in order to show this without single process from the front end this should be come as a single array right hope you're following me so we not like this group by each customer order each order number but it should be just a rows then only ui can take it and group it okay so now what i'm going to do is i have this group here right so whatever returning from this and i'm going to map again this each order and i'm going to use keys right object dot keys here because why i am going to use object dot keys because you can see here right each one is represent by key right so you can see here it's just only key right so in that case i can take this object and get the key okay object.keys and then i'm going to map right each key all right so here i'm going to get the key and then i'm going to return this as an object right so how i'm going to return this this will be the key right and then i'm going to send us a data as um let's say let's say order itself okay right so it looks like i did a small mistake key it should go within other records right now let's see how this looks like okay i hope you understand that i can write this entire logic and just show you the results but i'm going to show you how you can solve uh each and every uh problem you get one one problem at a time okay so now if you get this still start the final results right still it's not the final results don't worry we are going there so now see whether we change the groups okay so now if you see here still you have a 47 right but what we have done is each group we split out to this group keys and the data right inside the data you still you have this one right we can fix that right still you have a group and the data right to solve this part what you need to do is rather we sending entire object right we have we can say hey give me the object which just has the key in that case we can eliminate the internal object right so let's see that so we can we should remove by now because we are going to fetch this entire record up to here okay so now uh with this resource that that eliminator right still i'm not going to show that because still it will be multiple loaders right so now you would see if you get this right so if you get if you sorted by this particular key this object right still you can see these uh objects like this okay let me to show you that would be easy right so now we still we are not done properly still we have 47 right but now you have a key and values data separate right you have a key and the data separately so now it's just a matter of take this thing outside the group right so that means we need to combine these arrays right now we have an object array so we need to now combine this array as a single object single array right so that is pretty much easy pretty easy to do that so how we can do it okay so now merging these two arrays is pretty much easy i'm going to use the array concat method concat and then i'm going to pass this empty array and then i'm going to place this entire thing inside here right so rather than i'm using array copy or so i'm going to use the sprint syntax to here right so now i merge all these array together i don't want to uh explain that right so this is which is simplest way you can merge arrays elements into a single array right always add a copy all right so now let's see when we copy this how this goes here as adjacent right so now if you see this one now you can see you get um 123 elements so now you can see it's very easy right it's very easy it just one object represent one row right you can just take it to the front end and just a display on the table right so that's very easy now so now in the optimization we solve one problem what is the problem so front end used to group the data so now we don't have to do that we get this giant payload and group and also merge the groups together so then a table row each object each is just an object will represent each row right so now what the front they need to do is take it and then exit merge the bind into the table right it's pretty easy but we introduce a new problem right still we have a good number of time to execute this process right so this is 3.45 seconds why because now this now not just taking from the back end now it's also uh uh grouping within the backend itself right so that take more time so now it usually take three three point four five seconds okay now is the via caching come into the picture okay so to do the caching we need to use a redis uh for this one so you go to the module here right for um this one i'm going to use the radius inside the util service rather than just taking here for that here we have youtube service we can use this to implement our caching service i'm trying to mimic the real production situation as much as i can and then we need to import the radish module to the app module because it belongs to that module right so here you can import radish module dot forward and it's config and url so i'm going to use the local radius here localhost 63 so okay so that is my radius configuration right it is pretty much easy if you want to try out this just pull the radius container and run on your local right i have a local uh radius running here okay so that's pretty much it so now we what we can do is uh we have this app module imported to our old module here right right so you use the forward drive because app module need auto module and auto module refer the app module it's creating circular dependency so therefore you need to specify the forward reference to exactly tell how what is the order this need to solve okay right pretty much good so now what we need to do is here we go to the util service we are going to implement the uh caching right so here in the constructor i'm going to use uh inject radius right so private radius and it's a type of radius okay and then i'm going to create a method to async uh cache list right so i'm going to have a method to cache the list and then i'm going to give the key and the key would be string and the data it can be any type of array okay ideally you should use the models here but i'm not going to do that right so i'm i will do uh two different ways to caching right i will do how usual usual engineer implement this then i'll show you where and why that is bad and how the right way to do that okay so since i'm getting the data so now this is very important to choose how you use the radius what is the data type you use on the radius okay most of developers mod of engineers i have seen they just use the string i mean they take the payload and json.stringify and just push the payload back to the radius that is not good always let's say in our case right let's say you get entire this grouped payload and you do the json stringify and you can save it in red is that's perfectly fine it's working right when the ui request that you can take it and do the json passing and return it back but the problem here the next video we are going to discuss the pagination right so in that case you have 123 objects now after grouping you need to return the first 50 the second 50 and the third 50 like that okay so if you store the entire payload as a string with the json stringify the problem is you need to always take it take the entire object back into the memory and get the slice the first 50 and send to the ui when the ui asking hey give me 51 to 100 you need to again take entire payload back to the memory slice it and give the 50 to 100 right every time whenever you you are asking something from you you need to load entire thing back to your memory so that is bad that has a performance impact because every time you do this so therefore if you need to use part of the object it's ideal to save as a list okay but also if you want to keep the object behavior itself for example you want to not to get the entire object even right even the light order you want to just get the order name and the destination or the number and the destination in that case it is better to use as a hash why because you don't need the entire object you just need the property of object right so therefore you can use the hash hash table or the hash data type since our case we i want to use entire row separately so in the next video we can do the page ratio right for that i'm going to use the list right most of the time i have seen people just take it and do the json stringify and dump it why they do that because in the radius right this is a key values too right i will show you in the uh one of the next video how you can really do a trick with the radius right some some tips and tricks you can use the radiation effectively because you cannot query the radius for example if you store bunch of employees you cannot ask hey give me the employees who belong to this department right you cannot query and buy with the native radish you cannot query hey give me the employees who has more than 45 years old you cannot do that right because radish doesn't work like that but there are some tips and tricks you can get that work anyways for now let's do the list right i will do the wrong way first and then the right way so for this i want to use the uh i'd rate this so i'm going to do data dot for each right and i'm going to get each element right and then i'm going to use this this dot redis dot l the r push right i'm going to push from the right side right i need to use the key and then i want to do the json dot stringify the element so now you would ask you said doing the json.stringify is bad then why did you do the same thing right because here i am just string you find the row not the entire object right i'll show you in the within the radius itself right you are just stringifying the row right but let's say you have use case you and you don't need the entire row you need just need the old number and the price and the discount or something like that then this is bad then you need to go for hashing okay fine so let's now do this and this is should make us a weight right because uh this is a asynchronous function and for that you need to make this also in async okay good so now this is fine okay this is fine but the problem with this if you have a 500 rows you are sending 500 different commands the radius and each time you wait right each time you wait so this is little problematic why because even the caching take more time because i mean if in reality i mean you can say that this is a lightning fast but the problem is in reality your service where you run on the kubernetes cluster need to talk to the radio server right and talk to the ladies and give this data so it take network latest you cannot eliminate that right so now if you have 500 rows you're trying in the 500 times to eliminate that what you can do is we have something called radius pipeline right so what you can do is you can do pipeline equal this dot redis dot pipeline right and then here without using this dot release dot our push you can just say pipeline dot push our question right so after you done everything right so you can tell await this dot that is dot sorry you can say pipeline dot exec right so now this will insert the data into the radius okay so you understand what i have done so now we don't run if you have a 500 000 lines you don't run all thousand and every single time why because you you don't contact radisson thousand times rather than you create a pipeline and then you let manage let the pipeline to manage your data okay this is good so now sometimes we need to see these are caching right always with the cache you need to be very careful because cashing is not valid all the time these orders may create very frequently right so therefore you're cashing these orders for two hours three hours long it is not good okay so therefore you need to make sure this cache expires automatically ideally you can keep this like a 15 30 minutes but here for the demo purpose i want to expire this within like one minute okay so for this method i'm taking other parameter called ttl which is time to leave is a number this is my expiry time okay so after i add all the records i want to pipeline dot expire right so the same key i want to expire in given tt here right so why i'm going to set some ttl number some expiration to this uh caching right so you can make it optional if you want right if the ddl exist and something like but in that case i want to expire so therefore i'm going to use the ttl so now i have a generic method where i can use for caching okay so now let's see how we can use this in the service now let's go back to our order service right here before we go to the back end i want to see is there anything available on my cache okay for that what i'm going to do is i'm going to create a new method here to create from the cache okay so i'm going to read async read from cache let's say read list from cache itself that you can have a genetic method key i need to need to be a string right so a weight i'm going to handle everything in my uh front-end side the calling side await this dot redis dot l range right i'm going to read from the left right i'm inserting the right but read from the left and you understand this is basically basic data structures key and i'm going to from start and the end right that mean i can if i need to read the entire list oh now you can see what how i'm going to do the pagination right or i can read the part of the list right and now right so now i will show you on the next video why we need to this as a list itself why we cannot i mean why we shouldn't use the uh just to just in this stringify and use as a string anyway so now i can read it okay so now i come back here first i'm going to do is i'm going to see whether i have a cache data right cost cashed equal so i'm going to get the youtube service here private cash is coming from youtube service right so what i'm going to do is this dot cash dot read from cash okay so now the key this is playing very crucial part in the caching right so this is where most of developers doesn't think if you have your own cache for each services like a side car totally fine but most of the time that is not a good design having a too many side cars to your service is not doing good right so they are for most of the time to get the real benefit of the cash if you need to use the distributed caching or as something like a radius or elastic or some some sort of a hosted managed caching service centralized caching the reason is your service when the scaling you may have a multiple services let's say multiple order services let's say you have a hundred of um all the services running on your kubernetes cluster in that case if one not a service cash others order services should be able to use the same cash otherwise no point of everyone catching their own managing their own caching right so therefore it is not good to use the site of your private caching so the problem when you use this type of centralized caching oh one many people may use the same cash key in that case one person's cash may already buy some other service one service may overwritten by other service to avoid that the good practice is you need to prefix your service name okay so what i'm going to do is for now right for here this should be ideally coming from your parameters or configuration or somewhere but for here i'm going to use the key right i'm going to use the string templates to generate the key so key is od service right so now it is good to tell what you are going to cash over the service orders right so i'm going to cash the orders how you're going to crash groups right so now which group right i'm taking this category name from user right user sending this category i want to have accessories category orders i want to have a consumer category order i want to have a medical medicine category orders something like that right so then i'm going to use the category also as a part of my key right so category right so now you have a key which is unique to you right you have your service you are telling what you're caching and then you are telling how you're caching and also then you are telling what is the key for the caching right so this is good because and also in in upcoming videos i'm going to extend this key right you're going to take something from this here and do the other catching because i told you i'm going to teach you how to do the searching through the cache right so in that case uh we are going to create some caches using this main cache for now this is fine so here i hope you get a clear picture why you need to define the key like that if you're not just comment below with the next video i'm pretty sure i'm going to explain this i promise okay right uh key and for here i'm going to fetch everything for now okay for today i'm going to fetch everything okay so i want to see whether i have data in a cache so this has to come as a wait and let's make sure [Music] right so this didn't return anything okay that's i think i made a mistake over there so let's go back to the service and i'm going to tell if cashed and cash dot length why i'm going to check the link because there can be situation i get a md array right because since i'm reading an array right if i have caching then what i'm going to do is i'm going to return because now you need to you understand what we have is a json string right we use a stringy file right so when you return into the back uh the front end again you need to make sure you're converted back to json right so therefore i'm going to go through the cache and take the each element and convert this to json right and i'm going to return this right if i have a caching this is what i'm going to do right so if i don't have a caching right also i'm going to create a log here console.log cache found okay ideally we should send the headers like the catching headers what catch me say this right is when i don't have a caching what i need to do what i'm going to do is i'm going to take this data from the backend right i'm going to take this data from the back end and i'm going to cache this right so here i'm going to miss uh saying cache miss i'm going to log saying cash mist cash missed and i got the caching from back end this is the data from the backend and then i'm going to group this data also here right but i'm not going to return this i'm going to say construct group data is this right and then i'm going to cache this this dot cache dot at least so i create a cache and what i need is a key key is my key what i created about here right and then my list group data okay so now this need to make sure this is a weight because uh this is um okay so i think i have this variable somewhere else group data okay so uh this has to be a wait because this is the async call see what it's complaining about right so ttl it's expecting so i'm going to set the one minute right so now i'm going to cache this and i'm going to log saying cash success and then i'm going to return group data okay so now this is what i'm going to do okay so here what we do is we read from the cache if there is nothing in the cache we go to the back end and we take it and then we cache it and respond to the front end right okay cool so let's run this and see how this works right so now you can see 3.45 seconds stake right so now i'm going to execute this see how long it would take 2.51 seconds so now you can see here cash missed because there are no caching right but if you see uh a tool here i've used the midis and you can see there is a one key right so new is the keyword i'm sending so you can see here here you have 99 and then if you go to the next page you can see 123 right so if you execute the queue again 2.51 seconds so now the next one ideally come from the very first 91 millisecond right so early to 20 um so sorry two seconds right that means 2 000 milliseconds now it is coming in 91 millisecond this is lightning fast right so now i mean i understand it take more than this time because it's rendering here and everything but the real time is 83 milliseconds right so but the point here in one minute this cash going to expire right let's refresh okay now it's expired because i set it to the one minute right one minute for testing purpose and now you see you can you would say it's here no nothing here now this time it again going to take it again going to take more than uh one second because it's going to miss the cache right so you can see here it should print cache mist right because there is nothing in the cache now we take longer time right you can see here it take two seconds arriving 2930 milliseconds right so it's a cash missed but next one it's coming from the cash now 2900 milliseconds now 178 milliseconds right so you can see there are huge difference between original query and the new query right so with the original query if we doing that in the front end user need to wait until um three seconds at least to see the everything in the grid right so if it is not grouped by the front end probably four seconds or five seconds right because the the client browser computing power is always lesser than the back-end computing power right so in our problem original use case you take into the front end and group in the front end maybe five seconds now since we grouped from the back end we sent everything to the front end it's three seconds but now since we introduce the caching now it's 84 milliseconds that means one tenth of a second right so it's a very very fast right so now let's say in the next video we can go further use this further speed by adding a pagination right so now user will really like to work with application why you click the button say you have the data right okay so in next video i'm going to show you more detail here and if you have any questions regarding this video please comment below or reach to through my facebook page i'm happy to answer for all of your questions if you ask immediately okay and thank you very much for watching and if you're not subscribed yet please go ahead and subscribe and click on the bell icon as well stay safe take care
Info
Channel: Krish Dinesh
Views: 809
Rating: undefined out of 5
Keywords: krish, krish dinesh, krishantha dinesh, codelabs microservices, codelabs, redis cache, redis nodejs, redis cache spring boot, redis spring boot, redis docker, redis as message queue, redis as message broker, redis authentication node js, redis app, redis and react, redis cli, redis cache java, redis cache spring boot example, nestjs redis cache, nestjs redis queue, nestjs microservices redis, redis with nestjs, microservice redis, nestjs redis, nestjs, nestjs tutorial
Id: Lmom8_CDKj0
Channel Id: undefined
Length: 52min 34sec (3154 seconds)
Published: Tue Nov 09 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.