.NET 7 💥 - ASP.NET Core ElasticSearch, Kibana & Serilog Integration

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
foreign thank you for watching this video I am Muhammad and today what we're going to be discussing how we can actually integrate elasticsearch and Cabana into our.net7 application we're going to be seeing how we can actually configure elasticsearch and Cabana Locker on our environment as well we're going to be seeing how we can actually connect them to our dotnet application and we're going to be seeing how we can actually propagate the logs from our dotted application and basically seeing the main kibana if you like this video please like share and subscribe it will really have the channel as well if you'd like to support me please consider supporting me on patreon or buying me a coffee now with that said grab a cup of coffee and let's get started we're gonna start first by creating our web API so inside our terminal what we're going to do is we're going to put dotnet new web API and we're gonna call the sample API straightforward just to take a few seconds perfect now I'm just going to navigate to sample API and I'm gonna open the visual studio code so now let's open inside Visual Studio code I'm gonna basically go to my terminal open a new terminal and basically dot not build and now we can see application built successfully and the first thing that I want to do is I want to install some nuget packages actually before we do that let us just initiate our elk component and basically what I want to do here is I want to have a full elastic cluster available for me on my local machine as well as kibana and in order for me to do that what I'm going to be doing is I'm going to be creating a Docker compose file in order for me to actually manage everything from a single file and within the local compost file I'm gonna be basically connecting it to Docker Hub getting the latest version of elk basically as elasticsearch and kebana and basically initialize them inside my machine so let's create this right now so inside my local directory here I'm going to create a new file and I'm going to call it Docker Dash compose dot EML perfect and now we're going to start first by specifying the version and we're going to put versions 3.1 and then I'm gonna specify the services and here we're gonna first by starting the elasticsearch services elastic search and then I'm gonna specify the container name and in this case it's going to be LS and then I'm going to specify the image and basically if you want to find the image what we can do is we can actually go to Docker Hub and search for it and basically I have it already but if you wanna if you want to find figure it out or basically find it you can just go to Docker Hub and search for the image that you need from there and you'll be able to find it so here all I'm gonna do is put docker dot elastic dot Co forward slash elasticsearch forward slash elastic search and with an 8.7.1 and here I forgot the H so let me just add this here perfect so now that I specified my Docker image I'm going to specify the ports that I'm going to be using and in this case it's going to be Port 9002 map to the 9002 and this is a default Port that elasticsearch cannot be utilizing and then once I do that then I specify the volumes and it's basically the volumes mean here is where I'm gonna be storing my information so because by Nature a Docker image or Decor Contender when it runs it is thickness and basically any information that is stored inside this volume it's gonna be basically deleted or disappeared once the container stops what I want to do is I want to weigh what I can actually consistent the data inside my container and for that I need to utilize volume and volume allows me to basically create that so once I hit the volumes I'm just going to specify called elastic search Dash data and here I'm just going to specify the path so inside my USR share forward slash elastic search forward slash data perfect once we have done that now I need to specify my environment variable and the environment verb is going to be straightforward so this is going to be two items that I need to specify one regarding the security and one regarding the configuration so regarding security I'm going to put xpac dot security dot enabled and I'm gonna put false because it's a local environment I don't want to enable it right now and the discovery DOT type equal single node that means here basically what we're going to be doing is we're going to have a single node of our elasticsearch registry uh sorry elasticsearch node running and that way just make it easy for us uh the way we are developing locally on our machine so once we have added our environment the last thing is I want to do is I want to add a network so the network here is going to allow us to facilitate the communication between our elasticsearch as well as kibana and here we need to make sure that for both of these what we need to do is we need to make sure that elasticsearch and kevana are actually on the same network so they are able to communicate between each other because these two components will need to basically communicate in order for us to basically process the logs that that elasticsearch is going to have and keep on is going to take these logs and basically display it for us and the way we're gonna be doing that is by having them both on the same network inside this Docker compose file that way it's going to abridge Network and they will be able to see each other and see the ports and they will be able to communicate so now all I'm going to do is just create a network I'm going to call it elastic perfect now data has specified my first service so if you can see here the first step it's going to be elastic the second service that I'm gonna be actually providing is going to be right now kibana and kibana also is going to be pretty straightforward so the first thing that I want to add is the container name and it's going to be called kibana then I want to basically specify the image and here the image is going to be docker Dot elastic dot Co forward slash kibana forward slash kibana again I'm gonna get the same version number which is going to be 8.7.1 and then once I do that I'm going to specify the ports for kibana is going to be five six zero one map two five six zero one and then the next step is gonna say this is really important depends on and here that depends on means that we're doing is we're actually linking those two together we're linking the actual container or basically elastic as well as Cabana and once we link them all together and have them uh dependent which means that the elastic container needs to run first the elastic container needs to be fully running and then kibana will be able to actually run and execute all of the configuration it's needed in order for it to basic the way to be able to communicate with elasticsearch because if we have kibana setup first and it's all running and elasticsearch is not one is trying to initialize the configuration and the connection and it fails it will not restart itself unless we stopped and restarted so that's why what we need to do is we need to specify the dependency to make sure that the docker compose file will make sure that the elasticsearch is fully running and then it will start initiating coupon and do all of the required configuration so once we have done that all I'm gonna specify here the elastic search service and basically this depends on here it's gonna be the name that we currently have here on top Nexus services and that's mean with chain linking the services together and then once we specify the depends on our spender specify the environment variable and the environment variable is going to be the elastic search underscore URL and here basically we're going to allow it to go and connect to this URL on Port 9002 so here all we need to do is put http folds that's forward slash localhost on Port 9002 actually 9200. and then lastly I want to specify the network and in this case the network is going to be elastic perfect so once we have done that the last I2 items I need to add here is going to be nine Networks and the networks it's gonna be my elastic Network and I'm gonna specify the driver for this network and mean here what is going to be the network type and for this we're just going to make it a simple Bridge Network and lastly we're going to specify the volumes that are mutilizing for my elasticsearch and here all I'm gonna put is going to be the elastic search Dash data and that's it so let's do a quick summary what we have done so far what we did is we basically created our elasticsearch we configured here the container name the image the ports the volumes and the environment variable once we have done all of that we basically want to kibana specified all of the required information and then we specified the networks and the volume so now if I want to build this up let me just do that in a different terminal so I can do it directly from here let me clear this and in order for you to be able to execute this what uh what we need to do is we need to make sure that we have Docker running inside our machine and basically we have it all for the configured so inside uh once you go to Docker Hub and download it you should be able to see that you have the Swale icon on top and basically once you open it let me just put this here you'll be able to see that you're gonna have something like this and you need to make sure that you have the whale icon here the green it means that Docker is running as it should be so now if I want to just build this all I'm going to put here is stalker Dash compose and up and now we can see what's happening is we are pulling Cabana and elasticsearch from Docker Hub it's going to be downloaded onto our machine and once it's downloaded in the old machine it's going to start to configure itself so this is going to take a couple of minutes to do so so once it's up and ready and it's finished we're gonna be resuming a few moments later okay perfect so now that it's running let us see what happened here which is really important so or if I go all the way up all the way up all the way up we can see here that our kibana and elasticsearch has been pulled successfully they okay go back up okay disappeared okay this is gonna take a few because it's directly start populating the logs but basically what happens it attach elasticsearch and Cabana together and make sure that both of the instance are available before they can actually do the work and now what we can do here is we can see that kibana is actually running and elasticsearch if we go all the way up we can see that elastic which is Els it's actually also running so now in order for us to verify this let me open my web browser and inside my web browser what I did is I visited that two URLs the first one here is going to be localhost Port 9000 and we can see here that my elasticsearch cluster is actually running as it should be we can see that it's running on that version that I have specified 8.7.1 and we can see that the build type is Docker and we can see all of this information that we need now if I go back to my index inside elastic we can see that elastic is actually a Cuban is actually running and it's actually connected to elasticsearch so now that I have all of this set up and basically I have all of this information ready the next step is I need to install some packages inside my web API that I've created so now let's go back to visual studio code and inside my terminal here I'm just going to be basically adding some packages so the first package that I want to add it's going to be dot nuts see how it looks okay but not let's clear this up so it looks a bit better okay so now the stop not add package and here we're just going to add cellulog dot asp.core so that's going to be the first one and cellular log basically it's going to override the different the default logging system that exists within.net core then we're gonna put dotnet add package cellulog Dot and nurtures dot environment and this will allow us to basically have more logging enrichments again dotnet add package serilog dot syncs dot console and basically the things mean here inside this it means that because cellulose can basically output logs different modules so in order for us to do that and cellulog refers to these monitors are sync so it could be like a cabana it could be the console it could be anything that we want so that's why we need to add these things and then we're going to add the debug so we'll be able to see those debugs information so again not uh the package cellulog Dot syncs dot debug and thus for debugging purposes we'll be able to get it and then we're going to put.net add packages now we want for cellulo for elasticsearch so serilog dot syncs Dot elasticsearch cellulog add package package not packages okay perfect and the last one is going to be not add package cellulog dot exception perfect so now once that is done if I want to verify them that it's running all I need to do is go to my weather forecast.cs sorry to my sample api.cs approach and I'm able to see everything that I need there perfect so now that that's done the next step is I need to update my program.cs and I need to update my app settings in order for me to take full advantage of serilog as well as the logging mechanism that exists so so now the first step what I want to do is I want to actually go to app settings and basically add all of the configuration that's needed for elasticsearch and Cabana and this configuration basically going to be the ports that the elasticsearch is running on so my application will be able to find where elasticsearch is running and the second item that I need to add here is the configuration for cellulog under the under level of details that I want to have inside my application so inside my app settings and just remove this yeah let's remove this let's leave it for now then we can remove it later and the first thing I want to add here is cellulog and then here we're gonna put the minimum level of uh information that I want to record so minimum level and this is going to be another Json object specify the default and here we're going to add information and then specify the override and it's going to be Microsoft and I'm going to add information as well and I'm gonna add also system and I'm gonna add the warnings and then once that is added the next item that I need to add is the elasticsearch configuration and here it's going to be as simple as just adding the URI that I need and this URL is going to be http for the size fold slash localhost on Port 9200 and let me add a comma here perfect now let me just do a.net build excellent so now that my application is running and it's building the next stuff for me is I need to go to my program.cs and I need to update it and inside my program.cs here what I need to do is I need to add all of the information that's gonna be injected inside the startup of my application so the locking mechanism will be able to be overridden as well in order for elasticsearch to be configured correctly so all of these blocks has been conducted will be sent directly to elasticsearch so let's see how we can do that so the first thing that we need to do is basically right now we want to create a function and it's going to be responsible for configuring the logging inside our program.cs so let's see how we can do that so we go all the way down and here we're going to be doing is we're going to create a new function and this function we're going to call it a configure logging straight to the point and we're just going to make it void and we're going to say configure logging as simple as that and once we do that now we need to actually add to it what do we want to implement inside so first of all I want to specify the environment that I want to get so I'm going to call it environment equal environment Dot get environment variable and I'm gonna get my asp.net core environment see how it's looking asp.net core underscore and environment perfect so now that I got my environment the next step is I need to actually start configuring my logger so far configuration equal new configuration Builder and then inside my configuration Builder I want to add some Json so first of all add Json file and this Json file is going to be the app settings dot Json and we're gonna be basically pass the optional and I'm gonna say it's gonna be false and then I'm gonna make sure it's still load on change equal to true so that's really important and then I'm gonna put dot add Json files and this one here is going to allow us to adjust and file per environment so I'm going to put up settings Dot actually I can just put environment here yep and as we did before it's going to be optional oops optional it's going to be equal to false perfect so now that we have done that the next step is we're gonna We want to start actually building the actual logger so let's add a semicolon and then we can start configuring our logger and we're going to put log oops oops we're going to put log dot logger equal new logger configuration configuration and then here we're gonna basically fix those references so we can actually see how to complete so as you can see we added serial log and then here see what it doesn't need okay that should be fine and then we're gonna start first by enriching Dot from log context so that's the first item then we're gonna put enrich with exception exceptions details and then we're gonna be basically let's fix this reference perfect and then we need to basically now add the right to dot debug so we can actually see it and then I'm gonna write to console so we can actually see it in the console then we're gonna put right to elasticsearch and this is going to be the main thing that we need and we're gonna pass the configuration actually I think it's configured elastic elasticsearch see if it's up here maybe let's add it manually configure elastic sink and then here we're gonna pass the configuration that we got and we want to pass the environment and let's fix these references so here's gonna ask us to add this so right to elasticsearch let's continue on this and then we'll figure it out and then I'm gonna say enrich Dot with properties and then here is going to be the environment oh it's going to be environment and then we're gonna put dot read from dot configuration and here it's gonna be the configuration and lastly we're gonna put the create logger and close it off so this one we need to fix this needs to be a Capital C and let's see why this is not happy with it so let's check here maybe inside the if we have the cellulog installed successfully okay that's why it did not install the package successfully so let's try and install the package again because basically here what we need to do is we need to have elasticsearch so let me just copy this actually let's look for it so I can just put uh dot nut add package and it's going to be serilog dot syncs Dot elasticsearch let's see perfect now we can see it has added it and now if I go back here and I go all the way down perfect and now we should add the reference okay so this is still not happy with it configure elastic sync does not exist now I need to create this configure elastic sync a mechanism so I could be able to actually update the elasticsearch so after this methods here configure logging I'm going to add this method here so let's add it under after that and this one is going to be straightforward and within this matches what we're going to be doing is we're going to be creating the configuration that we're going to be needing in order for elasticsearch to work so what we did right now is we were actually configuring the logger first and basically we told log how it's going to actually grab the login information and once it's grabbed logging information how it's going to actually propagate it within the system now what we want to do is we want to pass the configuration for elasticsearch and those in configuration for elasticsearch they're going to fit into the logger and then from there it's going to fit into the asp.net application and that way we're going to have a full cycle of the configuration so let's see how we can do that so if we come back here now all we're going to put is elastic search sync option I think it's going to be elasticsearch sync option and then we're going to say configure similar to the name of this so let's just copy paste this is easier and then here I'm going to pass the eye configuration I'm going to call it configuration eye configuration root I'm gonna call it configuration and then it's going to ask us for a string which is going to be our environment once I do that now I need to actually propagate this with the information that's needed so I'm gonna put return new nope new elastic search sync option it's gonna require us to add those references while he's not happy with it so now I think what the issue here is we need to put on us let's see if that's going to help us find the correct yes that it is and let's add the US here as well perfect and now let us start with the configuration for it and we're gonna say basically forces we're gonna put the new when I add a new URI and we're then going to pass the configuration or basically we're gonna pass the URL for our elasticsearch from our configuration and here we're going to put configuration and we're gonna pass the elastic so we can see it directly from here elastic configuration and we're gonna put URI perfect so once we have configured the configuration next we need to initialize this and let's put this like this and some of the information that you want to pass inside here so the first thing is going to be the auto register template and I'm gonna be equal to true and that way it's going to automatically register the elasticsearch and the index that we currently have inside the elasticsearch service that's running then we're going to be creating our index format and the index format in assets it can take any form that you want we can give it any naming convention but it's really a good idea to actually give it something that's going to be easy for us to find within the actual logs because if we're utilizing our elasticsearch service to actually log multiple applications we want a way to figure out which log coming from which application and for which environment of that application so we want to do the separation of concerns within our logs so it's really good to give it an actually our index a really good naming convention that we can actually refer to so the way we're going to be doing is we're going to utilize reflection and inside this reflection we're going to be basically getting the name of our library which is running our hours which is running on application and to do that it's going to be straightforward we're going to basically just put assembly let's make it as a dollar sign so we can use sting interpolation assembly dot get execution execution assembly let's fix this reference perfect we're using reflection and then once we do that we're gonna put dot is this going to be valid get execution assembly yes that's correct then I'm gonna put get name and then I'm gonna put dot name I'm gonna put that to lower so that means that we're gonna get it in lower case and then once we've done all of that we want to do is I want to replace every single dot maybe because my application name could be compromised or for example sample application.sampleapp.api for example another one could be sample app dot for example data service sample app dot email we don't want that we want to have instead of the DOTA Dash it will make it easier so all I'm going to be doing here is just I'm gonna put replace and inside my place I'm just gonna say please replace the dot with the dash so that's going to be the main thing and after we specify what do we want to replace let's go back here I'm not sure why it went like this okay after we specify that I need to actually give it the environment name which is running on and for this all I'm going to be doing is uh let's see how this is looking so far so this is the assembly okay put it in this way and then I'm gonna put Dash and then I'm gonna put environment dot to lower again everything is going to be too lower and then the nice thing the last thing that I want to add is I'm going to add the date format and the date format here is going to allow me to actually know when is this application gonna when is this index has been created and because it's going to be attached to it because for example if the application crash and it's going to be running again so it's going to have a new index created for it or basically it's gonna happen to the current index and it's really good to add the date format for it so it's gonna be like date now dot the year and the month so let's add this right now so here all I'm gonna be doing after this is gonna be as simple as adding date time dot now and this will give us the actual current time of now actually let's make it UTC since uh we're not we don't really care about the actual times so and then I'm gonna put oh it's gonna be YY Dash I'm I'm just gonna be it's gonna give us the year as well as the month so once we have configured this index there's a couple of extra items that you can actually add and this couple of extra item it could be extra configuration that we're going to be needing in order for us to manage our elasticsearch cluster it's not something really important We're Not Gonna dive into what they mean right now but for example I was just going to give a quick High overview of what they are so the first one is gonna be the replica so I'm gonna put number so make sure it's finished and then we're gonna put here number of replicas I'm gonna put it's gonna be two replica and then I'm gonna put the number of swords as well equal to two actually let's put a bigger equal to one so now that we have added this the next item is uh all I'm going to be doing is in order for me to inject this inside my service I'm gonna copy this and all I'm gonna put is before the Builder I'm gonna add the configuration and once I added the configuration all I need to do is just enable Siri log through our services so I need to utilize the Builder dot uh services so Builder dot host dot use Siri log and that's it now with these two basically this one I added all of my configuration that I need and this one here I'm just enabled cellulog inside my service structure so inside my the eye container cellular will be automatically available for me so now as you can see there's one error let's see what's the error and we can see here that the configuration uh it's not working so let's see why oh yeah so first two things so this needs to be true and here we need to add double build at the end and now this should be yes perfect now just fix our problem now let's do a.net build to make sure everything is building as it should be so don't not build and we can see it built succeeded now all I'm gonna do with the test this right now I'm just gonna go to my weather login here and inside my get response I'm just gonna add a simple logger gonna put logger.blog information and I'm just gonna say hello from action and let's run this application so not run we can see it's right oops we had an issue let's fix it uh what's the issue file not found configuration app settings.json so maybe I misspelled that let's see it I have settings I forgot yes so now let's run this again so now let us run our application generate some logs and start seeing them in kibana pulled from elasticsearch so inside I will have Visual Studio code I'm just going to put not run now we can see it's building and now it's running I'm gonna go back to my web browser and inside my web browser here what I'm gonna do is just execute a few times and now if I come back here and look up we can see that my message or basically my log is actually popping up correctly we can see hello from action and basically now what I need to do is I need to go back to my web browser and inside my elasticsearch here if I want to see my index so first of all if you take a look here at the ports that we are visiting we can see that I'm visiting a localhost on 5601 and this port is basically what we have configured inside my Docker compose file so you can see here inside my Docker compose file on the ports we can see that I have 5061 Port into the kibana this is exactly what we're seeing and if I just refresh here on this one we can see here that my elasticsearch is running on Port 9002 which is exactly what I want so if I want to see the index that they currently have which is the one that the application has generated when it when it run and basically it was able to configure itself with that so here if I click on the menu on the left hand side I go all the way down we can see that we have stack management and inside the stack management if basically under data we can click on index management and through this I can see that I have some sample data development and basically sample API is the name of my application dll and then the environment and the date of this and here we can see that I have a couple of information which are available so first of all we can see the status is open which means it's accepting logs we have the primaries and the replicas so now if I go back to my video Studio code and I go to my program.cs we can see here that I have specified the number of replicas 2 equal to one when I configured it we can see it's equal to one and the number of source is equal to two which is going to be my primary perfect and now because I was testing this a bit we can see that I have a bit of logs which is populated here so right now how can I see this logs so oh I think what I can do is I can click back on the elastic icon and under the menu I can just click under analytics on Discover and this is a menu page that I created before where I can see all of my different plugins so what I want to do right now is I want to show you how you can actually create your own um logs so basically a few of these logs so under the view here so you should see sample or something along that so what I want you to do is just click on this small icon here a small Arrow here and click on create data View and inside this data view you can give it whatever name that you want I'm just going to give it a view for I'm gonna make a capital S and then here you're going to see an index pattern so let's copy whatever index currently exist I'll just put it here and what I want to do right now is I just want to remove everything after the dash and put a wildcard so why did I put a wild card after that because basically what I want to do here is I want to catch all of the logs which is coming out of my application no matter where I am using it so in this example here what I'm doing is I'm using my application inside a development environment on my locket machine but what happened if this environment is basically now outside if I'm testing this application on a different environment in production and pre-prode and a sandbox environment I want to also capture the log stair by having this wild card here it only it only can allow me to actually capture everything and allow me to filter the request which is coming in based on the different environments so I'm not limiting myself to the date and the environment that I have added to my dll what I'm doing here is I'm actually capturing everything and then through the search bar I can actually filter to basically what I need and this is the power of kibana I can create wild cards to match whatever logs I want coming out of my application so right now inside this what I want to do here is I'm going to keep it like this I'm going to keep safe date of a data View and once I do all of that we can see it automatically started populating all of the different views that currently exist so what I want to do right now I just want to make this a bit further so I started this video yesterday at 5am let's say and I want to keep it for today until let's make it end of days so I click on update and now we can see that you have we have 584 hits and basically what I see I can see the different plugs that has been collected throughout the a day from yesterday and you can see here like basically the number of Records counts we can see here that based on the different times what are the different information and if I click on one of these messages which is popping up here and I can click on these two arrows what I can see here as I can see the different information which is provided and being captured so I can see here that the unique ID which is came with this request I can see which inductors exist and I can see the timestamp which has been generated and I can see the what was the endpoint that we had put and what was the response that has been generated as well I can see the messages which has been propagated so on so forth I can see all of this information I can even see the unique action that has been utilized so all of this information we can see is directly coming out just by actually injecting and configuring elasticsearch inside our application and basically making sure that elasticsearch and Cuban are working together through the docking compose file so what I want to do right now is I want to basically create an error or throw basically an exception and see how that is basically displayed inside my cabana so back to my visual studio code all I want to do here right now inside my weather forecast I'm just going to create a simple first let me stop my application from running I'm gonna create a simple tri-cut and I'm gonna say here through new exception and I'm gonna say here error in the application yeah something along the slimes and inside my catch I'm Gonna Catch the exception and all I'm going to be doing is I'm just gonna put logger DOT log error and then here I'm just gonna pass the message that I want so I'm gonna say something went wrong and we can see here there's different ways that I can actually populate this so I can pass the parameters I can pass different items to make this work the way that I want so what I want to do here first is I want to pass my exception then I want to pass a message and then if there's any parameters that I have currently I don't have but let's say we are getting something interesting so I'm just gonna put to one or let's say any random number as parameters I'm just gonna order a name or anything so I'm just going to put five as a parameter so now that I have that I'm just going to run my application again so I'm going to put.netron and now my application is building and we can see it's running so first things first I want to go back to my web browser inside Swagger I'm just going to execute this once and if we come back here and look up we can see here that we have an error generated something went wrong we can see the actual exception that has been generated which is great so now what I want to do as as well we can see the hello from action the login that we have added previously so now what I want to do is inside my Discover elasticsearch let's stop this and let's just refresh we can see right now that the number of Errors has basically went up because we have sorry number of logging has went up and we can see now that the red line that I had it now it's become a bit more thicker because I have more exceptions again there was another one before because I tested it before so once we have all of that now let's see how we can actually search for the errors so inside here for example I put something controller in another application so let's take it in the application for example and let me put it here and let's search for it and now we can see here that the application we can see this message has been filtered error in the application we can see the time of the error the exception so if I click on these two arrows again we can see a lot of different items so first of all again similar to the log we can see in the unique ID and the index but the nice thing about this is we can see the actual exception stack and this exception stack that we can see here will actually allow us to see what was going on inside our application and potentially what went wrong in its in order for us too solve it and this is going to be a key role into debugging our application and figuring it out and basically try to solve the problem because within the exception stack we're able to actually see the exact information with the end of what was going on so now inside here I can if I scroll up and down we can see the action ID the action name that has been called the developed environment as well the message that I have populated and we can see the level of what happens so we can see here that the level is error as well as the message something went wrong and this something went wrong we have basically added it here if you within the logger and then we can see all of this information that we have populated available directly to us within the kibana dashboard and there's a lot more items of how we can actually utilize this but this is a very high level overview about kibana as well as elasticsearch and how they work together with not in order for us to capture all of these different blogs and in order for us to capture all of this different information so we can see here the power that kebana and elasticsearch provide to when it comes to logging working along with stereolog and this was a very high level overview if you'd like me to jump more into for example the language is that kibana utilized which is going to be kql or for example a different utilization that kibana provide like please mention in the comments down below and I will create a video around that as well if you'd like to jump more about elasticsearch within.net7 also please let me know and I'll make sure to cover these points I hope this video was helpful please like share and subscribe if you find it helpful it will really have the channel and let us do a quick overview of what we have covered today so in today's session what we have done is we have basically a configured Cabana and elasticsearch locally on our machine utilizing using Docker compose we're basically able to connect them together using the docker compose file and make them run locally once we have done that we basically then create our.net application we configured cellulog there we added some enrichment to it so we can actually see what was happening as well we configured the cellulog sorry the elasticsearch and Cabana directly into our application and we were able to propagate all of the logging that's happening inside our application to cellulog and kibana and from there we went into kibana and we were able to see all of the information which is all of the logos which is coming out of our application then we created an exception and we were able to see that exception as well inside kibana again I hope this video was helpful please like share and subscribe if we need to have the channel and have a great day
Info
Channel: Mohamad Lawand
Views: 23,215
Rating: undefined out of 5
Keywords: .net, api, c#, dependency injection, entity framework c#, entity framework, asp.net core tutorial, .net core, asp.net core, web api, dotnet core web api, dotnet core, dotnet performance, logs, dotnet logging, dotnet logger, dotnet core elasticsearch, dotnet serilog configuration, serilog dotnet 6, dotnet serilog dependency injection, dotnet elasticsearch kibana, kibana setup, dotnet kibana setup, dotnet configuration, how to code, dot net
Id: zp6A5QCW_II
Channel Id: undefined
Length: 45min 50sec (2750 seconds)
Published: Sat May 20 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.