Logging into Elasticsearch using Serilog and viewing logs in Kibana | .NET Core Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everybody i'm nikki and today i'm going to show you how you can integrate your asp.net core application with elasticsearch in order to push your logs and then visualize those logs in kibana search through them see what's going on exactly in a given moment in time this is a very useful video and a very useful thing for anybody who's running anything in production because you want to be able to see exactly what happened to your system in a given moment in time and you want the system to be scalable and easily filterable i won't be going in depth into what elasticsearch is and exactly how it works or how to optimize it but i will be showing you how you can spin it up and integrate with it in order to start pushing logs i am planning to make a video on elasticsearch in the future this video is part of my general asp.net core series so if you don't want to miss any episodes please subscribe and hit the sub notification bell to get notified when i upload a new video so let's dive straight into the code what i have here is what you get out of the box when you create a web api in dotnet core it is a weather forecast controller and if you uh run this all it really does is give you fake data so it's running if i go to post one and i hit the end point as you can see i'm getting a bunch of fake temperature uh stuff it doesn't really matter what this is doing just imagine that this for you is your asp or application your blazer application your web api it can be anything so i'm going to go ahead and stop this and what we want to do is as you can see we're injecting a logger here this is out of the box but if you log something with it the only thing that will happen is and like actually let me just show you if i do log information hey this is a request and if i run this again as you will see the only place where you're going to see this log is in the console so if i go back to postman and run this here you go hey this is a request this is coming from our log information now obviously this is not really useful but what could be more useful is you could be logging something related to your request or for example if there is an exception thrown during this this thing here and ideally you want this catch to be on the middleware but for now i'm just going to keep it within the method just to show you how it would work but you can very easily extract this logic in the middleware and here you would say let me quickly change this to an i action result that's good stop this return okay with this body or return new status code result with the status quo being 500 and not exposing this but also we will say logger dot log error and this error accepts an event id we don't care about that what we do carries up is the exception so i'm going to lock the exception and then i'm going to say something bad happened so what we're doing here is this is running this might be there might be an exception during this call and then we're logging it and in fact since we have a random here i'm gonna say rng dot next zero to five and if 0 to 5 is less than 2 throw new exception oops what happened so if i run this now a few of my requests should succeed and a few of them should fail going back to postman the first one worked step one didn't an error has occurred we caught it and we returned 500 and we did log oops what happened and we also logged the failure and the exception so this is nice but this is concentrated and it's only located in our console it's not going anywhere i cannot see it later i cannot analyze it unless i'm logging into a file or something that's not so convenient this is where elasticsearch and kibana comes into place elasticsearch is effectively a distributed version of lucine which allows you to search for data in general but turns out that such a thing is also very useful for time series data like logs or metrics what i have in this docker compose which is in this repo which you will find in the description down below is a way to spin up a cluster that only has one node and elasticsearch cluster through docker and also kibana and kibana is effectively a service to connect to elasticsearch that cluster and visualize the data of that cluster and also manage it so what i'm going to do is i'm going to go to partial and i'm going to point where i am yep here is the compose and i'm going to say docker compose up and what this will do is it will start my elasticsearch and my kibana cluster and this might take a second in your pc or your mac depending on whether you have the images or not i already did have the images so you can see that the cluster is spinning up this might look like nonsense but this is spinning up elasticsearch and once elasticsearch is ready it will also spin up kibana so we're going to give that a minute in the meantime we need to start changing our application to be able to push to elasticsearch and how are we gonna do that well the first thing we're gonna do and that's probably a personal preference and it might not be the same for you but i like logging with serilog because of its structured logging mechanism and i'm just so used to it and it has so many syncs and effectively a sync is a way to dump the data in a different location and for us it would be the elastic search sync and it also has enrichers which enrich your logs with information to make it better and you get more information out of it so we're going to use that and the first thing we want to do to add cherry log is go in um nuget and say serilog.asp.net core and this not only brings in stereolog but it also brings in a bunch of and reaches for the request and it starts to automatically log any requests in our system which is good you might want to limit that there are scenarios where you could be over logging but in general you want to log as much useful information as possible and then i'm also going to add a sync i'm going to say elasticsearch and i'm gonna add that this is where we're gonna push the data in and that's the dumping mechanism in a sense and last but not least i'm gonna get an enricher just to show you how that works as well and i'm gonna use the enrichers.environment package and i'm gonna add that into the project so these three packages and the first thing i wanna do is i wanna go to the app settings.json and i wanna remove this whole login section and i'm going to replace it with this now this is serialog's own configuration and you can extend on this is probably the minimum you require for it to run and in fact you might not even need that but all it really says is what's the minimum level that you want me to log the default information and then you override microsoft's log to also the information and the systems log to be warning i won't dive too much into sherlock it's just a way for us to easily push the data into elasticsearch and also log data very nicely on our application and now we have that we can very easily go to the program.cs and this is the only place where we need to change something for the mechanism to work you'll see it's very very easy to get working first you want to go to your host.create default builder and you want to create a new line and say use seri log and this comes from the asp.net core package that we added and i'm going to use the override that has the context and the login configuration and here we are going to configure our logger the first thing you want to do is you want to say configuration and this is the logger configuration and you're going to say enrich from log context and this will add a few information from the log context on the logs and then also enrich with machine name and this is coming from the environment and reaches i think and then you can specify write to console so keep writing effectively to console but also right to elasticsearch and this is coming from the sync that we added and now we're going to configure our sync let me make this bigger and we're going to say new elasticsearch sync options and the first thing that we need here is the url of elasticsearch and it's good to put this in the configuration in order to override if you want so we're going to go back to the app settings and i'm going to say elastic configuration and in here i'm going to say uri and i'm going to put the address and the address to elasticsearch let me show you here by default is localhost and then 9200 so this is what we have here you can see it's the instance running in docker so the instance that we have here and also kibana is accessible so as you can see here this is kibana and this is what we're using to visualize the data in elasticsearch this elasticsearch so i'm gonna move this back and what i'm gonna say is http localhost 9200 and that's it nothing else required here and now i'm gonna go back to this application and i'm gonna say new uri and i'm gonna say context dot configuration and then i'm gonna use the configuration path which is elasticsearch configuration and then semicolon uri and now i have my sync but also there's a few other things i need to configure so i'll go back here and i'm going to expand on this and obviously the one thing we need is something called the index format indexes are is where data lives in elasticsearch and we need to provide a name for that and we do that here now what i'm going to do is i'm going to copy the index format because it's going to be a long one as you can see here let me just make this smaller and let me break this down for you so what we're going to do is we're going to get a configuration value from the config which is the application name i could get this with reflection from the assembly name itself but i want to have more control over this value personally so i'm going to create an entry in here called application name and i'm going to name mine elastic search app or something you can name yours whatever you want and i'm gonna go back here and then i'm gonna add a logs in the name and then i'm gonna get the hosting environment and i'm gonna lower case it and replace any dots with hyphens because you kind of have to and then i'm going to break down my indexes by month and year and we do that because if indexes get too big they won't be able to be distributed nicely and the performance will become poor over time so that way you give a few shots on each month and i've actually seen this on a day-to-day basis as well if you have hundreds of logs or thousands of logs and this actually keeps it efficient and something else we want to do is we want to say auto register template and that is true and then i'm going to say number of shots and i'm going to keep that to two and you could also specify the number of replicas to one but you don't really need to for the purpose of this video um this shard and replica terminology might not make any sense to you um we will cover that in the elasticsearch video so now i have that and i will go ahead and expand on it with a few other enriches as you can see we keep chaining so i'm going to say enrich dot with property and i'm going to say environment and i'm going to get the environment from context dot hosting environment dot environment name and then i'm gonna say read from configuration context dot configuration and now this will use our applications configuration this line here which is coming from this json file to automatically configure things like the minimum level the default override and all that stuff and basically what you see here is all you need to get started i can actually run this and you would start seeing logs so now all we need to do is run the application and this is now running successfully and as you can see logging changed this is because we're now using a serial log and the structure logging and the formatting looks a bit different but we're still logging everything and if i go to postman again and i start clicking error error success success success error now we have a few things logged so i'm going to go to kibana now which you can access by localhost 5601 and i'm going to go to discover and manage spaces and i'm going to go to index patterns and the index pattern is basically the naming convention for my logging to collect things as a sort of a grouped thing imagine it so i can go here and i can say hey actually this is matching one of the sources as you can see here elasticsearch app logs development so i'm gonna use that i'm gonna say hey set the index pattern name to be elasticsearch app logs hyphen everything because we want to match development and the date if you want to make one specific for development you can say hi even here actually no you can say development here and then hyphen but we don't really want to do that so i'm going to say that i'm going to say next step and now i'm going to specify a time field which is a time stamp just to effectively order the data in a sort of time series fashion as they're coming in and say create index pattern and now we have an index button and these are all the properties we're actually logging as you can see here now i can go back to the dashboard and i can click discover and now as you can see in this cover we have all the logs so if i wanted to search for all the errors i can say level error and this automatically gives me back all the error logs and you can see exactly the exception the stack trace which environment happened which pc it happened the trace id the message the message template you can see everything which is very handy because now you don't have to go in a pc and see the console and see the files anybody in your system can access that url and see what's happening and there's many cool things you can do with kibana you can even go to logs and you can say change shows and you can provide your pattern here and then if you do that and you say apply and then you go back to stream you will see all the results being streamed back so if i run this you will see this being oh i have to change the stream live so here you go this will now be updated with more logs remember that there is no significant performance hit taken if you implement that because the logging doesn't actually go to the server every time you say log something it's being buffered and it's happening asynchronously so you don't have to worry about performance and now the last thing that's left for you is to learn the elasticsearch dsl and syntax to be able to query for the information that you want but you could do things like failed dot uh is it merely elapsed milliseconds yeah you can see a lapsed millisecond uh more than a hundred and filter and you will see all the results that took more than a hundred milliseconds to respond so you can actually identify them and act on them to optimize them it's very nice and if i go back to elasticsearch here you can obviously now keep using the i logger of mvc but you could also delete that and that and import the serilog one and now using the server log you can also create templates here so now you can go here and say log information you don't even need to say log uh in fact it's not information it's uh error and you say exception matches template and then you have a custom property and you specify the property as 50 i don't know it can be anything and if i run this again and i see some exceptions one two three all exceptions good now you can go back to your logs get fresh locked and as you can see there's also visualization it's pretty neat and what i want to show you is that i can go to this log and have my own custom property now called 50 and i can do an exact search on it so find any logs with this value being 50 there are a bunch of them however you might need to re-index for this to be searchable in different ways but it doesn't really matter at this point we will go in depth in the elasticseas video for now this is a very easy and nice way to get started with elasticsearch start getting familiar with it and start logging into it and be able to search for those logs and you can run your own cluster very quickly and very easily a multi-node one and there are tons of tutorials you can find around but if you want to wait for a bit i will have my own success video as well as always you can find the whole code for this video in the description down below that's all i have for you for today thank you very much for watching special thanks to my github sponsors for making these videos possible if you want to support me as well you're going to find the link description down below leave a like if you like this video subscribe for more content like this ring the bell as well to get notified when i upload new videos and i'll see you in the next video keep coding
Info
Channel: Nick Chapsas
Views: 79,331
Rating: undefined out of 5
Keywords: Elfocrash, elfo, coding, asp.net, .netcore, dot net, core, C#, how to code, tutorial, asp.net core, js, csharp, rest api, lesson, dev, microsoft, microsoft mvp, .net core, nick chapsas, chapsas, asp.net core 3, .net core for beginners, elasticsearch, logs, serilog, logging, kibana, visualisation, sink, Logging into Elasticsearch using Serilog, viewing logs in Kibana in .NET Core, .net core logging, dotnet, .net
Id: 0acSdHJfk64
Channel Id: undefined
Length: 19min 43sec (1183 seconds)
Published: Tue Aug 25 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.