Mastering Grafana Loki: Complete Guide to Installation, Configuration, and Integration | Part 1

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone today we're going to take a look at a tool called Loki and if you don't know what Loki is a log aggregation tool that's used to store and query logs this is a tool that was created by grafana which you guys are probably already familiar with and if you ever worked with any of the other grafana based tools like Prometheus you'll find that working with Loki is actually very similar and so if you are looking to move away from maybe like a traditional elk stack you'll see that Loki provides a fantastic alternative that's a lot simpler to work with and so in this video we're going to take a look at some of the basics of Loki and working with them and this is going to be a two-part Series so this video will be focusing on the theory as well as a quick basic demo and in the next video we'll take a look at incorporating Loki with either Docker as well as kubernetes so that you can see how you can incorporate that in a containerized based environment so we're going to start off by going over what is Loki why do we need it and how does it differ from some of the other pre-existing Solutions well then take a look at the Loki architecture don't worry it's pretty simple and pretty straightforward we don't need to dive too in depth into this I want to keep this at a pretty high level overview we'll then move on to installing our own instance of low key so that we can actually go ahead and start collecting logs and working with Loki in general we'll then take a look at some of the different configuration options so that we can get Loki to operate the way we want and then I'll show you guys how we can integrate it with grafana so what is Loki Loki is a log aggregation system designed to store in query logs so you're going to grab all of your logs from all of your applications all of your infrastructure and you're going to send it to Loki so that it can store those logs and then you can query the logs whenever you need to especially if there's an issue that comes up and you want to figure out what exactly went wrong you would then go to Loki query your specific time frame for when the issue occurred and then grab the specific logs during that time now the thing about Loki is that it's designed to be cost effective and easy to operate so some of the other logging Solutions I don't think that a lot of people would consider either cost effective or easy to operate it's usually only one of those two so if you're using something like elasticsearch that is pretty well known to be fairly complex to handle and you almost have to hire a engineer to manage it full-time and some of the other managed Solutions are going to be very expensive so Loki is meant to be an alternative that's meant to be a little bit more cost effective and simple to operate that's one of the things that they wanted to prioritize and the way that it's able to actually do both of those things is that Loki doesn't actually index the full text from logs like elasticsearch so instead Loki only indexes labels that it receives from the log so whenever logs gets sent to low key what we're going to do is we're going to send labels with them and labels are just it's just metadata tags that get sent with your logs and low-key indexes just the labels but not the logs and so this allows it to be a little bit more cost effective because you're not indexing all of the text from the logs and you'll see that the architecture and querying and working with Loki is actually very easy to operate but because we're only indexing the labels this is going to make it more cost effective and performant and another benefit of using low-key and really any tool within the grafana ecosystem is that the configuration in query language for Loki is very similar to Prometheus so if you've already worked with Prometheus you'll find that picking up low-key is very simple and very easy so now let's take a look at the architecture of Loki and we're going to see how all of these pieces fit together and how we actually get our logs to our low-key server so we're going to have a couple of servers or applications that we want to collect logs from now to actually get logs from the server to your Loki instance you're going to need to install a client on your servers now what client this is actually pretty flexible so you'll see that Loki actually works with several different clients or agents and they've created one specifically for low-key called promptail so you can use that but you can also use some of the other pre-existing Solutions like fluent D or log stash now once you get your agent installed on your servers that you want to click log from the agents are going to grab the logs from the server and then it's going to stream those logs to your low-key instance now when the logs arrive on your Loki instance you're going to get the basic log and on top of that you're also going to get the metadata the labels right and these labels are going to be set by you so you get to dictate what the labels are and it's important to understand this because low-key once again the reason why it's so performant and efficient is because it only indexes the labels it doesn't actually care about the log data it doesn't care about that it only cares about the labels so once it indexes that it's going to then have to store the logs and you'll see that with Loki there's a variety of different storage options that you can use to actually store the log messages and so you could store it on the local file system of your Loki server or you can use an object-based storage like S3 or any of the other object-based Storage Solutions provided by the various Cloud providers and so because you can just store it in object storage it's going to be very cheap it's going to be very easy to operate because S3 is managed by AWS and it just infinitely scales up so you don't have to worry about managing a elasticsearch database or anything like that now once we have the logs in low key how do we actually get the logs from the server when we want to analyze or troubleshoot an issue we make use of a query language called logql and so this is the languages that you're going to use to interact with low-key to get logs to filter out logs to grab logs from a specific date and time these are all done using their specific log ql query language and on top of that you'll see that this is part of the grafana ecosystem so you can easily connect grafana to actually use low key as a back-end data source so that you can then visualize it a little bit better and you can run all of your queries through grafana instead of using the API and they do also in fact have a CLI to make it easy to actually query the Loki server but I think most people like to just use grafana because it's nice and pretty and it's got a nice GUI so for this demo I've got three different servers I've got one server called low key that's going to be this terminal up here and if you ever get confused as to which device I'm connected to I've embedded the kind of the roller the purpose of that server into the host name so the one that's called Loki this is going to be the one where we install our Loki instance and then I've got node one and node two so these are going to be the servers that we're going to actually collect logs from and send it to our low-key instance all right so to get started with installing low key and we're going to make use of prom tail to actually be the agent that collects the logs let's go ahead and go to the documentation page for Loki and we want to go to the installation section and so there's a couple of different ways to install Loki they've got a Helm chart if you're working in a kubernetes environment you can install it within a Docker container but this demo is just all about using low-key so I don't want to focus too much on those specific deployment Solutions we're just going to install it from a local install and so to do that we'll select local and we're going to select navigate from release page and I'm going to open that up and here we can go ahead and download our specific version and so I'm just going to grab this or if you go down here you can select the specific version of Loki so look for low key and then your specific architecture but I'm just going to grab this line right up here we're going to copy that I'm going to go here and I'm going to go to my Loki server and I'm just going to do a curl and after we do that if I do an LS we can see that I've got Loki here and it's a zip file so I'm going to do unzip and I've got my Loki executable now the next thing that you're going to do is if you go back to the documentation you're going to need to get a configuration file for Loki so they've got an example config in the documentation so you can just do this Command right here which is going to be just a wget and this is going to have a basic configuration for low key so I'm just going to run this and we can see that I've got this low-key local config and if I want to I'm just going to do a VI and we can just poke around and take a look and see what we've got here so here we can see under the server configuration this is just going to specify what specific ports that our Loki insist is going to listen on so for the HTTP server it's going to be on Port 3100 and for the grpc it's going to be running on Port 9096. Now by default it makes use of the local file system so the file system of this specific server that Loki's installed on to actually store all of the logs so it's going to make use of a storage type of file system which is going to be the local file system and you'll see that it'll actually store the chunks for the logs in this specific location and you can obviously tweak that but if you wanted to use a different storage option like S3 this is where you would go and configure it and they've got examples of this in the documentation so if you ever need a good starting point you can actually just refer back to the documentation and just to show you guys where that might be I think it's going to be under configuration parameters examples so here they've got a couple of different examples so you can see a local configuration example this is going to match up with what we have at the moment but if you want to see an S3 cluster example this is going to be the one that you want to take a look at but let's go back to the installation section and let's complete that and so here you just do a DOT slash that's going to run the executable and then you pass in the dash config dot file flag and then you pass in the location for your config file I'm going to copy that part and so I can dot slash Loki and then we're going to pass in that config file and we're going to start up our low key server and it's going to print out a couple of different logs just make sure that there's no errors or anything like that but if you don't see any errors most likely everything is working and everything is good to go but if you want to test this just to make sure everything's working we can go to our browser and what we want to do is we'll go to http colon slash then the IP address of our Loki server so I have my DNS set up so I can just do low key and that's going to resolve to my low-key instance we want to do Port 3100 and then we'll do slash metrics so if this works and you get a result that looks something similar to this this is going to be the metrics that's being exported by the Loki server this means that everything is up and running and most likely you shouldn't have any issues moving forward if you see a metrics output so we've got our Loki server up and running now if we go back to our terminal remember we've got node one node two these are going to be the two servers that we actually want to collect logs from we're going to have to install an agent on these two servers to actually collect the logs and forward it to our Loki instance so if we go back to the documentation we're going to go down here and we're going to go under clients so here you'll see all of the different clients that it supports you could see a promptail you can see Flume bit fluent D log stash so there's a couple of different options but we're going to do promptail and this is going to have a example configuration but we can just skip this example actually if we can go to installation here this is going to have a Docker example a helm kubernetes do they have just a simple no if we go back to the actually the installation page and go back to local this will actually have the instructions for installing it locally but we want to go back to the releases page right we want to go back to the releases page and we want to go down and go to promptail and we want to select our specific architecture so I'm going to do prom tail Linux AMD 64.zip for me and you can just select whichever architecture and operating system you're currently using and what we want to do is we want to go to our nodes and we'll just do a wget that's going to download the file and I'm going to do an unzip of that promptail zip file and we've got our promptail instance our promptail executable and I'm going to do the same thing on node 2. so I'm just going to copy this command and I'm going to do an unzipped of that zip file and it looks like this node doesn't have unzip so let me just do a quick install of unzip and now I can run that command and there we go so just like with low key we're going to need to get a example configuration file for promptail and the documentation has that for us so we can just copy this line right here and we can paste this into our Node 1 and node 2. and I'm going to open this up and we'll just take a look and I'll walk you through the configuration it's pretty straightforward if you've ever worked with Prometheus it's going to be almost identical so here we can see the port that promptail is going to be listening on we can see the client so this is going to be the URL of our low key server so we want to make sure that we update this to point this to our low-key instance so here I'm going to change localhost because it's not running on this machine it's running on a different machine and I'm going to put the IP address of my Loki instance and so once again I've got DNS so I can just type in low key but you're going to want to put your IP address if you don't and then we've got the scrape config you want to tell promptail what log files you want to collect from this server and send it ultimately to low key so here we've created a job right so you have different jobs that will collect different logs in this case it just gave this job a name of system you can see the target this is just going to say okay the target is going to be this specific server but you can technically grab targets that are other servers here you're going to add labels remember this is going to be the metadata and these are important because this is what we're going to index on so so here we've added a key value pair of job and VAR logs it'll index the specific labels but you can add in as many key value pairs as you want and keep in mind that's going to require more storage costs the more labels you add and then here the path this is going to tell it what logs do we actually want to collect and so this is going to say anything in the slash far slash log folder we're going to grab any file in there that ends in log so that's what this asterisk means it means anything before it and then it just ends in log so we can actually save this and remember it's going to do VAR log if we go in this folder it's going to look for anything that ends in log so it's going to grab cloud in it output log current log my log syslog it's going to grab anything that just ends in log so that's what that configuration does if you want to grab something else logs from another location you just add another job in we will do that in this specific demo and let's go to node 2 and let's update that same config and I'm going to update the location of our Loki server and so now let's start up promptail so we're going to do dot slash promptail and then we're going to have to pass in this config file I don't remember what the exact flag is so we'll just do a dash help that's usually going to give us some instructions and we're going to look for something that says config file so let's see client config file so we just do pass in the dash config dot file and that should be everything we need so I'll do LS all right so now we'll do promptail pass in the dash config equals and then grab that file and I'm going to copy this line because we're going to paste it in node one as well so I'm going to enable this and I recommend you watch this for a second just to make sure that there's no errors because if there's an issue with it connecting to the low key server it's going to print this out it'll print out that message in this window but it looks like everything's good so I'm going to go to node one and I'm going to paste that in there and let me actually move to the correct directory and then run that and let's just make sure we run this with sudo because it is going to be accessing the VAR log directory which you need to have a root privilege for so I'll do pseudo and I don't know why I didn't throw that error on node two but that's okay and it looks like everything is good to go so now let's go ahead and test this out and see if we are in fact getting any log messages on our Loki server so now let's go ahead and test this out and see if we are in fact sending any logs to our Loki server and the best way to do this is I'm just going to show you how to do this in grafana so I already have a grafana instance running on the same server that happens to be running the Loki instance keep in mind remember they don't actually have to run on the same server this is just a demo so just keep it really simple just so the grafana instance is running on the same server so when I actually pointed to Loki it's going to be localhost so I'm going to open up a new tab I'm going to go to the grafana server wherever that is so that's going to be at Loki 3000 and here I'm going to go open this drop down and we're going to go to connections and we're going to add a data source and there's going to be an option for low key so I'm going to just call this low-key and here I'm going to provide the URL to our Loki server and so remember it's running on the same machine so I could just say localhost Port 3100 and we can do a save and test and we can see data source was successfully connected so it's everything looks to be good to go and now I can go back to explore you can select if you've already got several different data sources make sure you select the low key here and you'll know it's working if you go to the Builder and you select label and you see it populate that means we've already gotten some logs so it was able to get logs that had two different labels either file name or a job label so what we can do is we can use those label filters to filter down what logs we want to see so if I want to see everything with the job of VAR logs then this is going to do that and you can see the query what this does is the query ultimately is just you just do actually let me make sure I zoom in for you guys here but you just do curly braces you say the labels that you want so here we're going to do a label of job equals far logs and if you actually switch to code you can actually see what this query looks like you don't really need this part at the moment so you could just delete it if you want but we can run the query and we can see all of the logs all of the sys logs that it collected and sent to low key so this confirms that we are able to successfully send logs from promptail and from those servers to our Loki server now if we go back up so this is what we call log ql so if you ever worked at the Prometheus it's pretty similar to prompt ql but it really just it's nothing more than curly braces you provide the key value Pairs and then here this is going to actually allow us to search for text so if you wanted to find any Docker logs you could say I want to find the word Docker in there and it's going to whoops I didn't mean to hit enter we will want to run query and it's going to return all of the logs with the word Docker in it and we can see all of the logs that have the word Docker and that's how you can search for specific logs that have this specific label so the labels will first filter down what you want and then you can then search within those logs for specific keywords like docker and if we select on one of these logs we can do a little drop down button and you can see the specific labels associated with it so here we've got the job which is part of our logs and then you can even see the file name that it was collected from so this actually came from slash VAR slash log syslog and if you want to we can actually change our query and I'm going to go back to the Builder just to keep things simple we can change it to be file name and you can select the specific files that you want to see so maybe I only want to see kern.log so now if I run this query we're going to see just logs from that specific file and right now it's still filtering for Docker so I'm going to remove that we don't want that and let's see all of the logs so here these are going to be all the logs that we got from that specific file and you can once again drill down and see all of the relevant information if you'd like if you want to and you want to select multiple files you can do a match based off a regular expression so this equals and then the little tilde that's a regular expression if you want to do a not equals they've got that and if you want to do a not equals regular expression you can do that as well so if I do regular expression I can add another file whoops not that I want to add another file so I just click here and then maybe I want syslog so now I can match based off of two files and then I can run this query and it's going to get me all the logs from both of those files and once again you can take a look at the actual raw query so if you ever want to go into code and run the this is what it's going to do so you just say equals tilde and then it the regular expression this is going to be a pipe which is basically a or so it's saying I want to find this file or this file that's all and so you'll see that the query language is pretty straightforward actually and with grafana we can always use the Builder to assist us if we ever get stuck all right so let me go back to the servers and I'm actually going to open up a new connection and I'm going to connect to Node 1 and node 2 again so I have another terminal to them and I'll just rename these so that you guys know which ones which okay so now I'm going to go back to node one and if I go into this app folder I've got an application in here and so this application index.js here this is just a node.js application you don't need to worry about the details but this application that's currently running at the moment is going to generate a log file in the same directory and the log file is app.log so if I actually just cat that file we can take a look at the logs that it's generating and so since this application is running I want our promptail to collect these logs and forward it to our Loki server and the same thing goes for node two I've got the same application running on this if I go to app we can see I've got app.log and if I cat that we can see it is in fact generating logs we could even tail it if we want and every second or so it should create new logs so how exactly do we do that we're going to have to change the promptail configuration and so I'm going to go back to the first two tabs I'm going to go back to node one and I'm going to disable the turn off promptail so I'll just do Ctrl C that's going to stop it and I'm going to open the promptail configuration actually before I do that what I'll actually do is I'm going to cat that file and this is going to make it a little bit easier for you guys to see so I'm going to copy this and I'm going to actually just open this in my text editor so example.yaml and I'm going to paste this in here so I want you guys to think about where exactly would we configure this new information on telling promptail where what logs we want to collect because we want to collect another set of logs it's not going to be under server because that just tells you what server and Port we're listening on it's not going to be on a client that's going to be where that's just information about where our Loki server is running it's going to be under scrape configs right so we've already got a job that tells it to grab all the logs in VAR log but we want to set up a new job that's going to look for this new specific log file so I'm going to create a new job and this is just going to be a matter of copying and pasting this and I'm going to paste this in here and I'm going to call this API because the application is an API but you could call it whatever you'd like here Target once again is going to be localhost and then job I'm going to give this a new name I'm going to call this API logs path what's the path where is our log file so if I go to and I have to bring up my terminal here PWD this is going to be slash home slash vagrant slash app so I can just remove this paste and we want this to be we could just do star log or we could just specify app.log doesn't really matter if you had other log files like if you're doing some sort of log file rotation or you have multiple different files then you can just do a star log and it's going to grab all the log files from that specific folder so that's all we need to do now if you wanted to and you wanted to add an extra label maybe like environment production you can add as many labels as you want remember that's what we're going to be doing the indexing on but I'm going to remove that we don't need that so this is all we have to do so this is what our config looks like so I'm going to copy this and I'm going to go back to my node one and I'm going to do a nano promptail local dash config and we can just paste that in there and that's all we have to do and we'll save and exit and we're going to do the same thing with node 2. so paste that in and we will do a control X all right so we've updated the configuration we can go ahead and start promptail again by running the same command just hit the up Arrow a couple times make sure there's no errors there shouldn't be and so now we can go back to our grafana instance and I'm going to clear out all of this nonsense we don't need it so I'm going to select label we're going to do job but this time we're going to select the other job so now we got API logs ignore the API that's from some when I ran this before we'll select API logs and if I run this query we can now see the logs from our application my API so these are my API logs this is what I've set them up to look like we can see some information like what's the host name what's the specific method that the API hit what route as well as what was the status code of the response and we could do a drop down we can see more information and you can actually when you click on a drop down you can actually select specific fields that you want to filter on so if you want to see only if you want to remove job API logs you could just do filter that out and it's going to remove that from this filter so if I do remove that filter it's going to remove that or if you want you can even remove or add if you want to specify just the App log we can add that and it adds that file name equals blah and then we can just run that query so you can easily customize your query just by selecting which Fields you want in your log so that you can drill down and see the exact specific logs that you want but that's really how simple it is to work with low key you can see that getting it up and running isn't really much of a challenge and you'll see that managing it isn't much more difficult than that either all right guys so that's going to wrap things up for this video in the next video like I mentioned in the intro we're going to take a look at how we can utilize Loki in a containerized based environment with Docker as well as with kubernetes and make use of some of the other features and functionalities that come with Loki I hope you guys enjoyed this video and I'll see you guys in the next one foreign
Info
Channel: KodeKloud
Views: 28,985
Rating: undefined out of 5
Keywords: Grafana, Loki, Grafana Loki, Logging, Monitoring, Prometheus, Part 1, Tutorial, Installation, Configuration, Integration, Logging Backend, Open Source, Prometheus Ecosystem, Monitoring Stack, DevOps, Tech, Software, IT, Guide, How-to, Dashboard, Logs, Visualizations, Logging Architecture, SysAdmin, Installing loki, loki grafana, loki grafana tutorial, loki grafana kubernetes, loki grafana dashboard, how to use grafana loki, how to configure loki in grafana, how to install grafana loki, Kodekloud
Id: 0B-yQdSXFJE
Channel Id: undefined
Length: 30min 20sec (1820 seconds)
Published: Fri Jul 28 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.