04. Elastic Stack || Logstash Installation and Configuration

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello and welcome to a new tutorial in this one I'm gonna talk to you about blocks - this one right here locks - is an open source it's used for data processing with pipelines and you can ingest data from various sources then transform it and send it to elastic search elastic search is not the only one where you can set it you can see you can send it using different outputs plug-in for example if you have a great lock system and you would like to send it to trade lock you can do this one with Jeff output plug-in but in our cases we were going to focus today on the UDP input plug-in and elastic search output plug-in in locks - download locks - to download locks - for our Centauri system we will right click here copy the link address let's go to the CLI and just opw well okay we have the previous packages downloaded so we will download the locks - this time next we will download the ECG some file to see if the package is valid let's make sure that everything is ok and it is so we can proceed with the installation so let's install it with rpm - I you can also - a - - install but I'm going to do it with - I it's much easier I would say and then the package and hit enter so that the application is installed and here I'm going to create a directory called log stash and in this directory I'm going to create a pipeline which is actually a file with the extension that come so let's call this one cease block come and hit enter in this pipeline we will need to specify an input a filter and an output and to make sure that we don't reinvent the wheel here we can Google for some examples for box - come duration example and here there is a guide a reference guide okay there is one right here and it has two input ports so let's copy this pawn copy and then let's go to our CLI and paste it in going above let's get rid of the TCP because I'm not gonna run TCP for syslog it's also changed the port number something like one five one four okay and here we have some examples on how to parse a message using a grog pattern so if the type is syslog do this yeah and let's go down this one is pretty much standard so we could leave it like that or for example we could try and modify it because this will be some kind of syslog output and I have configured the pfSense box to send the syslog to elasticsearch sorry not to elasticsearch to log stash and i would like to create an index called PF stands for that yeah so what I'm gonna do is that I will get rid of this line here we could have kept it and modify but let's write it from beginning so I'm gonna do elasticsearch then curly bracket open curly bracket okay and here I'm gonna specify the host so host and here I'm gonna specify the localhost since we are running elasticsearch log stash in Cabana on the same box so localhost that : 9200 and these ones they need to be in code let's hit enter and let's create our index in here so we will send it to pfSense index and we we can also add some date here like for example the year the month and the day here we will go to year month and day and by the way when you define your index name like for example like I have here pfSense don't put capital letters put only small letters I had issues trying to put capital letters in here okay so after this one we will need to close the curly bracket that we opened for the elasticsearch configuration we will save it and we can also see how it looks like in here it's the same everything is good so right now we will need to run this pipeline and logstash actually sits in a directory and I'm going to show you where it sits so for example user share locks - and then the bin folder and then locks - so this is what you need to do to execute as a command to start your to start your application and run it with a pipeline but as you can see the command is pretty long so we can make a temporary alliance for this one locks - will be our command and then we will have equal codes double codes and then we will put these command in here and he tell you right now it will work like a charm locks - okay - if it's in there and right now we will need to specify our pipeline which is syslog humph so I'm gonna run this one and as you can see we have some okay so right now we started took a time or some time to get it running and now we can see some kind of data coming in to log stash this is how it looks like you can see the type is his log this is the source IP address the pfSense box that is sending logs to this one or C's lot to this one and here is our log now because our grog pattern is not tailored for the pfsense type of message we were going to have a grog part failure so we cannot parse this type of message we were going to take a look at parsing messages at the later point in time but not not right now right now we are interested to get the data to elasticsearch and see it in Gabon so let's just go to our web interface in cabana and let's go to settings or management yeah and then index patterns and in index patterns what I'm going to do is that I'm going to create an index pattern right and curiously there is nothing in this index pattern we resolve data coming to moon elasticsearch so let's troubleshoot why is not working let's take a configuration file of the logs of the pipeline now the pipeline answer is not in the log stash file but doesn't really matter let's check the configuration of it the input looks ok the filter we don't really care about it and this is the index that we should send our logs mm-hmm and I think I know what is the problem here we are lacking the plus sign that I forgot to put in so I'm gonna save it makes it move the syslog file actually to the log stash directories or move syslog to stash okay it is very should be let's stop the line right now the pipeline then log stash and we will run the pipeline one more time and hopefully right now everything should go to elasticsearch so we will need to wait a little while to for log stash to start and as you can see here there was actually an error saying that it must be lower case the index UUID so here because we didn't have the plus sign for the date it was considering it upper case being part of the index it so this is the reason why it was not going to elasticsearch and you were not able to see it in Japan and right now everything should work fine let's see if the port started and it's that - so pian and yeah it started let's see if we get some logs in here and yeah we get it we get the logs in here let's go to cabana right now to the web interface and let's check for new data there it is you can see it in here it's called pfsense so I'm gonna put an index pattern called Pearson's and I'm going to next then I like to add my timestamp you have some Advanced Options I'm not interested right now in to Advanced Options so I'm just gonna click on create index pattern and right now we can we would like to if we would like to see the logs we can go on discover button and there you go here are the pfSense logs now the problem with this configuration that we are running at this very moment is that if you're going to close this terminal session or do something else or for example if your system reboots your log stash will not load the pipeline so right now we need to find a way to load the pipeline and to do this we can check the configuration file of locks - which is in Etsy locks - and then logstash that yellow and here we have some information about pipelines now the idea of the main pipeline anyway it doesn't show you in here where we can actually put the pipelines or where you can save them but if we will exit out from here and go to pipeline or pipelines that diamo here it will gonna show you where the main pipelines we're gonna go so if you have multiple pipelines you need to put them into HC logstash County and everything with comp will be written if you have multiple pipelines you can create another pipeline ID and then you need to specify the absolute path including the file name just to differentiate between the pipelines I have tried once to run multiple pipelines but I didn't work for some reason somebody in the community elastic community said that there might be some kind of part with the version of luggage but if you like to have to get data from multiple sources you can create different type different inputs and I'm going to show you later how you can do that one also in this video so we we need to move our pipeline to this directory I'm gonna exit from here I'm gonna go to count D okay let's see where we are exactly are we are in lakhs - so lol so I'm gonna move the syslog file to 80 lakhs - can't be and hit enter now let's go to HC lakhs - can't be older or tired thing and there you go you can see that our pipeline is he here now to make this process automated we will actually enable so for example when your system restart your pipelines will be started automatically and also the locks - will start automatically we will do a systemctl and then enable the service box - right now we will start it and it says in here that it started we will need to give it some time if you want to see the how when exactly it started you can do a tail F and then VAR log log stash yet this is the first time when he started so we will need to wait for the files to be created let's check it one more time with status to see where it is right now okay is good it's again started so the the log files they are not created it okay and right now it says that it we started to send logs to VAR log then ok you can see it in here how the popped log stash plane and it looks like they started it started the ports let's check it out check it out with netstat so we have one five one four which is started and next time when you'll reboot the the sulfur your pipelines and your application will start automatic ok let's generate some more logs and we should be able to see it in the in Kiba no I'm going to just refresh this one and yeah you can see it in here it got six new messages so the main problem of white wasn't working it was due to the fact that in here when I wanted to add the the date I was missing the plus sign and then it considered everything else to be to be uppercase so you need the plus sign and then you need these characters to have a date and where you see that date I don't know if you notice if you have noticed for the first time but the date you can actually see it in the index pattern when you are actually looking for it or to create a new one you can see here 2020 April first it's much better to to have it by by date to get an idea about the information now if you have multiple inputs like for example you don't you you have another device that is sending logs to you you need to create another input or to start another input with lobster so to do that I'm going to show you how you can edit so let's add another input in here okay and this time let's do it on 1 5 1 3 and let's call this one I don't know how to call it let's call it other yeah so I'm gonna put type as author it doesn't really matter any here we need to do some if and else statements nothing to complicate it so we will say that if the type is other then we will send it to elasticsearch actually what I can do is that I can copy this one right here okay and here instead of PF spins I'm gonna name it author ever see slog okay so we have an open an open curly bracket open curly bracket this point is closed we will need to close it again in here and else there is something else like for example of other types and the first one that we had to find was on five one five one four six Lord so if there is any other type we will gonna do this one right here down below we will send it to the pfSense index and let's close the curly bracket for else here we can do it this way the indentation is quite nice to have it especially when you have a lot of text to work in indentation makes it much easier for you to see the data so let's keep it like that let's save it right now and we will restart the logstash application to load the new pipeline and right now we should have two ports started for syslog now to point out something to you I have the same pfSense box sending the same logs to log stash but one of them is on one five one four and the other one is one one five one three UDP port and this is just to show you how you can create different index patterns and then differentiate based on that one in Cabana because for example I did the second input just to show you how you can do it to separate the index patterns in case you have I don't know routers or you have the Linux servers that are sending data to log session so right now they start it it's working and we should go right now to the Cabana web interface and let's refresh a little bit the information here we will click on create index pattern and you can see that there is the the second index for other C's lock so I'm going to do other syslog next step add our time stamp and create index pattern now we can go into discover and in discover we already have the pfSense but we can also go for other syslog and here you can see the data which came in so it's the same data but it it doesn't really matter if it's the same data or not I just wanted to show you that you can have or configure multiple inputs and if you'd like to send it to different index set you can do that and then you can differentiate thanks for watching I hope you liked the video if you did so please hit the like button subscribe and talk to you guys in the next one
Info
Channel: Bits Byte Hard
Views: 16,262
Rating: undefined out of 5
Keywords: mongo, mongodb, elasticsearch, graylog, logstash, kibana, centos, debian, fedora, ubuntu, redhat, suse, ELK, java, monitoring, splunk, arksight, qradar, logs, log, processing, linux, vmware, virtualbox, Pen, VIP, virtual IP, firewall, Linux, server, nxlog, syslog, UDP, patterns, lookup tables, graylog 3.0, table, active directory, mail, exchange, notification, condition, bash, shell, scripting, monitor, graylog3.1, graylog 3.1, backup, elasticsearch backup, elasticsearch snapshot, elasticsearch restore, elastichq, cerebro
Id: ipS2d7pDgqs
Channel Id: undefined
Length: 29min 30sec (1770 seconds)
Published: Tue Apr 14 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.