Install logstash in windows10 & the basics of the FLK stack

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone welcome back in this video we will learn how we can install logstash in windows 10. in the last video i have shown you how you can install elasticsearch and kibana all right so for this you just need to download this logstash it's also as a file actually and once you download and unzip it it will look like this so basically you need not to change anything in this one but what you need to do you just need to create one config file before running the batch file i mean to say this log stash batch file all right so as you can see over here they are saying just unzip it and create locksteps.config file and they have created a sample file also so for example like from where the input will be coming so we are saying your input will be coming from console right now like this for basic example and output means it will push the output to the elastic search all right so as i have already created this config file i'll show you it's gonna look like this yeah this is the basic one exactly same input from where input will be coming so since we are seconds console and output will be going to the elastics uh elasticsearch and the index name so when the data is going you need to give the index name so i'm just giving it like this log index name all right so this is the first log index name it's just like you give whatever name you want to give it all right and what you can do just to open a command from from here simply and say logstash space iphone f it's the same command as they have mentioned over here if you want to see it's like log stash hyphen f and then config file path and if you want to create a config like a complex one which you can use for other examples as well like uh like they have many like we filled out another other new criterias are also will come to that point in just a bit all right so first let me give the path of the config file which is over here this is the basic one okay so this is the file that's i've entered and my kibana is running over here as you can see and let me show you the kimana home screen i hope it's empty you can see there is no index created so far let it run so now you can see the log stress is up and running all right so since we are saying in log stress like a input i mean the data will be giving from the uh this console all right so i'll say this is my first log message if i enter so you can see the message is going to the elastic search if i refresh this page you can see this is the same name which i have given in the config file and it has some data all right so now for to access that we need to create an index on that with this same name and i'm able to create it i'll say put a like any time stamp is like um filter by i mean like timestamp so index is created now let's go to discover that's where you will be able to see this is my first message all right so let me show you so i removed everything now this is the screen you will be seeing for the first time and it has all the so all the filters like basic and yours is like whatever you typed from the console that will be part of your message so you can say let's say message i'll show you how you will get to know that also you just simply expand this one click on single node and you can see this is the property name and this is my first log message go back simply type here say message it's coming over it okay since right now there is one message you click on plus button so it got filter clean all right so if i enter let's say here um two three i hope it should be able i mean you are getting you can see one two three all things are coming all right so now let's this is the basic one now let's say like you want to you know like pass the logs or maybe you see you want to pass the logs from a csv file so the log struct loss what you call that log stress can read that csv file and you know after reading that it can you know push those locks to the elastic search that also i'll show you so for this what you need to do you have to uh create a new one a new config file which is gonna look like this so what we are doing here in in this case we are saying input will not come from console it will come from one of the file so let's say let me uh give one file so there are also two example let's say i have downloaded data diabetes data like it has some data and so this is the csv file from there it has to read the content starting is always begin and then like that and then the filter you have to type like so in csv like it will be having so many columns right so if you want to filter apply a filter on the csb columns you have to write all the columns otherwise everything will be you know given in the message so message will be holding all the entire row but if you want it is like entire row should have the columns also right so then in you need to mention all the column names separated by comma and the comma means like let's say if you never row four values are there then you should have four keys over here the column name so i would say and it's the same this is like now no change over here you just simply say where the output should go so you say uh it should go in this elastic since the elastic search is installed in my local machine so that is why i'm saying localhost colon n9200 that's a default port actually and this is the index name where like these data should go into which index since in last time i was using this like this log index now i'm using the csv log index all right so it will create a new index altogether the moment i uh you know run the log steps with this new config file so i have stopped the old instance and i'll show you the logs this let's go back here you can see there is only one index over here right now which is nothing but old one okay now let's just say underscore cs we i hope that's the name underscore csvs correct let's run it so once this lock stress is up and running it should start reading this csv file which is nothing but this okay made a mistake looks like it should fail start running yeah it won't do anything because the file name is not matching let me make it underscore one save stop yes so now it is and this will log underscore one yeah it's matching hopefully this time it should now push the i will read the csc file and push it to the elasticsearch yeah it's able to read all those records so right now we are having close to eight records if i go to kibana let me refresh this yeah it's there 70 kb data let me create index on that all right so now let's go to the disk cover and we should be able to see all the records um three uh let me refresh which is the file we have post let me double check three records what is the entry one two three sorry sorry i didn't change the index over here so i need to change the index and you can see eight records are there which is nothing but this one all right so last is like two one two three five which is nothing but this record okay it didn't pick up the oh it's also there 100 is also there now let me create a new record let's say if i want to copy this one and paste here let the record is like entry is 999 and it'll paste here and just save it now you should see a spike on this one okay it's over here now let's go back to the kibana it's eight now if i refresh it should become nine this also new entry also came over here all right so that means like we are pushing the uh like iron let's say there is some application is judging which is generating uh you know logs in one csv file and those see i mean that csv file is read by the logs trash and then internally it's using all the i'm like reading all the data and pushing to elasticsearch and then we can see all the data in the uh kibana so i mean this is structural like this kind of architecture we'll use when their application is so big and generating like millions of millions of records and then we want to do a filtering when i mean depending on the use case actually and we want to you know find so what you can do you can put filter over here let's say i want to say um like let's say number of there's like these are it's having these many records let's say i'm interested only those over here the value is 100 i'll just simply say so only one record is there if let's say i just recently i just now created one more record now which is nothing but triple line one so i'll just go and it and if i make it to triple nine sorry that record came let's say i want all the records other than this record i'll simply say edit not so other records you got it so i'm looking it's up to you what kind of filter you want to put it over here all right now i'll just show you one more use case over here since right now what we are reading we are reading the logs from this file what if like our you know application is having a setting like it's every day it's used to roll the log file it will create a new log file right so in that case what we can do in this i just need to remove this lock file name and simply say start star and before i do this let me stop this execution so that i can reload the config file of the lock stage and i'll show you the current counter is nothing but nine all right so before i rerun i should save that file right the config file so i'll just go back here and since it's a star i'll just make it uh i'll just save it and i'll just go back to this folder where this file is there which is nothing but this so you can see there are two files one and two and this second file is having close to two records over here so nine plus two it should push two more all right so let me start this one it's usually take five in five to ten seconds actually and successfully started and it pushed these two records also you can see yeah 135 is the last record now like go back to the kibana if i refresh you can see 9 plus 2 is 11 so that means 746 and 135 do new records game i hope you are getting right so assume let's let's say if i go back and uh let me create a new file again let's say i change this file one two i'll make it two and i'll just save this file i would with a new name then only it will pick it up so i'll say simply three and save at the same location and saved it and you should see a spike on that so right now you can see 135 it should be changed to 235 you can see it the new cards came here okay and let me go back to the kibana again it has to record the new file so that means eleven plus two is nothing but thirteen you can see two new record game all right i think i am able to make you understand like how you can use this um elk stack where you your application is generating a single csv or a multiple csv depending on the application configuration and you want to push all the logs through uh you know to do elastic there are many ways to push it you can push directly or you can push it through lock stash it depends on that so i'll tell you like so for example if you are using lock straps what happens like let's say your electric search is not running because of some reason if you're directly pushing the locks to the elastic search right what will happen you will lose your logs that is the worst thing so that is why like your the el caste some shouldn't raise a picture so what happened like when your elastic size is not like at least you are having a csp that's like you say first you are still in a safer side your csv having all the data and you can just restart this elk i mean like elasticsearch and then you can push it again so there is no loss that is the benefit of using this stack anyway the in the coming videos i'll be showing you how you can you know send the logs directly from c sharp to elasticsearch and using through the elastic what you call that this elk stack also so i'll show you both the scenarios depending on your use case you can use either of them alright that's pretty much about this video thank you very much
Info
Channel: Infinite POC
Views: 858
Rating: undefined out of 5
Keywords: Install logstash in windows10 & basics of FLK stack, logstash, ELK Stack, elasticsearch, kibana
Id: IUFex9kS2Nk
Channel Id: undefined
Length: 14min 22sec (862 seconds)
Published: Sun May 02 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.