How to import CSV File data into ElasticSearch using Logstash | Visualize CSV data in Kibana

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hi guys welcome to simplifying tech and programming so in today's session we are going to talk about how we can import csv file into elasticsearch by using logstash so in previous session we have checked about how we can configure and install logstash on windows so continue further so let's uh import some data by using logstash into elasticsearch so we can import multiple files like or we can import multiple sources of information like csv data or we can import the database data or sql data into elasticsearch so basically we we can start with the logstash first so i will just give you the brief about lockheed so logstash is a like a data pipeline we can say so here you can see in the layman terms we can say when we we are getting input and we are processing that input in the filter and we are transforming data as understandable into the output format so here the our output is a elasticsearch so we are transforming data into elasticsearch understandable format and we are saving into output that is elasticsearch in details like the the log stash basics is a data source data source is a form of the data which can be any form like csv or excel form or sql form or any form that act as input then filter and output and destination is your anything like elasticsearch kafka any destination you can have so this way we can like convert one data into one data format into another data format so let's continue so in the previous session like i have told you about how we can start the elastic search so go to the elasticsearch installation directory and there you can from the bin directory you can hit the elasticsearch cmd command so i have already my elasticsearch started also we are visualizing the data which is improve logstash so we are using uh kibana also so i have started the keyboard on uh five six zero one port so now let's uh take one sample csv file so go to google.com and search for sample csv file for study okay so there are lots of sources uh from where you can find the sample csv file but uh uh i'm taking this from e4xl.com yeah so here uh there are sample xmls or php files available so i am taking this uh csv i am downloading this csv file that is a hundred sales records so click on hundred sales records yeah so go to downloaded directory where i got this extract here okay the file got extracted now put this uh data into one directory so i'm putting this into c data directory yeah so i have kept it here so now let's uh write the configuration for logstash so now i have downloaded the csv now i want to load it into elasticsearch so we have to write the pipeline in between logstash okay so there are three steps as as i mentioned in the beginning so one is a input one is a filter and other one is a output okay so uh we are writing uh like i as i mentioned so we have to give some configuration like input input is my csv file okay so just i am writing for understanding format like this and then here i have to write the column configuration how how my columns ah look like so that column configuration i write into a filter and in output i can write elastic db connection details ok so this way we have to write it so we have to follow the syntax so i already had the sample so i i will explain what is this so this csv file we have to give like this like we have to give the file uh section in which i have to give path where my csvs exist so like i said like my csv is exist in data directory c data directory okay so i have given this then start position so we have to give the start position so and the scenes db path so start position means from where i have to start and since db position since db path is a is a path where uh your lock stash is previously stopped or it is uh it is ended ok so if you want to see detail about this then go to the elasticsearch documentation so there they have mentioned like what is a path so this is required setting so path is a to the file as an input so you have to give the input file path where csv exist then what is the scenes db path so path of the system database file keep the track of the current position monitoring for the lock files that will return into the disk okay so this is the same sdp path so this is done for input now for filter you have to just give okay so here we have given the file here we have given the what is the csv and what is the separator so my csv is comma separated file that is separated by comma and i have to give the column so that my database could map that columns into elasticsearch data so for this columns so i have to go to the downloaded file so say right click edit with notepad so you can see this is my csv data so this is the header name so just copy this i have to give all headers into codes okay so to do that i'm going to replace the commas by quotes okay and i have to add one coat here then copy this into your lock stash configuration yeah so filter is done now in the output i have to write the connection details so what is the syntax for connection details so this is the your elastic search connection details so we are going to connect on elasticsearch that is present on localhost 920 the report and the index so whatever the data is that these data so i am importing into my elastic search and whatever the table or index into my elastic search will be with the name sales records so this index will be created into sales record and i'm doing the std output okay so now you have to save this file okay so now i have like we we can save this into logstash configuration directory so here i have a lobster conf so so you have to name this file as a logstash.com okay so save it okay so it's saved as logstash.txt so i have to change the name so let's verify it if it is saved correctly yeah so it is lockstage.conf okay so regarding the scenes db path so ah so i am executing these files on windows so it will be like a capital null but if you have like linux or unix then there is a like dev com slash null so this path you can give in case of linux or unix but for windows you have to give like capital null okay so this is about uh since db path where your log stage file uh previously stopped okay so save it so now just you have to like execute your your lock stash you have to start your lock stash so go to lock stage directory bing directory or this directory only okay so now let's initiate log stage bin slash logstash minus f log stash dot conf so hit enter okay so i have to replace the forward slash into backslash okay so it's a starting pipeline then it's a starting pipeline started and it is successfully started the log stash api endpoint so here uh pipeline source is a easter egg logstash.conf so whatever the logsdash.conf we have provided so it is executed and this csv file also got imported so now we can verify that by using kibana so so given a url is this so uh you can see this is the kibana url so i have kibana so now i have to list my all indexes present into my system so just uh use the command underscore cat and indices so it will list down all the indexes uh present into my database so you can see i can able to see this sales dash record sales and have one records index is created that we have given into the con file so now let's verify what is inside that sales records that is correctly imported or not so to get to see the records inside the sales index so just say get index name slash underscore search and just say send to request yeah so you can see 101 records are in successfully imported into elastic search from the csv file so you can see so this is the like i have hundred sales records plus one is uh your header so uh that 101 records are there so you can verify also so what are the column names here so you can see this is the country then region region country so that is there into my csv file so this is my region this is my country this is this is the order id total profit total cost total revenue so you can see this is a total revenue ship date order date so whatever your column names you have provided so for that you got this data so we got 100 records into our index let's go into further we can see the index pattern also so for this to see the index pattern for that created index you can go to stack management and inside that stack management you can go to index pattern so i am talking about this elastic search version 7.10 okay so you can see you can say create index pattern so here you can see you can able to see your index name so just copy this index name here sales record you have successful match say next then you can select this by default underscore at that time stamp and hit create index button okay so this index pattern is created that means like it has mapped your column data like this country item type order date from this column names this country item type okay so this mapping it has done so now let's verify into our discovered visualization of data we can verify into kibana discord tab so go to discover tab so inside my discover tab so you can able to see this drop down so whatever the index pattern i have created so that is visible here that is a sales record so click on that so it is searching for that index button and here we go so we got this spike of 101 so got one so that uh on in csv file i'm gonna put it like 100 sales records so that records can be in discover tab so we can see into details so this is uh uh like the one document one comma separated file where one attribute can say into json format where you can able to see this data like whatever the region country item type for uh for one record you can say so this one record can be mapped with uh this like this is the region then this is a country path total revenue like this like for example this is a region country then the total revenue this kind of all columns are visible into uh discover tab so likewise we we can see any document here so there are 100 documents you can able to see in table format also and json format also so in json format you can able to visualize all the data which is imported from csv file so just like that you can import csv data into elasticsearch and you can visualize into kibana yeah so that's all for this session so uh in in further sessions like we can continue with the curved operations with elasticsearch database and many more topics in the elasticsearch so stay tuned and thanks a lot for watching
Info
Channel: Simplifying Tech
Views: 22,563
Rating: undefined out of 5
Keywords: elasticsearch, kibana, logstash, elasticsearch training, bigdata training, data analysis, data visualization, elk, elk stack, load csv file to elasticsearch, import csv file in kibana, kibana import csv into elasticsearch, kibana csv file upload, store csv in elasticsearch, elkstack, elasticstack, windows 10, pipeline, elastic stack tutorial, csv, csv files, using logstash, loading csv in elastic, import csv in elk
Id: RHkc4RQw7cg
Channel Id: undefined
Length: 16min 46sec (1006 seconds)
Published: Sun Dec 20 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.