How To Load Data from a MySQL table to Elasticsearch Using Logstash | 2023

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
foreign [Music] Tech and programming so in today's session we are going to check how we can migrate SQL data to nosql data so that means like we are going to demonstrate how we can transfer or migrate data of MySQL tables to elasticsearch index by using log stash Pipelines okay so there is often requirement where like we have uh we have to migrate our database to nosql database or faster database uh but uh we see certain challenges so for that we can use elk stack and we can convert one data format into another data format like we can convert MySQL table data into like a elasticsearch index data so agenda of today's session is like we will see how we can download and install uh log stash 8.8.0 then we will go through like a starting elasticsearch and kibana then we will create MySQL sample tables that we will demonstrate to migrate the data then we will create logstash pipeline configuration along with isql connectors then we will configure that pipeline into elasticsearch setup and we will run that config and we will visualize that data into uh elasticsearch keyboard dashboard and by using Discovery tab so we will see the dashboard for or we will see the how that data getting logged into elasticsearch okay so let's get started okay so before starting uh so we will see like in one liner like what is log stash and how it is useful for transferring or converting the data formats so logspace is a free and open source server data that is a data processing pipeline so that engines data from multiple sources transform it and it sends to the your favorite stash that is a and that sends data to your elastic search so if you talk about log State skeleton so these are the simple building blocks one is like an input where we will Define the what is the input of the system like in our case it is a MySQL then there is a filter the so the filter will be like it is the business logic we are going to apply like what fields are going to change or what type of metadata is going to change so that we can apply here and in the output output section we are like just migrating that to elasticsearch so in the output elastic such as details will be there so here in the jdbc you have to give the database connection details in order to connect with data and here in output we will give the elasticsearch connection details to in order to connect with that the whole job of data transformation that will be done by logstash okay so go to browser and download the log stash so download officer so go to elasticsearch link and here it is mentioned that uh download logs dash for Windows so we are going to download for Windows okay so we have already downloaded this so we'll say this cancel okay so you can see this log size 8.8.0 so that is the downloaded so just we need to extract it here so like just extract here so we have already extracted here uh so this locks dash 8.8.0 so uh now like now let's create the log stash config and then we will start the log stash so before that uh in order to work with log stash so we need a elasticsearch and key Banner to be started so for that like if you want to know how to download and install elasticsearch and kibana so you can refer my earlier videos so here it is mentioned so here we have done with 7.9.3 here we have done with 8.5 so 8.8 is similar to 8.5 okay okay so our elastic search elastic search and kibana is already started so let's write the configuration for log stash and we will start the log stash also so as I mentioned so we need this input jdbc uh here we need to define the jdbc and here we need to define the elastic connection okay so let's do that okay so what we are doing here so so we are defining this Library so we need to download this MySQL connector so we have to go to this website 11 repository and here you have to search for Roma SQL connector latest version so currently 8.33 is latest but 8.30 is stable so we have to download this chart so click on the chart so that this jar will be downloaded so now so you can you can copy this jar so create one folder here called as a log stash slash com uh paste the jar here okay and uh this path we have to mention into our configuration file so go here so see tools logs test com you can replace it and this is the name of the jar so that we have to mention so this is this is about the MySQL connector then we need to define the driver in order to connect with database that is a MySQL JVC driver then we have to define the connection string so where like I want to connect for MySQL and this is our database name so that is a MySQL DB then we have this jdbc username and jdbc password and we have this uh statement to fetch the data from uh this database and this is a table name so this is a sample data like this is my MySQL workbench so this is some sample data so I have like a film table okay so I have thousand records and I have data related to sample data related to film so we have the film information so what is the film name description release year language ID and rental rate special features last update so that that type of data we have so these are thousand records so we need to uh like we will connect with this this MySQL data and we will transform to nosql data that is a Json format data okay so this uh select star from MySQL DB dot film so that I have mentioned here select star from MySQL DB dot film so this data we need to transform then user column value so this is the metadata you need to Define in statement you have to Define this query and this is the tracking column so now tracking column will be your anything your that is in date format so you can give last underscore update so here you can mention the last underscore update okay so last underscore update and this category now this input section is done so just now we have to define the output section so in the output section we need to define the connection for elasticsearch so let me copy this so in the output section like we have this we have to insert data into elastic cells so provide the details for elasticsearch like this is a post for realistic search is a local host 920 port with the HTTP export okay https protocol and then I have to insert data into fill index so so I will name as a field underscore idx okay so our username is elastic and password is here so password like uh you need to note note the password when the elastic search is starting at the first time so the same as your password so keep it here is password then uh then there is one another thing we need to do because uh this latest version of elasticsearch is uh totally focused on SSL so for now we are disabling this certificate validation only so we will make SSL as it but certification validation as a form and then this output section is done so input section output section done so filter will be uh like whatever whatever the things that we want to change in between so that we need to Define into filter so that is a mutate so we are doing mutation from one source to destination okay so inside this so we will add some basic uh thing okay so what is this filter so in this filter suppose we need to add some Fields so we can add underscore field then we can remove some some Fields so by default what elastic searches do so it will add the version so we don't need that version so we need to remove this we remove this field suppose like uh in elasticsearch underscore ID we want to maintain particular primary key of SQL table then we can mention in that way okay like copy this film ID as a metadata underscore ID so for now we are not looking for this yeah so this is done configuration file is done just save it so where we will save it so you can see this log stash 8.8.0 inside that there is a config folder so go to config and type as a film underscore config.conf this is the log stand this is the configuration file and this is the film underscore config.config okay so that's it so now we are ready to start the elastic search so go to elasticsearch bin folder so we are now config so go to pin folder to pin type CMD okay so we need to write this command so in the bin folder we have a log stash directory okay so you can see this Locker we need to invoke this log stash okay and with the minus F command so right log stash minus f then we need to Define that configuration file path so I will Define config and configuration file name okay fill okay so we have defined this configuration file and one more parameter we need to Define because logs test some uh sometimes like it will not pick up that configuration automatically so we need to Define this config reload automatic so it will pick up this configuration file automatically so that's it let's start this so it will start this log stress process and converting data into logstash so meanwhile meanwhile it runs so go to kibana so our kibana is already started so this is the localhost 56014 so login to kibana type elastic as a server and password as this okay so log stress is starting meanwhile our kibana is also started so go to devtools okay so this is devtools that's verify this key Rockstar okay so log stash is successfully started API 8.19600 okay so there is no index called film and layers gate fill and score idx okay so there is no index since beginning so it will create this index and insert data into this index and we have a plan Terminator so our log stash job is completed so let's see if yeah so log stash output elasticsearch so this index is getting created so let's verify this if this created uh data into elasticsearch or not so go to kibana URL and just like this is the film idea so that we have told to insert data here so just check if data is available yeah so we got 1000 records and uh we got this data also so we got this at the right times Temple so that this is the like uh the data we have inserted from SQL so this commentary is deleted scenes and uh this special features description rental duration this description then these are like special features rental duration everything okay so this 1000 records coming here also like 1000 rows are written okay so it has written all data into elasticsearch so that is another way to verify what we have to do so we have to go to discover tab so we have to visualize the in more analytical way so for that what we have to do so go to this manage this data view so we have this film idx created so we need to create this index pattern again so here like you have to give this name as film idx and here also we have to give fillment X so you have index pattern one matching source okay so just save this confirm yes yeah you can see uh like we have 1000 hits and uh around like uh 1854 so we got this count of Records you can see this count of Records thousand records are written and at this particular time and what are these particular records so you can see these as some documents and this is the value of film that is unbelievable yarn inspiration pattern Panorama auditional Saga of crocodile and uh this is the details of our film ID language ID last update so these are like these details and see this data this is the description this is the film ID this is the film name like this this is like more analytical way so we can find the patterns here okay to summarize this so we are able to visualize data into elasticsearch so whatever data into MySQL in like a tabular format or table format so this is the table format uh data migrated to elasticsearch from this data to this data one question coming into your mind like we have these indices also foreign Keys also so how that that will be transferred to this so elastic is a nosql database so they are like implementation of foreign keys and uh primary keys in a different manner so they're like there are different indices with a different attribute okay so that will be taken care in no SQL way so this way like we can migrate our SQL data to elasticsearch by using log stash pipeline configuration stay tuned for uh further interesting sessions and thanks a lot for watching [Music]
Info
Channel: Simplifying Tech
Views: 5,328
Rating: undefined out of 5
Keywords: logstash, jdbc plugin, easticsearch, database, tutorial, mysql, kibana, elk stack, windows 10, elasticsearch training, visualization, dashboard, kibana dashboard, kibana visualization tutorial, kibana tutorial, elastic stack, kibana dashboard tutorial, elasticsearch, data analysis, data visualization, elk, elkstack, elasticstack, pipeline, elastic stack tutorial, MySQL table to Elasticsearch using Logstash, migrating sql data to elasticsearch, logstash pipeline, sql to nosql
Id: PcSNVTBhe0w
Channel Id: undefined
Length: 15min 6sec (906 seconds)
Published: Sun Jun 04 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.