ELK using Docker Compose | Elasticsearch Logstash Kibana Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello friends welcome back to my channel and uh today we are back with another exciting tutorial so we are back with a tutorial on logstash elasticsearch in kibana so if you have seen my uh playlist for logstash elasticsearch and the kibana in last video we have shown you how to set up block stash elasticsearch and kibana using binary files and you know it was a normal process how you can do this but in this tutorial we will see how uh you know we can do it in much simpler way using docker containers so we are going to see how we can set up this whole elk stack you know both lock stash elastic search and kibana using a docker compose file so we'll be writing the whole docker compose file where i know once you run the docker compose it will create the containers for lock stash elasticsearch and kibana we'll be also writing the configuration files uh for logstash so that it can send you know the locks to the elastic search and then to kibana so all those things we will do in this tutorial so it will be pretty interesting for you so let's get started so if you are new to my channel or if you have not subscribed to my channel i would request you like click on the subscribe button also like my videos share press the bell icon so that you will be notified about this videos so in order to get started uh you know as i said this tutorial will be about uh docker compose where we'll be creating the whole docker compose file so what would be the overall agenda would be like we'll have some log files which will be you know sent to the lock stash the log stash will be running as a docker container and from the log stash we will be sending those into the elastic search and the elastic search also will be running as a docker container and from the elastic search will be having the kibana which will be the visualization which will be also running as a container so all these things what we are going to achieve is like we are going to create all these things using a docker compose file we will be writing about uh the services for logstash elasticsearch and kibana and also we'll be creating volumes where we can map some config files uh for logstash and other things so we can run when we run the docker compose file all these config files will be used and the whole containers will be running and the data will be available in the kibana so without further ado i will get into directly to the docker compose file so i'm not going to write the file from the start because of considering the time i already returned the file so i will go through the whole content and i will also share this docker compose file through my github link so check the video description you can just go to the video's description and get the github file where you can copy this docker compose file so you don't have to follow the whole steps again one by one writing through it so you can just copy the content from the github okay so what we are trying to do is basically we are trying to create three services one for elasticsearch one for lock stash and one for kibana okay so that's the main three uh thing what we are doing and for elastic search we are going to use the image elastic search and the version is 7.16.2 which i've used i think there is uh one more version released 17.7.17 i believe so but i use 17.16.2 and i'm going to give a name for the container which is elasticsearch you can use any name and i'm also giving some argument like restart always so that container gets stopped it can restart and uh as i said that we'll be using some volumes and environmental variables so volume i'm going to set a docker manage volume elastic underscore data and i'm going to map to this location so if you don't want this uh to be as a persistent uh volume you can just remove this parameter as well but if you want add other volumes as well because you know here i am not mapping any uh volume for the elasticsearch.yaml so if you want to create your own elastic config file so you can create that and you can use that as a volume and you can map it here as well so this is uh i wanted to keep it more simpler so that you know i don't want to make it bigger config files but if you want to have your own elastic dot search.ml file which need to be mapped to the container you can use that file in a location and you can use that as a volume to map it here okay and we are also using some environmental variables so i'll go to this one discovery type because i want to run this elastic search a single node cluster so i'm not using a multi-cluster here so it's a single node and i'm also setting some environment available so it can restrict the java ops or memory usage so i'm going to use this uh one okay and the ports which uh we are going to open for the elastic search is 9200 9300 and i'm going to use a network or elk so i'm going to use this network so that all these services all the containers will run use the same network okay so i've used the any name you can give so i've used elk as a name and the next service is the lock stash so here again i'm using the images logstash but make sure that you're using a compatible version of elasticsearch locksters and kibana so i'm using the same version here and i'm going to give the container as lock stash and again here the restart always and if you see here i'm using a volume so i'm going to use a folder inside my current location or where i'll be having the docker compose file and i'm going to map that to log stash dot underscore dar in the container why the importance is like i'm using a command so that you know when the logstash is getting started it should use this config file inside this logstash underscore dr so this logstash.conf i'm going to create it as part of this tutorial and i'm going to put it in this location so when the logstash gets started it going to map use this logstash.conf when it's starting so i'll be putting some content here the pipeline content so that logstash can use that as part of its process now this logstash container is also dependent on elasticsearch because some of things what we put under logstash.com will be mapped to the elasticsearch container so that it can send the data to the elasticsearch now also for lockstars the port we are opening is 9600 and also i'm going to set an environmental variable to set the memory usage so 256 m and the network is elk because both these services should be on the name network and the third service and the container which we are going to create is kibana and again i'm using images kibana and the version is 7.16.2 and the container name is kibana and restart is always the port for kibana is 5601 here if you see i'm using an environmental variable called elasticsearch url so this will be the elastic search url if you see this uh name elasticsearch it comes from this container name so if you are using a different container name you should change this url based on that and also it this uh service also depend on the elastic search cont container or service and the network is again elk and the volumes which i have used here there is only one volume which i have used as a docker manage this one is like a mount volume directly mapping to the host and so that's why i'm using all elastic underscore data and the network is elk now as i said you can use more other variables or other environmental other settings as part of your requirement as i said you can use a kibana.tml file as a volume if you want to map it by yourself where you want to create your own also you have logstash that's your ml which you have created by yourself which you have specific configuration you can use that as well as well as a volume so but i don't want to make any changes on all those things i only want to keep is like the logstash.conf now the logs.logstash.com i'm going to use this uh pipeline so i'm going to use simple uh input which is coming from a log file in dot log which will be under this time so i will be creating this file and also i'm going to use an output which will be to the elastic search so this would be the elastic search host so this will be the content of the logstash.com which i'll be using here so let's get started with this whole process so let me go to my docker machine so i have this docker and i'm going to create a folder called elk first and i'm going to get into that and now inside this i'm going to create another folder called logstash so you can see one more folder called logstash and i'm going to go into that now inside that i'm going to create a fold called logstash.conf so i'm going to create a file called logstash.conf and i'm going to paste this content which we spoke about the input and output so this will be the content of logs stash.conf which is under this file uh logstash folder so let me go one step back and here if you see i have this lockstars folder i'm going to create a file called docker compose dot yaml and i'm going to copy the whole content of our docker compose into here so you can see i've just copied the whole uh content and i'm going to save this so now we can see we have one docker compose file and one logstash and i want to go back to the root directory and i will create a folder called temp and let me get into temp and inside tab i'm going to create a law a file called in log i'm going to file called in log.log so it should be in dot dot log and i'm going to put some instructions like simple this is a test file some content i want to add this is a second line so i just want to add some content so that's all so that it can put these log files from this file into the elastic search okay so let me go back to the elk folder so now we can see i am inside the lk folder where i have the docker compose file now in this machine i already have docker install and also docker compose installed so if you don't know how to set up docker and docker compose please check my tutorial i'll link that into the video description so you can refer that okay to set up your docker and now what we need to do is we need to run a docker compose up so let me use docker compose up so once i run this it will uh download the image for elasticsearch kibana and lockstar so since i already have it it has uh created the container easily if not you know it will download the all three images first then it will start creating the containers or services so now you can see starting the process so we need to wait uh i'm not running it in a detached mode so you can see all the process happening on the screen itself the whole process so it will start creating the elastic search uh process first then it will go to the lock stash then it will go to the kibana so let's wait so you can see it's uh going through the process so it may take some time because it has to create the elastic search container first the its indexes then it has to create the log stash container and it has to connect to the log status to elasticsearch as part of our config file then it has to create the kibana and that kibana has to attach to the elastic search so you can have the index of logstash into the kibana dashboard so let's wait for this to be completed so you can see the process you know it's also starting the process for lock stash so you can see elasticsearch then uh the log stash is coming up so let's wait i know the whole process has to be completed so you can see a message like logstash the apis are started so it's going a little bit faster so i cannot pause it so let's wait for this to be completed now you can see it has started the process for kibana so you can see the kibana containers also getting started so let's wait uh the whole process completed then only we can access kibana as our web browser and another point is like you know since we have different ports use like two hundred five six zero one uh ninety six hundred five nine thirty ninety three hundred so in order to access all these ports uh or the connections to happen you also need to make sure that these ports are open in the firewall so i already did did that so you can run these commands you know which i will give into the my github link you can copy it from there so you can use this command so that it can open the ports and you know the connections happen so if you're not done it sometimes you know the connection will not happen and you will not be able to access your kibana on the web browser as well so i already run this so i'll give it into the video description through the github link so you can copy it and you can run it from there so you can see i'm using uh the 5601 uh port number and you can see the kibana page is uh opening up so now we are inside the kibana home page and you can also see like if you check the containers running you can see three containers one for uh lock stash so you can see and one for kibana and one for elasticsearch so there are three containers running now so all these three containers running perfectly fine so now you can see we have the kibana page and if you go to kibana configuration so you can see go to stack monitoring if you go to the stack management and you can check the index so now you can see we have the index for log stash over here and you can also see the pipelines uh all those things we should be able to see here so you can see the already the index is there for log stash so uh the thing is now you already will have the data coming in so now you can write your queries and everything you can use it using your tools dev tools you can write the queries and everything so get the data from the logstash which is coming from elasticsearch to kibana so i just wanted to show you how this connection is happening we are not going to talk much about how the data has to be used for querying and how to create uh dashboard charts and everything that is not part of this tutorial this one we wanted to show you like how we can easily set up you know the whole uh log stash kibana and are using a docker compose we'll also see like in the upcoming tutorial like we'll use some uh other components like beats which we can which can be used to collect some data and send that through logstash and elasticsearch to kibana so then probably after that once the data is there we will see how we can use those as analytics and how we can create more uh dashboard and everything in kibana so that will be the future tutorial so that's all for this tutorial i hope you can see like what we did is like we created a docker compose file which has created services for lock stairs elasticsearch and kibana and the config the logstash.com which we have defined like what should be the input from where that input should be sent to elasticsearch and that will be shown in the you know the as the index in the kibana as well so that is all for this tutorial i hope it's a informative tutorial for you and uh if you like this video and if you want to watch more videos like this subscribe to my channel and also you know like my videos share and give your feedback comment in the comment section and what do you want to see more if you are interested to take some topic which i i'm happy to help you to create those videos so please put that feedback in the comment section so thank you for watching [Music] you
Info
Channel: Thetips4you
Views: 70,824
Rating: undefined out of 5
Keywords: elasticsearch, logstash, kibana, elk, elk stack, logstash elasticsearch kibana, logstash configuration tutorial, logstash pipeline configuration, logstash and kibana, logstash configuration for elasticsearch, logstash elasticsearch configuration, elasticsearch output plugin, logstash kibana elasticsearch, logs not showing in kibana, syslog input plugin example, logstash tutorial, logstash vs kafka, logstash vs fluentd, logstash vs filebeat, vs prometheus, vs fluentd vs filebeat
Id: VpAH2IoMzKw
Channel Id: undefined
Length: 17min 23sec (1043 seconds)
Published: Tue Feb 08 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.