Exploring Kafka with Spring Boot: A Step-by-Step Tutorial | Kafka Integration with Spring Boot

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in this lecture we will be understanding the basics of Kafka and for that you'll be creating a springboard project assuming that it is a weather forecast application and bringing data from different devices installed at different location first we will run a zoo keeper then Kafka after that we will create the springboard project which will be reading data from different devices and sending into the Kafka pipeline after that I will create another project and read those data assuming that it's either a mobile tab or some user's data and in that editor is being read for basics of Kafka I've already created this lecture you can go and check it out link is in the description so these are the steps let's go and see how the things are made so I have already installed Kafka you can check out the video which I have shown and which are in the description now the Zookeeper is here in this bin folder of Windows and also Kafka server is there so we will be running these two the configs of these are here in the config folder this will be better properties and server.properties so now in this Kafka folder let me create CMD and run the Suzuki file first so I will simply go to the bin folder then windows and then zookeeper start so after that go to the Zookeeper server start dot pad file config will be used from the config folder and that will be the Zookeeper properties after that this zookeeper will start this is our first step second step is to start the Kafka while starting the Kafka again go to the same folder of Kafka inside that again go to the bin folder then windows and in this windows we have the Kafka you can see Server start dot pad file is there then again the config folder for server.properties and this will start my Kafka always run zookeeper first because that takes care of Kafka now these two are already started and we'll go and create a spring boot application so for these two already started you can check out the details and can do certain operations using console only for now let's go to the project I will go to the spring starter project you can either download it through website or you can go to your whatever ID you are using for now let's name it to temperature updater and let the name of this project will be the temperature operator here also so this will update the temperature for different devices to read in Android Kafka pipeline here I am putting web and Apache Kafka as a dependencies and Dev tools to load things at runtime I'll not have to restart every time so now my project has been created if you see the project nothing in the progress I am going to the project and will go to the main project and here this is the main file springapplication dot run that is springboot application I'll create first thing is to create the configuration for that I will create config file and that will be in the config folder so I will create this file inside this configuration will be there and I will create a bin here for this Kafka and I'll provide the public and Bin will be for the new topic and topic which will be created in this Kafka will be read through different devices and new topic and after this I'll create through Builder pattern that is topic Builder which is provided in the spring boot framework dot name and name I'll provide the name of the topic which I will be creating so this is the weather is the name of the topic and I'll go and create build so this is how this pin off the topic will be created and as of now you can find more details of partitions and replicas are there if you are going to go on a granular level of partitions and replicas you can provide the details here but as of now let's assume that we are going with the topic only so for that your default because number of partitions and here will be the replication Factor so as of now we are removing this replication and this partition from our config so this is the topic which I have created in the config folder now since our configuration is ready let's see how we can create service file so for service let's create another class file go to the new and create class file I will name it as with their service and this weather service will be inside the folder service so I will create the service class and click on finish inside this I will annotate without the red service because it's a service class inside this I will Auto wire Kafka and provide the logger so for that first provide the logger private static final blogger and this logger will be of from the Logo Factory Dot kitlogger and again I'll provide the class name of with the service.class so our login is created now let's Auto wire the Kafka template so this is auto wired private Kafka template and this key will be the name of the string and this will be the value which will be provided in that particular template so this Kafka template is now Auto add now I'll create my service method so let's name the method as public and this will return with other data has been sent or not and so Boolean and the name will be send temperature so this will be reading date assuming that these data are being read from different devices which are installed at different locations and are being fed in this Kafka Pipeline and will be read from other devices so this will send the data in that particular topic which we have created and this will be available in the pipeline of Kappa so if you see I'll create a config class in not a concrete constant class in this constant I will provide the name of the topic so I'll name it as weather constants and let the name provide to be provided here will be the topic of whether so I'll use this name of topic from weather constants and I'll be using it everywhere so I will limit us with the constants and dot topic and the same will be used in the service class so first key will be the name of the topic so weather constant and the constant will be the topic the message I will be sending will be the temperature at that particular moment so let the message will be temperature and it will be giving me certain real time temperature data of Maximum let it be 50 degrees so I'll put the random function in which I will provide till maximum 50. and I'll round it to nearest 50 number so nearest digit instead of decimal and return to that this temperature has been sent in the Kafka pipeline so I will write here data sent successfully so our service layer is ready now comes the next part after this weather service is the controller from where we will be sending the details every time so I will create weather controller and it will be inside the controller package and I'll name it as rest template and it will be the rest controller so first thing will be to Auto wire and second thing will be to get mapping so let's assume that we are going with the gate mapping of update temperature and this update temperature will be the URL of my gate mapping and I will return response entity in which I will provide the string data that update has been done successfully somewhat like these strings will be provided in the response entity I'll name this method as update temperature and this will be done through Auto wire which I have initially said so Auto add private this Weather Service which I have created this will be Auto add here so this Weather Service will have the same temperature method that will be called from here so I'll provide and change this package to controller let me change into the controller package now we'll autoize The Weather Service so whether service will be called here and through this weather service I'll be calling the method of send temperature after this has been done I will simply return the response entity at new response entity and this will be returning the string value along with the HTTP status code so let the string value will be temperature updated and the status code will be HTTP status dot ok so that is 200. so this is how this controller is created and the UL is update temperature so if you see these are the packages which has been provided here and this is the main file this is the controller there is a weather controller which we've just created inside this we are calling the same temperature service and in this same temperature we are reading assuming that we are reading data from different devices and sending in the Kafka template through this topic that this temperature at this particular point is this and after that we are simply logging the data set successfully and these are being sent through this topic Builder that is this topic now let's see the resources part inside This Ss part we need to have these that is the Kafka port and the strings here right here of key and value of that particular Kafka so now let's run as spring boot application so this is the console of my spring boot application let me minimize the other parts and you can see this has started at Port number 8089 so till everything is fine let's go and use this particular URL which we have created to send data for that I'll simply copy the data and first of all like these two zookeeper and Kafka which I created I will create a listener or I would say a consumer so this Kafka console consumer to just verify whether whatever I am sending is coming in the Kafka pipeline or not for that this is the command which we need to go the topic quick start and then the bootstrap part so again I'll go to the windows Kafka and then we'll go console consumer pad file then the topic name the topic name is weather we have created so where there will be the topic name and then the other parts which are there if you see in this particular command from beginning bootstrap server then the localhost then the port number of my Kafka so this I created and started so now the when I'll hit through the postman the data will be coming here in this pipeline so let me copy the URL that is update temperature let me copy this and go to the Postman again the console to see in the data being sent in the real time active consent we need to change it to get method now again send see the temperature is coming but it is coming as a large number so let me change through service layer and round it to maximum temperature of 50 degrees so I'll go to the service layer in this let me change this to this one now it will be rounded to maximum 50 degrees and not more than that so as you can see it is coming maximum 32 31 44 41 maximum till that so everything is fine till now these are the data which are being sent to Kafka pipeline but we should have certain places where we'll be reading out those details for that we should have Kafka listener but as of now let me provide it here only so that you will see how the Casper listener works so in real time there should be a different device but just to demonstrate it I am writing out here only that Kafka listener then it will be the topic name from which it should listen so it will be weather constants dot topic which we have already created it is nothing but the topic name topic name is weather here and this will be the group so different group of people can listen to that particular topic as of now let's assume that this is the temperature one group so all those people which are having temperature one will listen to this so I will provide the public void because it will just listen so that's why void only get temperature and inside this I'll be providing the data through that particular listener and I will simply printing this data without analog just test this out system.out.20 and inside this data will be printed so again let me restart this application let me expand the console so that we can see the real time data being sent in these console parts foreign started so see as you can see temperature is coming in both the places in the console also of this Kafka whether it is showing that Kafka is being data is being sent in the pipeline and also in our application so whenever I am sending any data it is going at both the places four has gone there again let me send it few times one has gone there 37 gone there to 1917. in this okay all the data which are being sent in that particular Kafka stream are being read out to Kafka listener so the this we have created a application and we have seen how the real-time data can be read out so thank you all for watching my video I will send creating a few more details video of the Kafka thank you all
Info
Channel: TrendingCode
Views: 234
Rating: undefined out of 5
Keywords: Kafka and Spring Boot, Spring Boot Kafka Integration, Spring Boot and Kafka Topics, Kafka Messaging with Spring Boot, Spring Boot Kafka Tutorial, Kafka Integration using Spring Boot, Spring Boot Kafka Consumer, Kafka Producer with Spring Boot, Kafka Pub-Sub with Spring Boot, Spring Kafka Examples, Spring Boot Microservices with Kafka, Kafka Streams with Spring Boot, Spring Boot for Big Data with Kafka, Kafka and Spring Boot Best Practices, Spring Boot Kafka Configuration
Id: i78_Ulp2A8k
Channel Id: undefined
Length: 16min 41sec (1001 seconds)
Published: Sat Sep 16 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.