Spring Boot Kafka Microservices - Spring Boot Kafka Real-World Project Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi welcome back in this lecture we'll take a  look into the overview of real world project well   basically we are going to develop a real world  project to demonstrate the usage of springboard   and apache Kafka well this course is also useful  to you if you are using event driven architecture   in your micro services let's say you are using  apache kafka for the communication between   multiple micro services as a messaging system then  you can go ahead and take this course so that you   will understand the fundamentals of apache  copy and how you can use apache kappa in a   springboard project well here is our project  goal well basically we are going to read a   huge amount of real-time stream data from the  wikimedia to the database okay and if you can   see the architecture over here so we are going  to create a Kafka producer which will read a   real-time stream data from the wikimedia and then  it will write that data to the kafka broker and   then we will create a kafka consumer which will  consume the real-time stream data from the Kafka   broker and then it will write to the database  so this is how the workflow for this project   well this real world project is the best use case  to demonstrate the usage of apache Kafka because   basically we use apache Kafka to deal with a large  amount of data right in this real world project we   are going to see how to create a cupcake producer  to read a large amount of real-time stream data   from the wikimedia and then it will write that  data to the kafka broker and then we'll see   how to write a Kafka consumer to consume that  real-time stream data from the Kafka broker and   then it will write to the database okay so this is  how the architecture for this real world project   well let's head over to this link and  let's see how the real-time stream data   looks like so let me go to the browser and  let me hit this link in the browser and you   can able to see this is a large amount of data so  whatever the users you know do the changes in the   wiki media that will be retrieved by using this  rest api well if you can closely observe this   real-time wikimedia data it has the events okay so  whenever user make a change in the wikimedia that   entry will be stored as a event in the wikimedia  server and by using this rest api we can fetch   all the user made recent changes in the wikimedia  okay well let me show you a couple of demos for   this real time wikimedia stream data so here you  can see someone has created this user interface it   behind the scene uses real-time wikimedia stream  data okay you can see here the real time you know   data records are displaying here so whatever the  changes the user made in a wikimedia that changes   will be you know reflected over here and these  are the charts and you can find the real-time   data changes in this chats so another demo  example you can see someone has written the   html and javascript code to create this user  interface well this sample demo behind the scene   reads real-time wikimedia stream data and it will  display that data here in a user interface well in   this real-time project we'll see how to use apache  Kafka broker to read the real-time stream data   from the wikimedia and then write to the database  all right great i will see you in the next lecture   hi welcome back in this lecture we'll take a  look into spring boot Kafka project setup well   basically we are going to create two projects  and you can think of these projects as a micro   services well first we create a Kafka producer  wikimedia so this project acts as a kafka producer   which will read a real-time wikimedia stream data  and it will write that data to the Kafka broker   next we'll create a Kafka consumer database  project which will consume or read a real-time   stream data from the kafka broker and then it will  write that data to the database all right this   is how at a high level spring boot kafka project  setup looks like basically we are going to create   two projects as a micro services first one is  Kafka producer wiki media as a micro service one   and Kafka consumer data is as a micro service too  and we're going to use apache Kafka as a messaging   system to exchange the messages between micro  service 1 and micro service 2. all right great hi   welcome back in this lecture we'll install and set  up apache Kafka locally well it's pretty simple   to install and set up apache Kafka locally just  follow the steps that i am going to implement in   this lecture well head over to the browser and  in a new tab just type apache Kafka download   and hit enter so go ahead and click on this post  link that is download apache Kafka so this will   navigate to the apache kafka downloads page well  here you can see get started tab and within that   here you can see quick start link so go  ahead and click on this quick start link   over here well in this course we are going  to use apache quick start page a lot okay   so just scroll down and here you can see the  steps so these are the steps are pretty useful to   install and set up the cup car locally well let's  take a look into the step one step one is get   kafka it means we need to download the latest  cop car release locally and then we need to   extract it so go ahead and click on this download  link over here well go ahead and click on this   link to download apache Kafka as a tar file  locally well i am going to click on this link   cop guy is downloading in a file system so it will  take couple of seconds as per the internet speed   well once apache kappa downloaded as a zip  file or a tile file then go ahead and open   in a folder and just go ahead and extract  this file so i am going to extract it   now let's go and let's rename to this folder so  i'm going to rename as kafka so just remove the   version from here let's keep Kafka all right now  let's see the second step so let me go back here   and second step is like we need to start the  Kafka environment well before starting the Kafka   environment make sure that you have installed  java 8 on your machine all right next we need to   start the zookeeper well you might be wondering  what is zookeeper right so here you can see the   diagram kafka ecosystem diagram over here so this  Kafka ecosystem has a cupcake cluster and within a   Kafka cluster we have three kafka brokers  crop club broker one two three all right   and here we have one zookeeper so zookeeper is  basically a service that will manage the state of   all the kafka brokers within a Kafka cluster and  it will also manage the configuration of topic   and producer and consumer as well all  right so just remember we need to start   the zookeeper so that it will start managing  the state of kafka brokers in a copy cluster   well in our case we have only one Kafka  broker that we are going to install locally   so first we need to start the zookeeper so that  it will manage the state of the kafka broker well   in order to start the zookeeper service we need  to execute this sh file that is zookeeper iphone   server hyphen start dot sh well this sh file is  located within a bin folder if you go to kafka   just expand the kafka and here we have a bin  within bin folder you can see all these sh files   uh so these sh files will support for mac and  linux if you are using windows then make sure that   you will use these bad files okay so i am using  mac so i am going to execute the sh files all   right so just search for zookeeper hyphen server  hyphens dot dot sh over here and there we go so   here you can see zookeeper iphone server hyphens  dot dot sh so this is the sh file we are going   to run in order to start the zookeeper service  so just go back to browser and let me copy this   command and let me execute in a terminal so let  me copy it and let me open the terminal over here and let me zoom this terminal  and let me paste here so before   executing this command we need to move into cup  cup folder right so just go ahead and type cd   and then we need to move to the downloads  folder right downloads hit enter and within   downloads folder we have cup cup holder right so  cd Kafka and then we need to enter this command so within kafka folder we have bin folder within  bin folder we have this sh file and then complex   slice slash zookeeper dot properties well this sh  file will report all the you know properties from   this properties file hit enter and there we go you  can see zookeeper service has started well once   zookeeper service has started then we can go ahead  and run the Kafka broker service well just open   a new terminal session and run this command well  let me copy this command well if you can see this   command over here in order to start the kafka  broker service we are going to run this sh file   that is server hyphen start dot sh so this sh file  is located within a bin folder well let me go to   kappa again and within the bin folder you can  see a bunch of you know sh files so if you are   using windows then you should use a bat file okay  so let me search for Kafka server you know start   dot sh file so this is the ss file that we are  going to run in order to start the Kafka broker   service well let me go back to the steps let me  copy this command and let me go to the terminal   so if you are using windows then make sure that  you use command prompt well let me open the   new cell okay so let me zoom it little  bit so if you are using window then you   can go ahead and open new command prompt  so let me go into the downloads folder cd   downloads cd kafka within downloads folder we have  kafka folder right so go to the kafka folder and   then run this command that is server hyphen star  dot sh file and make sure that you provide this   server dot properties file okay because this sh  file basically refer all the properties from this   server hyphen properties file so let me hit enter  so this will run the apache Kafka broker service   well let me max my this well apache kafka broker  service will run on localhost 9092 so let me just   search for 909 to port and let me show you the log  it states the apache broker service has started   let me search for 9092 and there we go so this is  the log statement you can able to see recorded new   controller from now on we'll use a broker service  localhost 9092 which means our apache Kafka broker   service is started on port 9092 all right and this  localhost colon 9092 we are going to use an in our   Spring Boot application in order to interact with  the Kafka great now let me go back to the steps   once all services has successfully launched you  will have a basic kafka environment training   and ready to use well we have started zookeeper  service as well as Kafka broker service now we are   good to use a kafka broker service in our local  machine well let me recap what we have done in   this lecture we have downloaded the latest version  of kafka from the official website and then we   have installed in our local machine and then we  have started zookeeper service as well as Kafka   broker service now we are going to use a Kafka  environment in our springboard application hi   welcome back in this lecture we will create spring  boot project for wikimedia user well basically we   are going to create a multi-module Maven project  within that we create a sub module all right so   first we'll create a springboard project using  spring initializer and then we'll import that   project in enclijay idea and then we'll create  sub models within that project so let's go to   browser and in a new tab just type star.ping.io  this will brings up spring initializer   let's go and let's pull up the  project information over here   so here we have two types of project Maven and  gradle so let's keep Maven project as selected   and choose language java and Spring Boot version  2.6.7 so as of now it is 2.6.7 but it may differ   in your case all right next let's fill up the  project metadata over here so let me give group as   net dot java guides and then artifact  something like spring boot Kafka real   world project something like this and this is  the description so name is same as artifact and   description is demo project for spring boot  and kafka and package let's queue net.java   dot spring boot all right and packaging let's keep  jar so basically we want to create a parent Maven   parent project but we need to change the packaging  type as a pom so we'll do it later as of now   let's select packaging as jar and java version  let's keep l1 as selected you can choose java 8   11 17 18 as per the jdk installation on your  machine but let's keep default that is java l1 as   selected next let's go to the dependencies  section over here click on add dependencies and   here we are going to choose Kafka dependency for  spring application so let's type spring for Kafka   and there we go so this is the  spring for apache kafka dependency   which will provide a support to integrate apache  kappa in a spring-based applications so let's   choose this dependency okay so this dependency  is enough for us so also let's choose the lombok   dependency to reduce the boilerplate code so  let me choose this dependency all right perfect   if you want to explore you can go ahead and  explore this project by using explore option   so you can just go to palm.xml and you can just  review whatever the dependency that you have added   okay we have added i know spring for Kafka  dependency as well as long box dependency okay   just close it and go ahead and click on  generate button so this will generate   this Spring Boot project as a zip file all  right let's open this zip file in a folder   and then let's unzip this file so let me unzip it perfect now let's open the enclosure idea so i'm  going to open the english idea over here and here   you can see a couple of options new project open  get from cvs so we are going to open the existing   project in the integer idea so let's go ahead  and click on open over here and then let's go   to the location where your springboard project is  downloaded so in my case it is downloads folder   so select it and click on open so this will  open up Spring Boot project in the intellij idea   all right now we have generated spring boot  application using spring initializer and   imported that project in the integer idea  let's make this spring boot project as a   multi-module Maven project okay so let's  go and let's make the necessary changes   to make this project as a multi-modal mind project  so first of all we need to make this project as   a parent project right so go to the pom.xml  and we need to add a packaging tag over here   so just add a packaging tag and let's give  value as pom okay so whenever we create a   multi-modal Maven project we have to provide  packaging type as a form for a parent project   okay now we are good to create a sub modules  within this parent project okay just save it   and just go to this icon here and just load  the moment changes okay now the parent project   is ready now let's go ahead and let's create a  module within this parent project so right click   on this project new and then choose module and  check tick mark this and then click on next and   then give module name as cup cup user wiki media  and then click on finish so this will create a   module within this parent project so you can see  as soon as we create a model within this parent   project a palm.xml will be updated so you can  see here just open the stom.xml of parent project   and if you can see this tag over here models  section will be added in a form.xml and you can   see the model okay so just go ahead and click on  this icon to load the maven changes okay now we   have created Kafka producer wikimedia model within  this parent project well if you can observe this   model that we have created within this parent  project this module project is not a Spring Boot   project so let's make this module as a springboard  project so let me quickly create a package and let's name it as net.java  guides dot spring boot   and let's create a spring boot main entry point  class within this package right click new and then   choose class and let's give class name  as spring boot producer application all right perfect and let's annotate this class  with add spring boot application annotation   and let's have a main method within this  class all right and within a main method   let's call spring application and then call static  method run method and then just pass spring boot producer application dot class okay that's it  now we have created spring boot menu deployed   class for this module okay so if you can  run this springboard project it should work there we go there is no error in a console right  so what we have done so far we have created a   parent project and within that parent project  we have created a module and we have made this   module as a springboard project okay so in order  to verify this Spring Boot project setup what we   can do is we can go to Maven over here and from  here we can just run the command Maven clean install okay and let's see whether the setup is  correct or not and here we go the maven build   success it means that the multimodal moment  project that we have created is working as   expected okay so one more thing we are missing  here we need to make this Kafka producer wikimedia   module as a jar file so go to form.xml  of this module and let me minimize this   and here let's give packaging type as  a jar so let's do packaging as a jar   for this model and save it and click on  this load Maven changes and let's again   build this multi-module Maven project  so just run maven clean install and there we go it means that the multi-module  mine project that we have created is working   as expected all right in next lecture we'll see  how to configure kafka producer all right i will   see you in the next lecture hi welcome back in  previous lecture we have created a multi-module   Maven project so basically we have created a  parent project within that we have created a   child project that is Kafka producer wikimedia  so this kafka producer wikimedia project   it acts as a Kafka producer now in this  lecture let's go ahead and let's configure   wikimedia producer and also we'll create a topic  within our Spring Boot project so let's go to   intellij idea and let's go to this module that  is kafka producer wikimedia and just expand src   and then go to resources folder so within  a resources folder we are going to create   application.properties file so choose file  here and just give file name as application   dot properties hit enter so within this  application.properties file we're going to   configure a producer so just type the property  spring dot Kafka dot producer dot bootstrap hyphen servers well here we are going to configure  this property because we are going to tell Spring   Boot that this is the Kafka broker which is  running on port 9092 all right so whenever we   create a Kafka producer it will you know read  the real-time stream data from the wikimedia   and then it will connect to the Kafka broker and  then it will write that data to the kafka broker   all right next let's configure key serializer  and value serializer for producer well let's   type the property spring dot Kafka producer dot  key serializer and then colon or g dot apache   apache dot Kafka dot common dot serialization dot  string serializer so similarly let's configure   value serializer class so let me simply copy this  property and paste it over here and let me change   from key to value all right now we have configured  key serializer and value serializer classes   okay so whenever we develop kafka producer  so kafka producer will internally use   these you know serializer classes to  serialize the input data okay great   next we need to create a Kafka topic we'll go  to the main package of this module and right   click on the main package new and then choose java  class let's give class name as Kafka topic config all right and hit enter and let's annotate this  class with add configuration annotation so that   spring will make this class  as a java configuration class   let's create a bin spring bin within this class  so we are going to basically use java based   configuration to create a spring bin so public and  then return type is new topic class from apache   clients admin package and then give a name method  name here as a topic and let's return of new topic   class so in order to build this object we are  going to use topic builder from spring package   okay spring framework dot Kafka.config package  and then it has a name api so let's use name api   and let's you topic name as wiki wikimedia  underscore recent change something like this   and then call build method to build the object  okay so you can also create the partition by using   partitions here but i want to keep you know  default partitions that is created by Kafka broker   so let's save this and let's annotate  this method with at bin annotation   perfect now we have created a kafka topic okay  great well let me recap what we have done in this   lecture we have created application.properties  file within this file we have configured key   serializer and value serializer classes for  the cupcake user and we have also mentioned   the kafka broker url over here next  we have created a kafka topic named   wikimedia underscore recent change all right  grid in next lecture we will start implementing   wikimedia kafka producer alright great i will see  you in the next lecture hi welcome back in this   lecture we'll start implementing wikimedia Kafka  producer so let's head over to the integer idea   and let's go to the main package so look at here  we are in a model not in a parent project okay   so go to main package right click new and then  choose class let's give class name as wiki media   changes producer okay so this is the name  we are giving to the producer so hit enter   and let's annotate this class with add  service annotation from spring package   so we are going to make this class as a  spring bin by using add service annotation   well within this class let's first create a  logger instance so let's have private static   final and then logger from cell 4j and  then logger equal to logger factory and then call getlogger method and then  pass this wikimedia class name here   dot class perfect we have created logger to  lock the messages next let's go ahead and   let's inject Kafka template from spring cup cup  library so we are going to use kafka template   to send a messages to the Kafka broker so let's  first declare a Kafka template and then we use   constructor based dependency injection to inject  this kafka template so let's pass key value as a   string type okay and let's say the object  name is Kafka template now let's create a   constructor right click generate constructor  select this variable and then click on ok now   we have created a constructor for this class  well whenever spring finds a single constructor   for a spring bean then spring will you know inject  this dependency you don't have to basically add   at a to wide annotation over here all right  so just ignore this annotation spring will   automatically inject this dependency whenever  it will find a single parameter is constructed   in a spring bin okay great next let's create  a method which will read a real-time wikimedia   stream data so let's create a method like this  the method is public and the return type is wide   and the method name something like send a  message all right and within this method   let's have a topic name post string topic equal to  the topic name we have given wikimedia underscore   recent change so this is the topic name we have  given right so let me go to topic complete here   and here you can see well this is the topic name  that we have given to the Kafka topic all right so   let's go back to wikimedia changes producer over  here now in order to read real-time wikimedia   stream data we are going to use event source  so here i'm going to write the comment to read real time stream data from wikimedia we use event source  well in order to create a event source and   read the real time event data from the wikimedia  we have to use a couple of libraries so let's   go and let's add a couple of libraries so  go to the browser in a new tab just type   ok http event source maven and hit enter  and go ahead and click on this first link and go to 2.5.0 origin and this is the maven  dependency that we are going to use that is   ok http hyphen event source so this is  the dependency that we are going to use   to create the event source to read the events  from the wikimedia so let me copy this dependency   go to english idea and go to palm.xml of this  particular module so we are not going to add this   dependency to the parent project in fact we are  going to add this dependency to the module okay   so go to palm.xml of this model and  create a dependency section over here   dependencies and within these dependencies  just paste this dependency okay perfect now if you can see the data this data real time  stream data from the wikimedia so it has json   right so we also need to use jackson libraries  in order to deal with this json so just go ahead   and search for jackson jason Maven hit enter and  go to this first link okay and go to this 2.13.2   all right and just grab this memory dependency and  go to palm.xml again and simply paste it over here   now we have added jackson core dependency let's  also add one more junction dependency that is data   bind dependency so go ahead and click  on this and choose the latest version   and just copy this dependency and go to  form.xml again and just paste it over here   now we have added these three dependencies okay  so here this dependency we have added in order to   create a event source to read the real-time event  data from the wikimedia and these two dependencies   we have added to deal with the json data all  right so now if you can see here this icon so go   ahead and click on load maven changes icon over  here well as soon as we loaded miami changes we   got this error cannot resolve this dependency well  basically this ok http hyphen event source so this   dependency needs a transitive dependency so let's  add a transition dependency for this dependency so   go to browser go to this maven repository  and here just type http ok http hit enter   and go ahead and you know click on this link  so if you can see the error com.square up dot   o key so this is the dependency we need to add  so go ahead and click on it and just scroll down   and click on this 4.9.3 so this  is the latest release of this   dependency so go ahead and select this my own  dependency just copy it okay and just right below   here paste this dependency and go ahead and click  on this Maven changes icon so this will load this   dependency from the internet now we have added  all the required dependencies to the pom.xml file   now we are good to write the event source code to  read the real-time event data from the wikimedia   now let's go back to our producer class that  is wikimedia changes producer now we are good   to write a event source code which will read  a real time stream data from the wikimedia   well in order to handle the events we  are going to create a separate class   that is wikimedia change handler so let me  create a class so right click on the package   new and then choose java class and  let's give class name as wiki media changes handler perfect and this handler class  will implement event handler you know event   handler interface make sure that you choose event  handler interface prompt com dot launch directly   dot event source package now we need to override  couple of methods from this interface and we need   to provide the implementation so just mouse around  this class and then click on implement methods   and then choose all these methods and click  on ok all right now in this class we need to   only implement on message method okay so these  other methods like on open and closed on comment   on error so these are the methods we are not going  to implement because these are the methods that we   are not going to use we are going to only use  onmessage method well whenever there is a new   event in a wikimedia then this on message method  will be triggered and then it will read that event   okay now what we'll do we'll create a constructor  for this class so before that let's declare   Kafka template and just pass key type  as a string value type as a string   and then kafka template and also let's declare  one string variable it takes a topic value   okay perfect now let's create a parameterized  constructor right click generate constructor   and select these two variables and then click  on ok now we have parameterized constructor and   these two parameters we pass whenever we create  a instance of this class all right perfect now   just save it and let's go to on message method so  within our message method we're going to basically   call the Kafka template so before that here i'm  going to add a logger private static final and   then logger so logger from cell 4j and then  this should be a logger instance and then   logger factory dot get logger and then pass  the class name wikimedia changes handler dot   class perfect so go to on message method over here  and just add the logo statement so logger dot info   and then pass the message let's say string  dot format and type the message something like   event data something like this and  then placeholder that is percentage yes   and then pass the message so let's say message  even dot get data okay now we have added a logo   statement to lock the messages now let's  go and let's use Kafka template to send   the event to the topic so let's say kafka template  dot call send method so you can see here there are   a lot of overloaded send methods we are going to  call appropriate one so let's choose second one   it takes a topic so go ahead and call this  second one and then pass post argument as a topic   second argument as message event dot get data  okay perfect now we have created wikimedia   changes handler so this will basically trigger  whenever there is a new event in a wikimedia so   basically if you can see here we have implemented  on message method right so whenever there is a   new event in a wikimedia then this handler will  be triggered and this within this handler method   this on message method will be called and within  this on message method we have written the logic   to send this message to the topic using kafka  template provided send method okay it's pretty   simple now let's go back to our producer class  that is wikimedia changes producer so within this   producer we are going to call this wikimedia  changes handler class let's create a event   handler object so let's type event handler make  sure that you choose event handler from com dot   launch directly dot event source package and then  they should be event handler equal to new and then   call the implementation class that is wikimedia  changes handler and then pass first parameter as a   template second parameter as a topic perfect  now we have created event handler next we need   to define the event source url that is rest  api url so let me simply copy that url from   here so this is the rest api url it provides  real time wikimedia stream data right so let   me copy this link and let me pass here perfect  now we need to pass this url to the event source   next we need to create a event source  which will basically connect to the   source that is wiki media source and it will read  all the uin data so let's create a event source   and make sure that you choose event source  from com dot launch directly dot event source   package and then call bulder so first we need to  create a builder event source builder and then we   create a event source object from the event source  builder next new and then event show dot builder   and then pass event handler as a first parameter  second parameter as the uri so just call uri dot   create and then pass url perfect now we have  created event source builder now let's create a   event source object from this builder so let's  say event source and then event source equal to   builder dot build method perfect we have  created event source object well if you   can see the internal implementation of  event handler event handler internally   uses executor service to create the threads  okay so now what we need to do is we need to   you know start this event source in a separate  thread so just call event source dot start   okay perfect i am going to provide the slip  sleep for 10 minutes so just call minutes and   then call sleep and then 10 minutes okay after  10 minutes this should be stopped okay so let me   see what is the error here so this should be so  this should handle the exception okay perfect   now what we have done basically we have created  a wikimedia changes handler so this handler will   be triggered whenever there is a event in a wiki  media and then this event source will basically   connect to the source that is wikimedia source and  the source url is this and then this event source   will full all the real time stream data from this  source and then it will trigger the respective   handler okay and then within a handler we have  written a code to send that event to the topic   isn't it well in order to run this wikimedia  changes producer we need to call send message   method right so just go to main entry point class  of this project so this is the main entry point   class that is springboard main entry point class  so let's implement command line runner interface   it provides a run method so this run method  will be executed whenever this application   gets started so here what i want to do is i'm  going to inject private wikimedia changes producer   and let's use at a total annotation now we  have injected this producer so just call   its send message method right so whenever we  run this springboard application by using this   main entry point class then this object will be  you know instantiated and then it will call its   method that is send message method now let's  go and let's try it so let's run the Spring   Boot application and let's see how this producer  works well you can see in the console we haven't   started the kafka broker if the kafka broker is  not running in a local machine then you will get   this kind of you know warning messages so let's  go and let's run the zookeeper service as well as   Kafka broker well let me open the terminal and  let me start the zookeeper service as well as   kafka broker service well i am using mac so i am  going to open the terminal but if you are using   windows then make sure that you open the command  prompt so let me slightly maximize it and let me   zoom it okay so we have couple in  downloads folder so let me first go into   downloads folder and then Kafka and then  we need to execute the command so let's say   in order to run the zookeeper we need to trigger  this command that is bin slash zookeeper hyphen   server hyphen start dot sh and then we need  to pass this properties file just hit enter   now our zookeeper service is up underneath local  machine similarly let's go ahead and let's start   kafka broker service so in order to do that  just open the new cell go to cell new window   if you are using windows then make sure  that you should open a new command prompt   so let me a little bit maximize it and then let me  slightly zoom it and let me go into cup cup holder   perfect and let me run the command well in  order to run the kafka broker we need to run   this command bin slash kavka server start  dot sh and then we need to pass this server   dot properties file all right so hit enter now  our Kafka broker is up and running on port 9092   well in order to confirm whether the Kafka broker  is running or not you can just see this lock   recorded new controller from now on we'll  use broker localhost 909. once you've got   this log then you can you know show that your  kafka broker is up and running on port 9092   all right now let's go back to our springboard  project well let's go and let's rerun the Spring   Boot application and let's see how this producer  works so let me start the springboard application   and again it has some error so let  me stop the springboard project   and let me check what is the error so it says  cause unexpected error so what is the error   let me see failed to construct Kafka producer well  there should be some issue in copper configuration   so let me go to application.properties file and  let me check what is missing so here you can see   localhost so this should be localhost  right so there is a typo so let me save   this file now let's run the springboard  application and let's see how this works and there we go you can able to see in a console   even data is you know retrieving from the  source we can able to see the log here   event data so in our key media changes handler  we have logged this statement right event data   followed by the message it means that the  kafka producer that we have written to review   a real-time wikimedia event data from the  wikimedia is working as expected you can see   a lot of data is redeem okay so there is basically  huge amount of real time wiki media stream data   okay so let me stop this otherwise this will go  infinite let me stop the server now what we'll do   we'll verify whether the kafka  producer that we have written   has sent a real-time stream data to the Kafka  topic of cut so in order to verify that what   we can do is we can trigger some command  from the command line and then we can see   well basically in next lecture onwards  we create a kappa consumer to read a real   time stream data from the kaka broker but in  order to verify quickly we can you know check   or we can verify from the command line so  go to the terminal and let's open a new cell and then let me slightly zoom it out and let me  go to the crop cup holder download and then cup   car and then just have a command that is bin slash  console consumer dot sh and then we need to pass   the topic name right so let me quickly change  the topic name for this command now right now   the topic name is java guides underscore json  right so we have given the topic name is wikimedia underscore recent change okay hit enter and now here you can able to see we can see that  the consumer is reading a real-time stream data   from the Kafka topic isn't it now let's go  ahead and let's run our Spring Boot project   and let's see how the Kafka producer will  you know retrieve the real time stream data   from the wikimedia and how it will write to the  Kafka topic and then we will see this terminal   so let's go to the springboard project  and let's run the Spring Boot project and you can see the Kafka producer is start  reading a real-time stream data from the wikimedia   and similarly if you go to terminal you can  see here all right so consumer is reading the   events or the real-time stream  data from the Kafka topic isn't it   you can just take a look into the console as well  as you can take a look into the terminal as soon   as the cup cup user will reach the real-time sim  data from the wikimedia and it will write to the   kafka topic then this you know consumer will read  that data from the kafka topic and it will print   here in a terminal isn't it so it means that the  kafka producer to read the real-time stream data   from the wikimedia is working as expected in next  section of the lectures we'll start implementing   kafka consumer to read this data from the  Kafka topic and then we'll see how we can write   that data to the mysql database all right i  will see you in a next section of the lectures   hi welcome back in previous couple of lectures  we have seen how to implement a wikimedia Kafka   producer which will basically read a real time  event stream data from the wikimedia and it   will write to the Kafka broker right in next  upcoming lectures we'll see how to implement   Kafka consumer which will consume that data  from the Kafka broker and it will write to the   mysql database okay in this lecture we create  a Spring Boot project for kafka consumer and   then next lecture onwards we'll start implementing  Kafka consumer to consume data from the broker and   will also see how to store that data to the mysql  database well let's head over to the intellij idea   here we have parent project within that we have  model Kafka producer wikimedia so let's create   a one more module so right click on the parent  project new and then choose module and then you   know tick mark this and then click on next and  then let's give module name as Kafka consumer   database something like this and then  click on finish so this will create   this Kafka consumer database model  within this parent project so as   soon as we create a model it will add one  entry in a palm.xml of this parent project   so if you can go to palm.xml of this parent  project you can see one entry will be added in a   module model section right so go ahead and  click on this load maven changes icon over here   okay now let's make unnecessary changes for  this you know model to make this module as a   springboard project so first of all go to pawn.xml  file and let's add a packaging type as a jar   and save it and click on this load maven changes  button next let's create a main entry point class   for this project so right click on the  java package and new and then choose   package and let's give package name as net  dot java gates dot spring boot hit enter   and within this package we are going to create a  springboard main entry point class so right click   on this package new and then choose java class  let's give class name as springboard consumer   application and then hit enter and let's make this  class as a springboard main entry point class so   in order to do that let's annotate this class with  add springboard application annotation and then   within this class let's create a main method  all right and within a main method let's call   spring application dot run method and then  pass springboard consumer application class   all right perfect now we have created  Spring Boot main entry point class all right   now this module becomes a springboard  project okay now we are going to create a   consumer so before that let's run this project  and let's verify this setup is correct or not   so in order to do that go to the Maven  here and then click on this execute   execute Maven icon over here and then  here just run maven clean install command and there we go well look at your spring boot  cup car real world project is a parent project   within that we have Kafka producer wikimedia kafka  consumer database so these are the two models   within this parent project and all these projects  are you know build success it means the model that   just we have created under this parent project  is working as expected okay great in next lecture   we will configure Kafka consumer in application  dot properties file all right i will see you   in the next lecture hi welcome back in previous  lecture we have created a spring boot project for   Kafka consumer in this lecture we will configure  kappa consumer in application dot properties file   so let's go to integer idea and just expand this  Kafka consumer database model okay so make sure   that you are in a right project so go to main  and then within main we have resources folder   within our resources folder we are going to create  application.properties file so right click on   this folder new and then choose file and let's  call this file as a application dot properties   hit enter so within this application properties  file we are going to configure Kafka consumer   details so first we need to configure Kafka  broker url that is Kafka broker address so just   type the property spring dot Kafka dot consumer  dot bootstrap hyphen servers and then localhost localhost colon9092 well our Kafka broker is running on port 909 to  write so that is what we have mentioned over here   next we need to configure a Kafka broke  where the consumer belongs to in order   to do that let's have property spring dot  cap card dot consumer dot and then group hyphen id and then let's go group name something  unique name that is my group something like this   okay now we have provided consumer group okay  where this kappa consumer basically belongs to   next we need to provide the upset okay so in order  to do that let's type the property spring dot   Kafka dot consumer dot auto offset reset or  list okay we give earliest as the value to this   offset next we need to configure d serializer  classes right so basically we need to configure   two d serializer classes one for key another  for value so just type the property spring dot   Kafka dot consumer dot d serializer and  then or g dot apache dot dot common dot serialization dot string d  serializer okay so this is the   d serializer class that basically deserialize  the key in a message so let's similarly do it for   value so let me copy this property paste it  over here and let me change from key to value   okay now we have configured these serializer  classes for key and value well basically Kafka   consumer will use these deserializer classes to  deserialize key end value in a message all right   great in next lecture we'll implement kafka  consumer to read real-time stream data from   the cup card topic all right i will see you in  the next lecture hi welcome back in previous   lecture we have configured Kafka consumer in  application.properties file right in this lecture   we'll implement kafka consumer to consume a data  from the Kafka topic well let's head over to the   intellij idea and go to the main package so  make sure that you are in a right project   we are going to create a Kafka consumer in  a consumer database project so go to main   package over here right click on it new and then  choose java class let's give class name as Kafka Kafka database consumer something like this  hit enter and let's annotate this class with at   service annotation so this will make this class as  a spring bean and then let's go and let's create a   logger instance let's say private static final  logger from cell 4j and then variable name and   then logger factory and then called log get  logger api and then pass class name that is   Kafka database consumer dot class perfect  now we have created logger instance   next create a method let's say public  wide and let's view method name as consume   and let's annotate this method with  at Kafka listener well make sure that   you choose Kafka listener annotation from  org.springframework.kafka.annotation package   next we need to pass topic and group  id to this annotation so just call   topics attribute and then pass the topic name  well we have given a topic name as wikimedia   wikimedia underscore recent change right so this  is the topic name we have given next we need to   also provide the group id well the group id that  we have given is my group right so let's have a group id over here so just call group id  attribute and then pass group id as my   group okay perfect so if you can  go to application.properties file we have given group id as my group right let's go  back to our consumer class and here you can see   for this consume method we need to pass  the parameter of type string so let's say   event message perfect now let's lock  this message in order to do that   let's use a logger instance and then use  input method and then use string provided   format method to format the string string dot  format and then say let's say message received and then placeholder and then pass   event message well here is a typo this should  be event message i received something like this perfect now we have implemented consumer to  consume a data from the this particular kafka   topic okay now let's run this kappa consumer and  let's see how this works so in order to run this   what we need to do is we need to just run the  springboard this project so go to main entry   point class all right and just run this project  and let's see how this you know how this Kafka   consumer will read the data from the cutoff and  there we go you can able to see in a console   event message received so this is the  log statement that we have written in a   kappa consumer right so let me go to  consumer and here you can see the statement   event message received okay and in a console you  can see event message received it means that the   kafka consumer that we have written to consume the  data from the topic is working as expected and you   can see the even data now let's go ahead and let's  run Kafka producer and Kafka consumer together and   let's see the result so in order to run the Kafka  producer so go to kappa user wikimedia project   okay and run the project so by using this main  entry point class you can run the project now   the Kafka producer is start reading the real-time  stream data from the wikimedia right now parallely   let's run the Kafka consumer well in order to  run the kafka consumer go to kafka consumer   database project and go to main entry point class  that is spring boot consumer application and   just run this project and you can see in a console  there are two tabs first tab is ping boot producer   application second tab is springboard consumer  application and you can see the logs of spring   boot consumer application event message received  okay so parallely you can see both kafka producer   and kafka consumer result here well in case of  cop cup producer you can able to see the log event   data followed by the event and in case of consumer  we can able to see the log event message issued   followed by the event okay it means that the  Kafka producer and Kafka consumer both are working   as expected okay great in next lecture what  we'll do we will configure mysql database in our   consumer project and we will see how to save these  events into mysql database all right great i will   see you in the next lecture hi welcome back in  press lecture we have implemented Kafka consumer   to consume the data from the kafka topic next  we need to store that data into the database   right for that in this lecture we will configure  mysql database in a springboard project so that   our springboard application can able to connect  to the mysql database all right so let's head   over to the intellij idea and let's go to Kafka  consumer database project and go to resources   folder within resources folder we have application  dot properties file so within this application   dot properties file we are going to configure  mysql details well before that first we need   to create a database so let's go to  mysql workbench and here just type the   sql statement create database followed by database  name let's say wikimedia so wikimedia is going to   be our database name let's fire this sql statement  and refresh the schemas and there we go wikimedia   database is created so once we create a database  next we need to add the required dependencies   in our Spring Boot project so basically we  need to add spring data jp dependency as well   as mysql gdbc driver so in order to get these  dependencies what we can do is we can go to the   dot io website that is spring initializer website  and go to the add dependency section over here   and just type spring jpa so you will get this  option like spring data gps so go ahead and   choose this spring data gpa next we use mysql  jdbc driver right so just type mysql you will   get my sql driver over here just choose it now  we have these two dependencies okay so whenever   you want to connect your spring boot application  to the mysql database then you have to add these   two dependencies all right so spring data gpa by  default it uses hibernate as a gpa provider and we   use ping data gpa to simplify the repository error  down and we use mysql driver to connect our Spring   Boot application to the database the database is  mysql database so go to the explore here option   and go to palm.xml and just go ahead and grab  these two dependencies okay just copy these two   dependencies and go to intellij  idea and go to palm.xml of Kafka dot   co hyphen consumer hyphen database project okay  so make sure that you are in a right project   so go here in a pub.xml and create the dependency  section and just paste these two dependencies   perfect and click on this load maven changes icon  over here now we have added required dependencies   okay now let's go to application.properties file  and let's configure mysql database okay so let me   go to resources folder within our resources  folder we have application.properties file   and within this application.properties file we  are going to configure mysql details so just   type the property spring dot data source dot  url equal to jdbc so we are going to configure   jdbc url to connect to the mysql database right  so jdbc colon mysql and then localhost well we   are connecting our Spring Boot application with a  local mysql database hence localhost followed by   the port of the mysql server that is 3306 followed  by database name that is wiki media and then let's   configure database username and password just type  the property spring dot data source dot username   equal to root well in my case i have given a  database username as root but make sure that   whatever the database username that you  have given we have to mention over here   so similarly let's configure database password  spring dot data source dot password equal to   mysql direct 123 well whatever the database  password you have given to your database make   sure that you have to give here after that we  need to configure couple of hibernate properties   well let's first configure hibernate direct so  just type the property spring.jpg.properties dot hibernate dot dilate equal to or g dot  hibernate dot dilate dot my sql 8 dialect   so make sure that you choose latest direct  so mysql it is the latest dialect as of now   so this dialect is required to the hibernate  because hibernate will refer this dialect uh   to create the sql statement to the respective  database vendor in this case mysql is the target   database vendor right so based on this dialect  hibernate will basically create the sql statements   next let's configure one more hibernate property  to automatically create the database table let's   say spring jpa hibernate dot ddl hyphen auto  equal to update so this will create you know   uh database table automatically well we want to  also see the hibernate generated sql statement   in a console right for that let's configure a  few more abundant properties over here that is   spring.jpg.properties.hibernate dot  show underscore sql equal to two   so this will basically show the sql statement that  is generated by hibernate in a console and this   spring.jpg.properties.hibernate dot use underscore  sql underscore comments equal to true so this   property will basically print the comment  next spring.jpg.properties.hibernate.format underscore school equal to true so this property  will basically tell cybernet that whatever the   sql statement that will print in a console should  be well formatted all right so these are the few   you know uh properties that we have to configure  in order to connect our springboard application   with the mysql database now let's run the  Spring Boot application and let's verify   whether our springboard application will  able to connect to the mysql database or not   so go to main entry point class so make  sure that you are in the right project   so open this kafka consumer database project  and go to main entry point class of this project   that is springboard consumer  application and run this class and you can see there is no  errors in a console it means   we have successfully configured mysql database in  our springboard project okay great in next lecture   we will write a code to save the data into the  mysql database all right i will see you in next   lecture hi welcome back in previous lecture we  have configured mysql database in our blinkboard   project in this lecture we will see how to save  wikimedia data into mysql database well we have   already written a kafka consumer which will  consume a data from the topic next we need to   save the data into the mysql database right so in  this lecture let's see how to save that wikimedia   data in a database table let's head over to the  integer idea and let's quickly create a jp entity   to you know store the records into the database  table we'll go to Kafka consumer database project   go to main package right click new and then choose  package and let's get package name as entity hit enter within entity package you are going  to create a class let's give class name as wiki wikimedia data something like  this hit enter and let's create   a fields like private long  id and then private string wiki event data all right so let's keep these two fields  now let's annotate this class with add entity   annotation so in order to make this class as a  jp entity we have to annotate this class with add   entity annotation from jpa okay next let's use add  table annotation in order to provide table details   so let's give table name let's give table name  something like wiki media underscore recent   change something like this next let's have a at  id annotation to make this field as a primary key   let's also use add generated value annotation  to provide the primary key generation strategy   so let's give identity and here we need to provide  lab annotation because the event data is quite   huge right so in order to store large data  we can use at log annotation okay great   now we need to create a get a setting methods  right so we have already added a lombok library   we can leverage lombok provided annotations to  automatically create a getter setter methods so   here i am going to use add to getter annotation  from bloombox library to create a getters method   for these fields and also let's choose add setter  lumbar connotation okay so these two annotation   will basically create a get receptor methods for  these two private fields okay great now what we'll   do we'll create a jpg repository that is spring  data gpa repository for this jp entity so go to   main package right click new and then choose  package and let's do package name as repository   and then within this repository package  we are going to create interface let's call it as wiki media data repository  perfect and let's extend this interface from   jpeg repository interface and then pass post  argument as a entity type that is wikimedia data   followed by long as a second argument type  okay perfect now we have created spring data   jp repository so it will basically give a crude  methods to perform database operations on given   entity now let's go to Kafka consumer class  that is Kafka database consumer here we'll   inject spring data gp repository and then we'll  call respective method to save the event data   so here just declare the spring data jpa  repository that is wiki media data repository   let's call it as data repository and let's use  constructor based dependency injection so let me   generate the constructor over here okay we  no need to add a toilet annotation because   this spring bin contains only one parameterized  constructor now we have injected wikimedia data   repository okay now let's go and let's call it  save method to save the event message so here   i will create object of wikimedia data jp entity  new wikimedia and then i am going to set the data   that is event message next call data repository  dot save method and then pass wikimedia data okay   perfect now what we have done we have injected  wikimedia data repository and then we have   called it save method to save this wikimedia  data object okay great now let's go ahead and   let's run the Spring Boot project and let's see  how the data will be stored in a mysql database   we'll go to main entry point class that is spring  boot consumer application from here just run the   Spring Boot project and you can see the consumer  kafka consumer is running similarly let's go ahead   and let's run the producer so go to kafka producer  class project that is kafka producer wiki media   and then go to main entry point class that is  springboard producer application and just run this   project so let me select the Spring Boot  put up you know producer application over   here and just start it and here you can see  two tabs spring boot producer application   springboard consumer application and spring  boot producer application it will basically   you know retrieve the real time stream data from  the wikimedia right and here you can see the   logs in the console and in a Spring Boot  consumer application you can able to see   the events are you know storing in the database  you can see the statements right insert into   wikimedia underscore recent change and then the  value is passing so let me stop both the instances   and we can see the log over here  you can see the insert statement   insert into this table and this is the data  isn't it it means that we are successfully   you know storing the wikimedia data into  the database so let's go to mysql workbench   and here let's refresh the schemats and go  to tables and select rows from the table   and you can able to see a wikimedia event  data is successfully stored in this table   well let me recap what we have done so far we  have created a multi module mine project within   that we have created two more projects one  for kafka producer another for kafka consumer   well we have created a kafka producer project to  implement Kafka producer to read the real-time   stream data from the wikimedia and  write that data to the kappa topic   and then we have created Kafka consumer project  to consume the real-time stream data from the   topic and write the data to the mysql database  i hope you understood how to use apache Kafka   as a broker to exchange messages between  producer and consumer in a Spring Boot project
Info
Channel: Java Guides
Views: 103,852
Rating: undefined out of 5
Keywords: spring boot, kafka, microservices, java, java guides
Id: TkhU8d-uao8
Channel Id: undefined
Length: 79min 35sec (4775 seconds)
Published: Sun Jun 12 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.