How To Load Test Apache Kafka With LoadRunner

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
this episode is sponsored by OpenText the information company we power and protect information our purpose is to elevate every person and every organization to gain the information Advantage OpenText Works to power the ubiquitous information around us to elevate everyone we strive to help customers simplify their systems with frictionless Automation and equip them to thrive and grow in the digital world the loadrunner family is one of many solutions where OpenText can help did you know that loadrunner has the largest community of load testing Professionals in the world check out the QR code or go to their website at opendecks.com to find out more about the load Runner family foreign hello it's time for the SMC Journal podcast this is the show that is all about making it software better and faster and more efficient today how do we monitor it how do we scale it how do we make sure that it performs well is it secure all of these things we cover on this show I'm Scott Moore your host and I appreciate you being with me a couple of things you'll probably notice we're back in our regular studio and I'm so glad to be back I've been on the road for quite some time I'm not going to be around I'm going to hit the road again on the performance tour again we've got more interviews to do Live Events Etc uh can't stand still too long but I am glad to be back in the regular studio I do notice that agent 503 is going to need a little bit of a refresh so expect him to change here in the next episode or so a couple of shout outs before we really begin which I want to thank flux Ninja for sending me this found this in the mail when I got here flux ninja where they were on the podcast uh several several episodes ago a few weeks ago and they sent me these socks handy dandy flashy socks I will be showing those off in the few in the near future they also sent me this bag but I took the carry bag and I actually just put it on the wall of Swag because it matched the lampshade so so well uh that they they get a free little plug here now if you'd like to get on the wall of Swag you can send me a private message we can talk another thing that came in the mail was this book not a book on testing the book on testing by Alex rodolph of QA Consultants I've done a lot of work with qac Brian burknoff James pulley they've been on the show quite quite a few times want to read this book looking forward to it I'm also looking for an autograph from Alex in the future when we meet in person which I I hope is very very soon so uh thank you everybody and let's talk about the subject today Apache Kafka some of you have already experienced Apache kapha because you work in one of the Fortune 500 companies and probably half of them use Kafka some of you are wondering what is it it is a a platform for event streaming well that's great Scott but what is event streaming okay that according to the documentation from Apache you can think of Kafka as like a central nervous system for your software uh you have software that talks to the clients uh the servers that talk to their clients but if you want something that goes across all your applications to bring everything together and provide some real-time data where you can make decisions or monitor or do some things based on that you need this event stream platform plenty of use cases for it uh some of the most popular ones if you are getting an Uber ride how does Uber know where those cars are at at any given time because they're changing their position constantly about how fast they're going what direction they're going how do they know who's near you and then when you they start sending messages on their platform between you and that Uber driver that's been selected and you're getting updates to how close they are I mean that's a lot of real-time information that needs to get back and forth it needs to be scalable it needs to come back in a short period of Time same thing for Netflix you watch a movie after you're done with the movie because you watched Sasquatch Sasquatch each New York we think that you would like the Sharknado movies right so how do they they know that you just watched that so they're giving you real-time updates and and recommendations based on that that's how they're they're pulling this stuff off so the big companies use it and it needs to be scalable hint number one these messages need to come back in very low amounts of time low latency so we're talking 10 milliseconds or less hmm you know that's another red flag and so if there ever was a a user use case for needing load testing stress testing performance testing Kafka would be one of those things and so how do you go about how do you approach load testing and event streaming platform where you have something that's publishing information and messages all the time and going out there but then you have consumers of that information across your your entire Enterprise you know how do you even do that well that's the topic today our sponsor open text the owners of the load Runner family have a solution for that load Runner recently added in a new protocol just for Kafka and it includes not only the the actual recording and functions to be able to create a script against a Kafka application but also a Kafka monitor that monitors the producers the throughput and the message rates and things like that so that you can actually use load Runner the way that you have done load testing for web applications Citrix or any of the other applications in the past but specifically for Kafka so it can be done and I would like to show you on this podcast a demo of how that actually works so we've got a demo application and we're going to walk you through about an eight to ten minute demo of how to use Vu gen the script recorder and load Runner to create the script and play it back and then how to run it within a load test and monitor some of the the load profiles and different things so I'm going to show this demo and then we'll come back and show you some references for you as well we have a topic there is a data that is sent to this topic that different applications subscribe to for example a data producer might be a warehouse scanner in fact we're going to use an example of that for our demo this data producer will send data to a topic and then a consumer in our case Vu gen is going to act as a consumer capture the messages and deserialize the message which means deciphering and understanding and parsing that message so that it becomes something that can be processed by the application now this is done through a recording process just like any other view jump protocol however on playback of course you have a topic but now the playback the virtual user generator becomes a producer it realizes the messages in other words it packages them up in a way that's understandable to the consumer and it sends those messages out to the topic and in this case the consumer will be a warehouse dashboard which again we'll use for our demo purposes let's start by describing the environment we're going to be using for our demo now I have a Kafka server running in the background and I also have an application that is comprised of two major components so the first part is a producer it generates messages out to a topic and think of it as a client in this case think of it as a scanning application that will scan a product in a warehouse and includes an order number a truck number it has a destination State and a pilot where this product will be will be placed now this is sending messages it's producing messages but the consumer on the other end will be a dashboard this dashboard will take those messages in that topic and it will process them it will provide a status and assigned route an assigned driver and expected date of arrival so for example if my scanning application on the left hand side I generate a message and then when I refresh the screen I will see the message received by the by the dashboard and it'll show the route it'll show the warehouse status the driver Lisa in this case will be driving the truck and the expected arrival is uh on in May of 2023. now part of this of course is going to be capturing those messages or that traffic as it crossed the wire from producer to Consumer and various topics and for that we're going to be using virtual user generator from loadrunner let's go ahead and create a new script using Eugen and the script I'm going to create of course is a Kafka script now this is going to be a capture and replay type of operation so very simple here's our plain script I'm going to start recording and a couple of things I want to point out right one of the things that I want to point out immediately is that there is configuration files now the configuration will hold things such as the server information different serializers and these serializers in other words the way that Kafka is going to recognize the type of data and be able to decipher the data that is coming across now we can point to a file that we've already created or as you can see shortly we can Define the file with reporting options the next thing is are the different topics so we have a scanner and we have a dashboard then we have things like router events those are messages that pertain to the routing of the products right that we have in our warehouse and then we have scanner events these are the events related to the scanning activity by the employees in the warehouse let's go ahead and look at some of the recording options now so we have a logging option we can provide different levels of logging brief or extended if we're debugging the script and then we have the code generation where do you want the information to reside once we create the script now um probably one of the simplest is to save it within the script itself as a new file or we can insert it directly into the script in other words have these statements generated to show the the operations within the script itself but I like to keep things simple I have a file name scanner that will include the information you'll notice that it's for the producer in this case now the next thing that I want to show is the configuration files and here's that my config that I uh that I created earlier that includes information such as the bootstrap server this is basically your Kafka server then you have the the serializer in other words how do we take data apart how we recognize it because Kafka can work with a wide range of data from plain text to Json and more complex types so this is basically saying what is the class A Java class that's going to be able to serialize and make sense of that data right from a consumer perspective the other thing though is the serializer the serializer is the opposite operation how do we package this data so that it makes sense to the uh to the consumer ultimately rightfully when you send these messages out to the different topics then we have a couple of other options things like the virtual machine settings for Java and in the class path for any additional libraries and jar files and things that are important to to uh the scripted working so very simple very logically laid out but also very powerful now once we have these options set and it becomes a simple capture or record operation once we hit record you're going to see the window minimized and we get the recording bar for bu10 so you see the events right now we're standing up one event but here's our application we have a client I guess you could call it it's a the scanner application sending messages to a topic and then of course we have our dashboard that's going to consume those so let's go ahead and select a couple of uh items here we're going to select a truck a state that is supposed to go to this is going to Colorado I'm going to assign a palette and as soon as I click send you see the messages are are being sent to the topic and Vu gen is picking it up now in this case Vu gen is acting as a consumer it is reading those messages and if you recall we have a the serializer defined so that it knows how to take that data apart in this case of course very simple data but we see the information reflected in our dashboard and we see it reflected as an event so let's go ahead and do a few more messages so we have some good data to work with in fact we're going to click on the send a couple of more times so that we have a nice list of orders and statuses and things of that sort so now let's go ahead and stop the script with the recording if you will and let's look at the data that's generated so we have a script just like any other script right and we see different actions on the left-hand side we have an unique section an end section so just know that even though this is a Kafka protocol script it still behaves as any other virtual users we have runtime settings parameterization you have different actions different functions that you can Define but more importantly here is your script and here are the different messages notice that there's a the serialized function so basically it took the message and it is able to take it apart and read the internal part of the message so we know it's string data we know that here's an order number the truck that was assigned the state that it's going to and then the palette so these are all values that can be parameterized and it's all directly within the script now playing this back very simple just like any other Vu gen script I go ahead and click on run and we're going to see that it generates those messages where basically it plays it back when I play back the key here is that viewgen is now acting as a producer so before we were acting as a consumer reading data listening to messages we were subscribed to a a particular topic and we were listening to that data as it hit the topic now we're on the opposite side now we're acting as a producer we are essentially regenerating this data that we captured in the script as a matter of fact if we go to our dashboard and let's go ahead and clear the dashboard and let's go ahead and rerun our script and we should see those messages now regenerated so if I click on refresh you see that the data is now regenerated so you know it's basically it's bypassing or ignoring the client or the scanner in this particular case and just subscribing and listening to those messages as they come across to a particular topic and of course the serializing them right so I need to stress that because it needs to know what to do with the data that's where the serialization comes into play here now that we have a working strip the next step for us is to take this single virtual user and we're going to create a controller scenario and again this is just happens to be a Kafka script but this becomes like any other virtual user so I'm going to go back and we're going to create a controller scenario we're going to run let's say 10 virtual users and we're gonna just leave the name as Kafka for the group name we click OK this is going to bring up our controller and it's going to generate the scenario for us now before we click this off I want to show you uh three monitors that are specific to Kafka we're going to add those to the scenario things like the Kafka product uh producer duration Kafka throughput and then Kafka message rate or how many messages we're pumping through the application so let's go ahead and kick this off and watch what happens we click on start and we're going to leave our refs populate as the scenario kicks off we're starting to see some good data being built by the different monitors here right of course we have the number of running virtual users we have specified tents which slowly ramping up we have the transaction response time just as any other typical performance test but again we also have the the uh various Kafka graphs specific to the Kafka application but I want to show you right as this runs here is our consumer right so again we're acting as a consumer right because we're running a script so we're generating that data we're sending messages to the various topics and when I click on refresh you can see of course that this is a ton of data that's being generated by our performance test so in summary we created a virtual user by consuming the messages generated by the producer in this case the scanning application we captured that data as part of a script we did this by listening to the topic where these messages were sent next we turn things around and we had view then generate the messages and essentially uh turning it into a producer on the other end there is a consumer to capture those messages in our case it was the dashboard application then we finished by creating a complete load testing scenario that you see running here and with multiple virtual users running the test and monitoring that Kafka specific traffic that you see on the screen foreign that's load testing Apache Kafka now if you want more information about how all this stuff works it's very easy to find on the internet from the ADM documentation from OpenText I'm going to take you to uh the first page that I'm recommending which is the Kafka protocol documentation for load Runner and it talks about what the protocol is for it allows you to create a virtual user script to monitor the health of Kafka deployment run these scripts there's several areas to this how you can manually create a script nvu gen what all these functions are there's actually in the function reference there is a function reference for Kafka functions and there's also this page here which is for Kafka monitoring so the Apache Kafka monitor monitors the producers and the consumers and you can see there the main monitors that they have and of course you'd be adding standard monitor as well for underlying things like the operating system and what have you and then I'm going to also share a couple of videos with you that I found on YouTube that's just top and look for Apache Kafka and it says you know Kafka in five minutes Kafka in six minutes just kind of explains more detail about Kafka in more detail than what we can share on this because we're I'm assuming that you are either experiencing a need to load test it you have some understanding of Kafka but you you need to have a load testing solution but if you don't have that context these videos will definitely help you on that and another place to really start is the Apache documentation themselves which can be found at kafka.apache.org documentation and it just kind of takes you through the beginning stages of okay what is event streaming what do I use it for uh and how do I do this so it tries to give you kind of a breakdown of that and then walks you through actually doing more with it so I thought that that was an interesting topic to discuss on today's show but you may say no it's not tell me I want to hear from you give me some feedback you can find me online in many different places just scan the QR code or go to that linked list that I have there and you can find me on LinkedIn Twitter Instagram and other places you can also just send me a quick email to hey Scott smcjournal.com real easy to find and would just love to hear what you think about this and if you know about other Solutions I'm sure there are other load testing Solutions where people have load tested Kafka in the past I mean it that has had there's been a need for it before this protocol ever came out so how did if you've done that how did you load test Kafka in the past and what were your results if you like this kind of content would be great if you subscribe to our YouTube channel you can either go directly to it from that URL to scan that QR code or just search for it on YouTube you can find my channel Scott Moore Consulting right there so until the next episode of the SMC Journal this is Scott Moore thanking you for once again for watching and we'll see you next time bye-bye [Music] [Applause]
Info
Channel: Software Engineering With Scott Moore
Views: 1,529
Rating: undefined out of 5
Keywords: performancetesting, performanceengineering, softwareengineering, observability, tech, techreview, technews, techtips, techyoutuber, mobile, techie, creators, travel, micro focus, opentext, Saltworks
Id: NEZ_m-yh0eo
Channel Id: undefined
Length: 21min 2sec (1262 seconds)
Published: Tue Jul 11 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.