The Ultimate Guide of Using Kafka with Node.js

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
what's up guys in this video you will learn everything you need to know about using Kafka with node.js and no I'm not talking about this guy I'm actually referring to Apache Kafka the event streaming platform I will shortly explain how Kafka works and why you should even use it how you can use it in combination with node.js and finally we will also build a real-time cryptocurrency wallet tracker with Kafka as its backbone so let's get started as you might know most databases encourage you to store your data as a state which can change over time Kafka however encourages you to store your data as an immutable stream of events so for example instead of storing and updating the current price of the cryptocurrency in a table you can actually create an event each time the price has changed in Kafka events are organized into topics to which producers can write events and from which consumers can read events topics can also be split into partitions for example a producer might write events with the current price of a cryptocurrency to a price topic the key of the event used for partitioning might be the ticker symbol of the cryptocurrency and the value might be the current price in US Dollars and best of all Kafka guarantees that events inside one partition are always read exactly in the same order as they have been written to the partition nonetheless there can be as many producers consumers and partitions as you might want the the event of a topic are read by a so-called consumer Group which might have one or many members if there's only one member then this member will receive all the events from this topic if however there are multiple members in one consumer group then the events will be distributed across the members using the topics partitions meaning that each partition will be uniquely assigned to one of the members hence you can either use Kafka to build a published subscribe system by putting each of your consumers in a unique consumer group that way all of the events will be broadcasted to all consumers or you can use Kafka also to build a message queue that processes events in parallel that can be done by putting all consumers into one consumer group but be aware the parallel processing only works when your topic has multiple partitions okay that's nice but you might ask yourself why should I even use Kafka and the first point to this is obviously thinking of problems in terms of events will give you an entire history of data not just the current state with Kafka you also have the ability to decoupage services using an event-driven architecture if you have multiple services that need to exchange data between each other the number of connection gets out of hand quickly if each service talks to each other in that case Kafka can serve as the data exchange backbone drastically reducing the number of connections and lastly Kafka can scale to almost any demand no matter if you are processing one or one million events per second Kafka can handle it because of its scaling abilities so now we are going to dive into the actual topic of this video using Kafka with node.js and to actually interact with Kafka through node.js you can use the kafka.js library with the hype of this Library it's very easy to produce and consume events so let's build a small example after you've installed kafka.js we will start by creating a Kafka instance like this you have to specify at least one Kafka broker to connect to here from there we can instantiate a producer and and connect to it using a finger weight to publish an event to a topic you can call the stand method and pass in the topic name a value and optionally a key for partitioning in my case that's the example topic a random emoji and a random username but we also want to consume the messages and to do that you simply instantiate a consumer with a group ID now we can connect to it subscribe to our topic and do something on each message for this example I will just lock the key and the value of each message to start our service in Kafka we will be using Docker compose usually running Kafka requires at least one broker instance and at least one zookeeper instance but there actually is an alternative Kafka implementation called red panda which is much faster and doesn't require zookeeper so it's much easier to set up so we can start redpanda easily by using this Docker configuration if you however prefer to use the standard Kafka implementation with zookeeper and the broker you can take a look at the Legacy Docker compose file to run the example app we will use TS node and we also have to specify the Kafka broker address as an environment variable which in this case is red panda column 29092 after starting the docker compose file you should see a random username and a random Emoji being printed to the terminal to make it a little bit more fancy we can create a simple Loop that produces many messages and I would also recommend disconnecting the consumer and the producer before stopping the app this can be done by listening to the sick term event re-running the app should now result in a lot of messages being printed and as a quick side note you usually wouldn't produce and consume the same messages by one service this producer and disk consumer could just as good run and different services in fact you could have as many services as you want producing and consuming messages simultaneously as promised I will also show you how you can build a real-time cryptocurrency wallet tracker with Kafka as its backbone that way you can admire your crypto riches all day while also flexing with your developer skills isn't that nice actually the final application is just a command line tool which displays live updates for the price and the Wallet balance of a Bitcoin or an ethereum wallet the application consists out of five components acli which displays the balance and the price a server which communicates with the CLI VR web circuits a balanced service which cause wallet balances a price service which emits live price updates and finally Kafka which handles the communication between the back-end services so let's look at the actual implementation the CLI application is very simple it reads the wallet address from the user's input and establishes a bi-directional connection to the websocket server and each time the balance or the price changes the console output will be updated the user can also force a Wallet balance up update by pressing the enter key then we also have the price service which uses the binance API to get the price data of Bitcoin and ethereum in real time each time a price update from the API is received a new message will be published to the Currency Price topic the message key is the ticker of the currency so either BTC or eth and the value is just a Json object containing the current US dollar value the balance service is a little bit more complicated than the price service because it has both a consumer and a producer the consumer listens for the task to crawl a wallet's balance and since we want each task to be only executed once we choose a specific group ID for the consumer Group by doing this we effectively have a message queue which is able to distribute those tasks across multiple instances of the balance service and of course each time we receive a task we load the Wallet balance using in this case the block Cipher API and publish a message to the Wallet balance topic finally there is the server which communicates with the CLI through a websocket the server listens for price and balance updates using two different consumers and then notifies all relevant clients about the updates to figure out which client is tracking which wallet we also initialize an in-memory key value database of course the server also handles incoming connections from new clients feel free to take a closer look at the code at your own pace but now we will see the application in action to start the app just run Docker compose app and make sure to run npm install beforehand once that's done you can watch any Bitcoin or ethereum wallet by running dcli with the wallet address as the last argument you should also be able to watch multiple addresses simultaneously and pressing the enter key will force the balance to be recruit and that's everything you need to know about using Kafka with node.js of course there are more advanced topics like re trying failed messages and adding them to a dead letter queue in case of consecutive failure or deploying Kafka and services to the cloud to make use of Auto scaling and load balancing but those topics are out of scope for this video by the way the wallet tracker that we have built in this video only works because Bitcoin and ethereum have a completely transparent blockchain that wouldn't be possible with Anonymous cryptocurrencies like Monero or Darrow so if this video helped you consider sending me some coins privately or at least like this video And subscribe to my Channel with that said thank you so much for watching and have a great time
Info
Channel: Florian Ludewig
Views: 7,121
Rating: undefined out of 5
Keywords: kafka, node.js, apache kafka, ultimate kafka guide, using kafka with node.js, nodejs kafka, redpanda, zookeeper, docker, docker compose, kafka tutorial, kafka node.js tutorial, typescript, node.js typescript kafka guide, cryptocurrency, wallet tracker, bitcoin, ethereum, monero, dero, how to use kafka with node.js
Id: gTwXG8lC2GM
Channel Id: undefined
Length: 9min 10sec (550 seconds)
Published: Mon Dec 19 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.