How to filter and route event using Splunk Forwarder

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
okay today we'll be discussing about how to route and filter event using heavy for order okay and the similar kind of concept we can apply for Universal for order as well but there are certain septal distance which we will be discussing as well okay now now basically what we are trying to achieve it over here so for that let me let me open this diagram over here so we have a heavy for order with with this routing feature okay and it is basically monitoring if file it says it normal fine okay now based on the data in that file that means events in this file this heavy follow order is basically what it is doing some events it is forwarding to index l1 and other events it is forwarding to index - okay so as as we know heavy for order has the capability of seeing through the index as well right so that's why this kind of stuff is possible in heavy forward at the index level but if we have a universal for order over here instead of a heavy for order then in that case in your Universal for order will not have the view of event level in that case it may sink or route the data based on source type or source that level is possible okay we'll see how how those configurations to work okay so for that what we will do is if you see it over here and if you remember from my previous videos we created this heavy for order instance right and we also created this Splunk receiver instant initial date now I also created another Splunk receiver instance called Splunk receiver to which we will which will be acting as a second indexer okay so so our job is to let's have some file we will pointer over here in heavy for water okay so see if I just show you at the server level so this is our heavy for order server and the server 82 that means our Splunk receiver - we will be forwarding some events with error okay that means we will in our file whatever events are generating with error sticks in it will forwarding to this server 80 tube that means receiver and whatever for okay this will be a success one and whatever events are generating with success message will be forward into this 54.3 that means our receiver - okay so for that what we'll do is first we will log into or heavy for order let me take some time we will come back so we logged in over here first we will do sudo su okay okay so we log in as a rule prompt now first thing is will create our input now for that what we will do is we will have some kind of normal text file okay let's say some some text test dot log we will be putting it somewhere okay we will monitor that one because if we understand for smaller stuff and easier stuff right it is very easy to understand the most more more complex stuff so that means if we have some log file it say I will go to CD /opt slash Splunk then we'll go to log Splunk directory okay so over here if you if you remember Splunk all the Splunk generated log files are stored over here right so we will create a we will create a our log find our own law fine now for that what we will do is i will log in to this heavy for order instance okay so 224 so if I just copy the URL from here this is our heavy for what are you are right so in just low mean I need some events okay with a determined so I'll go to our underscore internal index okay let's say it is nothing time just needs one even let's say I'll copy this value one here so this is our error event we will save it somewhere and we need a success event so basically if you if you are getting it I am just taking two sample events which I will be continuously pushing to our test log fine okay so this is one of the successive months now first we will create our test dot log file so we will go over here will say VI just I just save it no data now if I just put a less over here our test dot this - dot log has been created right we will be put we will be putting our events over there eventually now we will create the input first okay in the heavy for order for that we'll go to settings data inputs you can create the input from the backend itself by editing the inputs calm so this is this is another way to do it basically as we are monitoring a simple file so we don't want to do it again anything you use dedicated I still can show you the game okay new local file and directory okay so we will be selecting our test dot log over here for that we need to go to ATC so /opt slash Splunk right so opt Splunk far log Splunk then our tested log right so he was selecting that one will continuously monitor that we will click on next now source type will be giving a we'll be saving it as a source type we'll say we're source type as test so this is important okay so you'll remember the source type name yes then we'll be picking on next let it be in such habit it will store our default indexed ok that's fine review submit so our input has been created now so if we just if we just wanted to check it out you can go to CD / opt / plumb etc' app search app because we saved in such a pride and the local folder so there is a input song VI oops just now we created for our test dot log over here so currently the test dot log do not have any any data we did not pushed anything over there okay so that's fine now our input is ready so let us first construct our outputs dot-com and the configurations related to disappear outing okay then then we will eventually push some data to our test dot love some of the irrelevant some of the success events and see how it working okay so so for that what we'll do is first we will create our outputs calm now for every 400 order we'll be creating our output scones in in system locally folder itself you can maintain your outputs that confident in any app as well so that is that is fine so just for simplicity purpose I am maintaining in system local ok et Cie system local folder okay I think they're they're easy outputs dot-com previously created when when we we have our hey before order installed right in my previous video so yes so what we will do is we will delete this file now this time if you remember from my heavy for ordered video we created this outputs confusing a Splunk CLI command right so now what we'll do is we will create this output zircons using by direct editing this output Slotkin fine okay so for that what we will do is we'll copy this guy over here some of the configurations name we need so we give it over here save it over here ok now we'll come out here will completely remove the out push cons ok now we will create our new fresh output store comes good so now let us understand when we will update this particular file so as you remember from my previous video for a outputs Kahn was specifically the TCP output okay we need three stanzas among these three turns changers this may this is the global standard right you CPM and which is optional as well okay so that means if you do not give this default group as well you can still create TCP out and group stanza Splunk will automatically consider this guy as a group okay so for our purpose what we'll do is we will remove this guy okay so we have two groups now right one is Error group another is success loop because we have our two index and now right so we will name this group as it has okay now error group so if I just go to this our server so error receiver is 82 that means this is our server external IP I will copy that 82 and it is listening on triple 97 right now I will remove this one as well now if you remember this particular things is required when we want to do have want to have some kind of server specific configurations right should not meet that husband will copy this whole guy again will now hit another group called success group so I change this success grow now if I just again go to our server list so this is the server which will be receiving our success events right so we will copy this IP address always remember you have to copy the external IP not the internal IP ok so we have our two groups basically ok each group having a single indexer ok mmm and we will see now how to route to blue so this is our outputs conf so we'll come back over here we will paste this information here ok so this is our outputs comm we'll save it ok so our outputs not conf is ready now for TCP routing you need two things one is a props dot confine another is it sends from dot confine ok so that is very very important in the profs dot con file you are basically saying for this source and source type apply this transform if you if you are not sure about props conf and transform comms how it works I have created few videos on that please check out that I will be giving those videos link as well in this particular video ok so and in the transform comm you are basically having the configurations which will do the disappear outing ok so first we will create the prop conf then we will go and create that transform console ok so for that we will again come back to our node friend ok so so if you remember just now we created our input right over here and while creating the input we have given the source type as test over here right and in our props dot con file we need a stands on right you can if you just give the name directly that is always points to a source type right so if you want to do something else then there is a separate syntax for it right for a pose you want to set it for source and goes three separate the same text point right so as we will be monitoring a single file and we have set the source type for that input as test so I will be giving test over here okay now to create a transform conf entry over here that means a transform conf stands over here this is how you do it right in this transforms then a name class name basically so I have discussed all these things over there as well in my video okay then we will be giving to transform name over here one is error routing okay another is success logging and if you remember from from that video if you give you can give more than one transform Constanza name here right so they will be applicable one by one so for all the events first this transform will be applicable then this transforms will be applicable so I will copy so this is this will be our simplest props confine so I will go over here I will say VI proz.com insert okay I save this file so our props dot conf is also ready now the most important part is the transform confer okay so for that again will go to on notepad we will create another window over here so now what will be the stands enemy in our transform conf it will be the same as what we have defined in our prop kampf right so so there will be two stanza you know what transform con called error routing and success rounding over here right so this this is the one now the three important configurations in transform conf which basically tells plum to route the data first thing is a regex okay so this regex is basically telling Splunk look for the events which match this particular rejects okay and then do something so that means this radix is important over here now here the distinguishable factor between a heavy 400 order and a universal 400 order now when we use heavy four order heavy for order has the event label axis right so that means you can provide any latex over here okay now when you use the universal four order okay so in that case Universal for what I do not have access to the event level that means if you provide some rejects over here it's not going to work over there in that case you need to give a dot simple okay that means Universal for all for order will always go by the settings what do you mentioned in the props calm okay that means this probable setting is enough for Universal further forward throughout the data okay so as we are using heavy forward or over here so we will be giving a latex over here for either event will be just simply looking for the string called error in the event for success event we will simply looking for the string called success in the event that's all okay so I would suggest you try that this particular same settings with Universal for order and see you and see whether it is working or not fun okay so in that case you have to remember you have to give it dot over here instead of error and success okay now the most important key is the destination here you will be mentioning underscore TCP underscore so this is as easier to mention now where it will route basically so for that this is the key is important format okay so this here you can give the group name we set up in outputs con so if I just see our outputs con we set up two groups over here right one is the error group and another is the success group so we will copy this error group over here and give it over here for error routing and for similarly for success group the same configurations will go except thus this group name will be our success group correct so this is our transform calm simplistic okay so we will copy this we will come over here we will come out of this we will create a new file called transforms dot-com okay dr uranus sephora.com will keep the same data over here okay let's skip shift Z Z so that means our transform file is also ready so that means if if we have done everything correctly ideally whatever data in test dot log we will be pushing accordingly it should route the data to the corresponding servers okay so now as we have changes manually due to this configurations files we need a splunk restart if you remember from my previous video we changed it through CLI commands so that's why it was may not be needed but this time it has to be done so we will for that we will go to our Splunk home directory and game directory CD /opt slash plum b dot slash so while it is restarting let us check it out our two receivers how they are doing so we will first log into our error receiver so second tab is in in a receiver at me okay so now we will go to our settings forwarding and deceiving okay we'll just check whether the receiving is enabled or not so we already have the receiving created but it currently is enabled I actually disabled it okay so I am just in a village okay so it do not get disabled automatically now similarly for success we do the same stuff we'll go to will go to settings forwarding and receiving configure receiving so you already have a receiving port and just inability okay now this is our 224 is our heavy for order right so I will just log in over there okay again this guy will be there doing the forwarding part so you go to forwarding and receiving okay so configure forwarding if I see if you see it over here currently it is reading from its outputs calm that's why these tools these two things are created automatically our two host right wire will be sending the data we will also check in our receivers because we will be sending their domain index whether is there any previously data or not we will start from fresh just so that at least we can check it out with it's fine so currently we do not have any any data in our main index and remember if you learn the delete command it does not delete the data from the index it just hide it so if you need to do the index cleanup so there are certain other commands as well maybe I will be creating a separate video for that so it is not there fine so our main index is clean cleaned up now so our outputs are ready our transform comes already our props Khan's are also ready right so now what we will do is we will go to our heavy for order we will go to that test dot low / pretty / you know what I'll just clear the console we go to CD / opt / plunk then we go to for log Splunk correct so here we have our test dot log files right we'll go inside that log file so currently you do not have anything so you push some some data manually okay so ideally in your scenario your log files will be automatic automatically updated by some of the apps right so as we will not go to that complexity we'll just manually push some data let's say first we will push a error data now remember this date it's 20 not not today's date so we mean to search in our index accordingly so we will push that data and if I just save it as our heavy for order is continuously monitoring this test dot log right and it's a error event right if I just see it it's a not even because there is a string called error inside my event right so that means it should route the data to our error receiver that means 82 right so let us see so this is our 82 server right so if I just go to men if you see the heavy for our already routed the data tour 82 now also we need to check it is not routing to our 54.3 right so for all time I run of it is also routing over here that means something is wrong over here in my here before order level so let us let us check it out we'll go to our heavy forwarders how to start on Splunk TC system VI outputs calm so this is fine we only have look over here let's see the transforms so for each and every error routing that means the probes can transform settings we are doing the rejects okay so this is the issue if I just go to insert mode the key name is d st dot underscore key okay not be nice why underscore key destination okay so this I did some kind of spelling mistake over here which is - okay so destination underscore key okay hopefully this this is going to work so as we have changed it again so I have to restart the Splunk once dot slash long restart okay while it is restarting will also delete this event because this is not how it's supposed to work okay will start from fresh again and also you will do one thing okay let it be so now we will go back to our log file /opt so we test now we have a single error event let let us see whether it has seen anything over here otherwise we will I think it's one sit Saints it do not send again so what we'll do is we will we will go back here we'll go to insert mode okay we will push another error event let's say and copy the irrelevant here this now we pushed another event over here so ideally our our error server that means 82 should be receiving that right so if I just so it is it is receiving that and 54.3 should not be receiving that so it is not assuming that if you see this time but that means that error event has been routed to only this guy over here okay not not this guy over here according to the configurations now let us push a success event over there so I will just copy this guy okay I will go over here I will go inside again the test dot lock I will go to insert mode I will push the success event now okay now let us see how how it's going so ideally our our 54.3 should receive that event now if you see it's already received that even now this guy should not receive that all that particular success event it is not receiving it okay so this is how the routing works that means based on the event it will read out the riddle to corresponding indexers okay so now can you can you do this this exercise where instead of heavy for order you you have a universal for order and you want to achieve the similar stuff in that case maybe you need to you need to have the two separate files because you will be applying your transformations on the source source type or host right you cannot apply the transformation at the event clip in right so maybe two separate source type data you want to index to two separate indexes okay and and if you remember as I stole before in the transform com whenever you are using a universal for order that means you cannot use the event level details the radix has to be a dot but rejects this particular configuration has to be there in the transformed or gone for TCP routing purpose okay so try that one otherwise a separate video for that see you in next video
Info
Channel: Splunk & Machine Learning
Views: 8,587
Rating: undefined out of 5
Keywords: splunk, how to, route event, filter event, heavy forwarder, admin
Id: AxHetwfLC0Y
Channel Id: undefined
Length: 30min 19sec (1819 seconds)
Published: Thu May 23 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.