Applied Security Orchestration with Splunk & Palo Alto Networks

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone thanks for thanks for showing up so early here in the morning how many of you saw this black card in your bag when you checked into the conference and are here because you're expecting to see some guy from slug talk about next-gen security teaming together to prevent a CAD probit attacks and protect your data one that's good because as you can see we've sort of completely changed the presentation on us to make it infinitely better so what happened was is that I was going to stand up here and talk about Splunk for quite a bit of time and and we decided that it would be much better if you do experience exactly what it is that a customer would have in terms of success integrating Palo Alto Networks data and Splunk so Kevin here from when our corporation is actually going to do a presentation and I'm going to fill in some of the blanks when it comes to this point technology so my name is James Brodsky I've been in Splunk for about four years I like long splits on the beach and I am a huge lifehouse fan right I didn't know that until last night but anyway so I the best technology job I've ever had and one of the best things about Splunk is that it's constantly surprising me and our customers with what it can do and one of the best parts about it is being able to see the success that customers like Kevin have so I'm really proud to be able to present up here with him today so hit the next slide there and we'll get started so I am Kevin dollars I'm a security operations manager of an AR corporation just so you in on what Lennar does Lennar is a bit of everything in terms of what industry it's in so we are a home builder we work in financial services with a mortgage and a titling division we have international presence we have energy we're in the energy industry with Sun Street we have asset management our own desk on Wall Street with one of our subsidiaries as well public company fortune 200 revenue of over 11 billion dollars a year we have over 11,000 endpoints over I mean over 15,000 endpoints over 11,000 employees and home to over 40 Palo Alto Networks firewalls right so as I said before I'm James broadsky based out of Denver Colorado I manage a team of sales engineers and it's my day job and my afternoon and evening job is to be a security SME for slunk that works all right so here's our agenda for today right we're going to go over a little bit about Splunk enterprise security Politan networks and threat intelligence and pretty much how our corporation has integrated those three aspects really what we're looking at is the main integration is Splunk enterprise security and Palo Alto Networks and then how we utilize the threat intelligence data models within Splunk es to populate Palo Alto Networks firewalls threat intelligence portions the next aspect we're going to be going over is user ID so how can you populate user ID utilizing Splunk so I know a lot of customers have issues with user ID and actually getting that user affinity out onto their actual networks so we're going to go over what are dilemmas were some use cases that we actually applied and then how we actually use Blanc to solve those use cases the next aspect is this Blanc Universal forwarder so the forwarder is deployed at every end point and we're going to go a little bit more into that but most people don't realize just how powerful the forwarder is and what the kind of data you can collect from that is and we're going to go into just what you can use the forward or for as it relates the Palo Alto Networks and if we have time we'll have one more thing awesome so how many of you guys show of hands have heard of Splunk how many of you are stuck everybody how many of your Splunk customers about half of you okay cool so for those of you that that that everybody's heard of Splunk for those of you that are not customers just a quick review of what Splunk is and we'll also talk about enterprise security just for a minute or two Splunk is a basically intimately scalable platform to gather machine data store it and analyze it all in one place and so that has really interesting security implications because if we have a record of humans interacting with machines and machines interacting with machines all in Splunk and we can search all that at scale that's a really wonderful thing for your security operation center is Incident Response forensics etc any data with a timestamp can go into Splunk this one can be hosted on premise one can be hosted in the cloud as one can also work in a hybrid manner so you can have data in both places and search at all from one place on top of Splunk when we call it Splunk enterprise or Splunk cloud we have a product called enterprise security it's one of our premium products this is our sim like functionality it's up there in the top of Gartner Magic Quadrant and it really adapts Splunk and gives significant capabilities for instant investigations and forensics security compliance reporting and monitoring of known threats and since that sits on top of this one core technology what we're going to hear is some of the use cases that Kevin is going to get into leverage the enterprise security and the data model underneath and our enterprise security and of course course Blunk as well so why did Lennar choose Splunk and why did we choose Palo Alto Networks so just to go a little bit into that we were in the middle of standing up our enterprise security security operation center and we pretty much needed something that will fulfill our Incident Response needs at the same time we had very outdated firewalls we had very basic rule based firewalls with some basic web proxies nothing crazy it was very outdated and we needed to upgrade all that so for the firewall the solution was unified threat management platform right so Palo Alto Networks kind of fit that for us and in terms of Incident Response and a Security Operations Center role Splunk enterprise security definitely fulfilled from that for us just to go a little bit more into that right what Splunk Enterprise Security gave us is a single pane of glass we're able to go and not just in Palo Alto Networks rules or puddles and networks alerts every single platform every single agent everything that we have in our environment we can go and look at all that within one pane of glass and be able to correlate that information rather than just looking at the firewall to see the logs this is the alert this would generate that but we don't know what happened on the point we don't know what happened within the access points we don't know what happened anywhere else because we're only looking at one location what Splunk enterprise security allows us to do is create those notable events there and actually correlate that information with everything else another key aspect right is so with the new security operations center you're kind of stuck on the amount of time you have to investigate the amount of incidents or false positives or true positives that you may have to go through so adaptive response with enterprise security actually gave us a really competitive advantage over that one of the key things to note is when you're going and creating a notable event you're creating correlation search you can add adaptive response measures such as auto submit this IP address or this URL to wildfire so you get an automatic analysis right at the Splunk another thing you can do is do a tag here you're the say botnet alert comes in and that's where you've customized within Splunk es and you want to tag that IP address into a dynamic block list you can do that right off the gate or you can just have it automatically do it if you have trusted threat sources or anything like that we're gonna go a little bit more into that so when the security operations center we also had threat intelligence needs we wanted a centralized platform we wanted to be able to go and manage that platform in terms of filtering it in terms of the duping it being able to pick and choose exactly what we wanted to utilize within our firewalls and really this this integration which we're going to go into and I'm going to show a couple searches and what we use to actually do that and it's pretty mean you up the button there sorry no problem all right so going into those botnet alerts this is one of the key items that we actually plugged into Splunk enterprise security this is not out of the box but it just goes to show that you can do anything you want right if it's there you can create it so going into the auto wildfire submissions and pro notable event tagging for Palo Alto Networks what that screenshot on the right there shows is a botnet alert that we created using Splunk query language to automate or create what the firewall show you on their devices so instead of getting an email alert or having to go into the firewall to go and investigate an action you can just create that with enterprise security and correlate that with other logs other information so you get a better picture of what's going on in your environment like I said you can go into automated actions the depth of response or you can go and actually do manual submission so for example maybe you don't want everything to be tagged into a dynamic block list because you're probably going to break some stuff right you can have the action which on the screenshot all the way to the right there's a little drop-down you can click on it and say well I want to tag this IP address or I want my security operations center analyst to be able to tag whatever is in this notable event or whatever was in this botnet and say tag this with this tag because I know this tag is in this dynamic block list and now you know that's going to be blocked in the future so another neat thing you can do utilizing somewhat enterprise security is leveraging the threat intelligence of data models so within the Politan ette works threat intelligence portion you don't have a lot of unless you have mind-meld of course you don't have a lot of customized ability in terms of picking and choosing and filtering you kind of just get one large data dump into the firewall and you're stuck there right the firewall just going to use everything that's in there and you don't have any management over that so utilizing Splunk Enterprise Security's of the right intelligence management portion of the data models you're able to go and choose all your threat intelligence sources go and be duped that filter that pick and choose time frames or how you want to utilize that and have Auto submit or Auto tag that into your firewalls can also just publish it down to an IAS server and then your firewall and we have to read three files rather than hundreds of files 20 to tens of files or whatever it may be so what we did is we actually just published an IP file a domain file in a URL file so that way our firewalls won't have to read three things that we manage so going into a little bit of that botnet alert this is kind of what's generate what's what's the the backend portion of that alert right as you can see here you kind of create a correlation search you just name it what you want write a description you put the search query that's actually running that which that is the search query that's in that box just nicely formatted and colored with Crayola for you and you can choose a time range that you want this search to run on in our case 15 minutes or every 15 minutes would give us the what we considered a good amount of time to very host to contact external IP addresses external threats to be able to determine a botnet and then we have our adaptive response action so at the bottom there you can see that we wanted this to create a notable event we also used it to actually create a risk analysis by saying if this user is in here but the user field is in here maybe create put a higher risk to this and within Splunk Enterprise security for those of you who don't know you can actually put higher risk to maybe your executives and what not so on the scaling of things you know which ones to react to first you can also do like I said Auto submit URL to wildfire so you get your automated analysis and you can do the the tagging to dynamic address groups for dynamic block lists and etc so go into that little search a little bit more so what's that search do right maybe I should grab this giant green lantern things yes so to start off most of you may be familiar with the index pan an organization were very diverse we have a lot of subsidiaries so we've actually parsed each subsidiary with their firewalls into their own index so for Lenora we have Lent underscore pan we have our CM underscore pan we have a UA MC underscore pan so we had to parse them out so this is how we go and we look at every single firewall within our organization so whenever you see Ashford's within my searches means i'm searching the entire enterprise and all of its subsidiaries not necessarily I created some sort of weird index or anything like that so we're going to go over the pan threat source type we're looking for about anything that's tagged as potentially malware and internal IP space that's going outbound from there just because we don't want to lose any information by using space we're just going to fill null so any source user that's not tagged for whatever reason just fill it out with a blank value just so we don't lose that data then we're going to parse it into a staff command so we can actually get a counter of how many times a particular device use source IP contacted an external site that's potentially malicious from there we just dump that into a table just to kind of sort it neatly and then we ran events that so event stacks is essentially stats within a status right it's like that section in the sense so we can go and utilize the count that came from here some of those counts by client IP so now we know okay how many did so we had the initial stats gave us how many per individual site now just how many for all the sites so we actually know how many malicious site a certain IP or certain user visited within X amount of time and then from there if it was greater than four we determined that as a potential botnet or at least something for our security operations and Rannells to actually look at after that we have a lookup which we're going to go into what lookups are and really this is utilizing the URL toolbox applet within Splunk and within spawn base and really the purpose of this piece right here is just so when that notable event create get that notable event gets created our Security Operations Center Alice have a little bit more fields to work with just in case of false positive keeps being generated we can utilize those fields that are added via the URL toolbox and maybe create a very intricate or a very specific suppression rather than going and suppressing something or suppressing too much because that's always bad alright so I went over how you can go and just tag or automatically tag your trusted threat sources I briefly described that a couple of slides ago and I just wanted to really quick go over the search behind that so this is kind of running off the same thing that we just went over but instead of creating a notable event at the end we just have this running in the background so we have Splunk collecting all the threat intelligence in this case we're going over IP intelligence and it's aggregating that deduping that now we want to go and choose our most trusted threat sources now of course you can always add a couple more pieces into this to filter the data a little bit more and be dupit I just wanted to give the basic search commands just so you can see how it works and how you can utilize it and you utilize the Pens tag command lit which is within pen double use app within swung bait Splunk base and really what you're doing here is you're pulling all the IPS within your threat intelligence data model you're choosing your most trusted threat sources and then just tagging those devices with or whatever tag know what L means not good in Spanish so that's what we call it we're in Miami so we have to and we're going to add on those all those IP addresses into this tag so they're automatically blocked from the firewall so we're having spelunky s manage our what we wanted blocked all the time every day right it's just easier that way with a small team especially when we really trust these threat sources because we've reviewed the data we know it's always good or 99% good and we can trust it right so there is a couple of concepts there on Kevin's last couple slides with the searches that I wanted to go into a little bit more detail just so you can understand some of the concepts a little better this is a modified version of the first slide that I have up in terms of what is clunk but if you see here at the bottom there is these external lookups one of the magical things about Splunk is that if you're all the data that you're going to analyze and search on and Splunk doesn't necessarily need to be indexed in Splunk Splunk has very powerful ways of taking existing data and Splunk and correlating that against other external data sources those can take many different forms but one of the things that we were seeing before is that kevin is doing a lookup against something from our URL toolbox app on Splunk base which we'll talk about as well to get some additional field into the data that don't exist there in the actual data they exist at search time and they're parsed out by that special command or special look up look up to take many many different kinds of forms the most basic kind of look up as a CSV lookup also underlying more recent versions of Splunk these days we have something called a KB store which is actually MongoDB underneath it gives us a little bit more power and flexibility for lookups you can also do dynamic and scripted lookup so actually run some commands maybe a Python script under need to enrich your data lookups can be also fed from a database lookups a really highly performant they can get really really really big and we do some clever things with indexing lookup data underneath to make sure that they stay performant they can also use to be stored to store state between searches as well and they are extensively used in enterprise security and also the Palo Alto Networks app that's out there on Splunk base which we'll talk about at the end a little bit two really powerful apps that are available on Splunk base these are free community apps but they're developed by people that work at slunk URL parser and URL toolbox as analyst there's an awful lot of work that we want to do with things like URLs and domain names etc to be able to parse out the different components of them the top-level domains etc we also might want to do analysis against those domain names so maybe you're trying to search for dynamically generated algorithms or maybe you're trying to do some fuzzy matching and you want to do a levenshtein algorithm against that or Shannon entropy those types of things all those are built into those two apps and we see that people that are doing analysis of intricate strings in their data especially from a security perspective really love these two apps finally one of the searches that Kevin showed also talked about Splunk data models and this is an acceleration technology so it can be an acceleration technology but it's essentially a data exploration technology and an acceleration technology that we introduced back in version 6.0 of Splunk this is incredibly powerful this allows you to pre publish a schema on top of the raw data and then once you've pre published that schema you can accelerate that and so the ability to search many many many many many gigabytes terabytes etc of data very quickly within Splunk usually using the concept these days of an accelerated data modeler neath so if we want to get threat intelligence out of Splunk really really quickly we might use the threat intelligence data model that comes with enterprise security which is exactly what Kevin did and then lastly one of the searches use the pan tag command and so if you download the flunk app for Palo Alto Networks one of the things that it has in it is the pan tag command there's a couple of other custom commands that we're going to use but that's where you get this and essentially that allows you to take IP addresses and move them inside and out of dynamic address groups right from a Splunk search like was done there in the last search of the Kevin show Europe alright so let's talk a little bit about user ID market right the markets grade shut up and take my money improve visibility we got policy control logging reporting and forensics every time you get an HR request or a legal request you can say hey I know exactly what that user was three months ago this I was his IP address I know everything about it all right yeah we had some problems so being a very diverse shop with different subsidiaries non-microsoft endpoints a lot of sass applications no centralized point for user ID agent it was very difficult for us that's trying fulfill or fill this user ID agent with the actual user ID - IP mappings especially when one of the most recommended places to actually put a user ID agent is an exchange server well for office 365 customers are kind of dead in the water for that right and then another thing that we kind of lacked was a historical data we didn't have some sort of log that actually gave this for us or without we were already doing that we can just put a user ID agent there and just say hey Justin just all this and we are good well at least we'll have everything since the beginning of time so I kind of sought out to see how we can actually populate this agent and how we can get all user ID - IP mappings since the beginning of time or at least since implementation time right so this is a high level solution overview of kind of what we're utilizing today we have other solutions but these are the main use cases just due to time constraints these are the ones that we're going to go over so as you can see here basically what this diagram describes is the relationship between our environment with Splunk in the middle so there's a couple these things to know here right one we are collecting all logs from everywhere in our organization so in this case we're talking about Meraki access points for wireless devices we have our endpoints so we have MacBook Pros we have Mac's we have Windows devices we have a little bit of everything with Linux servers we have everything and then we have our radius servers so we also have our firewall populating Splunk but the two main things to know here that we're actually using Palo Alto zone data to populate itself back we're using these three sets of data to populate the firewalls user ID the agent as well and we're utilizing the Meraki data to enhance the radius server data data to actually give us one complete picture and I'm going to go over that just now so you can updating user ID with radius logs so give an X amount of incomplete sources you can depict a full picture right so if one set of data has half the information and another set of data has the other half and there's at least one identifier where you can link those two sets of data then you can go and actually create one full picture and utilize that information to populate user ID in this case so given that give a given said that within our wireless access point logs we had device MAC address and we had device IP address great we had the IP address but we were missing the user at field within our radius logs we had the user field and we had the device MAC address but we didn't have the device IP what the radius servers were giving us would be IP addresses of the actual access points rather than what the user was getting so we did was we grabbed the API logs we created an automatic look or autumn use the output lookup command or output CSV sorry and basically that's running every X amount of time outputting a CSV onto the squonk servers and then we're utilizing that CSV to do an automatic lookup back into the radius server log so now every time we search the radius server logs we know the user that's tied to each individual log based off that MAC address so the search behind that is that first one right so this search is the one that's running every amount of time actually outputting the lookup of the Meraki source CSV into spline and then from there we're utilizing that CSV creating an automatic lip up like I said within the the settings and applying it to our radius server logs so now when we run this search here we'll always have the user ID so let's go a little bit more into that right because there's a couple commands that maybe some people are not familiar with so in this one rex is just reject statements for field extractions so in this case we are extracting the source IP from the logs because our Meraki beta didn't have some sort of extraction and then the output lookup as I mentioned to go and actually output that CSV as a lookup table and here we have MV expand which that one is actually a very powerful command so sometimes you get logs where there are multiple values to one particular field this happens in Windows event logs all the time within the security ID you'll have an issue or security ID and you'll have a target I mean you have an issuer security ID and a target security ID and when you go and you look at the security ID field it'll have two values for one log but if you output that into a table it removes one of those values so you're just cut your information in half so what multi-value expands or m-v expand does is actually go and it undoped that log or unduplicated it kind of dude it goes and it creates that log and it creates it again but with the second value in that field so now when you parse it into a table you're getting 100% of your data so from here we go and we grab a security idea when we multi value expands it so now we have two logs to get every single piece or multiple logs depending on many security IDs within a table we do the same for source IP just in case and then we do dupe that just in case there's multiple values that just equate to the same thing from there we utilize the pan user update which we'll go into to actually update the user ID field and the IP address into the Palo Alto Networks firewall so that we can actually get that user affinity right so pan user update is the second of three commands that come in the Palo Alto Networks app that's on Splunk face this is pretty simple just like Kevin mentioned essentially what this is allowing us to do is take a couple different couple different fields specifically the IP address and user field which are by default and feed that either either directly into a Palo Networks firewall device or into panorama and dynamically update right there from the Splunk search interface by the way the third command which actually isn't listed in the presentation is a is an update pack there's a few look-up tables again that come with the app for app data and threat data and that third command that we're not going to talk about pulls data out of firewalls or panoramas and updates to look-up tables right within Splunk so now let's move into the next piece right so now we're going to talk about updating user ID with session initiation protocol and other best guest scenarios so this is where I was talking about utilizing Palo Alto Networks his own data to populate it back with its own user ID agent right so one thing to know about your environment you have to know all the applications and all the blogs that those applications output you have to know the information within those logs in order to best utilize it in our case we use we use Skype a lot formerly known as link and what Skype does is that it patched it passes the SIP user name within URL traffic and what we did was we noticed that and we started going in extracting those logs or that user names tying it to the IP address that it was coming from parsing that into a table and then updating the firewall with that so that's this course right here I mean this this search right here all right so we're searching our firewalls we're looking for any URL that has that sip URI command extracting the username within that field doing a multi-value expand because it's always good to do that these duping that data searching where the users unknown is not equal to unknown and then we're going to go and update the the or update panorama with the user ID to IP mappings the second piece here is more of our best guest scenarios so now that we learned our sit data and we can extract user names from our from our URL logs what else were we actually sending across our network that contain that same kind of information now it wasn't going to be as exact as sip because well it could be anything right it could be apps it could be anything that people can pass any kind of user name for whatever reason maybe they're using somebody else's username or something that doesn't get tagged correctly so what we did was is that we created the same kind of search we're looking for are all of our email addresses so any domain that we owned it was being passed within the URL parameter and we went and we extracted that and we put an asterisk in the middle of the log I mean are in the middle of the user the reason we didn't put at the beginning or ends is because poly won't act and the Palo Alto Networks firewalls won't actually read it you can't update it if there's an asterisk at the beginning or at the end so we're using the asterisk as kind of like a wild card saying hey this is a best guess the benefit of this is that we're only populating logs or IP addresses that don't already have a user so if there's no user this is what we think it is right and what that gave us is it gave us kind of a point of reference on where to begin when we got a request from HR when we got a request from legal or when we just needed to investigate an incident right it's saying hey we believe this is the user tied to this we need to validate that but at least we know where to start right another huge benefit of this best guess in the sip URI traffic is you get a lot of user ID to IP mappings that you won't expect to get you get unmanaged devices you'll get mobile devices you'll get people that are just utilizing it on their personal like iPads personal Android tablets you'll get pretty much anything anything that people are utilizing their company email addresses or their company user names and passing that traffic via URLs so it gave us a huge advantage of information that we would have never collected if we didn't utilize this okay so this this one is my favorite so we did the first two and we got a lot of the data there we got about 50% 60% of our actual user ID agent populated now the question was is that how do we get the rest of it we're kind of going around the sources we're gathering from servers from agents from URL logs but we're kind of going around where the true source was we can all agree that end points are the true source of user IP mappings right they contain the user that's logged in all LAN IP solve an IPS or any other IP that they have they have extra NIC cards when the IP address is tied to it you can gather that too so what we did was that we utilize a swamp Universal forwarder Splunk universal forwarders your friend it does whatever you need it to and when I say that I mean that really wholeheartedly you can do anything you want with the folder so in our case we created a PowerShell script which I posted here since most people use Microsoft and we created a Python script that does the same thing for Macs and what this does is this script runs at the endpoint the forwarder executes sat every X amount of time and it's collecting the username that's currently logged in using wmic and then runs the WMI object command pull the network adapter configuration and writes all the IP addresses that within that network network adapter configuration to standard out so the forwarders just reading that sending it up to the indexers and it's there for you to read with your search heads from there we parsed it into we have our any custom script that we created we put it in a script bucket just because it's easier that way and our other indexes don't get filled up or cluttered with data and we go we ran this search so our source type scripts use your affinity and anything that has an IP address just in case just cuz we wanted to make sure that filters out what we did was restricted the username extracted all the IP addresses so doing a regex statement and putting a max match value of maybe extract maybe up to five IP addresses from this device and then we're going to go and do a multi-value expand because when we go we extract two three four five IP addresses that's all those values are being dumped into one field into one log so now you have one log with five values in one field can't really do anything because now we're stuck that we're going to lose a good majority of the data that we're collecting so use multi-value expand parse that out into X amount of logs depending on how many IP addresses you grab put that into a table and then update the firewall with all that information this got us to 99% of user ID capacity so we have very few logs that don't have a user ID IP map to it the ones that we don't have our basically unmanaged mobile devices that may be connecting to our network or something like that or just something that we just for whatever reason we missed it's always the unknowns right and then yeah I could keep going alright so just a little bit about the Splunk Universal forwarder at this this technology is a very stripped down copy of Splunk extremely lightweight runs on pretty much any flavor of Windows you can think of including embedded ones POS ones etc as well as Windows as well as Linux and a couple of other operating systems essentially configured and as Kevin said it has significant amounts of flexibility its main purpose in life is to collect things like windows event logs or flat files that you might have out there on on an endpoint but there's so many more things that you can do with it we've seen an awful lot of popularity with it to collect Microsoft's despond data we can we have another free app that goes on to to those quarters called stream that will turn the NIC card into promiscuous mode and capture Network data directly off of anything that that endpoint is seeing it can collect perfmon counters registry values and how they change over time and as we saw with Kevin's example it can also run scripts and take the output of those scripts and put that in Splunk and so very very interesting technology that sort of the world is your oyster when you have these things deployed across your network and so we'll see another very very creative thing that Kevin did with the forwarder here on the next slide so before going to the creative thing right let's talk about a little bit more how you can utilize the folder on your endpoints to actually make a better integration with your paltu networks firewalls right so there's a couple things that we're currently doing today we're doing dual protective install validations on the forwarder so the forwarder is constantly checking hey is golde protect installed grab the version that the user has on that device we check all of their settings so we exactly know exactly which gateway portal user ID their agents their past everything that's saved on that agent and storing it all in spawns so that way if there's ever a VPN issue if there's Enver anything we know all the information on that user all the settings they have everything they're utilizing before we even contact them and that saves us a lot of time because more than likely we can correlate it with other information maybe we see oh hey well this version is having X amount of columns or we keep noticing this same version is having this problem and we can correlate that information actually get true value out of it it also saves its time because we know all this information before you can contact somebody another cool thing that you can do is you can actually assess your egress points so we're utilizing the forwarder to do now this isn't directly to Palo Alto Networks but it's definitely a good point on where you need to put a firewall where you need to make a network configuration to adjust your network traffic to go through a firewall right so what this script is it's a PowerShell script it goes and I'm gonna Python them as well and unfortunately due to time constraints I'm not going to go into the code of it but what this is doing is that it's checking every every single forwarders checking its external IP and validating whether there's a firewall in between itself and the public Internet if it sees that there is no firewall in between it goes and it sends a log about this blog saying hey this is my external IP I have no firewall we need to fix this so this helps a lot when networking starts making changes that we don't like because networking and security don't like each other and and it also helps us know the unknown so brief let's roll back a little bit security words and then Lennar was established at least an official security program was established not more than three years ago so 15 years of bad networking can lead to a lot of holes right so the first thing we did when we had the forwarder was we implemented that so we could say okay well we don't know all of our unknown so let's find them let's use the forwarders people and say where are all the points within our network that we're not detecting we implemented this and we just pushed a firewall at all those locations or we routed them back to a place that had a firewall so this plugged every hole in our environment and allowed us to do that very rapidly and the next thing you can do is bug workaround so as James said you can go and utilize this order to collect things you can collect anything you want you create a custom script but what about fixing things how can you live order to fix something so let's go into a very specific example as it relates to Palo Alto Networks so we notice we have global effect installed in all of our endpoints we have ubiquitous VPN turns on so VPN always-on and we noticed that on version 3.0 X in our environment under these particular circumstances so ubiquitous VPN specifically on Windows and when there was an ungraceful network disconnect so for whatever reason a computer maybe ran out of battery and the computer just shut off and the goal protected was unable to reset the dns settings or whatever the case may be anything that qualifies it maybe they did this they disabled the NIC card or something like that and then the dns settings gets frozen just because global X isn't able to reset them on that version under these circumstances so what we did was is say we're saying okay well there's an issue right because once the dns settings get frozen the user isn't able to connect to their home network they're not able to connect to the corporate network they're not able to connect to any network so they would go home turn on their computer say hey I have a lot of work to do but they can't connect to the internet call the help desk say hey I have a problem I can't connect to the internet we know what the issue is we know that their dns settings were frozen but they don't have administrative privileges over their own laptops so they can't reset their own settings we can't remote in and assist them we're not going to give them the administrator password so what do you do nothing you right you just kind of dead in the water until they get back into the office the next day or whenever they get back in the office and you assist them then so we're getting a lot of calls a lot of issues and a lot of time consider I'm actually dealing with this and the only way that we could think to do it is utilizing folder to fix it for us so that's what this script comes in so more PowerShell more Python well no Python in this one because it's only windows but more PowerShell so we created this script and what the what this script is is that the forwarder is executing this very frequently and all it's doing is that it's checking on the system making sure dole protects is installed before it does anything check step though once goal protect is installed the version check on this particular one just so we know the version that it's the issues occurring on then it goes into validates network connectivity it checks DNS first on external DNS if that fails then it goes and it checks internal IP ranges so it's checked the internal IP if it can't connect to anything that's an internal IP then it goes and it resets all the settings back to what it should be and goes and gets that person live again without them ever even noticing so this is a real true value example of how you can use the forwarder and actually you can report on this how much time you're saving how many issues you're resolving without even having to make a phone call right there's so much value to this that it it was pertinent to our situation there we could have lived without this it really helped us out so I mean realistically just not having to call anybody and having the issue fixed before anybody even notices was was really a great advantage yeah yeah what some a couple just a couple slides here to finish up but I guess before I go into this there any questions that anybody wants to ask at the moment to just make sure we have enough time to finish these up and get some questions we'll also get some questions here at the very very end too and we'll be available afterwards and this one quotes - okay alright so just a few things then the slunk apple paladin networks that we've been talking about throughout the entire presentation is now in version 5.4 was released last last month it's one of the most intricate and well developed apps in my opinion on Splunk base because Palo Alto has developed it has a few people sort of devoted to the development of this app and they've used some various different technologies along the way like the data models the custom commands to make it as good as it can be most recent version has better support for traps data coming into it the better support for autofocus some cross search functionalities in there if you're interested in hearing more about it come stop by the booth how many of you I know that we had some people raise their hand in terms of being a being used of smoke how many of you start to get into that split search interface and end up with searches that look something like the one here they can get very very complex and intricate sort of hard to understand for the layman that isn't used to typing search isn't just plug that sort of raising hand that sometimes there you go at least one person a couple okay best in in our user conference in October the question is is like if you had a search like that wouldn't it be better to type something like this in and actually get a usable answer out of Splunk so just do some natural language let me ask like an actual question like I would ask it in English so back at our user conference in October one of our partners called insight engines released something called the cybersecurity investigator specifically it works against the enterprise security data models but they've recently expanded it to be able to work against the Palo Alto data models that are in a lot of that and so I'm just going to spend about 90 seconds to show you what that looks like and if you get really excited about it come down to the booth and we'll let you type your own question into it and see what it comes up with so go over here real quick wait for this to uh sink okay cool so standards flow interface right with an app on top of it you can see this is the insight engines app and this is just this their demo system and I'm going to go in here and I'm just you can search any of the data models that are within pet the Palo Alto Networks app I'm going to stick with some of the the pan threat data but we could certainly do certain searches against traffic and endpoint etc but maybe I want to know what threats the Palo Alto detected yesterday so like what would I ask slunk I could say what pan threats were seen yesterday go and as you can see it's working its data models in its taking that that natural language question that I just asked and it is converting it on the fly into a series of Splunk searches that are all visible here if you want to reuse them in other places you can see the data models that it's querying there's data models of fire firewall logs data model man sub sub - that is the threat portion of that data model and then here are all the answers to that question that I asked returned from the data model so that's a pretty simple question I can get I can get a little more advanced with it I could say what critical and threats we're seeing yesterday all right and so now it's automatically going to know Oh will he tech critical so I'm just going to filter this down and just showing criticals you can see there's there's a time range here I said yesterday right I can also say things like and it'll figure that out and say okay well three days ago I'll do the automatic calculation and I'll return that for you eventually it might come back hopefully there we go so you can see it's just at the time frame for me and then I might say something like we're seeing outside of business hours and sure enough it's going to say okay well I understand what outside of business hours are and so I'll go and I'll modify the search to show me what what happened Brett data that happened that wasn't between 9:00 and 5:00 right of course that's all configurable as well all right I'm switch back over to there we're going through that there we go the questions comments concerns for you there with us oops oh f5 right did I break it all the way to the end without a technical problem and there we go the slide says literally questions comments concerns on it right there very technical fifth question is for you Kevin yep um and your setups processes I noticed that you were pulling things like MAC address and machine names right and so I'm sure in your environment people have more than one device all right so that's something that I'm struggling in my environment and I don't have a spunk yet right I'm right you're doing the user ID with the Paolo's directly and the only information that I'm able to get is IP and user name right and I'm not able to really do anything with that until I tie it to a machine name and a MAC address before either you know how do you notify the desktop support person of exactly which one of that person's devices is the one with a problem and I'm wondering how you folks solve that actually the answer for that is flung out a short answer so you can't so the Palo Alto firewalls they utilize the user ID agent is IP address and user those are the two fields that accepts in our example what we did was that we utilized the MAC address on the device to tie two different data sources together right so that was the correlating data field that we can utilize to put those two things together to give us the IP address and give us the user now from there we could parse that out onto a table and say now this is a user IP and MAC address and host name and everything else that we may need and have that as there are historical evidence file since the beginning of time or since inflamation implementation time that way if we ever get a request we can be like oh well let's just go back to this date okay yeah this is what their user name this was their IP address these were the hosts that they were logged into these were their MAC addresses maybe they spoof the MAC address and that's what the new MAC address is so that's really where Splunk comes in play so you take like the botnet report from the Paolo's the user ID from the pals you're feeding back that into Splunk and the Splunk is doing output and giving it to the whoever that needs to go clean up that device all right thank you you're welcome right and it's populating the firewall back right so we're getting that information back but we're also out putting the extra sets of information that we can't push back into the firewall and outputting that somewhere else just so for our purposes in terms of asset management and inventory anyone else hello every question I keep to the threat intelligence feed do you use only the one that are embedded in yes and oh do not manage for specific I'm sorry what do you manage false positive on the Year spreadsheet how we manage for others yes and do you use only the one that are embedded in yes how do you use the initial data sheet okay yes credit that's it yeah so we utilize one way to utilize the Palo Alto Networks intelligence feeds that are published on pan W site we also utilize several other open source where heavily focused on open source feeds because we haven't had the budget to get a more holistic intelligence platform we do have some industry specific ones that are paid for registered like information security and analysis centers but we do utilize some of the ones within ES and we utilize every open source one that we validate and we deem that it does have useful data now in terms of false positives what we have done is we added a little bit more to the search that it normally does right so we've noticed that sometimes we'll get a false positive and it's trying to block or alerting on internal IP ranges right those blocked out private IP spaces and what we did was that we just added those to exclude those from the data models or exclude those when we're searching the data models by default so that we don't ever see those we don't get those false pause that is there anything beyond just excluding the private IP spaces we removed certain ranges that we knew for a fact that they're going to be false positives or maybe we trusted them a little too much I'm not saying that's best practice just because technically anything can be compromised but you do have that flexibility with enterprise security actually make those changes and customize it to your environment that answer your question awesome right so going back to your integration user ID and from strong so I saw that you're only sending one user ID to panoramas how do you distribute that to your other firewall to do it at all any issue distributing with the order power we had an issue while we have again Delhi we get about one terabyte of data because you have 185 walls on the World Bank also another ID so I won't understand how you strip your user ID but we all so yeah definitely so my recommendation is publishing it to panorama and I'm not saying that's Palo Alto's recommendation it's just what's works for us just because we populate the ID agent within panorama and then panorama will push it directly to the serial of your primary firewall now what you can do is if you have maybe multiple panoramas just because the amount of firewalls that you have deployed within the world you can go and run that same search and just say okay well this is the panorama for let's say the United States right and we could run that search against that panner of a device or maybe that primary device and then have that same search do the same thing to all the other main regions or main firewalls for that matter does that answer it pause three so go expect your panorama to automatic push the user ID to the firewall right so because we join our audience we love the power loss and to panorama and to our strong yeah but there is we just push it to a panorama my theory that the log coming from the far wall might not have to use our idea right so let me go into the pan user update what pan user update does it doesn't actually populate panorama it utilizes panorama to push it to the firewall so you choose your primary firewall maybe it's in a data center or something like that and you're you look so we actually go back to one of the searches so look at that bottom search right where it says hand user update panorama X dot X that X and then serial number that serial number isn't the panorama device that serial number is a firewall so it's utilizing the panorama IP address to go and use panorama to push to a firewall we just like that centralized point now you can also just push it directly to the firewalls like in this example right you can use pan user update instead of pan tag and just push it directly to the devices IP address so you can do it one of two ways actually I think in panorama in the newest version of panorama or something like that there's an like a weird issue that it might bug out if you try and push through panorama so we're actually changing all these certain sort of searches to push directly to the firewalls rather than utilizing panorama to push to the firewalls okay thank you go up okay I think we'll be giving the given the yank off the stage if we do one more plea okay thank you with that being said for p.m. tags can with a single pan tag event can you push to multiple we like you said we run the search multiple times yeah you can do that you can so if you can so if the tagging is being distributed across all your firewalls if you're utilizing one central point of reference and all your firewalls are feeding off of that and that's the primary template or whatever you're utilizing for panorama right then you can do that if you have maybe different tags or different options or just a different deployment of firewalls depending on your location or something like that yeah you can also tag things differently by utilizing multiple searches or you can even do it within the same search you can utilize the same pan user update and just keep specifying it to different firewalls if you like okay thank you all right that's all thank you very much for attending appreciate it have a great rest of coffee [Music]
Info
Channel: Palo Alto Networks Ignite
Views: 4,076
Rating: undefined out of 5
Keywords:
Id: Ztmyy_6yUt4
Channel Id: undefined
Length: 54min 53sec (3293 seconds)
Published: Mon Jul 10 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.