Azure Sentinel Lab Series | 100 ways to get data into Azure Sentinel | EP4

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
what's up guys episode number four of the lab tutorial series i'm going to talk about all the ways to ingest those logs into azure sentinel all the methods available to you to ingest the logs i'm going to just go over everything i know so if i don't cover something that um you're curious about feel free to comment and i'll respond but like i said i'm just going to go over everything i know um in log ingestion which is quite a bit um and if you're not interested in how you know to ingest logs you know all these methods go down to the description and find the one you are interested in right i'm going to put timestamps so you can just click at it all right and i'm not going to lab all of them but i'm going to show you as best as i can how to ingest that log so you can just do it and so it'll be kind of your homework assignment so some stuff i can show some stuff i can't because it's just a lot of stuff to show alright so let's get started now let's talk about it when we think about azure sentinel you first have to understand kind of at a high level how it works right so let's say we have our cloud here and let me actually just make it smaller so we'll just call this the microsoft cloud now within this cloud this azure cloud we have logs that we want to ingest into the cloud right and sentinel is going to be sort of that brain you know i call it the brain because it's doing a lot of things automatically for us with its ai automation it's going to be the brain that's going to ingest all these logs understand sort of what we call the entities you know the ips domains host and correlate that data across your third-party data your microsoft data whatever data you're bringing in you can correlate it and you know we can correlate across you know many different entities as well as you can even customize your own sort of data points parameters that you want to have available in the alert okay so sentinel doesn't actually store the logs it relies on log analytics okay log analytics is already there it already exists and it's already doing the work so it's already doing a good job it's built on a database architecture that's very scalable and very robust so you don't have to worry about building another type of data ingestion service if log and linux is already doing a good job so sentinel is going to be the brain using log analytics as the resource so when you think about ingestion you're not ingesting into sentinel you're ingesting into log analytics okay now how do we let me actually just erase this and just write it a little bit better we'll call log and lit ticks ah it's bad all right so it's built on top of log analytics okay now how do we actually ingest logs we use what we call connectors a connector is a piece of code or you know software that will connect your data to us so so let's talk about a connector so a connector that connects the data could be a piece of software so it could be just like an agent right and it would connect and the agent would then connect that data to log analytics um and that data source could be a you know a syslog from a linux machine or it could be event logs from a windows server right and so these will be your data sources right and another way to connect it you know a connector would function would be something like an app it could be a an app a middleware that acts as a a medium to facilitate feeding that data to sentinel okay so that app could be a you know it could be app right and that app could be a logic app right which is a like a gui based programming is the best way to explain it um flow is another way power bi would be another way to hydrate data into sentinel um you can also have um a python script right just a python script that will just connect to the api and then feed the data from uh you know through the python script and then just call the api and then feed the uh custom data you can also have a powershell script it's endless because if you have an api um you can connect you know any kind of app you can even program in in some some kind of software yourself some custom software and actually connect to the api it's very it's it's very flexible to kind of you know connect to api um what else you can have a power app so a power app will be another good one right so there's just many ways to connect the data right so it through that api interface now another way that you can connect the data is you know maybe a web hook so maybe you have a web hook session so it could be a web hook where you can have sort of subs you know maybe the third party app has a way that you can subscribe to a feed uh be a consumer of it establish a web connection and then any messages coming in that web hook you can then feed to sentinel you know as you're listening so many different ways to connect that data and the data sources could be any number of sources as well because once you have a connector it just connects the data but you haven't really identified the sources right so for syslog and event that would be maybe a a windows server a linux server right that would be the source of you know if you have an agent if you have a api the source could be any mean that you want to collect the source could be also it could be a server the source could be maybe an event hub right that would connect that with the connector would just take the data from the event hub and feed it to azure sentinel it could maybe subscribe to the event hub as a consumer and ingest that feed and then feed it over to sentinel right you can also have it be stored in a storage bucket right that storage bucket could be a s3 bucket it could be a blob storage and now you store it in a bucket as a csv json whatever you want and then the goal is we ingest those logs from that source file and whether it's been normalized or not we can then ingest that data and then have it populate in sentinel if it's normalized already then of course it's going to be a better experience when you want to build queries but if it's not there are a lot of methods and functions you can leverage to normalize that data save that data as or save that query as alias that way you can just use that query to join other tables or correlate that data somewhere else or just have a alias to query to basically retrieve that query that you already built so you can actually save special queries after you've normalized and structure your data okay now that you know that let's go into um some flows i'm not going to go into syslog and you know cephlogs but i'm gonna go and show you sort of the other flows and how you would achieve that okay and how you ingest that to sentinel okay so let's talk about something very very simple um let's talk about storing it right let's talk about storing that data in a bucket so let's say you store that data in an s3 bucket or a a blob storage that is just going to be storing the data it's not pulling the data right so you need a connector on e to either pull retrieve the data from the blob storage or you need the connector to push the data to azure sentinel either method you're going to need to have credentials right and you're going to you're going to need the credentials to pull the data as well as push it out right you're going to need the authorization to perform that task now the connectors we have are already pre-built that when you deploy it it already comes with you kind of have to populate the parameters the workspace id the sentinel workspace whatever you need that's required for that deployment script and when you populate it it'll spin up that connector and hopefully connect the services together right so you might need api key and things like that so again the connector is very important to you know connect that data and just make it very easy so let's talk about the connectors let's go into sentinel and show you what it looks like so let's look at the connectors okay let's look at what is available to us now see here so as you can see if i click on data connectors on the left side here's all the connectors you have access to okay and if we look you can see that we're just going to pick a random one let's pick akamy security events akamai is a they provide ddos mitigation and and you know cloud scrubbing services so you may want to ingest that data into sentinel to maybe have alerts and correlated with other tables so if this is akamai and this is the data connector remember what i told you is there's many different ways to connect the data in this method it says that you need a linux syslog agent configuration so i'm assuming that what this is going to do is this is going to want to for you to establish and build a syslog collector which you kind of already know how to do and then you want to send your aca my security events to the syslog collector okay and that's basically it so you'll probably have to poke up a firewall poke a hole in your firewall rules you want to make sure you want to whitelist akamai right you think about it only allow akamai to send those syslog on a on a special port that you map to port 514 to that syslog collector when it comes in the syslog collector will see it and then send it up to azure sentinel okay now if it's coming via syslog then you're going to have to parse it a little bit better but if it's coming via cef it shows up in another table called the ceph log okay so syslog and sef shows up in two different tables okay so it says follow these steps to configure a cef collector and syslog messages so probably it's telling you to uh point your port to this uh syslog collector and then send it up so you set up a sim right you provision your connector sf sample connector and then you go through the documentation it's probably going to have you yeah akamai will send their events through the internet through it has to go through a firewall right so it has to go through the internet go back to my services go to the internet hit your firewall so you need to open up you know either uh port uh five one one four or you may need to you know poke a special port like one five one four and then it maps to five one four on the collector okay and if you wanna take it further you can encrypt that session maybe enable tcp because it's udp by default so enable you know have a list on tcp or add some tls certificates to it so that only akamai can send logs to it but i don't even know if akamai supports cls though you know you only want to use tls if the server supports it but you can have udp be on 514 and then have tcp with tls um as well and then it can kind of use either or all right so this is just an example of akamai right and we're going to go through and just look something like a microsoft service as well we've already pre-built the data connector so you don't actually have to do much you just go to open connector page and then literally just check mark what you want and then apply the changes or click connect right you just have to have the necessary privileges to enable that connector okay that's all you need to make sure privileges um is going to be very important to enable the connector so someone does have to have the rights in this area as well as the area that you want to connect so you know with any microsoft product usually there's a data connector and some of the stuff you're gonna uh when you see it you can actually check mark and have it also create an incident so that when the incident is created by the other product you can just ingest it and then create it here okay so let's look i'm just going to scroll down i'm going to try to find a another connector let's look at proof point proof point is pretty popular with my customers so if you want to connect proof point data right let's let's say on demand security events what's that connected look like is it a syslog no it's a api websocket so to deploy this one would be a function app it actually will deploy an app that will go and you populate the keys and it's going to connect proof point data and ingest it into sentinel for you so what does that look like well it says follow these steps to create it so i'm just going to click on this link follow the steps and you see right here this is the query in log analytics i think i clicked on the wrong one so let me see optional steps connector see enable and check log api oh there's a button here to just deploy it so click the button and then when you click on the button you're going to be brought to a custom arm template script that you populate your variables you need so let's say i want to put it into a resource group here it is this is the name of the function this is going to be the name in our app when you want to search it within azure that is the name the workspace id that is the workspace id of your sentinel environment you need to populate that the key the proof point cluster id and the token which i don't have so you're gonna have to populate that but the workspace id and key i can tell you how to get that that would be back in azure sentinel go into settings um and then go into workspace settings over here click on agent management and you'll get those here all right guys so again you can populate this into the parameters that's going to let you spin it up and hopefully it connects and then your data is ingesting when your data ingests then you're going to go back over here to the proof point connector let me uh go back here go back to data connectors we're going to go back to the proof point logs we're going to go to open connector page and how do you search it well you just go to the next tab so i'm just making sure there's nothing else so deploy the function app there's some instructions there click on next step we already provide a workbook you can search against as well as proof point uh queries as well so you have the ability to query your logs and as you can see if you have a custom connector like it goes to api a lot of times it's going to store it into a custom table that you could just simply query and this one's called proof point pod okay and that way you can just type proof on pod and then now just go to kql and just query it and just build your query all right so that's another way to connect data is through api with a connector like this right another way to connect data let's let's just see the uh existing connectors we have already available to you right we have 98 connectors but a lot of these connectors like that proof point api connector i usually just go in there and then look at the code and then just kind of pull the pieces i want the code snippet and then kind of fashion my own connector uh using that and build my own function app right we are constantly building connectors but if you don't want to wait for that uh connected to be available crafting your own might not be too difficult if you know powershell or python or something like that okay all right guys so i'm going to go through cisco's usually syslog seth logs common event logs that's going to be going through syslog dns is coming from your servers okay or you know your dns service but this is yeah this is going to be a server agent force point you know f5 a lot of these can be syslog some of them will connect to api like another api connector would be like amazon so amazon and getting those cloudtrail logs into here is going to be important that way you ingest those cloud troll logs and you can query it okay so open the connector page and the simple premise the similar ideas you will provide your microsoft id external id but look this is a little bit different you just have to provide a a role inside the iam in amazon and inside iam there's a role and then you have to allow cloudtrails read-only access for this to get access to it and then so you provide the information and you add it and then the data is going to be connected okay so you want to put your aws role here and then search it and now the data connected okay how else would we connect into sentinel let's see um talked about that talked about that uh let's talk about the workbooks that's another way i can kind of show you how you can get data into azure sentinel so if i go to add workbook and i'm just going to click on edit um the reason i'm doing this is just to show you all the stuff available to you in the workspace in the workbooks i'm sorry so logs right here pretty straight forward you have access to your logs right so you you still build a similar query as normal right once you build your query you can then take this query and then maybe do a summarized soar capital s source system i probably need to do a count i don't have much darn let's see but then you could just do a some chart some pie chart but it's going to be 100 but you get the gist okay pretty straightforward here's the cool thing as your resource graph so azure resource graph you can actually query the azure resource graph and type in whatever you want here's an example resources where virtual machines so we'll go back over here we'll paste that query you can literally if you point to the azure resource graph you can then type in a query and then find out what resources are available so let's see what resources i have available which is none because i'm probably not pointed to a subscription that has resources yeah let me use another tenant i have we're going to add workbook because i do want to i do want to show you what it looks like so we could add go to query and we're going to go to azure resource graph paste that query run it and here's a virtual machine so that is a virtual machine that you now can query so you can use this in your workbook right does that make sense so imagine that now you can query all this and then when you query it you can have a sort of idea of all the machines what the utilization is correlating that with other tables right you kind of see where it's going so this is the data source you have available to you as well that you don't actually have to go ingest those logs another data point you have available is the azure resource manager query now it's going to require a certain path but the cool thing is if you provide the subscription and that details you can query your features your management groups your subscriptions your resource groups your tags your policy templates where you can see which resources are have certain billings or certain policies set up or which policies you need to configure or or which subscriptions are not adhering to policies so as a from a security perspective you can get a better grasp of all the policies just leveraging the azure resource graph and this is all free this is all free for you to query okay azure data explorer right so you can actually define a cluster name so i'm going to type in help right and this is a public data set you can query we'll do samples and the table is going to be i think covid 19 is usually what i query limit 10 and there's the cova 19 table that you can query right and then you can then summarize uh by maybe count by country and there you go and then you want to then sort by count right does that make sense guys all right so data explorer allows you to query those data points you know through the workbook okay json you can provide whatever json data point you want static json and it shows up here so if you have your own chat json elements and you want to use it in the workbook you can literally just populate a json table so you just type in something like um name bob and you run this query and you got name bob right and if you need another one then you simply have to add a maybe a comma name let's make it an array i think it wants is as an array guys i think it wants an array which is brackets guys there we go name bob named sam you want to add more data points h 30 we got to put a comma up here h here 20 put a comma right if you know json you got to put commas after each object or key value pair and there we go bob sam now that we have that we can then visualize that uh query and do a pie chart another data point would be merge merge just merges two tables together that's it's like a join okay parameters you can provide a parameter which is um you basically uh you know take the parameter and then do a query on the parameters exactly as it says um a custom endpoint what's a custom endpoint guys you can literally query an api and then have it populate in the workbook so if you have an api you have a and you can query it um sometimes if if they have a course you know that that protection where they only allow certain uh web services to access it then they might have trouble getting that getting access to that but for the most part you could just put the custom endpoint identify as they get post whatever and then identify the url and then um get the data set okay does that make sense matter of fact let's just go find one let me show you so let's find a public api data point let's find a public api we can query let's do animals anime all right let's look at this one uh let's look at the url scan anti-malware url scan data point it needs api key though maybe the api key let's look at anime it doesn't need api key uh api requests you're up here here okay let's do naruto right so let's get this data point this is the api we're going to copy it we're going to go over here we're going to paste the url here let's run the query we could ah make sure course is enabled [Music] there we go so don't send standard correlation header so you know we don't we don't need it because it's a public api and you get the data point here all right so now we have the data be sure to check that if you if it has that little error you always want to have cores if it's a private api guys so but this is public you know real really we don't really care now we have the parameters and you see this like this is not usable data we've got to make it usable data so go to results settings tab go to json path and we've got to make this usable data [Music] all right i think results it looks like results is the right one see the reason i know that is whenever you see a you know the bracket request hash this is the hash of the request the cash but then the but then the result is here right so we need we need that so dollar sign dot results is the path i'm thinking and then yeah that gives us the results we want i'm going to change it let's change it over to grid mode so that we can at least just see what the what the what the data is coming in so here we go so we got the image url we got the url we got the title we got the airing synopsis right there's the data so now that we have that data and we have the results we're done i mean we've mapped that data it can be used right you know and if you want this image to be a clickable image then you would just go to column settings go to you know the image url let's say the image url uh change this column rendering to a is it link yeah link link settings um we need to open we'll open the url link link label fly save it and there we go guys we got a link to every single picture here does that make sense guys this literally was a json element that we brought in that we can query does that make sense guys okay so that's a custom endpoint it could be a private api right but you got to make sure you post the right api key and things like that the header i'm not going to go to that but just give you an idea custom resource providers so you can actually build a custom resource provider and then actually pull data from so you actually build a resource provider and then actually query it here as well let's look at azure health that's another query so you can look at the azure health of your nodes right of yeah that's pretty cool right guys azure health of your devices right workload health change analysis all that is available to query um within the within this workbook okay another thing i want to mention if you wanted to query the azure data explorer within azure sentinel it took me a while to figure it out so i'm going to show you this tip sometimes you might want to query azure data explorer and this is how you do it you go over here you type in adx you type in the url you type in this type in help.custo.windows.net samples the database so it's going to be first this will be the this is the cluster name that's the cluster this is the database then you specify the table name okay right cluster database table okay and if you're thinking about doing privileges and privilege access you can scope it by cluster database and table as well so you can say these people have access to this stuff and these people have access to this stuff okay so that is the query and then you type in code 19 like we did before the limit and and there we go the data and then we do the same thing so if you want to query your azure data explorer if you got it stored over there and another repository or storage area you can query the data here you can also query external tables here too so you can external table it could be a json data point it could be the table and just ingest it here but you're going to have to map the columns right build you're going to have to provide the schema it doesn't really know the schema doesn't know how to paint it uh you know doesn't know how to populate the query so you might have to provide the schema so you can tell it the columns you want to populate okay so external table external data you know look it up and you can query that as well okay um other ways to query data let's talk about like a powershell script the python script i'm gonna show you that too okay so we're gonna go to python we're going to type in azure monitor because azure monitor is kind of using the log analytics right and that's where you know a lot of these already made code is available and we just type in powershell i'm just going to type it just to get to the spot i'm looking for um ingest data maybe data ingest azure monitor powershell ingest data here we go azure monitor http data collector api so this is us connecting to the api of log analytics and feeding data into a custom log table okay so how do we do that again we do a http request we provide the data and then it stores it in log analytics workspace where then azure sentinel you know maps everything and does all the correlation and does all the automation and integration and all that jazz all right let's look at how to do it so if you look here the you need to have authorization in order to actually uh feed locks into sentinel or log analytics you need to know the shared key signatures it has to be assigned uh it has to be signed in order to populate that data it's not just going to be an api key so keep that in mind so when you're when you do a request body you actually will populate your key value pairs and then just do a comma to kind of split them up so you would do you know name value name age value whatever and you build your key value pairs this right here is the array right so this is the array and then this is the uh going to be the object this is going to be one value so you literally just build another one like that and you put like a little comma right here and you do another one and you put a comma right here right that's sort of how you're doing it similar to what you have down here where you have that's one row item and this is going to be another row item so you're feeding json to us okay so now when you're feeding all the json to us um this is the script so there's already a pre provided script that's available to you to update so let's go ahead and do that if you want to follow me go ahead if you don't want to follow just skip along but we're gonna just go ahead and just build a script okay guys let's just paste that and let's look we need a customer id number all right let's let's go get a customer id number um let's go to workspace workspace setting let's get our information there workspace id we're going to go back into log analytics and we're going to paste that and then we need the shared key we need the key to upload that data okay copy that paste that into powershell script like so paste you guys got that you guys with me we have the workspace and the shared key now we need to add data right and the data they have already provided the json they've already provided the json so if they provided the json what does that mean these values will populate the way it's it's showing up here so let's not mess with it right now but i'm going to tell you right now this my record type let's change the name to teaching demo so you know you're going to find it it's not going to show up as this and when we do it you're going to see why the timestamp field this optional field you can use for timestamp to kind of build a the start of collection like the start of submission like when you actually submitted the data when it was ingested so you can kind of see the timeline and see how slow it is for log analytics to ingest the data and what that's what that gap is that time gap so if you want to put that populate the field this function's already built that builds that signature for you so ignore it i you know click on the minus key ignore it right post log analytics data that's already been pre-built for you as well so but what it does it will post to the api it will build the signature right it will build the signature post the api build the signature build the method build the url string get the header key and then this bad boy is the actual uh after you build everything you get everything set up it will call this uri it will call this method it will post that content type which is the content type is application json right there okay application json which is that content type the method it's a post right there so the methods post the uri here that's going over here the header here this header is going over here and the body the body was up there when you build that json right and that's it then this right here is going to post that data okay to the shared workspace okay now you're saying i thought this was the post i thought this was doing the work you're right this is calling a function post log analytics data we provide the parameters or arguments for the function right so we provide a customer id a shared key and the body which is that json and the log type log type was the name of the log which we call teaching so this function let's just close it and basically this thing is calling that function does that make sense guys so literally if you want to post your own data you need the signature function you need the post log and analytics function and then you can post your own data in the script sticking this scheduled job and boom you're go it's going i mean that's it does that make sense guys all right so let's post this data run it done it's posted don't believe me let's go look and as you can see doesn't show up doesn't show up let's give it a bit let's give it a bit oh what what what do we have here it didn't show up in the list but it is available all right so if you have no data yet it takes a bit of time it's building the table and then it's the log is already there it's just it's in the pending state so it's like waiting in a uh a queue once the table's done being built it'll ingest a log then it'll be a lot faster but the table has to be built first because it's never been built right so just give it a sec and uh should be coming in any moment now all right guys so it took a few minutes to build that table and start ingesting the logs but now the logs are coming in so as you can see time generator came in the string values came in and these are the custom columns okay see the custom columns that was built out so what do we want to do now well we want to go back and add our own custom columns to the same data set okay so don't make another log type guys because you don't want to have too many custom logs showing up in your thing so just use the same one and then just delete this element here or maybe we just keep it we'll just keep it guys and let's do a string value let's do a name bob we'll do uh h 42 we'll do marry true date value good it's going to be id we'll just call it id let's just copy and paste it all right guys so now we copy and pasted it let's call this person sam i don't know whatever you want guys just do whatever the goal is we just want populate data we submit it submit it a few times so we're sending custom logs to the log collector it's going to show up in that same table but it might have different columns these new entries are going to have different columns so you can have many different columns and you're sending it normalized it's already been structured while we wait for that to load if you go back to that documentation where you copy that you see this if you guys like c there's c uh you know there's a c plus plus or c c i'm sorry um so if you guys like c there's c sharp code that you can look at to uh do the same thing so it's actually just written in c now python this is python 2 example so if you will have to change it to python 3 if you want to use that format but python 2 same premise build the json key build the signature point and then post that data via python so if you guys like python and you guys want to put in the cron job or a schedule job and just send it directly through your syslog collector through that python script you could do that instead of having it go through syslog or step log inject it through a custom log okay well here's a here's python32 here's javascript so if you guys like javascript it's available there as well many different ways to ingest that data another way to ingest data is you could probably put it on a a data collector or azure data explorer so let me show you azure data explorer see there's no there's so many there's so many ways to collect data guys azure data explorer so if you want to store data longer but you still want the data available through log analytics azure data explorer might be a better alternative option because it's actually used by you know ai data analytics folks to you know query large data petabytes of data right and the pricing model might be uh to your advantage okay so azure data explorer really it's just another sort of data cluster and you just store your logs there so if you're thinking about putting your logs there i suggest normalizing that logs um if you have a lot of logs don't send audio logs there if you have like metrics or samplings maybe just get samplings for that day and then kind of take the you know the daily sampling and then you know take filter those logs to a smaller number because you might not need every single entry you know if you just want to take average of the day if that's comfortable with you so whatever means put in the data management here that's where i'm storing a lot of my stuff that you're querying um you know with teaching is through this azure data explorer cluster so let me show you what that looks like so if you want to go to aka ms kql data explorer again this is going to be a free area for you to kind of mess around query and just mess with the sample data but if you add mine if you have my cluster teaching dot west us.custodo.windows.net you have access to my sample data here see my minor catalog which you would you can query here okay and you don't actually have to add it as a cluster here you can actually also do cluster uh teaching.west us dot database um miter oh no security data and then do table miter catalog i think this will work or maybe there might be another way that worked but i think you could do it this way too yeah i think both ways work yeah all right guys and maybe you can even do it this way that works too whatever you want many different ways to do table do a dot whatever whatever method you can query now that you uh know that that's how you kind of query your data but how do you add data here how do you add your own data and what kind of data can you add well again you can store the data in a in a storage repository and just use that data and just pull from that but if you don't want to do that you want to store it in this cluster and have it indexed and have it normalized then you would have to store here and ingest it here so you would just go to here go to your cluster when you build your when you go to uh explorer here you go to the type in data explorer you're going to have to build your own data explorer cluster that you can then access so you need to build your own data explorer cluster which then you get your you're going to get your cluster string your url stream okay then when you build it out you're gonna you're gonna add data here so ingest new data okay and then you put database what's the database you wanna put it in guys right if you want to put a new database you've got to go build a new database in your cluster you have built out so you can build more databases and then set you know privileges there right you build your table here so you have a table you want to put in a table or create a new table let's do a new table right you put that new table here and then you put the source type so again here blob is it a file right is it a blob container is it a i don't know what adls is uh and or maybe event hub so maybe you're referencing event hub so it can basically subscribe to that event help feed and then just ingest that feed right into data explorer we're going to do file to show you what file types it supports what file type does it support up file to upload add up to 10 items up to one gigabyte and compress each size it doesn't save the foil but i know it i know it um supports csv json um let's let's see it's a lot of formats a lot of formats see okay we got the apache avro csv json multi-json parquet oh psv raw tsv files text files tia yeah a lot of different file types um and if it's zipped up we support it too it'll have to unzip it and then parse it maybe take a little bit longer but we can support that too but the goal is now you ingested the static data or data that you've brought somewhere else that you want to use as a data point you would put it here and then set the schema of that data point map out the schema saying that this column is this column right then you go back into azure sentinel and then this is where you would do it again guys you would go here type in um help.busto.windows.net and then you would do slash guys you gotta do slash your database uh we'll do samples dot covid 19 limit 10. and that's it guys yours is going to be different your string is going to be that kusa string okay does that make sense guys if you want to query log analytics stuff on azure data explorer um you're going to have to figure out that workspace there's a there's there's a documentation around that okay let me actually show you logic apps because i'm going to show you how easy it is if you don't know programming in that powershell is kind of scary for you let me show you logic apps just real quickly we're going to go from scratch guys we're going to do a log analytics from scratch and log it right into the workspace let's do it so open up logic apps you don't know what it is just type logic apps and type in uh add consumption logic apps charges on a consumption based model so don't be worried it's if you don't run it then it won't do anything okay so we're gonna do um a resource group so we'll pick whatever resource group guys i'm gonna do a test resource group we're gonna do each thing logic app review create once it's done validating your parameters click create it will build the logic app logic app is gui based programming is like i said the most simplest way i can put it we go to the resource and we're going to go recurrence model i'm going to go back to the templates real quick i kind of you can do when the message is in the bus when you get an email many different trigger points to trigger the logic app if you store it in the file if you put on event hub if you get a tweet if there's if you can even make a logic app api and then send json data and have that go into sentinel so you kind of do all the complexity here and then just set up your own easy api we're going to do recurrence that's the easiest we'll do we'll do every three minutes okay guys and let's log to sentinel so what do we want to log in the central now let me think uh we're not going to log the center note where do we log guys we're going to log into log analytics so type log analytics okay and then we're gonna pick um which one's this one data collector there's two of them one is for querying data but we want to do the data collector because we want to send data so the data collector connector send data you're gonna have to have a connection name that's okay you can create it so we're gonna put uh log analytics connector you guys remember how to get your workspace id and key okay go get it go and get that key okay guys go get the key and agent management id where's my logic app put that id here put the key there paste it in now you have your connector established how easy was that now the json request body well let's cheat and let's just go into our thing here copy that json body and paste it there does that make sense guys custom log name we don't want to put it in another table we want to use this same table but we need a way to identify this one so let's say name is logic app logic app too a way to just identify that it's flowing through logic gap and you guys are seeing it flow right does that make sense guys you can also add the time generator field here we're not going to do that maybe you want to add another flow maybe you want an outlook email to email you when it's done uploading or something like that right so you can just go to outlook and then just do like click on outlook and then send email right here just send email when it's done sending it right this it goes in a it'll go on top down just each test one by one if one fails it stops okay but we're not gonna do the connector outlook we're just gonna keep it simple we're gonna save it and then we're gonna run it wait here do not click around don't be fast watch what happens you get check marks it ran through this interval three minutes okay send data here's the input this is the input i provided right that's the body so it already kind of provided that json body right and then here's the output raw output it made the api call and it got a 200 response back guys we just logged to log analytics to sentinel with a logic app okay does that make sense guys and then it shows up where you can query it all right guys so what did you think how was it hopefully you know various methods to ingest logs of the sentinel we labbed a few and yeah have fun your homework ingest some logs in the sentinel and make some queries and make a workbook okay and feel free to like and subscribe and comment below on anything else you want me to talk about or any questions you have click the bell so you get notified and thank you for watching stay safe guys [Music] you
Info
Channel: TeachJing
Views: 2,244
Rating: undefined out of 5
Keywords: Teaching, TeachJing, syslog server, Azure Sentinel Syslog, Azure Sentinel Tutorial, Azure Sentinel Lab, microsoft security, Azure Sentinel, azure sentinel setup, azure sentinel syslog collector, azure sentinel syslog, azure sentinel automated playbook, azure sentinel playbook, logic apps, azure sentinel, Azure Sentinel Usage, Azure Data Explorer, Powershell Sentinel, Python Sentinel, Azure Sentinel Workbooks, Logic Apps Sentinel, Proofpoint Sentinel, Akamai Sentinel
Id: z9xPUblfQjs
Channel Id: undefined
Length: 57min 45sec (3465 seconds)
Published: Thu Mar 18 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.