The AI Data Protection Platform 2024 | Empowering data protection with a purpose-built DSPM

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone my name is offer and I'm the director of product management for the SPM in today's session we're going to talk about data protection for the public Cloud we're going to start with discussing the challenges of the public Cloud uh the agility of it the complexity of it and the challenges that data security teams are facing when they're trying to figure out what type of information they have over there and how to protect that then we look a little bit into the Z scaler approach and how we decided to secure that data we'll talk about that uh last in order to help Focus the discussion uh we have run a survey within our customer and we picked the three top uh topics that came up as top of the agenda for our customers with regards to DM and we'll be talking about these and walking you through a demo in dspm showing how these use cases are resolved and how we help those teams with that so First Data protection for the public Cloud so talking about uh the data that is stored in the cloud and why do we care about that first let's talk about the volume of the data uh looking at all of these uh companies that are either in the flow of digital transformation or have already completed that flow uh the prediction goes that within the next year we will be seeing over uh 150 or almost 200 billion uh terabytes of data that will be stored in the public Cloud this is a huge amount of data and with that amount of data you can see that the number one uh attack the number one data breach uh that was reported uh last year or the the number one type of attack that we're seeing is Attack on the cloud uh data now with each of these attacks uh averaging about $4.5 million and that's not just the cost or the liability it's also the loss of public uh opinion and the trust of your customers uh this brings data security of public Cloud as one of the most interesting uh areas for the data security teams uh in 2024 now data applications and services are touching every aspect on the Enterprise uh today with cloud services that are ranging from SAS to full ASAS we're seeing more and more adoption everywhere now public cloud is complex and it is super agile which means that the customers uh data could be pretty much everywhere and those applications or servic services that they are subscribed to are being provisioned by developers by business owners not necessarily going through the uh data security uh side of things which makes those team uh struggle with even the simplest question trying to understand what types of data do they have in the public Cloud uh where are where is this data stored uh who has access to this type of information what are the risk associated with my uh data in the cloud and even those simplest questions in the agile manner of the public Cloud are hard to answer now within time uh that industry has moved from uh single point solutions that were first focused at on premise uh and channels like web dop like email dop Etc and evolved towards the public cloud cspm or synup Solutions sspm uh solutions that are looking into SS applications casby uh that is looking into some of the aspects and many more the challenge to the customers with all of these single point Solutions or small Frameworks is that most of them were built as a dedicated solution and then gradually uh emerged uh through that that brings two challenges to the data security team number one uh your uh single point Solutions are giving you one aspect of the data you only see what they are interested in if you're trying to get the overall insights and understand uh the context of that specific issue within your Enterprise you will not see that you will see that within the context of just your uh email or just your web or just your uh casby but understanding uh data lineage within the organization understanding the importance of these files that is not something that these framework uh can provide the other aspect of that is that do is complicated not just the public cloud is complex but defining and classifying your data is something that takes time and takes a lot of effort uh until you are reaching a point where you trust the system where you trust uh the information that it delivers now having to do that again and again through multiple different products each of them speaking at different uh data classification language and maintaining that over time is a huge challenge uh to the data security teams and something that uh none of the customers that we were talking to was happy or willing to go through uh again so what is the Z scaler approach uh to this type of uh challenge in this scaler we had the advantage of running billions and billions of web uh transactions through our system on a daily basis uh through that we evolved our uh DLP solution in a way that it can train and use Ai and machine learning algorithms in order to improve our accuracy over such an enormous volume that almost no other company out there in the world can compete with that and get to the same level of accuracy uh and improvement over time uh so data classification is done inside Z scaler either by automatic AI based algorithms but also through more advanced classification methods like EDM IDM uh running your own custom DP engines and other methods that allow you to not only use our uh predefined abilities uh but also tune them improve them uh and adjust them to your needs uh as an organization and then this scaler took these types of abilities uh and made sure that we use them all across our portfolio meaning you define your DP once uh you define what you care about what you would like to classify and we will go and we will look for that and we will provide you with the same set of accuracy uh actions uh and protection across all of your Channels with web email uh endpoint mobile casby and the latest addition securing your public Cloud for data uh as well now our approach says that by taking that extra uh step and expanding your existing uh DP capabilities we are reducing uh the challenges and the work to our customers if you think about that once you have defined your data classification once we will take that we will reuse that and will leverage the trust that you've already gained in the system which means that any additional area that we're covering is going to get the same level of accuracy that you've gotten used to uh so with uh the sus uh the web channels already covered by the other products uh the infrastructure as a service or the public Cloud that's the last piece of the puzzle uh that was missing and DPM is going to Leverage The DLP capabilities for this scaler to close that Gap and to allow for uh that extra protection for the data in your public clouds so what is DM about first thing you need to go and uh connect our uh dspm solution to your public cloud and that is basically the last step uh that you need to take from this point on we take uh things and move forward and help you uh tune the system so once you connect us to your public clouds to your uh AWS accounts to your a subscriptions and so on uh we will go and we will scan your clouds and locate uh the uh data storing services that are out there you don't need to point us and tell us I want you to scan this S3 ET I want you to scan uh these two databases you could do that but first we will go and we will scan everything and figure out what data stories do you have databases uh volumes uh virtual machines uh uh storage accounts we go and we scan and we find those Services next we will go and we will analyze two things number one the data on these Services understand whether it is sensitive data uh non-sensitive data or types of information is it PCI or whatever and do you care about this type of information yes or no the the second thing that we will do is we will go and we will look at those services from the uh misconfiguration or uh exposure perspective uh and figure out all of these uh bad configurations all of these mistakes or all of these uh excessive permissions that were granted so that we can calculate that and understand understand whether that specific data store is at some uh risk whether it it's a data loss risk or a data theft risk or just uh exposure of your data we're looking into all of these last once we have all of that information we will go uh and we will check uh that information against our set of insights uh and predefined uh policies to give you though that understanding of what is the risk for your data uh to go over everything that you have prioritize those risks for you uh and present to you all of the information uh that you need and we'll we'll look at that as part of the uh demo in a few minutes now more importantly than uh what we do is also how we do that one of the biggest challenges uh that our customers have voiced is the fact that well we are scanning the most sensitive data which is out there in the public Cloud taking that information out of their Cloud uh or uh out of their accounts in order to scan that uh is a big challenge both from a security perspective uh because this is well it goes against any uh Regulatory Compliance guideline but but also it uh introduces latency and extra costs uh into the system so our first line of business was Let's scan that data in inside the customer account account and avoid those issues what we do is once you onboard uh Z scaler uh we provide you with the template for the dspm scanner uh you deploy that inside your account uh it does not uh work as an agent meaning it does not connect to your virtual machines uh it does not impact the performance of the VMS or the databases or anything else it runs as a standalone Service uh and then the the entire thing is completely controlled uh by you as a customer uh everything that you've configured whatever you asked us to uh scan monitor or do uh goes uh into that uh scanner from the data or the dspm portal all of the configurations go here uh and then that scanner goes it reads the files it reads the database tables and it does uh local scanning and classification within the customer account and then last only the metadata of the findings go back outside of your uh account and into the uh dspn portal for reporting meaning that if we've detected a file with uh certain credit card numbers for example or financial information we will notify the user that we found this file in this path uh and we detected p C and we detected credit card numbers and there are four of these uh but we will not be uh actually moving the row information outside the customer account which is super important so once we've established what is DPM and how we operate the first business flow that we would like to walk through is okay I have deployed uh DPM what is uh my data situation where is my data what types of data do I have what services is it running on and what geography and that's the first flow that users are looking to uh start with the starting point for that is our autodiscovery or autoc classification meaning that out of the box we come with dozens and dozens of predefined uh engines and document type and AI based uh classifications uh and we run these over your data we will not necessarily alert based on that because you did not configure that yet but uh we will scan through everything uh once we've done that you can go into our dashboard and take a look at the data Discovery side of things understanding which types of data reside in your Cloud uh what are the data types and if you would like to learn uh more about that uh then you can uh click through and understand what types of Records we found in there uh where where is your data uh did we found the did we find these medical records uh uh everywhere the answer is no we've seen that in storage buckets we've seen those in uh virtual machines but uh we did not find any medical data stored on your databases which is important uh we can also show you the uh data region side of things giving you Clarity into where your data resides you can go through that view or you can go through uh the data uh uh region flow and figure out where your data is if you're a European company and you want to make sure that your entire sensitive data stays within uh the Europe Union uh then that would be one view you can of course set up a policy and get that as a notification but that would be one view where you can take that and uh detect those anomalies now the the next step is once I have understood the breakdown of the data I may wish to uh go and drill down and look at specific data stores and understand what's on these data stores so here's an example for uh an ec2 instance a virtual machine uh the first uh thing that you will see about that is a visualization or a simplification of the uh posture and the information that we know about that virtual uh machine so we can see the ec2 uh is in here uh in the middle uh we can see that it is publicly uh exposed to the internet uh we can see that it has two volumes uh with these types of data uh that were detected in that and there is one user and 29 Services who can access uh that virtual machine there are some vulnerabilities here and if I will scroll down I will see those breakdowns here uh and I can learn more about that and there is tons more uh information I can also figure out what is my sensitive data that is stored over there if I'm interested in a deeper level than just these uh breakdown by uh engines that we found and where we found them uh you could click on that sensitive data uh and get the complete information uh as to which files were detected in which path what types of uh engines were detected in there what is the classification for that file and so on so you get the complete uh granular of everything that we have found inside your Cloud now you can understand where your data is what types of services it's running on and the next question that you will be asking is okay now that I get all of that what is the risk to my data what are the challenges what is the risk of me losing that data or somebody hacking it stealing my information and so on your entry point for this uh is similar you would go at at our dashboard uh and this time you will be focusing on the risk side of things now there are multiple widgets that are showing you uh different slicing and dicing of the information we will focus now on the uh top risky data stores that we're showing here now here we're trying to highlight the top issues that you need to take care of if you have just two hours today uh and you want to uh fix something and reduce the risk for your organization please go in here take the top one from the list and handle that so why did we decide that this is critical and we put that uh in here on the list so if you if you click through you will learn that we're looking at two aspects when it comes to uh determining uh the risk algorithm for that the first one is what is the likelihood of something bad happening right if my data is stored there uh and it is publicly exposed or some external users have access to that or I have used uh risky credentials or be the reason what it may uh all of these make the likelihood higher make it more likely that something would happen to my data because it is potentially not backed up or any other reason for that so that's the likelihood side of the uh the risk then we look at the uh potential impact if this actually happen happens if somebody takes advantage of these posture issues uh what will I be losing and here we're looking at the sensitivity of the data and the volume of the data that resides on that data store now we take these pieces of information the likelihood and the impact we push those into our risk algorithm and that is what helps us prioritize these on top of other issues that you may have now if you want to reduce the risk you need to go and handle these four alerts let's take a look at what an alert looks like uh inside the SPM so when you go and investigate an alert again you will see that visualization uh of what is going on what is this alert all about I'm going to pause for a second and explain why we invested so much uh into making everything look simple uh and the reason is that uh there are multiple users to our uh tool and those uh data protection or infos uh teams are not always Cloud savy they know the cloud but not always to the level of that cloud architect that they need to talk to later on in order to fix the issues and handle them so in order to allow them to make that conversation allow that discussion between two teams where one side is more cloud sevy and less security oriented and the other part is uh super security uh knowledgeable but less Cloud knowledgeable now we are allowing for that conversation through giving all of the information uh both made simple but using the terminology uh that is part of the cloud thus allowing for that conversation so this alert is about this S3 bucket a storage solution uh in the cloud uh it has sensitive data as we can see here and it can be accessed by uh 10 ec2 instances uh that are publicly exposed and vulnerable we see all of that in here uh and we can understand that from the alert description and the details and we can see the details for that ec2 if you want to know a little bit more but we would like to further uh investigate so uh first thing is I would like to understand okay what are these uh E2 instances that can access that uh so by uh clicking on that I'm going to now be able to browse through those E2 instances one by one get their details or get a summary of all of them uh once I've seen this ec2 instance I know that it is uh carrying some vulnerabilities uh by scrolling down I'll be able to take a look at those vulnerabilities figure out which packages they reside on uh whether there is a solution or there is no solution for those vulnerabilities understand whether I would like to resolve that uh through removing those packages updating them or what be it uh now that I understand the vulnerability State I reminded that you know this ec2 instance uh is not just vulnerable it's also publicly exposed uh let's uh try and understand that a little bit further so uh by going into the uh investigation path I can click on the public exposure path and get the visualization of this time this is my virtual machine on the right uh and this is the access from the internet and I can figure out pretty easily that there is uh an internet gateway on my way there is a load balancer I can click on them and get all of the details uh but I am focusing on this Security Group where I see this uh exclamation mark highlighting that there is an issue uh with that one uh clicking on that security group is going to shift me to the context of that show me the details that we know about that Security Group uh and if I scroll a little bit uh you will realize that the the the root cause for this entire public exposure is this inbound rule uh within the security group which is configured to allow remote access or external access from uh the internet I can uh click on that take a look at the the rule and share all of that information with my cloud architect in order to remove remove that potential uh risk from uh my system at the same time if you recall that virtual machine had access to my S3 bucket for that lateral movement uh that allowed stealing that information if I'm going to go through uh the uh path selection again and look at the uh access path this time again we're simplifying the super complex uh permission model inside the public cloud and we're showing you that that ec2 is associated with this instance profile uh through that role and allowing it to use these three policies all of which granted access to this uh storage bucket with the sensitive information again I can click through all of these I can get more details I can talk to my peers uh that are overseeing the cloud and help them help me in removing those potential uh issues so in the alert I can see all of these path I can understand all of these details I can take a look at the sensitive data like I've seen before and then when I decide that I would like to remediate that in most cases we can also provide uh guidance as to how you can solve the issue what should you be doing uh and we're providing that information both for your uh console user or your uh code or CLI user uh that uh can go uh and immediately Implement that uh uh those best practices or guidelines in order to reduce the risk and remove that alert have it resolved once we completed that part of risk understanding uh now we know where our data is we know what uh risk uh it is facing uh the next uh challenge is basically operationalizing all of that understanding how I can take that and bring that to resolution within the ecosystem that I have in in my organization so first and foremost in order to do that you get our list of predefined dspm posture policies so data posture policies all of them are correlated all of them are smart we're not just looking at a single open port uh uh to give you the information but we are targeted at uh interesting attack scenarios uh that are either more popular more likely to happen but also on the extreme cases when we're looking at more complicated uh attack attack vectors that can be used not only for the bad hacker we're also focused on potential data loss to user uh based on user mistakes or or issues if somebody accidentally deleted a super valuable information which was not backed up or does not have versioning then that data is lost for me and I may incur the uh cost for that regardless if it went to a hacker or I lost it uh on my own so we have pretty fine policies covering all of these and strongly tuned to not give you false alerts or a lot of noise but rather focus on the stuff that matters even with that some of our customers are saying you know what I would like to build my own or I would like to take your policy and add that extra Nuance to uh tune it to my needs with DM we allow the users to write any policy with the language that we use they can see the logic behind our policies uh they can optimize them they can create the same policies uh in their system everything is fully customizable and fully transparent uh to everyone once we've uh gotten through that we have an alert in our system uh and the next thing is now I would like to get that to these uh Cloud Architects or to these it guys or to my developers who are going to resolve those issues uh to that we have automatic uh integration uh with uh itm tools like service now like jira slack Which is less of an itm tool but uh used a lot in some of the development organizations where they're more agile uh you can set a notification and we can open tickets automatically for those cases not just open them but also update them if later on we go and we uh detect that the issue was resolved we could potentially close that ticket or add more details as we are seeing updates or changes to the configuration and of course we're providing access directly from that idsm system there is an embedded link that a deep link that takes you uh to the dspm showing you all of that information the the other uh side of operationalizing it is uh tying it into your uh Sim or a sock system and with that as well uh we have uh developed a fully flexible export mechanism that allows you to push the information out and use it in any uh common Sim solution out there both the formats that we use uh and the methodology of export are standard and allow for direct connector from all of these uh systems uh and of course on top of our alerting we're also pushing the miter attack Tech and all of the information relates to that in case you would like to build your dashboards based on MIT so we've talked about understanding your data then understanding the risk for your data then resolving that these are the three main use cases uh that we see uh with the system and uh to conclude this just a few pointers from what we've talked about in the last 30 minutes uh Z scaler approach uh to dspm is a holistic one we relate to the public Cloud as another layer uh that requires data protection just like uh we protect that at the email just like we protect that at the web just like we protect that at the endpoint in your s Solutions and in the casby uh data protection in the public cloud is another layer in order to allow you to hit the ground uh we come out of the box with all of your classification everything that you've used before in other uh Z scaler data protection Solutions is going to be there if you're a new office user we come out of the box with a set of uh classifications both AI based and uh custom based that let you classify the data uh and all you need to do is connect us to your system we will do the automatic classification we will run the base is for you and we will provide you with those uh smart insights uh and we will share that in a way that allows your team to communicate with other teams uh in order to get to the resolution track that and use the complete set of uh business flows that you have inside your uh Enterprise if you would like to learn more about that uh please contact your sales rep or uh go directly to uh www.car.com dspm and learn more information about that I would like to thank you very much for your time uh and hope this was useful
Info
Channel: Zscaler Inc.
Views: 323
Rating: undefined out of 5
Keywords: security as a service, cloud security, zscaler, sase, secure access service edge, digital transformation, secure cloud transformation, zero trust security, zero trust exchange, zscaler private access, zscaler internet access, data protection
Id: riWPW5EpPJs
Channel Id: undefined
Length: 31min 41sec (1901 seconds)
Published: Wed May 15 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.