CSRF is Dead - Stephen Rees Carter

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Some click bait bullshit. Saving you all 30 minutes of your time.

👍︎︎ 18 👤︎︎ u/kisuka 📅︎︎ Jul 24 2020 🗫︎ replies

When do the ads end and the actual video begin?

👍︎︎ 8 👤︎︎ u/vomitHatSteve 📅︎︎ Jul 24 2020 🗫︎ replies
Captions
do you want your online store to be ultra fast fully customized to your needs and scalable to more than one billion items imios is an ultra fast php e-commerce framework ieos next level e-commerce try it now [Music] [Music] if you're running an application at scale in public or private clouds you have to tame almost boundless complexity datadog brings together all this observability data with infrastructure metrics traces and logs in one integrated platform phonage is a cloud communication platform that lets you integrate voice video and messaging into your applications we have a load of helper libraries including a laravel package and a client library for php you can find out more about us at [Music] developer.nexmo.com [Music] it's finished it's no more you don't have to worry about you don't have to think about it you don't have to develop for it right well maybe let's have a look csrf stands for cross-site request forgery and it's the name of an attack given where you forge your request as if you're the user through their browser in order to perform some form of action so a good way to explain it is through this simple request here so when you've got a say a post request updating the password on a user account so mysite.com account you send the post request through with the new password and the server would accept it see the password update the database and everyone will be happy but if there is no csrf protection on this request then you can do this instead as a hacker so first of all investor and this is the important bit you need to trick the user get the user to somehow visit a site that you can control a site you can run javascript on because if you can get javascript running in the user's browser then you can make the user's browser do what you want so if you've got javascript running in the user's browser on anothersite.com it can make a post request to mysite.com such account and give it a new password and if the site doesn't have csr protection the server is going to accept that new password and change the user's password because the browser has helpfully added the cookies so when the post request goes through to mysite.com the browser says oh i know this site i have cookies here you go it authenticates lets you in change the password now the hacker has updated your password and they can log in directly to your site and you may not even notice the victor may not even notice this for days months weeks or even years because of the long life of cookies or the infrequency at which they use the site so that is a csrf attack in a nutshell and how do we defend against it so the standard method of defending against it and the method that you would be most familiar with working with laravel would be a csrf token and the way that works is the server generates the token and then it attaches it to the response to this to the client to their browser in the form somewhere and then when the form is submitted the browser has to include this token that was generated and send it back to the server and the server receives the token and it validates it against the token that it generated and it makes sure that it matches or that it's valid by some way and as long as the token matches it knows the request is valid that it has come from the server to the browser back to the server and then it is the right request and it can update the password and this works because of the security feature in the browser that prevents others javascript on other sites from reading the content of requests on different sites and so you cannot make a post request from anothersite.com to mysite.com and read the contents of the response the request will get through and you can send the password through in the request but the browser won't let the javascript read the response that comes back and that protects you and it protects the token because the token cannot be retrieved if you can't use csrf tokens or you want to use a different method you can also use origin or refer a header these are considered protected by the browser and the browser won't let the javascript modify them at all and so you cannot set them you cannot override them and therefore you can trust them on the server when they come through because you know that the browser is only attaching that information as a legitimate request so if you can predict the value of the origin or the referral header you can then rely on that to defend against csrf attacks another option you have uh clients is client-side cryptographical magic and what i mean by that is you could use public private key encryption or some proof of work or something something that makes it really hard or impossible for the attacker for the hacker to generate and replicate on anothersite.com so they cannot generate the results that is needed on the server to verify the request and you'd use this method if you can't retrieve tokens from the site server reliably or if you um can't modify can't trust the headers that sort of thing so it depends on your application how it works as to whether you need to go down that route but there is also a fourth option to look at which is same site cookies so same size cookies are they're an attribute on the cookie and they specify the behavior of the cookie in relation to cross site requests and so there are three options there you've got strict nun and lacks and the support for same site has been around for quite a while a number of years the main browsers have supported it safari does have a problem at the moment where same site none is treated as strict and so it'll block the cookies and we'll explain all the behavior of that in a second but apart from safari it supported all the main browsers so you you're right to use it you should be good to use it on your sites so same site strict works by blocking everything and it says you cannot send cookies on cross cross-site or third-party requests which means embedded content such as iframes and images and you can use iframes for things like um transparency attacks where you overlay the iframe over something else the user thinks they're clicking on another site.com but they're actually clicking on the frame which is sending those clicks back to my site.com and it's causing the user themself to conduct malicious activities on their account another it also blocks unsaved requests so say put requests post requests delete requests because they modify things and they have actions that's why they're considered unsafe is because they perform an action they perform some behavior but strict will also block safe requests so get requests head requests it just blocks everything cookies across all third-party requests or cross-site requests which makes it the most secure option and a great option if you want to lock things down but from a usability point of view it has a big problem so consider your site where you where your users want to be logged in every time they visit it they want to remain logged in but if they're over at facebook.com and they click a link that goes to your site when they first get to your site they won't have any cookies because the browser will say this is a cross site request we cannot send it even though it's a simple get request and you clicked on a link it's still considered a cross-site request and it will block the cookies being sent on that request and so when the user gets killed they're not locked in and as far as they're concerned they need to relog in again so maybe they'll go find their login button i'll click that but that subsequent request is considered a same site request and the cookies will be added by the browser and suddenly the user will be logged in and so they will be really confused as to what's going on where they get to the site they're not logged in they try to log in and suddenly they are logged in it's just gonna be confusing so strict is the most secure option but it does have the problem because it is the most secure option it will block all csrf attacks however any because csf requires third party requests across site requests to um and cookies on those requests in order to work strict will block everything in every csrf attack that you attempt when it is set on a cookie which is great as i said from a security point of view now same site none is the opposite end of the spectrum so in same site none it says please send my cookies on every request so embedded content unsafe request safe request it always sends the cookies which is great if you need to say in better form on other sites or if you've got a multi-tenancy system where um everyone has custom domains and they need to send requests say to your authentication domain same site none will let those through but the thing about sensing none is because it does allow cookies on all requests the browsers have decided to require secure set as well and the secure flag on a cookie says the cookie can only be sent on a secure request so https and so same site none works only on https connections when the connection is encrypted none as you would oops sorry as you would expect blocks no csrf attacks because it attaches the cookie on every request there is no csrf protection in there at all you need to have your own csrf protection in order to protect these requests if you try and use same site none without secure or if you try and use it on http instead of https it will not send these cookies will never work you cannot set a same site cookie without secure the browser will reject it it won't send it and you need design to need to watch out for if you're working with domains in local development for example because if you have https in local development and using custom domains the browser will request same site none domains sometimes uncookies i should say and so it's something you need to account for when you're doing local dev and you need same site and unworking how you're going to manage that if you can't get https working in there now the nice middle ground between strict and none is lucks so lacks allows safe requests to have cookies only so it'll block them on embedded content and book them unsafe requests because those two types are the ones we generally see attacks that require cookies across sites across site attacks now safe requests however according to you know the nice theories according to the ways you should be developing safe requests so get requests and hand requests should never perform an action they shouldn't but change state they shouldn't have a behavior and so they should always be safe to call as many times as you need constantly and it shouldn't make any problems at all couldn't shouldn't change anything and so if an attacker can make those requests from an external site it doesn't matter because they can't change things and so lex says that we want to let through safe requests and this is why it is it's the same default it's it's a nice balance between secure and not secure because it blocks the routes we would normally expect attacks to come from while allowing the behaviors you not users would expect so if the user's on facebook.com and then click on a link to go to your site the cookies will be added because it's a get request it's a cross site get request and lax will add the cookies and the user will be happy because everything will load as expected and they will be logged in but if there was a post request from facebook.com to your site it would be blocking the cookie because that is um lacks would block that one because it's considered an unsafe request so csrf attacks in lax it's fine when all of your actions as i've said before are under embedded content and unsafe requests so it'll block all csrf attacks as long as the routes are set up properly and you're not accepting any actions on safe requests at all so time for a bit of a story so last year in august i did a conference talk it was less of a talk more of a live hacking session so i got up and i spent a bit of time hacking into wordpress demonstrating a number of different vulnerabilities how they work and how you protect against them and one such vulnerability was a csrf attack and i used that to upload a php backdoor to the site and it worked really well in august demo went perfectly and then i was refreshing my talk i was running through it again at the start of february because i was giving a talk at another conference and i was in the middle of the practicing and i got to my csrf attack and i executed the attack checked my payload and it wasn't there the attack had failed and i was a bit confused so i checked the browser console and i saw this message sorry so this message come up in the console saying a cookie associated with a cross-site request resource at hack wp was set without the same site attribute it has been blocked as chrome now only delivers cookies with cross-site requests if they are set with same site none and secure so for some reason the browser chrome has blocked the cookie the authentication cookie required to execute my csrf attack in my testing and this was odd because i didn't have same site set on those cookies but it reminded me that chrome are rolling it out so back in may last year chrome announced they're improving the privacy and security of the internet and to do so they will be using the same site cookie attribute as part of the process to have to make things more secure so in october they provided some details insights as to what they were doing and they said they would be defaulting all new all cookies that are created would default the same site likes when it's not set and they would also require secure on same site none and their thinking for this is they will push this out in chrome 80 in february this year which means that this was around the time when i was practicing my talk but i had actually enabled it in the browser when that and that's what came through so this announcement came out in october last year and they also said as part of this that you could manually enable them in the browser in the flags so i had done that to test my size to see what would break what wouldn't break and i'd completely forgotten there was a couple of sites that broke at the beginning including including a few google things and the australian impacts department website broke when i was trying to take player tax bill which was fun i had to wait a few days to see if the payment went through or not because it was embedded in a frame and it just went blank so i typed in my payment details and clicked the button and the screen went blank the frame went blank and i wasn't sure if it had gone through or not which part of the the redirect and their flow actually had the problem so i had to wait a few days to see if the payment went through but most of the sites i visited worked fine with these flags which is great but so that's why i forgot about them now in february they google re-announced this i guess they reminded everyone this is what they're rolling out saying again we're going to make same site lacks by default in this month in chrome 80 and they announced their plans to do it as a gradual rollout so rather than everyone getting chrome 80 and then getting this feature they would roll out chrome 80 people and they would start in very small batches enabling it across the world so that if things start breaking if things start having problems they can quickly roll it back or fix things or people can identify the problem without the entire world combusting because same site lacks defaulted and it breaks some vital parts of the infrastructure that no one realized it broke it breaks before but then in april after they had done a little bit of the rollout and when kova was really building up to a big thing and everything was shutting down people were moving to work from home and people needed stable connections and stable applications everything needed to be stable the big software companies like google and microsoft etc they all announced that they would be pulling back on feature updates and focusing on security updates and bug fixes and keeping things stable and so at this point chrome said in april they're rolling back their same site cookie changes and so at this point in time they haven't re-announced it so no one has there's no one on mainline chrome unless you've manually enabled it has this default set for same site lacks anymore and they haven't re-announced when they're going to roll that out again but they'll be doing at some point but it gives us developers a lot more time gives us time to figure out what applications of ours are broken what will be broken by same site lack so we can fix our flows because a lot of people didn't know this was coming they didn't really give much notice as to how long they were doing it and it didn't get very widespread i'm guessing quite a few of you watching this now haven't heard of same site cookies before and yet we were supposed to be well into the rollout by now so when they announced they would default the same site lacs microsoft um raised their hand and said uh guys wait a second you're gonna break our flow so as part of the open id can connect authentication flow which is used by the azure active directory and a microsoft account authentication system layouts was going to break that because of the way it works where it sends excuse me sorry it sends a login request to the to mysite.com which sets a csrf token in the cookie redirects the user to an external authentication site and that redirects the user back via a post to mysite.com and that final post request at the end needs to have the cookie set to validate against the authentication tokens in the post request but because it's a post request and it's coming across site same site lacks by default was breaking it it was blocking it and so this authentication flow stopped working and this is a problem for microsoft sites not because they're microsoft sites because this is on a document documentation site and it's code that other people implement on their own sites enterprises corporates even probably small hearts as well have this sort of flow and getting them all to update their cookies is a big task you can't simply put out a little notification with two months notice and say hi everyone change a cookie the same site none or do this or do that it's not going to work and this was a big problem and it was going to break potentially a lot of science so chrome thought about this for a while and realized that this flow should only take what two minutes or less so why don't we default to a thing called same site lacks plus post and that's a special a special value that only works when same site isn't set before and what it means is let through get safe requests from the lacks but also any cookies that were created less than two minutes ago we will let through which allows authentication flows like this to work because the cookie is created for this purpose for this flow and it's a very fast floating within a contained time and so you can allow it through in these requests and so they enable this as an option as so as as part of their same site lacks by default and as a temporary feature so they will remove it at some point but for now it's a thing is how it works and it resolves this problem until these developers have a chance to update their cookies and update their flows in order to make it work with same site lacks so same type domains isn't just the primary domain but it's also encompasses subdomains as well so mysite.com is considered the same site as static.miners.com and account.mysite.com they're all considered the same site which is great if you have your multi-tenancy and a subdomain authentication and you need to send a post request between the different sub domains you can easily do that even with same strict enabled because they're all considered the same site but what about public subdomains like github.io well they're considered cross site because there's a thing called the public suffix list and that defines the public domains where subdomains are easily retrievable easily accessible and you know are acquirable i should say by by general users and by different people and so if you allowed your same site request between laravel to github.io and velourin.github.io then you have a problem but because github.io is considered a public suffix it blocks those and they're considered across site requests which keeps the cookies secure and keeps those requests safe so let's have a quick look at how it works in a demo app i built so here we've got samesitetest.net and we're going to cookies set now in here what it's doing is it's setting a couple of different cookies so we've got strict cookie here which is being said as strict lacks cookies set as lacks secure non-cookie was set with the secure flag and none fla and skewer none seems like none sorry and you've got none cookie set with same site none so we'll check over at the request and we can see they've all been set here except for this one so this one highlighted in yellow down here has been rejected by the browser and the browser's rejected it because it's the same site none and secure isn't set so that's this one on here you can't really read it on that take it there and it's letting us know it's informing us that the cookie was sent to the browser but the browser is rejecting it the browser is not going to save that cookie and remember it and this is really really helpful for debugging when you're trying to figure out why is michael keeming set and the browser is going to tell us they've also added this tick box over here hasbro cookies which will reduce down the request to only the requests that have blocked cookies coming in the response in order to further to drill down to what's going on if you've got a lot of noise in in the network tab so i'll jump over to the external site and now we're going to make a couple of requests we'll make a get request first this is a cross-site get request we're going from same site test.net through to sensitivetest.com and my demo is not working i think it's on the wrong site or anything so go to com i know i should use it even okay so here we go again so now we're on same site test.com slash cookie set sorry we'll set the cookies same thing go to the external site which is net this should work better now so we'll go to get request and there we go so now we've got cookies being sent through so this is telling us that the strict request strict cookie the same size street cookie wasn't sent lacks was insecure none was and if we check the browser we can see the browser is listed lacks and secure none both sent through this one the yellow one here it rejected it so the browser has filtered out hasn't sent the streak to cookie because this wasn't this was a cross-site request and strict blocks every cross-site request cookies on every crosshair request and we can got the tick box here again we can hide and show the filtered out request cookies to make it easy to see what's going on with our requests so that's a get request now we'll go back to the external site and we'll run our iframe i'll load that up okay so on an iframe strict wasn't it and lax wasn't sent but secure none was because an iframe is considered is abandoned content and lacks blocks that as well so the only cookie that will get through on an iframe is the secure non-cookie even if we navigate around inside here and we'll go to a get request which normally adds the lacks it's not being added because we're still embedded we're still within the iframe let me go back to the external site and we run the post request now as we can see here the only one being sent is the qr9 again because post requests again are considered insecure and lacks won't have it and strict will never have it obviously and so that's that's a very basic demo on on the cookies and when they're added and when they're not and the different cross site requests um you can go to samesitetest.comcookieslash set to set the cookies and then follow the link to external site and see the values going through there and i'll have a link to this at the end of the talk um if you can't type that down or whatever so you can go play with it later okay so i also built an audit tool alongside that little tester i built an audit tool to look at the behavior of the cookies of the different values in different browsers and this is from chrome chrome of the flag set and we can see at the top we've got the same site requests and as you would expect cookies have been added to all of the same site requests except for the invalid cookies so these are the ones that have got same site none without secure set or secure set on a http request at the bottom you've got the cross site request here ah go away chrome slides okay so you've got the crosshat request at the bottom and you see most of them are blocked because it's a cross-site request and so you're seeing like in the iframe columns you've only got the yeses under the external nun secure which is the same site none secure cookie and that's the only time you will get the cookie through an iframe an image is exactly the same as that then you've got a few more in the get column which are the laces and the the default or the invalid values which are defaulting to lacks and then a couple s in the post column because it's only let through on we'll explain that in a sec so the other yeses and what i just got confused then was you've got the lacks plus post here so in the left column these are the requests that were made immediately after the cookies were sent and you can see the boxes i've highlighted there are post requests where the cookie was sent and there where same site wasn't set or it was set to an invalid value which in far as this browser was concerned meant it wasn't set and so the cookie was sent for both the get request and the post request in these instances because the cookie was created less than two minutes ago and so the post request was allowed to have cookies but if you look in the right column that is after the two minutes you will see that there are now no and it didn't send the cookie on the post request because it's been blocked because the cookie is no longer um less than two minutes old and it's only sending through the cookies on the get requests for those in defaulting to lacks and as you can see the secure line there is letting it through on all requests as you would expect it to do so how does this relate to laravel how does laravel implement same site and it's pretty easy in laravel in the config file in the session config file i should say you've got the same site keynote in there and you can set that to whatever value you'd like to set all of your cookies to so anytime laravel creates a cookie it will use this as the value for same site by default in version 7 and above it's set to lacks in versions before that it was set to knowledge means it wasn't set at all but this only this modifies all the cookies the larval creates if you want to create a cookie with a different value you have to do it through the cookie builder and if you're doing it through the cookie builder you have to get through all the other attributes because it was added the most recently it's at the end of the list and because we're tracking symphony it's at the end of the list and so you've got to get through name and value in minutes which is fine you want to set those anyway but then you've got path and domain and secure and http only and raw then you finally get the same site and you've got to remember how many nulls there are and the true and the false which one comes first before you get to the same site value which is kind of frustrating so i would recommend writing a helper function wrap it in a helper function where you only have to set the name and the value and the same size value and then it'll make your life a lot easier if you have to set this on more than one request but ultimately it's pretty easy to set because it's right there so wrapping up the question is what option do i use so use strict if the users shouldn't be automatically logged in and or you need to perform actions over get requests if you need that extra security so if for some reason your application requires get requests to perform actions you either need solid csrf protection or you need same side strict because it will add protection that you need there but you only need strict if if you need to do this because strict locks down as i said if users shouldn't be automatically logged in now you seems like none if you need to have post requests or embedded content between third party domains so if your site is embedded in an iframe and it needs to remember the user then you need to use none in order to allow the cookies to be included on those embedded requests or if you've got say a 10 multi tenancy with custom domains and an off domain with post requests again you'll need none and for these you need alternate methods of csrf protection in order to protect your requests and prevent print and attack in that regard now use nothing don't set it if you like unexpected behavior because at some point it's going to change so the browsers will implement same sign lacks by default chrome will be doing soon they have announced swim but they'll be soon firefox and edge have both said they will be implementing it as well and i'm sure safari will follow eventually and so you need to learn about same site now you need to apply it now work out the values that your site needs now so that things don't break and users don't get confused and the default really is same site lacks it works on the majority of cases it's the same default and it allows through safe requests and as long as you've built your application properly then lacks will work and do what you want because it works for the same request and it only block in a block csrf on all the requests where you would expect to find csrf attacks so is csrf dead i asked the question on twitter and i had a pen test to come up and say unlikely even with same site lacks a 90 90 proof of concept still worked so he just he found that even though they were using same site lacks to block post requests cookies on post requests he could just change it to a get request and it would still work it would still get through because the applications weren't built properly they weren't designed to have all of the actions and behavior blocked behind post laravel does help us with this because of the routing you're explicit with the different methods and the resource controllers for example they get you to write the code in for the unsafe requests in you know the in the update and the create and both of those are covered under the you know put and post et cetera and so for it makes our job a lot easier as liverpool developers because the support is in there already in the framework the framework encode just encourages us to do it but it is still a problem and still something you need to think about and so mike back to the question is it dead and no i don't think it is i think even though we can build our applications properly and securely and we can have same site lacks preventing csrf attacks on our sites it's still going to be an attack it's still going to be able to attack websites that are out there and applications and if you accidentally allow an action on a get request because it's simpler and quicker and easier then you're still vulnerable to csrf so i think it's still going to be a problem in the same way that sql injection is still a problem because although you can use a tool like eloquence and it protects you against that and it helps you to parameterize to write proper queries that aren't vulnerable to sql injection it's still an attack that is used in the while and can still be used on many sites because they aren't built properly so unfortunately no i don't think csrf is dead but i do think as developers we can make sure our sites are protected by using same site cookies so that we don't have to worry about it thank you very much for listening i hope you learned something i've got notes and links to the demo app and my slides on my site there feel free to email me or hit me on twitter if you have any questions and i'll be around in the chat for a bit longer yeah thank you very much do you want your online store to be ultra fast fully customized to your needs and scalable to more than one billion items imios is an ultra fast php ecommerce framework imeos next level ecommerce try it now [Music] [Music] if you're running an application at scale in public or private clouds you have to tame almost boundless complexity datadog brings together all this observability data with infrastructure metrics traces and logs in one integrated platform phonage is a cloud communication platform that lets you integrate voice video and messaging into your applications we have a load of helper libraries including a laravel package and a client library for php you can find out more about us at [Music] developer.nexmo.com [Music] [Applause] [Music] foreign
Info
Channel: Laracon EU
Views: 2,827
Rating: undefined out of 5
Keywords: laracon, laravel, php, laraconeu, csrf, samesite, cookies
Id: wVYyEx6yics
Channel Id: undefined
Length: 33min 24sec (2004 seconds)
Published: Mon Jul 20 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.