Why do browsers allow CSRF?
Asked Answered
U

1

16

I am pretty new to web security, and as I read more about the different attack vectors, my mind boggles that they are allowed in the first place. It's like the web was designed with a broken security model and to be vulnerable.

I am also amazed at the amount of vague and imprecise information. For example, at first the Single Origin Policy sounds pretty good, then I read that it only applies to XHR, and oh and by the way, doesn't actually prevent XHR cross-origin POST, which is the classic CSRF attack. Glad I kept reading.

There is also an Origin header that the server can use to make sure the request is coming from the right place -- but oops, it is set inconsistently across browsers, and if it is NOT set, you can't be quite sure if it was because of a same-origin request, or a request type that just didn't get it for certain browsers (maybe an IMG tag?). Keep reading.

So the right way seems to be set a CSRF token in the session cookie, and also add that token to forms/links, and then compare them server side on a submission. In theory (and lets exclude all XSS attacks for the purpose of this question), a CSRF attempt from another tab may make a POST request to a form that includes the cookie, but without a form input element set to the matching token (because it can't read the token from the cookie), so the server will reject the request. Works but kludgy, and make sure you never ever forget to check!

Holding that thought in mind for a second, here is my question -- why does the browser send the session cookie in a request that originates from a page that is not the origin of the cookie?

I mean, browsers will refuse to send cookies to different domains for good reason, but are quite happy to send them from different origins? Would stuff break if they didn't? Would it be a robust defence against CSRF, only requiring servers to do what they are doing anyway -- checking for a valid session cookie?

Edit: maybe this is an attempt to improve the situation? https://datatracker.ietf.org/doc/html/draft-west-origin-cookies-01

Urease answered 1/6, 2016 at 4:59 Comment(7)
A lot of stuff would break. For example all these analytics and advertisement scripts.Didymous
It's not like browsers were designed, from day one, to allow CSRF to take place. CSRF was discovered later, at a point where there were already lots of websites already out there. Definitely more than ten. Changing the rules after the fact and expecting every website to change to accommodate the rule change is expecting a lot - especially when a lot of cross-site requests may have no harmful effects, only desirable ones.Blacktop
It's kinda irrelevant. A website is responsible to protect itself, NOT rely upon "correctly" designed/developed/maintained browsers. Which is why the CSRF token (even if kludgy) is necessary. I recommend building CSRF into the website architecture (or use a framework that already has it). That way, it's always there AND always checked (assuming you use the framework correctly ;)Aurora
You'll have to call Netscape and ask themDamselfly
@Aurora is it not the user that needs protection, not the website? And the user expects their browser to protect them by keeping their cookies from one site safe from requests from another site? Just like it expects the browser to protect them in many other ways too.Urease
@Urease No. The website receiving a GET or POST without a session unique token opens the user up to damage to his/her account on the website. So it may be clearer to say that it is the website's responsibility to protect its users.Aurora
This post is relatively old, but just wanted to say - brilliantly put!Palpitant
T
12

I am pretty new to web security, and as I read more about the different attack vectors, my mind boggles that they are allowed in the first place. It's like the web was designed with a broken security model and to be vulnerable.

All true. It was never designed to be secure in the first place. The web was originally designed as a static document management and sharing system which allowed direct links to resources on different machines.

The dynamic web you see today is a kludge. We can fix it with CSRF tokens, HTTP headers and the like, but if you make a dynamic website without doing any of these things then chances are it's vulnerable (and keeps people like me in a job).

Check out its history in the Wikipedia article.

I am also amazed at the amount of vague and imprecise information. For example, at first the Single Origin Policy sounds pretty good, then I read that it only applies to XHR, and oh and by the way, doesn't actually prevent XHR cross-origin POST, which is the classic CSRF attack. Glad I kept reading.

Also mainly true. The Same Origin Policy also applies to windows and frames too (e.g. example.com cannot alter the content of example.org by JavaScript if example.com includes an IFrame to example.org). Yes, cross-domain XHR's can be made, but without CORS being enabled the responses cannot be read. This does protect CSRF tokens from being stolen, but as you say if you're not using CSRF protection then this presents a CSRF vulnerability.

Other defences such as adding a custom header can be used to mitigate CSRF, as custom headers cannot be sent cross-domain.

XHRs didn't used to be able to access anything cross-domain, which was seen as too big a restriction, hence the advent of CORS. Previously, as forms could access different domains anyway, it was not seen as a particularly risky maneuver. It is still not, provided the appropriate controls are put into place.

There is also an Origin header that the server can use to make sure the request is coming from the right place -- but oops, it is set inconsistently across browsers, and if it is NOT set, you can't be quite sure if it was because of a same-origin request, or a request type that just didn't get it for certain browsers (maybe an IMG tag?). Keep reading.

Quite true. See this answer.

why does the browser send the session cookie in a request that originates from a page that is not the origin of the cookie?

Because lots of things would break otherwise. There are countless forms that are designed to be submitted from static sites to dynamic sites that do the back-end handling.

There is a new standard for "same-site" cookies. A less dry explanation is here.

Basically cookies can be set with a new attribute SameSite. In strict mode, cookies are not sent when the site is different. In lax mode, they are only withheld if the method is e.g. POST, which is where CSRF vulnerabilities mainly lie.

The one you linked to was an early draft of this.

Tania answered 2/6, 2016 at 11:46 Comment(5)
"There are countless forms that are designed to be submitted from static sites to dynamic sites that do the back-end handling" -- but cross domain? And if so, they just assume a cookie has been set by another window to the target site? I was trying to think of such examples and could not find any. The SameSite cookie sounds interesting, hope it becomes standard soon.Urease
Federated single-sign-on is one example where sometimes a nonce is created per session, that can be validated when the flow is redirected back to the site. e.g. Site --> Set Cookie -> Redirect to OpenID Provider --> Authenticate --> Redirect back with claim --> Site checks nonce cookie and claim.Tania
@Urease Same origin policy never prevent request from happening they only prevent from response to be read by the javascript.Lawerencelawes
@Tania Isn't this wrong 'at first the Single Origin Policy sounds pretty good, then I read that it only applies to XHR,' , It applies to image too. And many other, you cannot read image data, if you try to use canvas to read it will generate errot.Lawerencelawes
@sur It applies to images also.Tania

© 2022 - 2024 — McMap. All rights reserved.