2

What changes to the HTTP protocol spec, and to browser behaviour, would be required to prevent dangerous cases of cross-site request forgery?

I am not looking for suggestions as to how to patch my own web app. There are millions of vulnerable web apps and forms. It would be easier to change HTTP and/or the browsers.

If you agree to my premise, please tell me what changes to the HTTP and/or browser behaviour are needed. This is not a competition to find the best single answer, I want to collect all the good answers.

Please also read and comment on the points in my 'answer' below.

Sam Watkins
  • 6,451
  • 3
  • 34
  • 35
  • "CSRF Isn’t A Big Deal - Duh!" ... "A huge percentage of the fraud on the Internet (TOS fraud, not actual hacking) is related to CSRF abuse (click fraud, affiliate fraud, etc…). We’re talking about hundreds of millions of dollars lost to a single exploit and only in those two variants." ha.ckers.org – Sam Watkins Mar 22 '12 at 05:01

6 Answers6

5

Roy Fielding, author of the HTTP specification, disagrees with your opinion, that CSRF is a flaw in HTTP and would need to be fixed there. As he wrote in a reply in a thread named The HTTP Origin Header:

CSRF is not a security issue for the Web. A well-designed Web service should be capable of receiving requests directed by any host, by design, with appropriate authentication where needed. If browsers create a security issue because they allow scripts to automatically direct requests with stored security credentials onto third-party sites, without any user intervention/configuration, then the obvious fix is within the browser.

And in fact, CSRF attacks were possible right from the beginning using plain HTML. The introduction of nowadays technologies like JavaScript and CSS did only introduce further attack vectors and techniques that made request forging easier and more efficient.

But it didn’t change the fact that a legitimate and authentic request from a client is not necessarily based on the user’s intention. Because browsers do send requests automatically all the time (e. g. images, style sheets, etc.) and send any authentication credentials along.

Again, CSRF attacks happen inside the browser, so the only possible fix would need to be to fix it there, inside the browser.

But as that is not entirely possible (see above), it’s the application’s duty to implement a scheme that allows to distinguish between authentic and forged requests. The always propagated CSRF token is such a technique. And it works well when implemented properly and protected against other attacks (many of them, again, only possible due to the introduction of modern technologies).

Gumbo
  • 594,236
  • 102
  • 740
  • 814
  • I agree that the problem needs to be fixed in the browser. I'm suggesting that the HTTP spec should say a few extra 'MUST' here and there so that the browsers must do the right thing. If HTTP specifies cookies, authentication, POST and GET, it should also specify what to do with cookies, authentication, POST and GET in the case of cross-site requests. These points that have been left unspecified (or wrongly specified) are handled wrongly by most browsers. – Sam Watkins Mar 22 '12 at 04:43
  • I am interested to hear critique of the proposed changes, in my 'answer'. I think they could be implemented across-the-board only if the HTTP spec is altered, and the protocol version bumped. But, if we can do it without changing the HTTP specs, that's even better. Please, I need more feedback than "this break absolutely everything". In particular, would the concept of local (by default) versus remote cookies be useful, or would it 'break everything'? Any such broken thing could easily change to use a 'remote' cookie, with no additional vulnerability but perhaps additional awareness. – Sam Watkins Mar 22 '12 at 04:46
1

I agree with the other two; this could be done on the browser-side, but would make impossible to perform authorized cross-site requests. Anyways, a CSRF protection layer could be added quite easily on the application side (and, maybe, even on the webserver-side, in order to avoid making changes to pre-existing applications) using something like this:

  • A cookie is set to a random value, known only by server (and, of course, the client receiving it, but not a 3rd party server)
  • Each POST form must contain a hidden field whose value must be the same of the cookie. If not, form submission must be prevented and a 403 page returned to the user.
redShadow
  • 6,239
  • 1
  • 25
  • 33
  • -1 the attacker does not need to know the cookie value to perform a CSRF attack. If he knew the cookie value he would just hijack the session. Please read about the basics of CSRF before suggesting a fix. The CSRF prevention cheat sheet is a good place to start. – rook Mar 21 '12 at 14:55
  • Read this before downvoting: https://www.owasp.org/index.php/Cross-Site_Request_Forgery_%28CSRF%29_Prevention_Cheat_Sheet#General_Recommendation:_Synchronizer_Token_Pattern -- security is not based only on cookie but on the cookie+post var pair, and it's the method used by many frameworks including django. – redShadow Mar 21 '12 at 15:09
  • That -1 was wrong, so I'll +1 it. Double cookie submission is a reasonable way to protect against CSRF. I am asking about protecting by fixing the protocol and browsers, however. – Sam Watkins Mar 21 '12 at 15:12
  • @redShadow ok well -1 for undermining httponly cookies, also it wasn't clear that both methods should be used together. – rook Mar 21 '12 at 15:18
  • It's an interesting idea to protect against CSRF in the webserver, thanks for that. I found this relevant article: http://knol.google.com/k/preventing-cross-site-request-forgeries-csrf-using-modsecurity – Sam Watkins Mar 21 '12 at 15:24
0

It can already be done:

Referer header

This is a weaker form of protection. Some users may disable referer for privacy purposes, meaning that they won't be able to submit such forms on your site. Also this can be tricky to implement in code. Some systems allow a URL such as http://example.com?q=example.org to pass the referrer check for example.org. Finally, any open redirect vulnerabilities on your site may allow an attacker to send their CSRF attack through the open redirect in order to get the correct referer header.

Origin header

This is a new header. Unfortunately you will get inconsistencies between browsers that support it and do not support it. See this answer.

Other headers

For AJAX requests only, adding a header that is not allowed cross domain such as X-Requested-With can be used as a CSRF prevention method. Old browsers will not send XHR cross domain and new browsers will send a CORS preflight instead and then refuse to make the main request if it is explicitly not allowed by the target domain. The server-side code will need to ensure that the header is still present when the request is received. As HTML forms cannot have custom headers added, this method is incompatible with them. However, this also means that it protects against attackers using an HTML form in their CSRF attack.

Browsers

Browsers such as Chrome allow third party cookies to be blocked. Although the explanation says that it'll stop cookies from being set by a third party domain, it also prevent any existing cookies from being sent for the request. This will block "background" CSRF attacks. However, those that open full page or in a popup will succeed, but will be more visible to the user.

Community
  • 1
  • 1
SilverlightFox
  • 28,804
  • 10
  • 63
  • 132
0

Enforce the Same Origin Policy for form submission locations. Only allow a form to be submitted back to the place it came from.

This, of course, would break all sorts of other things.

  • Yes I agree, that would help (to prevent e.g. home router hijack). CSRF can usually be done with GET + ?query too, as few web apps check the request method. We can hardly outlaw remote GET + ?query. So I'm suggesting the user should be prompted to allow that. There would be an option to always allow between a certain pair of domains. This would encourage people to use PATH instead of query string for linkable GET requests. – Sam Watkins Mar 21 '12 at 15:44
0

If you look at the CSRF prevention cheat sheet you can see that there are ways of preventing CSRF by relying upon the HTTP protocol. A good example is checking the HTTP referer which is commonly used on embedded devices because it doesn't require additional memory.

However, this is weak form of protection. A vulnerability like HTTP response splitting on the client side could be used to influence the referer value, and this has happened.

rook
  • 62,960
  • 36
  • 149
  • 231
  • CSRF is a major problem because browsers send too much information with cross-site requests. So I am looking at how to change the HTTP specs, so that a conforming browser would not send authentication cookies, etc., with such a request. We need to do this without upsetting valid cross-site requests too much. – Sam Watkins Mar 21 '12 at 15:37
0
  • cookies should be declared 'local' (default) or 'remote'
  • the browser must not send 'local' cookies with a cross-site request
  • the browser must never send http-auth headers with a cross-site request
  • the browser must not send a cross-site POST or GET ?query without permission
  • the browser must not send LAN address requests from a remote page without permission
  • the browser must report and control attacks, where many cross-site requests are made
  • the browser should send 'Origin: (local|remote)', even if 'Referer' is disabled
  • other common web security issues such as XSHM should be addressed in the HTTP spec
  • a new HTTP protocol version 1.2 is needed, to show that a browser is conforming
  • browsers should update automatically to meet new security requirements, or warn the user
Sam Watkins
  • 6,451
  • 3
  • 34
  • 35
  • This breaks absolutely everything. – rook Mar 21 '12 at 15:26
  • I explained these suggestions in more detail, here: http://sswam.wordpress.com/2012/03/16/not-secure-will-fail-how-to-stop-csrf-cross-site-request-forgery/#content – Sam Watkins Mar 21 '12 at 15:47
  • @Rook ... well, I intended that it should break nothing. Can you explain how any of my suggestions would break anything? Minor changes would be needed to the small minority of web apps that actually need cookies to be sent with cross-site requests: the owning site must declare those as 'remote' cookies. The browser should request the user's permission to send cross-site POST or GET-with-?query; that may be (very) annoying but it doesn't break anything. The user might choose to allow any cross-site action, except in the local network, and except apparent repeating attacks, for example. – Sam Watkins Mar 21 '12 at 15:52