Why shouldn't data be modified on an HTTP GET request?
Asked Answered
L

7

39

I know that using non-GET methods (POST, PUT, DELETE) to modify server data is The Right Way to do things. I can find multiple resources claiming that GET requests should not change resources on the server.

However, if a client were to come up to me today and say "I don't care what The Right Way to do things is, it's easier for us to use your API if we can just use call URLs and get some XML back - we don't want to have to build HTTP requests and POST/PUT XML," what business-conducive reasons could I give to convince them otherwise?

Are there caching implications? Security issues? I'm kind of looking for more than just "it doesn't make sense semantically" or "it makes things ambiguous."

Edit:

Thanks for the answers so far regarding prefetching. I'm not as concerned with prefetching since is mostly surrounding internal network API use and not visitable HTML pages that would have links that could be prefetched by a browser.

Lanky answered 1/4, 2009 at 14:27 Comment(1)
regarding prefetching, your browser may do this, not a bot.Parapodium
P
59
  • Prefetch: A lot of web browsers will use prefetching. Which means that it will load a page before you click on the link. Anticipating that you will click on that link later.
  • Bots: There are several bots that scan and index the internet for information. They will only issue GET requests. You don't want to delete something from a GET request for this reason.
  • Caching: GET HTTP requests should not change state and they should be idempotent. Idempotent means that issuing a request once, or issuing it multiple times gives the same result. I.e. there are no side effects. For this reason GET HTTP requests are tightly tied to caching.
  • HTTP standard says so: The HTTP standard says what each HTTP method is for. Several programs are built to use the HTTP standard, and they assume that you will use it the way you are supposed to. So you will have undefined behavior from a slew of random programs if you don't follow.
Parapodium answered 1/4, 2009 at 14:29 Comment(1)
"GET HTTP requests do not change state and they are idempotent" - that's rather the point - they're not idempotent in themselves, it's expected that they will be used in such a way as they are.Kisumu
B
10

How about Google finding a link to that page with all the GET parameters in the URL and revisiting it every now and then? That could lead to a disaster.

There's a funny article about this on The Daily WTF.

Briar answered 1/4, 2009 at 14:31 Comment(0)
R
8

GETs can be forced on a user and result in Cross-site Request Forgery (CSRF). For instance, if you have a logout function at http://example.com/logout.php, which changes the server state of the user, a malicious person could place an image tag on any site that uses the above URL as its source: http://example.com/logout.php. Loading this code would cause the user to get logged out. Not a big deal in the example given, but if that was a command to transfer funds out of an account, it would be a big deal.

Rawlings answered 1/4, 2009 at 14:34 Comment(3)
POST requests and others can also be forged in this way, but it requires the ability to run scripts on the client (i.e. you can execute a GET via loading an image, but you need an AJAX call to make a POST), so it isn't a huge security increase, but it does help.Porshaport
AJAX requests can not be made across domains, GET requests can. As a result, malicious-domain.xxx can embed <img src="good-domain.angel/transfer_funds?amt=all&to=bad_guy">. If you are authenticated with good-domain and bad_buy can get you to visit that page, he just got all your money.Cordelia
Note that there are defenses against CSRF that still allow you to have actionable URLs, but that isn't really the point of the original question.Cordelia
A
5

Good reasons to do it the right way...

They are industry standard, well documented, and easy to secure. While you fully support making life as easy as possible for the client you don't want to implement something that's easier in the short term, in preference to something that's not quite so easy for them but offers long term benefits.

One of my favourite quotes

Quick and Dirty... long after the Quick has departed the Dirty remains.

For you this one is a "A stitch in time saves nine" ;)

Antilogy answered 1/4, 2009 at 14:43 Comment(0)
A
4

Security: CSRF is so much easier in GET requests.

Using POST won't protect you anyway but GET can lead easier exploitation and mass exploitation by using forums and places which accepts image tags.

Depending on what you do in server-side using GET can help attacker to launch DoS (Denial of Service). An attacker can spam thousands of websites with your expensive GET request in an image tag and every single visitor of those websites will carry out this expensive GET request against your web server. Which will cause lots of CPU cycle to you.

I'm aware that some pages are heavy anyway and this is always a risk, but it's bigger risk if you add 10 big records in every single GET request.

Aerodyne answered 1/4, 2009 at 15:3 Comment(0)
A
2

Security for one. What happens if a web crawler comes across a delete link, or a user is tricked into clicking a hyperlink? A user should know what they're doing before they actually do it.

Antifederalist answered 1/4, 2009 at 14:31 Comment(0)
H
1

I'm kind of looking for more than just "it doesn't make sense semantically" or "it makes things ambiguous."

...

I don't care what The Right Way to do things is, it's easier for us

Tell them to think of the worst API they've ever used. Can they not imagine how that was caused by a quick hack that got extended?

It will be easier (and cheaper) in 2 months if you start with something that makes sense semantically. We call it the "Right Way" because it makes things easier, not because we want to torture you.

Heinrick answered 24/4, 2009 at 15:5 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.