I am trying to write a BASH command that uses CURL to send a GET request to two different web pages but uses the same connection. For me, it is like sending a GET request to a login page to authenticate to the server and then the second request mimics the automatic redirect to the home page that would've happened in a web browser (via meta refresh tag). I need to chain the requests because the content of the home page (generated by the server) wil be different for a guest user than an authenticated user.
I tried this command first based on recommendation from SOF post (assume that the variables $IP
and $PORT
were already defined with valid values):
curl -u user:pass ${IP}:${PORT}/login.php && curl ${IP}:${PORT}/index.php
However, I always get something like this happening between the end of the first GET and the start of the second:
* Connection #0 to host 10.0.3.153 left intact
* Closing connection #0
So was the SOF post wrong? Anyways, doing this command will successfully keep the connection open between two requests:
curl -u user:pass ${IP}:${PORT}/login.php ${IP}:${PORT}/index.php
However, I really would prefer a solution closer to the former command than the latter command. The main reason why is to separate output from the first page versus the second page into two different output files. So I want to do something like:
curl page1.html > output1 && curl page2.html > output2
Of course, I need to reuse the same connection because the contents of page2.html depends on me also doing a request to page1.html in the same HTTP session.
I am also open to solutions that use netcat or wget, BUT NOT PHP!
&&
: That's just a Bash notation for running two commands one right after the other, with the restriction that the second command is only run if the first one succeeded. (In other words, ifcurl -u user:pass ${IP}:${PORT}/login.php
returns an error, thencurl ${IP}:${PORT}/index.php
will not be run.) It doesn't have anything to do with keeping a connection open, or anything like that. – Oca