powershell httpwebrequest GET method cookiecontainer problem?
Asked Answered
K

3

10

I'm trying to scrape a website that has user authentication. I am able to do a POST to send my login and stores a cookie. However, after the login I get a 403 error when trying to access the protected page.

$url = "https://some_url"

$CookieContainer = New-Object System.Net.CookieContainer

$postData = "User=UserName&Password=Pass"

$buffer = [text.encoding]::ascii.getbytes($postData)

[net.httpWebRequest] $req = [net.webRequest]::create($url)
$req.method = "POST"
$req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"
$req.Headers.Add("Accept-Language: en-US")
$req.Headers.Add("Accept-Encoding: gzip,deflate")
$req.Headers.Add("Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7")
$req.AllowAutoRedirect = $false
$req.ContentType = "application/x-www-form-urlencoded"
$req.ContentLength = $buffer.length
$req.TimeOut = 50000
$req.KeepAlive = $true
$req.Headers.Add("Keep-Alive: 300");
$req.CookieContainer = $CookieContainer
$reqst = $req.getRequestStream()
$reqst.write($buffer, 0, $buffer.length)
$reqst.flush()
$reqst.close()
[net.httpWebResponse] $res = $req.getResponse()
$resst = $res.getResponseStream()
$sr = new-object IO.StreamReader($resst)
$result = $sr.ReadToEnd()
$res.close()



$url2 = "https://some_url/protected_page"

[net.httpWebRequest] $req2 = [net.webRequest]::create($url2)
$req2.Method = "GET"
$req2.Accept = "text/html"
$req2.AllowAutoRedirect = $false
$req2.CookieContainer = $CookieContainer
$req2.TimeOut = 50000
[net.httpWebResponse] $res2 = $req2.getResponse()
$resst = $res2.getResponseStream()
$sr = new-object IO.StreamReader($resst)
$result = $sr.ReadToEnd()

WORKAROUND: So after trying almost everything I ended up trying something different and it actually works.

After posting the login and getting the session cookie, I use webclient to access the secure page by adding the cookie string to the headers.

$web = new-object net.webclient
$web.Headers.add("Cookie", $res.Headers["Set-Cookie"])
$result = $web.DownloadString("https://secure_url")

One of the cool thing about this is that webclient saves the cookie. To access another secure page, you can just call $web.downloadstring("https://another_secure_url") :)

Kick answered 29/3, 2011 at 9:31 Comment(2)
Can you post your complete solution for this. I am in the same situation but I don't seem to have this working quite yet.Mareah
I used Fiddler2 to to capture the traffic between my browser and the server then grabbed the cookie out of the request header in Fiddler2. I added that cookie to the request as you show and now DownloadString doesn't continually redirect to the login page. Thanks!Tinishatinker
C
6

I found that since cookies can have additional information attached (like the URL or HTTP-only), the $res.Headers["Set-Cookie"] didn't work for me. But using your $CookieContainer variable, you can easily change it to use GetCookieHeader(url), which will strip out the extra information and leave you with a properly formatted cookie string:

$web = new-object net.webclient
$web.Headers.add("Cookie", $CookieContainer.GetCookieHeader($url))
$result = $web.DownloadString($url)
Crossbred answered 22/10, 2012 at 14:6 Comment(0)
P
4

People have been asking for the complete application, here you have it

$url = "https://some_url"

$CookieContainer = New-Object System.Net.CookieContainer

$postData = "User=UserName&Password=Pass"

$buffer = [text.encoding]::ascii.getbytes($postData)

[net.httpWebRequest] $req = [net.webRequest]::create($url)
$req.method = "POST"
$req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"
$req.Headers.Add("Accept-Language: en-US")
$req.Headers.Add("Accept-Encoding: gzip,deflate")
$req.Headers.Add("Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7")
$req.AllowAutoRedirect = $false
$req.ContentType = "application/x-www-form-urlencoded"
$req.ContentLength = $buffer.length
$req.TimeOut = 50000
$req.KeepAlive = $true
$req.Headers.Add("Keep-Alive: 300");
$req.CookieContainer = $CookieContainer
$reqst = $req.getRequestStream()
$reqst.write($buffer, 0, $buffer.length)
$reqst.flush()
$reqst.close()
[net.httpWebResponse] $res = $req.getResponse()
$resst = $res.getResponseStream()
$sr = new-object IO.StreamReader($resst)
$result = $sr.ReadToEnd()
$res.close()


$web = new-object net.webclient
$web.Headers.add("Cookie", $res.Headers["Set-Cookie"])
$result = $web.DownloadString("https://secure_url")
Photochromy answered 10/1, 2013 at 18:50 Comment(0)
I
0

I would use IE automation. With this don't have to work with cookies, headers etc. Much easier.

Ihs answered 29/3, 2011 at 10:41 Comment(1)
I gave ie automation a try before this but it's just too slow to scrape. But i did find a solution to my problem.Kick

© 2022 - 2024 — McMap. All rights reserved.