TL;DR
What libraries/calls are available to handle query strings containing semi-colons differently than parse_qs?
>>> urlparse.parse_qs("tagged=python;ruby")
>>> {'tagged': ['python']}
Full Background
I'm working with the StackExchange API to search for tagged questions.
Search is laid out like so, with tags separated by semi-colons:
/2.1/search?order=desc&sort=activity&tagged=python;ruby&site=stackoverflow
Interacting with the API is just fine. The problem comes in when I want to test the calls, particularly when using httpretty to mock HTTP.
Under the hood, httpretty
is using urlparse.parse_qs
from the python standard libraries to parse the querystring.
>>> urlparse.parse_qs("tagged=python;ruby")
{'tagged': ['python']}
Clearly that doesn't work well. That's the small example, here's a snippet of httpretty (outside of testing context).
import requests
import httpretty
httpretty.enable()
httpretty.register_uri(httpretty.GET, "https://api.stackexchange.com/2.1/search", body='{"items":[]}')
resp = requests.get("https://api.stackexchange.com/2.1/search", params={"tagged":"python;ruby"})
httpretty_request = httpretty.last_request()
print(httpretty_request.querystring)
httpretty.disable()
httpretty.reset()
I want to use the machinery from httpretty, but need a workaround for parse_qs
. I can monkey patch httpretty for now, but would love to see what else can be done.
parse_qsl
. – Lenorelenox