I am reading through the alchemy-API documentation here:
http://www.alchemyapi.com/api/image-tagging/image.html
They say that the image must be URI-encoded... what exactly does that mean?
Does it mean converting the image to a base64 string and then passing that to the request?
I've tried that but I receive a http 414 error - Request URI too large.
Here is my code where the request is made:
@IBAction func analyzeImage(sender: UIButton) {
var imageData = UIImagePNGRepresentation(mainImage.image)
let base64ImageString = imageData.base64EncodedStringWithOptions(.allZeros)
let requestString = ENDPOINT+"?apikey="+API_KEY+"&image="+base64ImageString+"&outputMode=json"
let url = NSURL(string: requestString)
let task = NSURLSession.sharedSession().dataTaskWithURL(url!) {(data, response, error) in
println(NSString(data: data, encoding: NSUTF8StringEncoding))
}
task.resume()
}
EDIT: I took into account Dijkgraaf's recommendation to use a POST request instead of GET to work around the URI length. I am using the Alamofire library to do this. Here is my code:
@IBAction func analyzeImage(sender: UIButton) {
var imageData = UIImagePNGRepresentation(mainImage.image)
let base64ImageString = imageData.base64EncodedStringWithOptions(.allZeros)
let params = [
"apikey" : API_KEY,
"image" : base64ImageString,
"outputMode" : "json"]
var manager = Manager.sharedInstance
//Passing all the headers you want!
manager.session.configuration.HTTPAdditionalHeaders = [
"Content-Type": "application/x-www-form-urlencoded"
]
manager.request(.POST, ENDPOINT, parameters:params, encoding: .URL)
.response { (request, response, data, error) in
println(request)
println(response)
println(error)
}
}
However, i get a "cannot-analyze:downstream-issue" error when I try this.
Here is the console output:
<NSMutableURLRequest: 0x1742040c0> { URL: http://access.alchemyapi.com/calls/image/ImageGetRankedImageKeywords }
Optional(<NSHTTPURLResponse: 0x17082c1e0> { URL: http://access.alchemyapi.com/calls/image/ImageGetRankedImageKeywords } { status code: 200, headers {
"Access-Control-Allow-Origin" = "*";
"Cache-Control" = "no-cache";
Connection = "keep-alive";
"Content-Length" = 326;
"Content-Type" = "application/json";
Date = "Mon, 08 Jun 2015 05:59:22 GMT";
Server = nginx;
"X-AlchemyAPI-CurrentVersion" = "12.15";
"X-AlchemyAPI-Error-Msg" = "cannot-analyze:downstream-issue";
"X-AlchemyAPI-Key" = [API KEY HIDDEN];
"X-AlchemyAPI-Params" = "sentiment=0&knowledgeGraph=0&detectedLanguage=unknown&submitLanguage=detect";
"X-AlchemyAPI-Status" = ERROR;
"X-AlchemyAPI-Total-Transactions" = 0;
} })
nil
Not sure what is going wrong, but the Alchemy documentation does state that POST requests should have the "Content-Type" header set to "application/x-www-form-urlencoded", which doesn't seem to be happening no matter what I try to set. Could this be the issue?
EDIT: I tried POSTing just the raw image data, again as Dijkgraaf suggested:
@IBAction func analyzeImage(sender: UIButton) {
var imageData = UIImagePNGRepresentation(mainImage.image)
//let base64ImageString = imageData.base64EncodedStringWithOptions(.allZeros)
var request = HTTPTask()
request.requestSerializer = HTTPRequestSerializer()
request.requestSerializer.headers["Content-Type"] = "application/x-www-form-urlencoded"
let params: Dictionary<String,AnyObject> = [
"apikey" : API_KEY,
"imagePostMode" : "raw",
"image" : imageData,
"outputMode" : "json"]
request.POST(ENDPOINT, parameters: params, completionHandler: {(response: HTTPResponse) in
println(response.headers)
})
}
but I still get the same cannot-analyze downstream issue error again.