How to URI-Encode image?
Asked Answered
A

2

7

I am reading through the alchemy-API documentation here:

http://www.alchemyapi.com/api/image-tagging/image.html

They say that the image must be URI-encoded... what exactly does that mean?

Does it mean converting the image to a base64 string and then passing that to the request?

I've tried that but I receive a http 414 error - Request URI too large.

Here is my code where the request is made:

    @IBAction func analyzeImage(sender: UIButton) {

    var imageData = UIImagePNGRepresentation(mainImage.image)
    let base64ImageString = imageData.base64EncodedStringWithOptions(.allZeros)
    let requestString = ENDPOINT+"?apikey="+API_KEY+"&image="+base64ImageString+"&outputMode=json"


    let url = NSURL(string: requestString)
    let task = NSURLSession.sharedSession().dataTaskWithURL(url!) {(data, response, error) in
        println(NSString(data: data, encoding: NSUTF8StringEncoding))
    }

    task.resume()
}

EDIT: I took into account Dijkgraaf's recommendation to use a POST request instead of GET to work around the URI length. I am using the Alamofire library to do this. Here is my code:

@IBAction func analyzeImage(sender: UIButton) {

    var imageData = UIImagePNGRepresentation(mainImage.image)
    let base64ImageString = imageData.base64EncodedStringWithOptions(.allZeros)

    let params = [
        "apikey" : API_KEY,
        "image"  : base64ImageString,
        "outputMode" : "json"]

    var manager = Manager.sharedInstance
    //Passing all the headers you want!
    manager.session.configuration.HTTPAdditionalHeaders = [
        "Content-Type": "application/x-www-form-urlencoded"
    ]
    manager.request(.POST, ENDPOINT, parameters:params, encoding: .URL)
        .response { (request, response, data, error) in
            println(request)
            println(response)
            println(error)
    }
}

However, i get a "cannot-analyze:downstream-issue" error when I try this.

Here is the console output:

<NSMutableURLRequest: 0x1742040c0> { URL: http://access.alchemyapi.com/calls/image/ImageGetRankedImageKeywords }
Optional(<NSHTTPURLResponse: 0x17082c1e0> { URL: http://access.alchemyapi.com/calls/image/ImageGetRankedImageKeywords } { status code: 200, headers {
    "Access-Control-Allow-Origin" = "*";
    "Cache-Control" = "no-cache";
    Connection = "keep-alive";
    "Content-Length" = 326;
    "Content-Type" = "application/json";
    Date = "Mon, 08 Jun 2015 05:59:22 GMT";
    Server = nginx;
    "X-AlchemyAPI-CurrentVersion" = "12.15";
    "X-AlchemyAPI-Error-Msg" = "cannot-analyze:downstream-issue";
    "X-AlchemyAPI-Key" = [API KEY HIDDEN];
    "X-AlchemyAPI-Params" = "sentiment=0&knowledgeGraph=0&detectedLanguage=unknown&submitLanguage=detect";
    "X-AlchemyAPI-Status" = ERROR;
    "X-AlchemyAPI-Total-Transactions" = 0;
} })
nil

Not sure what is going wrong, but the Alchemy documentation does state that POST requests should have the "Content-Type" header set to "application/x-www-form-urlencoded", which doesn't seem to be happening no matter what I try to set. Could this be the issue?

EDIT: I tried POSTing just the raw image data, again as Dijkgraaf suggested:

    @IBAction func analyzeImage(sender: UIButton) {

    var imageData = UIImagePNGRepresentation(mainImage.image)
    //let base64ImageString = imageData.base64EncodedStringWithOptions(.allZeros)
    var request = HTTPTask()
    request.requestSerializer = HTTPRequestSerializer()
    request.requestSerializer.headers["Content-Type"] = "application/x-www-form-urlencoded"
    let params: Dictionary<String,AnyObject> = [
        "apikey" : API_KEY,
        "imagePostMode" : "raw",
        "image" : imageData,
        "outputMode" : "json"]
    request.POST(ENDPOINT, parameters: params, completionHandler: {(response: HTTPResponse) in
        println(response.headers)
    })

}

but I still get the same cannot-analyze downstream issue error again.

Airdry answered 8/6, 2015 at 3:42 Comment(7)
"Requested image documents can be a maximum of 1 megabyte. Larger documents will result in a "content-exceeds-size-limit" error response." what is the size of the image you are trying to send?Touchy
If the image is large you will probably want to POST a raw image as mentioned in that documentation rather than try URI encode and using GET with a querystring. A URI max length is 2,083 characters is some systems, and even 255 characters is some others.Momentarily
@Touchy my test image is 156kbAirdry
@Momentarily ahh I see... so the encoded string is probably larger than what the URI allows... I'll take a lookAirdry
@Momentarily I am trying a POST request now, but getting a "cannot-analyze:downstream-issue" error. I've updated my original question with more details. Any ideas?Airdry
@Karuna-bdc "imagePostMode = raw - pass an unencoded image file using POST " Try it without base64 encoding and setting that parameter.Momentarily
@Karuna-bdc I have exactly the same issue. Did you solve?Moreira
D
5

When using imagePostMode raw, you need to send the image data as the body of the POST request, and the parameters should be included in the endpoint URL (for example ENDPOINT="http://access.alchemyapi.com/calls/image/ImageGetRankedImageKeywords?apikey=API_KEY&outputMode=json&imagePostMode=raw"). I haven't worked with Swift so I don't know the best way to do this, but it's a little different than what you might expect.

Dilate answered 11/6, 2015 at 14:1 Comment(0)
U
2

Below code works for me.

    let image = UIImage(named: "your-image.png")
    getImageTag(image!)

func getImageTag(image:UIImage){
        let apiKey = "xxx-xxx-xxx-xxx-xxx"
        let url = "https://gateway-a.watsonplatform.net/calls/image/ImageGetRankedImageKeywords?imagePostMode=raw&outputMode=json&apikey=" + apiKey
        let myURL = NSURL(string: url)!
        let request = NSMutableURLRequest(URL: myURL)
        request.HTTPMethod = "POST"
        request.setValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
        let imageData = UIImagePNGRepresentation(image)

        request.HTTPBody = imageData!
        let task = NSURLSession.sharedSession().dataTaskWithRequest(request) {
        data, response, error in

            // Your completion handler code here
            if let error = error {
                print("error: \(error)")
            }
            print(response)
            print(NSString(data: data!, encoding: NSUTF8StringEncoding))
        }
    task.resume()
}

You also can test Image Tagging Requst by curl command like below.

 curl --data-binary @your_image.png "https://gateway-a.watsonplatform.net/calls/image/ImageGetRankedImageKeywords?imagePostMode=raw&apikey=d3a529b15ac9ebe550a51006815xxxxxx"
Urbas answered 30/11, 2015 at 13:26 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.