"We can not access the URL currently."
Asked Answered
S

10

15

I call google api when the return of "We can not access the URL currently." But the resources must exist and can be accessed.

https://vision.googleapis.com/v1/images:annotate

request content:

{
  "requests": [
    {
      "image": {
        "source": {
          "imageUri": "http://yun.jybdfx.com/static/img/homebg.jpg"
        }
      },
      "features": [
        {
          "type": "TEXT_DETECTION"
        }
      ],
      "imageContext": {
        "languageHints": [
          "zh"
        ]
      }
    }
  ]
}

response content:

{
  "responses": [
    {
      "error": {
        "code": 4,
        "message": "We can not access the URL currently. Please download the content and pass it in."
      }
    }
  ]
}
Sean answered 15/7, 2017 at 15:0 Comment(1)
did you get a fix / workaround for this? It's 2019 and still it does not work. Throws same error.Ebert
M
7

As of August, 2017, this is a known issue with the Google Cloud Vision API (source). It appears to repro for some users but not deterministically, and I've run into it myself with many images.

Current workarounds include either uploading your content to Google Cloud Storage and passing its gs:// uri (note it does not have to be publicly readable on GCS) or downloading the image locally and passing it to the vision API in base64 format.

Here's an example in Node.js of the latter approach:

const request = require('request-promise-native').defaults({
  encoding: 'base64'
})

const data = await request(image)

const response = await client.annotateImage({
  image: {
    content: data
  },
  features: [
    { type: vision.v1.types.Feature.Type.LABEL_DETECTION },
    { type: vision.v1.types.Feature.Type.CROP_HINTS }
  ]
})
Millisent answered 14/8, 2017 at 14:52 Comment(1)
I've run tests and found that passing google cloud storage image uri, instead of sending base64 content is >1.5x faster. So I would like to see a real solution instead of a workaround.Obelisk
S
2

I have faced the same issue when I was trying to call the api using the firebase storage download url (although it worked initially)

After looking around I found the below example in the api docs for NodeJs.

NodeJs example

// Imports the Google Cloud client libraries
const vision = require('@google-cloud/vision');

// Creates a client
const client = new vision.ImageAnnotatorClient();

/**
 * TODO(developer): Uncomment the following lines before running the sample.
 */
// const bucketName = 'Bucket where the file resides, e.g. my-bucket';
// const fileName = 'Path to file within bucket, e.g. path/to/image.png';

// Performs text detection on the gcs file
const [result] = await client.textDetection(`gs://${bucketName}/${fileName}`);
const detections = result.textAnnotations;
console.log('Text:');
detections.forEach(text => console.log(text));
Sublieutenant answered 10/3, 2019 at 12:19 Comment(0)
O
1

For me works only uploading image to google cloud platform and passing it to URI parameters.

Outboard answered 22/2, 2018 at 10:37 Comment(0)
P
1

In my case, I tried retrieving an image used by Cloudinary our main image hosting provider.

When I accessed the same image but hosted on our secondary Rackspace powered CDN, Google OCR was able to access the image.

Not sure why Cloudinary didn't work when I was able to access the image via my web browser, but just my little workaround situation.

Purcell answered 13/11, 2019 at 17:17 Comment(0)
S
1

I faced with the same issue several days ago.

In my case the problem happened due to using queues and call api requests in one time from the same ip. After changing the number of parallel processes from 8 to 1, the amount of such kind of errors was reduced from ~30% to less than 1%.

May be it will help somebody. I think there is some internal limits on google side for loading remote images (because as people reported, using google storage also solves the problem).

Swell answered 21/4, 2021 at 13:51 Comment(1)
Thanks, that was exactly our issue. We were sending many requests in parallel, making them in series mostly fixed the problem. Not sure if Google is to blame here, making 10 requests to the vision API means Google is making 10 requests to the origin server concurrently, which might employ some rate-limiting itself.Pied
V
0

I believe the error is caused by the Cloud Vision API refusing to download images on a domain whose robots.txt file blocks Googlebot or Googlebot-Image.

The workaround that others mentioned is in fact the proper solution: download the images yourself and either pass them in the image.content field or upload them to Google Cloud Storage and use the image.source.gcsImageUri field.

Ventricose answered 11/3, 2019 at 8:56 Comment(0)
A
0

For me, I resolved this issue by requesting URI (e.g.: gs://bucketname/filename.jpg) instead of Public URL or Authenticated URL.

   const vision = require('@google-cloud/vision'); 
   function uploadToGoogleCloudlist (req, res, next) {
       const originalfilename = req.file.originalname;
       const bucketname = "yourbucketname";
       const imageURI = "gs://"+bucketname+"/"+originalfilename;
       
       const client = new vision.ImageAnnotatorClient(
            {
            projectId: 'yourprojectid',
            keyFilename: './router/fb/yourprojectid-firebase.json'
            }
        );

    var visionjson;

    async function getimageannotation() {
        const [result] = await client.imageProperties(imageURI);
        visionjson = result;
        console.log ("vision result: "+JSON.stringify(visionjson));
        return visionjson;
        }

    getimageannotation().then( function (result){
        var datatoup =  {
                    url: imageURI || ' ',
                    filename: originalfilename   || ' ',
                    available: true,
                    vision: result,
                                };
    
    })
    .catch(err => {
        console.error('ERROR CODE:', err);
        });

    next();

    }
Auxin answered 16/1, 2021 at 22:46 Comment(0)
D
0

My hypothesis is that an overall (short) timeout exists on Google API side which limit the number of files that can actually be retrieved.

Sending 16 images for batch-labeling is possible but only 5 o 6 will labelled because the origin webserver hosting the images was unable to return all 16 files within <Google-Timeout> milliseconds.

Digiacomo answered 31/5, 2022 at 16:17 Comment(0)
H
0

In my case, the image uri that I was specifying in the request pointed at a large image ~ 4000px x 6000px. When I changed it to a smaller version of the image. The request succeeded

Hylo answered 25/6, 2022 at 10:47 Comment(0)
S
-1

The very same request works for me. It is possible that the image host was temporarily down and/or had issues on their side. If you retry the request it will mostly work for you.

Shriner answered 15/7, 2017 at 18:30 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.