google.api_core.exceptions.Forbidden: 403 Missing or insufficient permissions
Asked Answered
K

6

12

Similar issues were submitted but none of the solutions work.

When trying to do this tutorial from the Google Cloud doc, I'm getting the following error when trying to access the datastore:

google.api_core.exceptions.Forbidden: 403 Missing or insufficient 
permissions.

The executed file can be found here.

I did execute the following commands:

gcloud auth application-default login
export GOOGLE_APPLICATION_CREDENTIALS="file.json"

Please note that I'm executing the file on a local computer. The goal is to perform reads/writes on the datastore directly from Google Engine app.

Krystenkrystin answered 6/12, 2017 at 10:12 Comment(1)
Do you have a service account that allows you to access the resources? Have you followed the guide in this link?Absquatulate
S
11

I was also having the same error message when running the tutorial from a local computer. I am using a service account (and not the "gcloud auth application-default login), as this is the preferred approach recommended in the Google tutorials.

However, after a lot of investigation I found that the problem was occurring due an error in Google's documentation (it seems that the documentation is not up-to-date).

Setting up authentication To run the client library, you must first set up authentication by creating a service account and setting an environment variable. Complete the following steps to set up authentication. For more information, see the GCP authentication documentation .

GCP CONSOLECOMMAND LINE In the GCP Console, go to the Create service account key page.

  1. GO TO THE CREATE SERVICE ACCOUNT KEY PAGE
  2. From the Service account drop-down list, select New service account.
  3. In the Service account name field, enter a name . 4. From the Role drop-down list, select Project > Owner.

The error in the documentation, has to do with step 4 of the instructions. In the current implementation of the GCP console, the Role cannot be set directly from the Service Account Key page. Instead, you must go to the "IAM & admin"page to set the 'Owner' role:

In your Google Cloud console select “IAM & admin”->”IAM”

You will see the “ADD” option. This will allow you to set permissions for your new Service Account. Click “ADD”​.

You can then enter the service account and role ('Owner' if you are following the instructions in the tutorial).

The following article "The Missing Guide To Setting Up Google Cloud Service Accounts For Google BigQuery" provides more information. The article is written in the context of BigQuery, but it is equally applicable for Google Datastore :

https://blog.openbridge.com/the-missing-guide-to-setting-up-google-cloud-service-accounts-for-google-bigquery-6301e509b232

Schumann answered 9/12, 2018 at 2:1 Comment(2)
I tried to connect on own server with Firestore, and this answer was handy.Ashes
This is exactly what I was looking for. Thanks.Humdinger
E
9

You're trying to use two different forms of authentication, which I wouldn't recommend.

From Google's documentation, gcloud auth application-default login is if you want your local application to temporarily use your own user credentials for API access.

When you use export GOOGLE_APPLICATION_CREDENTIALS='file.json', per Google's documentation, you are setting an environment variable to the file.json. This means you will need to create a Service Account, assign the Service Account the proper permissions, create/download a key (which in this case is file.json) and then the environment variable will be in effect when your code is executed.

Since you're just getting started, I would recommend starting out using your Cloud Shell that's available in the Google Cloud Console and using an account that has full Owner rights on your Google Project. This will make it much easier for you to learn the basics (and then you can run it more securely later and/or in production). The Cloud Shell has everything installed and updated.

If you absolutely have to run this Quickstart through a local computer, I'd recommend the first option above: gcloud auth application-default login. You will need to have the Google Cloud SDK installed for your operating system. When you run the command, it should open a browser and you will be prompted to log into your Google Cloud account. That will give you permissions to run the script locally. Hope this helps!

Ellingson answered 15/2, 2018 at 16:57 Comment(1)
Definitely the execution of gcloud auth application-default login instead of gcloud auth login solved this 403 issue for meCroom
B
1

Forbidden Error

actually its more simpler, I have a solution that should work in any environment.

either it be on a:

  1. Local Development Environment
  2. Heroku
  3. AWS
  4. Azure
  5. Docker

and or whatever cloud environment you fancy.

Define a function to get the NDB Client with credentials as follows

def get_client() -> ndb.Client:
    if is_heroku():
        # NOTE: hosted in Heroku service key should be saved as environment variable in heroku or in any platform other than GCP
        app_credentials = json.loads(os.environ.get('GOOGLE_APPLICATION_CREDENTIALS'))
        credentials = service_account.Credentials.from_service_account_info(info=app_credentials)
        ndb_client: ndb.Client = ndb.Client(namespace="main", project=config_instance.PROJECT, credentials=credentials)
    else:
        # NOTE: could be GCP or another cloud environment
        ndb_client: ndb.Client = ndb.Client(namespace="main", project=config_instance.PROJECT)
    return ndb_client

then create a python wrapper like this


def use_context(func: Callable) -> Callable:
    """
        **use_context**
            will insert ndb context for working with ndb. Cloud Databases
        **NOTE**
            functions/ methods needs to be wrapped by this wrapper when they interact with the database somehow

    :param func: function to wrap
    :return: function wrapped with ndb.context
    """
    @functools.wraps(func)
    def wrapper(*args, **kwargs) -> Callable:
        ndb_client = get_client()
        print(f' ndb_client : {str(ndb_client)}')
        with ndb_client.context():
            return func(*args, **kwargs)
    return wrapper

whenever you need to use GCP NDB Context just call the wrapper function as follows


@use_context
def save_model(model: Optional[ndb.Model]) -> Optional[ndb.Key]:
    """save ndb model to store and return ndb.Key"""
    return model.put() if isinstance(model, ndb.Model) else None

NOTE: the contents of GOOGLE_APPLICATION_CREDENTIALS

environment variable needs to be obtained from the JSON file and the contents of this file needs to be set as an environment variable if you are on Heroku or any other cloud Offering other than GCP

On Local Development you can save the file on your local drive,

On Docker you can set the environment variable or use the file

control which is which by this logic

if is_heroku():

in my case its just a function that tries to read an environment variable to see if the app is running on Heroku or not

in your case it could be anything as long as it tells you which environment you are running at so you could choose to load your key file from local or environment .

Loading JSON Files from Environment Variables

this is just so you can load the contents of a json file from environment variables

app_credentials = json.loads(os.environ.get('GOOGLE_APPLICATION_CREDENTIALS'))

the above allows you to save the actual contents of a JSON File to an environment variable and then load it back as JSON again,

To avoid having to save the file in the src folder or any other folder.

Bream answered 11/9, 2021 at 10:57 Comment(0)
A
0

If the above solutions does not work go to your firebase console click on the settings icon then navigate to

  1. Navigate to the service account
  2. Click on Generate new private key

and set GOOGLE_APPLICATION_CREDENTIALS variable to the json file path this worked for me :)

Aplasia answered 29/4, 2021 at 15:58 Comment(0)
R
0

I was facing the same issue until I added "Cloud Datastore persmission" to my service account.

Rota answered 5/2, 2022 at 13:48 Comment(2)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Endomorphic
As it’s currently written, your answer is unclear. Please edit to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers in the help center.Endomorphic
J
0

You need to check your service account permission if it was properly selected. And after you add new permission, stop the instance/cluster, go to Edit, go to Service account and reselect the account; this make sure your instance didn't store old permission config in cache and accept all added new permissions. Then restart the VM and run your task normally.

Jehial answered 11/1 at 9:56 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.