Credentials Error when integrating Google Drive with
Asked Answered
L

4

2

I am using Google Big Query, I want to integrate Google Big Query to Google Drive. In Big query I am giving the Google spread sheet url to upload my data It is updating well, but when I write the query in google Add-on(OWOX BI Big Query Reports):

Select * from [datasetName.TableName]

I am getting an error:

Query failed: tableUnavailable: No suitable credentials found to access Google Drive. Contact the table owner for assistance.

Lepidolite answered 23/5, 2016 at 1:6 Comment(3)
i am just giving the basic sql query like --Lepidolite
select * from [datasetname.tablename] , in google add-on(OWOX BI Big Query Reports) the code is right, but I am getting an error--Query failed: tableUnavailable: No suitable credentials found to access Google Drive. Contact the table owner for assistance. Can you please help me out with, how to add credentials in console.cloud.google.com/apis/credentialsLepidolite
This uses Federated Data Sources. When following these steps and creating the table, you are asked to allow the Google Drive View and manage scope and the BigQuery View and manage scope. If you allowed these, was the BigQuery table then created properly? Any errors using the Web UI?Scummy
C
2

I just faced the same issue in a some code I was writing - it might not directly help you here since it looks like you are not responsible for the code, but it might help someone else, or you can ask the person who does write the code you're using to read this :-)

So I had to do a couple of things:

  1. Enable the Drive API for my Google Cloud Platform project in addition to BigQuery.
  2. Make sure that your BigQuery client is created with both the BigQuery scope AND the Drive scope.
  3. Make sure that the Google Sheets you want BigQuery to access are shared with the "[email protected]" account that your Google Cloud Platform identifies itself as.

After that I was able to successfully query the Google Sheets backed tables from BigQuery in my own project.

Cheriecherilyn answered 1/6, 2016 at 8:11 Comment(0)
B
1

What was previously said is right:

  1. Make sure that your dataset in BigQuery is also shared with the Service Account you will use to authenticate.
  2. Make sure your Federated Google Sheet is also shared with the service account.
  3. The Drive Api should as well be active
  4. When using the OAuthClient you need to inject both scopes for the Drive and for the BigQuery

If you are writing Python:

  1. credentials = GoogleCredentials.get_application_default() (can't inject scopes #I didn't find a way :D at least

  2. Build your request from scratch:

    scopes = ( 'https://www.googleapis.com/auth/drive.readonly', 'https://www.googleapis.com/auth/cloud-platform')

    credentials = ServiceAccountCredentials.from_json_keyfile_name(
        '/client_secret.json', scopes)
    
    http = credentials.authorize(Http())
    
    bigquery_service = build('bigquery', 'v2', http=http)
    
    query_request = bigquery_service.jobs()
    query_data = {
        'query': (
            'SELECT * FROM [test.federated_sheet]')
    }
    
    query_response = query_request.query(
        projectId='hello_world_project',
        body=query_data).execute()
    
    print('Query Results:')
    for row in query_response['rows']:
        print('\t'.join(field['v'] for field in row['f']))
    
Bans answered 5/7, 2016 at 15:11 Comment(0)
B
0

This likely has the same root cause as: BigQuery Credential Problems when Accessing Google Sheets Federated Table

Accessing federated tables in Drive requires additional OAuth scopes and your tool may only be requesting the bigquery scope. Try contacting your vendor to update their application?

Behlke answered 27/5, 2016 at 23:35 Comment(0)
R
0

If you're using pd.read_gbq() as I was, then this would be the best place to get your answer: https://github.com/pydata/pandas-gbq/issues/161#issuecomment-433993166

import pandas_gbq
import pydata_google_auth
import pydata_google_auth.cache

# Instead of get_user_credentials(), you could do default(), but that may not
# be able to get the right scopes if running on GCE or using credentials from
# the gcloud command-line tool.
credentials = pydata_google_auth.get_user_credentials(
    scopes=[
        'https://www.googleapis.com/auth/drive',
        'https://www.googleapis.com/auth/cloud-platform',
    ],
    # Use reauth to get new credentials if you haven't used the drive scope
    # before. You only have to do this once.
    credentials_cache=pydata_google_auth.cache.REAUTH,
    # Set auth_local_webserver to True to have a slightly more convienient
    # authorization flow. Note, this doesn't work if you're running from a
    # notebook on a remote sever, such as with Google Colab.
    auth_local_webserver=True,
)

sql = """SELECT state_name
FROM `my_dataset.us_states_from_google_sheets`
WHERE post_abbr LIKE 'W%'
"""

df = pandas_gbq.read_gbq(
    sql,
    project_id='YOUR-PROJECT-ID',
    credentials=credentials,
    dialect='standard',
)

print(df)
Ralli answered 22/4, 2019 at 16:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.