Load schema json file to create table or job schema [closed]
Asked Answered
G

3

7

If I already have the schema file, for example: schema.json. How can I load the file to create the table or job schema using the google-cloud-python API?

Graehl answered 28/11, 2016 at 20:0 Comment(0)
A
4

You can try this solution:

import json
from google.cloud import bigquery

bigquerySchema = []
with open('schema.json') as f:
    bigqueryColumns = json.load(f)
    for col in bigqueryColumns:
        bigquerySchema.append(bigquery.SchemaField(col['name'], col['type']))

bigqueryClient = bigquery.Client()
tableRef = "myproject.mydataset.mytable"
table = bigquery.Table(tableRef, schema=bigquerySchema)
table = bigqueryClient.create_table(table)
Amphictyony answered 6/9, 2020 at 18:4 Comment(2)
This won't work for schemas with nested RECORD type fields unfortunately.Ladybug
#67459105Ladybug
E
1

I don't think this is currently possible. This is why I tend to use the bq cli when I want to load complicated JSON files with many different columns.

Something like this:

bq load --source_format=NEWLINE_DELIMITED_JSON \ [PROJECT_ID]:[DATASET].[TABLE] gs://[BUCKET]/[FILENAME].json \ [PATH TO SCHEMA FOLDER]/schema.json

Espinoza answered 18/1, 2018 at 0:43 Comment(0)
Q
-1

In case anyone finds this question 3 years later, this can now be done in the cloud shell found here: https://console.cloud.google.com/cloudshell/

If you are not comfortable using the command line for uploading files you can click on the editor icon and upload via drag and drop.

Google Cloud Platform documentation on uploading JSON data on command line, including with a schema file, can be found here: https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-json#bigquery_load_table_gcs_json-cli

Quietus answered 18/4, 2019 at 15:26 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.