Google BigQuery Delete Rows?
Asked Answered
D

7

49

Anyone know of any plans to add support for delete parts of data from a table in Google Bigquery? The issue we have right now is we are using it for analytics of data points we collect over time. We want to run the queries over the last X days of data, however after the last X days of data we no longer need to store the data in BigQuery.

The only way currently we can think of to delete the data would be to delete the entire table of data, then recreate it and load it with X number of days of data. This would though require us to store our data in daily CSV files too, which isn't optimal.

Any recommendations on how to handle this issue or if there is a delete rows query coming in the near future?

Distich answered 15/5, 2012 at 15:36 Comment(0)
T
40

2016 update: BigQuery can delete and update rows now -- Fh

https://cloud.google.com/bigquery/docs/reference/standard-sql/dml-syntax


Thanks for describing your use case. BigQuery is append-only by design. We currently don't support deleting single rows or a batch of rows from an existing dataset.

Currently, to implement a "rotating" log system you must either: 1. Create a new table each day (and delete older tables if that is necessary) 2. Append your data to a table and query by time/date

I would actually recommend creating a new table for each day. Since BigQuery charges by amount of data queried over, this would be most economical for you, rather than having to query over entire massive datasets every time.

By the way - how are you currently collecting your data?

Telium answered 15/5, 2012 at 15:48 Comment(7)
Good suggestion. We currently are storing it in MySQL, dump the data we actually process to a CSV and upload it. I'm looking around for any limits on the number of tables in a specific dataset, but can't find any. Is this correct on there being no limits?Distich
Just an FYI if you are taking the rotating tables approach -- BigQuery now supports table expiration time. You can update the table with the bq tool using bq update --expiration <time_from_now_in_seconds> dataset.table.Scratches
@Distich BigQuery doesn't have a limit on the number of tables you can create per dataset.Telium
Is deletion of data based on some user specified filters still not possible? I'm just wondering if I'm streaming data into bigquery and if I get some duff data is it possible to clear them? Are there any patterns to handle duff data?Toffee
Created a related question, we might have found a way on how to do it: #34839122Selfexistent
Finally I can delete and update! I've been waiting for this feature for yearsGraceless
There are some constraints to delete data fro table.If any streaming write operation is going on the table, then you can't delete data.Sepulchral
S
37

For deleting records in Big query, you have to first enable standard sql.

Steps for enabling Standard sql

  1. Open the BigQuery web UI.
  2. Click Compose Query.
  3. Click Show Options.
  4. Uncheck the Use Legacy SQL checkbox.

This will enable the the BigQuery Data Manipulation Language (DML) to update, insert, and delete data from the BigQuery tables

Now, you can write the plain SQL query to delete the record(s)

DELETE [FROM] target_name [alias] WHERE condition

You can refer: https://cloud.google.com/bigquery/docs/reference/standard-sql/dml-syntax#delete_statement

Surfboat answered 16/3, 2017 at 10:51 Comment(0)
Z
22

#standardSQL If you want to delete all the rows then use below code

delete from `project-id.data_set.table_name` where 1=1;

If you want to delete particular row then use below code.

delete from `project-id.data_set.table_name` where (your condition)
Zalea answered 29/7, 2020 at 14:21 Comment(2)
deleting all rows does not working.Hobard
did you tried using this delete from project-id.data_set.table_name where 1=1; ?Zalea
M
7

If you want to delete all rows in a table then

DELETE FROM {dataset}.{table} WHERE TRUE
Montespan answered 23/4, 2020 at 22:31 Comment(3)
I get a error message telling UPDATE or DELETE statement over table tenor.trending_terms would affect rows in the streaming buffer, which is not supportedRhoades
Incase anyone else is wondering, it takes BigQuery time to injest/distribute the streaming buffer. If you are getting the would affect rows in the streaming buffer, which is not supported error, simply wait a few minutes. This answer seems to indicate it can take up to 90 minutes. I found in my case ~5 mins was enough to clear it out.Hopkins
I'm excited to announce that mutating DML statements (UPDATE, DELETE, MERGE) over recently streamed data via the BigQuery Storage Write API* is now supported in public preview!! Check it out the feature and how to allowlist your project here: cloud.google.com/bigquery/docs/…. *This feature only supports recently streamed data via the BigQuery Storage Write API, not the legacy insertAll streaming API.Unavoidable
F
5

This is only relevant if using Legacy SQL.

You could try the following:

DELETE FROM {dataset}.{table} WHERE {constraint}
Fullerton answered 27/2, 2019 at 16:33 Comment(1)
delete from dataset.table where trueMorganstein
B
5

What worked for me:

TRUNCATE TABLE `project_id.dataset.table_name`
Brominate answered 27/4, 2021 at 19:24 Comment(0)
L
4

Also, if applicable, you can try BigQuery's OMIT RECORD IF, to return all items except what you want to delete. Then, create a new table from that query result.

(example taken from Google reference docs)

SELECT * FROM
  publicdata:samples.github_nested

OMIT RECORD IF
  COUNT(payload.pages.page_name) <= 80;

Source: https://cloud.google.com/bigquery/query-reference

Littlejohn answered 15/7, 2016 at 13:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.