Is there a way to filter pg_dump by timestamp for PostgreSQL?
Asked Answered
G

1

5

I have a database that needs backing up, but only for specific timestamps and tables, like from the first of October to the 15th of October. After looking up on multiple sites, I have not found any methods that can suit my requirements.

Let's say I have database_A, and database A has 15 tables. I want to be able to use pg_dump to back up 10 tables from database_A, from the 1st of October to the 15th of October, all into 1 file. Below is what I have managed to do, but have not gotten the date portion yet as I'm not entirely sure.

pg_dump -U postgres -t"\"table_1\"" -t"\"table_2\"" database_A > backup.csv

This above code will work if I want to back up multiple tables into one file, and it will back up the entire table, from start to end.

I would much appreciate if someone could help me with this, as I am still mostly a beginner at this. Thank you!

Glossolalia answered 2/11, 2021 at 10:1 Comment(3)
If the options of the pg_dump can't implement the data selection on timestamps, then you can use the COPY instead see the manual.Newmodel
pg_dump has no functionality for that.Furfural
@Colin did you get any solution to this?Agio
B
6

If the data you're copying has a column named timestamp you can use psql and the COPY command to accomplish this:

# Optional: clear existing table since COPY FROM will append data
psql -c "TRUNCATE TABLE my_table" target_db

psql -c "COPY (SELECT * FROM my_table WHERE timestamp >= '...' AND timestamp <= '...') TO STDOUT" source_db | psql -c "COPY my_table FROM STDIN" target_db

You can repeat this pattern for as many tables as necessary. I've used this approach before to copy a subset of live data into a development database and it works quite well, especially if you put the above commands into a shell script.

Bracknell answered 26/8, 2022 at 22:32 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.