I used this command to backup 200GB database (postgres 9.1, win7 x64):
pg_dump -Z 1 db_name > backup
It created 16GB file, which is fine I think because previous backups which works (and were packed by ext. tools) had similar size. Now, when I'm trying to restore into PG9.2 using pg_restore
, I'm getting the error:
input file does not appear to be a valid archive
With pg_restore -Ft
:
[tar archiver] corrupt tar header found in ▼ (expected 13500752, com puted 78268) file position 512
Gzip also shows it's corrupted. When I open the backup file in Total Commander, the inner file has only 1.8GB.
When I was looking for a solution, dump should be done with -Cf
parameter probably.
Which format has the file right now? Is it only tar or gzip (winrar shows gzip)? Is there any way to restore this properly or is it corrupted somehow (no error when dumped)? Could it be due to file size limitations of tar or gzip?
pg_dump -Z1 my_new_database > backup
everytime backup is corrupted, but on postgres DB as default db createdpg_dump -Z1 postgres > backup
it works fine. How is it possible? – Legumin