mysqldump - exclude some table data
Asked Answered
C

11

72

Is it possible, using mysql dump to export the entire database structure, but exclude certain tables data from export.

Say the database has 200 tables, I wish to export the structure of all 200 tables, but i want to ignore the data of 5 specific tables.

If this is possible, how is it done?

Compliment answered 27/11, 2012 at 21:12 Comment(3)
I think you would need to do two backups - one with just the structure for all tables, and another with just the data for all but the 5 tables.Meletius
But then you risk inconsistent results.Yarvis
Marty, please help the forum. Look at @Kantholy's answer down below and strongly consider making it the accepted answer. The one you have accepted is NOT the most useful or even relevant.Kelter
L
13

As per the mysqldump docs:

mysqldump name_of_db --ignore-table=name_of_db.name_of_table
Lynea answered 27/11, 2012 at 21:17 Comment(5)
It is only data to be ignored, i need the structureCompliment
then do one dump with structure only (--no-data), then another dump that skips the structure but dumps and skips your unwanted tables. or you simply append some delete from ... queries at the end of the global dump to delete the data from your tables.Lynea
Yeah, so export all that data, store all that data, import all that data, and then just delete. A strange definition of exclusion.Yarvis
This answer does not really do what the OP is asking for (ie. dump all tables' structure but skip one table's data)Faena
Yeah not what is requested, i do like AmitP's answer, since it will be usable in one command and result in one fileConservative
S
147

This will produce export.sql with structure from all tables and data from all tables excluding table_name

mysqldump --ignore-table=db_name.table_name db_name > export.sql
mysqldump --no-data db_name table_name >> export.sql
Sacculate answered 7/4, 2015 at 15:10 Comment(6)
This is a better answer because it skips the unwanted data but gets the wanted schema, and I like that it still makes one file. Note that you can the use --ignore-table option multiple times to ignore multiple tables.Onomastic
This export.sql will create a database with empty tables. The calls should be executed the other way around: First the one with --no-data, then the one with --ignore-tableSmith
@Smith the first command simply dumps the whole db to export.sql excluding a table called 'table_name'. the second command just adds the structure of 'table_name' without it's data. please test it and see how it works, or at least explain your statement about empty tables.Sacculate
@Sacculate Thank you for your quick reply. When I tested it the first time I forgot to add table_name to the --no-data command. This added a DROP TABLE and a CREATE TABLE for every table - which was not what I wanted :)Smith
For multiple tables : mysqldump ---ignore-table=db_name.table1 --ignore-table=db_name.table2 [...] db_name > export.sqlGibbie
In the event that you have 100 tables, you want the data+structure for 80, but want structure-only for 20, @Lhary 's answer should be followed instead of this one.Wu
L
82

I think that AmitP's solution is great already - to improve it even further, I think it makes sense to create all tables (structure) first and then fill it with data, except the ones "excluded"

mysqldump --no-data db_name > export.sql
mysqldump --no-create-info --ignore-table=db_name.table_name db_name >> export.sql

if you want to exclude more than 1 table, simply use the --ignore-tabledirective more often (in the 2nc command) - see mysqldump help:

--ignore-table=name   Do not dump the specified table. To specify more than one
                      table to ignore, use the directive multiple times, once
                      for each table.  Each table must be specified with both
                      database and table names, e.g.,
                     --ignore-table=database.table
Lhary answered 18/11, 2015 at 12:56 Comment(11)
@dspjm, can you explain why this answer is not as good as AmitP's answer? :)Jacie
In fact this is far superior to AmitP's answer. It is the more logical way to approach the problem and creates a file which has a more readable order, should that be required.Singlet
I agree this answer is nicer than AmitP's but it shares the same issue that the autoincrement value stored in the schema portion will not be correct if a row gets added to a table between the two mysqldumps. I'm not sure if there is a "real" fix for that though...Agonist
@Agonist execute before: FLUSH TABLES WITH READ LOCK; execute after: UNLOCK TABLES;Lhary
@Agonist to avoid the autoincrement/table modifications between the two dumps -> you can simply lock the tables, do the backup and release them again.Lhary
@Lhary I see. Would this not result in making the application unresponsive during the db dump time?Agonist
@Agonist no, the database is only locked for create/update/delete commands. read is still possible. Of course, if you have a write intensive application, you need a "maintenance" window for doing the backups my way.. if you have this requirement, mysqldump is not the way to go anyway :)Lhary
I wouldn't say this is "far superior" to @AmitP's answer, but it's definitely the cleaner approach. All structure first and then only the data you need.Hairworm
Obviously superior to @Sacculate s answer for the fact that it doesn't write CREATE statements for tables that already exist in the target.Dewie
would be nice if they had adopted more of an include/exclude table pattern in mysqldump for the inverse situation that ignore-table is useful.Calycine
Thank you for this, I tried improving it by not writing uncompressed data, see another answer hereAdrieneadrienne
E
19

I am a new user, and do not have enough reputation to vote or comment on answers, so I am simply sharing this as an answer.

@kantholy clearly has the best answer.

@AmitP's method dumps all structure and data to a file, and then a drop/create table statement at the end. The resulting file will still require you to import all of your unwanted data before simply destroying it.

@kantholy's method dumps all structure first, and then only data for the table you do not ignore. This means your subsequent import will not have to take the time to import all the data you do not want - especially important if you have very large amounts of data you want to ignore to save time.

To recap, the most efficient answer is:

mysqldump --no-data db_name > export.sql
mysqldump --no-create-info --ignore-table=db_name.table_name1 [--ignore-table=db_name.table_name2, ...] db_name >> export.sql
Eleonoreeleoptene answered 31/3, 2016 at 4:24 Comment(1)
I'm confused, what are you adding to this Q&A? There's not 50 answers on here that need to be summarized by yet another answer. There are 3 answers, two of which are very similar and can be easily read by any person landing here looking for a solution. It's too bad you can't comment, but to be honest, even if you could, this doesn't even belong as a comment. It's just adding noise.Hairworm
L
13

As per the mysqldump docs:

mysqldump name_of_db --ignore-table=name_of_db.name_of_table
Lynea answered 27/11, 2012 at 21:17 Comment(5)
It is only data to be ignored, i need the structureCompliment
then do one dump with structure only (--no-data), then another dump that skips the structure but dumps and skips your unwanted tables. or you simply append some delete from ... queries at the end of the global dump to delete the data from your tables.Lynea
Yeah, so export all that data, store all that data, import all that data, and then just delete. A strange definition of exclusion.Yarvis
This answer does not really do what the OP is asking for (ie. dump all tables' structure but skip one table's data)Faena
Yeah not what is requested, i do like AmitP's answer, since it will be usable in one command and result in one fileConservative
L
10

In mysqldump from MariaDB in version 10.1 or higher you can use --ignore-table-data:

mysqldump --ignore-table-data="db_name.table" db_name > export.sql

For multiple tables repeat the --ignore-table-data option:

mysqldump --ignore-table-data="db_name.table_1" --ignore-table-data="db_name.table_2" db_name > export.sql

From MariaDB mysqldump docs:

--ignore-table-data=name

Do not dump the specified table data (only the structure). To specify more than one table to ignore, use the directive multiple times, once for each table. Each table must be specified with both database and table names. From MariaDB 10.1.46, MariaDB 10.2.33, MariaDB 10.3.24, MariaDB 10.4.14 and MariaDB 10.5.3. See also --no-data.

Lewallen answered 7/7, 2021 at 10:25 Comment(2)
Did not work with mysqldump 8 against MariaDB 10.5.13Encratis
I updated to the mariadb-client in version 10 which just adds a symlink to mysqldump. This worked :)Encratis
A
3

Previous answers don't fix the issue with the AUTO_INCREMENT when we export the structure and don't show how to export some specific data in tables.

To go further, we must do :

1/ Export the structure

mysqldump --no-data db_name | sed 's/ AUTO_INCREMENT=[0-9]*\b//g' > export-structure.sql

2/ Export only data and ignores some tables

mysqldump --no-create-info --ignore-table=db_name.table_name1 [--ignore-table=db_name.table_name2, ...] db_name >> export-data.sql

3/ Export specific data in one table

mysqldump --no-create-info --tables table_name --where="id not in ('1', '2', ...)" > export-table_name-data.sql

I tried to use the --skip-opt option to reset AUTO_INCREMENT but this also delete the AUTO_INCREMENT definition on the field, the CHARSET and other things

Abbeyabbi answered 30/4, 2019 at 8:39 Comment(0)
C
2

Another possibility that I use is to avoid the lines inserting data into the wanted table.

The principle is to filter out the INSERT INTO lines using grep -v

mysqldump name_of_db | grep -v 'INSERT INTO \`name_of_table\` VALUES'

or

mysqldump name_of_db | grep -v 'INSERT INTO \`name_of_db\`.\`name_of_table\` VALUES'

That you can easily get into a gziped file and a separated error file

mysqldump name_of_db | grep -v 'INSERT INTO \`name_of_db\`.\`name_of_table\`' | gzip > /path/dumpfile.sql.gz 2> /path/name_of_db.err

and therefore get a nice backup of what you want and know what failed if any :-)

Casimir answered 22/11, 2018 at 8:14 Comment(0)
A
2

To further improve on kantholy's answer, adding compression and removing most of the disk writes by not writing uncompressed data:

#!/bin/bash
echo -n "db name:"
read -r db_name
echo -n "username:"
read -r username
echo -n "Exclude data from table:"
read -r exclude_table_data

{
 mysqldump "$db_name" --user="$username" --password --no-tablespaces --no-data \
 && \
 mysqldump "$db_name" --user="$username" --password --no-tablespaces --no-create-info \
 --ignore-table="${db_name}.${exclude_table_data}";
} \
| bzip2 -c9 \
> "${db_name}_$(date +%y%m%d_%H%M).sql.bz2"
Adrieneadrienne answered 16/11, 2020 at 14:17 Comment(0)
M
1

In my opinion the best answer is from Steak, the only answer really working on any case.

All the answers suggesting two dumps are wrong, or at least they can work just under certain premises.

As many have pointed above you can have problems with sequences.

But I find more critical that the database can have triggers that validate or process information (suppose a trigger that insert records on table B when inserting on table A) - in this case, the sequence of creating the full schema (including triggers) and then inserting the data will create a different set of results.

Moralez answered 13/8, 2019 at 15:53 Comment(0)
T
1

The below command will export the database structure and ignore the data

mysqldump --no-data --databases -u[db_user] -p[db_password] [schema] > File.sql

Then export the data ignoring the table

mysqldump --ignore-table=[schema.table_name] --databases -u[db_user] -p[db_password] [schema] >> File.sql

Tragedian answered 24/2, 2022 at 6:23 Comment(0)
P
0

How about remove the inserts after exporting https://gist.github.com/lionslair/b867e95aad4703e9cd1333dbe3f36024

Permanent answered 19/3 at 5:25 Comment(2)
meta.stackexchange.com/a/8259/997587 just inline itThickening
While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes.Iridize

© 2022 - 2024 — McMap. All rights reserved.