How to skip certain database tables with mysqldump?
Asked Answered
B

12

678

Is there a way to restrict certain tables from the mysqldump command?

For example, I'd use the following syntax to dump only table1 and table2:

mysqldump -u username -p database table1 table2 > database.sql

But is there a similar way to dump all the tables except table1 and table2? I haven't found anything in the mysqldump documentation, so is brute-force (specifying all the table names) the only way to go?

Baking answered 8/1, 2009 at 17:30 Comment(0)
A
1121

You can use the --ignore-table option. So you could do

mysqldump -u USERNAME -pPASSWORD DATABASE --ignore-table=DATABASE.table1 > database.sql

There is no whitespace after -p (this is not a typo).

To ignore multiple tables, use this option multiple times, this is documented to work since at least version 5.0.

If you want an alternative way to ignore multiple tables you can use a script like this:

#!/bin/bash
PASSWORD=XXXXXX
HOST=XXXXXX
USER=XXXXXX
DATABASE=databasename
DB_FILE=dump.sql
EXCLUDED_TABLES=(
table1
table2
table3
table4
tableN   
)
 
IGNORED_TABLES_STRING=''
for TABLE in "${EXCLUDED_TABLES[@]}"
do :
   IGNORED_TABLES_STRING+=" --ignore-table=${DATABASE}.${TABLE}"
done

echo "Dump structure"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} --single-transaction --no-data --routines ${DATABASE} > ${DB_FILE}

echo "Dump content"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} ${DATABASE} --no-create-info --skip-triggers ${IGNORED_TABLES_STRING} >> ${DB_FILE}
Aramen answered 8/1, 2009 at 17:30 Comment(17)
is there any way to just skip the table contents? the structure i want to backup.Winton
You can use the --no-data=true option, but I don't know if you can do that on a per table level.Saltant
if the database name is not set for each --ignore-table then you will get a 'Illegal use of option --ignore-table=<database>.<table>' error. Make sure you always declare the database!Beetlebrowed
if you want to ignore some tables data, but still dump their structure, you can run mysqldump again fo those tables, and concatenate it onto the backup you just createdIdle
I've been constantly backing up a table that stores deleted info and it's taking up gigabytes of space... the deleted info tables aren't a necessity to backup and when I discovered this trick just now the backups went down to about 2 Megs total. BEST... TRICK... EVER!Niven
Is there a reason --single-transaction is used on the structure but not data dump?Mellissamellitz
note: it is ok, if you add a --ignore-table=db.table where the table does not existGerick
According to https://mcmap.net/q/40849/-dump-only-the-data-with-mysqldump-without-any-table-information dump content (second command) should be mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} --no-create-info ${DATABASE} ${IGNORED_TABLES_STRING} >> ${DB_FILE}Disqualification
maybe you will use this instead loop IGNORED_TABLES_STRING="$(IFS=" "; echo "${EXCLUDED_TABLES[*]/#/--ignore-table=$DATABASE.}")"Unawares
compressed version, use | gzip -c > ${DB_FILE}.gz instead of > ${DB_FILE}Proceed
where is that AI? grep man mysqldump by except, then by exclude, but I forgot about ignore...Yaws
Instead of building up a bunch of --ignore-table options with a loop, it's much neater to use your shell's expansion. In Bash you could do --ignore-table=mydb.{table1,table2,table3}. Note for this to work you need to use = between option name and value rather than a space.Corticate
Thanks. Works for me as well. My use case is that I need to dump a db to a file and import it to my local sql server. The db has a few special tables that cannot be directly imported together with the rest in one go. I had to remove --single-transaction and --routines to get it to work. But it worked. Thanks!Retractor
I recommend passing --no-create-db to the second command for data, so the database itself is not tried to be created a second time. Although this is just cosmetic with the IF NOT EXISTS clause.Authorization
For security reasons, It's better to leave empty the -p parameter so it'll ask that in the console, in this way the password will not be saved in the log of the consoleLatta
Don't put passwords on command line or in scripts. Create .my.cnf file mode 0600 with user = myuser \n password = mypassword. Then in the command line use --defaults-file=/home/myuser/.my.cnf (or whatever). That keeps your password secure.Inconvenience
use -ignore-table-data=db.table_A -ignore-table-data=db.table_B to ignore just the data of the tablesHyperbole
B
173

Building on the answer from @Brian-Fisher and answering the comments of some of the people on this post, I have a bunch of huge (and unnecessary) tables in my database so I wanted to skip their contents when copying, but keep the structure:

mysqldump -h <host> -u <username> -p <database> --no-data > db.sql
mysqldump -h <host> -u <username> -p <database> --no-create-info --skip-triggers --ignore-table=schema.table1 --ignore-table=schema.table2 >> db.sql

The resulting file is structurally sound but the dumped data is now ~500MB rather than 9GB, much better for me. I can now import the file into another database for testing purposes without having to worry about manipulating 9GB of data or running out of disk space.

Bala answered 8/1, 2009 at 17:30 Comment(2)
Tested and used under MySQL 5.5.43 (for debian-linux-gnu (x86_64)) ThanksXerarch
great solution. I had to add --skip-triggers to the second statement for the dump to work later (assuming you have triggers), but otherwise: perfectFourpence
U
81

Another example for ignoring multiple tables

/usr/bin/mysqldump -uUSER -pPASS --ignore-table={db_test.test1,db_test.test3} db_test> db_test.sql

using --ignore-table and create an array of tables, with syntaxs like

--ignore-table={db_test.table1,db_test.table3,db_test.table4}

Extra:

Import database

 # if file is .sql
 mysql -uUSER  -pPASS db_test < backup_database.sql
 # if file is .sql.gz
 gzip -dc < backup_database.sql.gz | mysql -uUSER -pPASSWORD db_test

Simple script to ignore tables and export in .sql.gz to save space

#!/bin/bash

#tables to ignore
_TIGNORE=(
my_database.table1
my_database.table2
my_database.tablex
)

#create text for ignore tables
_TDELIMITED="$(IFS=" "; echo "${_TIGNORE[*]/#/--ignore-table=}")"

#don't forget to include user and password
/usr/bin/mysqldump -uUSER -pPASSWORD --events ${_TDELIMITED} --databases my_database | gzip -v > backup_database.sql.gz

Links with information that will help you

Note: tested in ubuntu server with mysql Ver 14.14 Distrib 5.5.55

Unawares answered 8/1, 2009 at 17:30 Comment(3)
Great way to avoid using a script when you want to ignore multiple tables. That answer should receive more "+1"Moiety
My top tip for importing is to use pv. e.g. pv db.sql.zst | unzst | mysql db (swap out for gzip/xz/...) then you get a visual chart showing progress and ETAMurray
@Murray yes, you have a progress info, but it last more than without information.Unawares
H
74

for multiple databases:

mysqldump -u user -p --ignore-table=db1.tbl1 --ignore-table=db2.tbl1 --databases db1 db2 ..
Hydrargyrum answered 8/1, 2009 at 17:30 Comment(4)
There's already a reply to this thread that says that, better.Edmondson
Thats right, but this solution works with multiple databases.Barque
Interesting - I first thought mysqld and mysqldump would be the same programs.Unscramble
that's what i'm looking for - a one-liner answer with the solution which doesn't force me to read something for 2-3mins :PToddle
H
14

To exclude some table data, but not the table structure. Here is how I do it:

Dump the database structure of all tables, without any data:

mysqldump -u user -p --no-data database > database_structure.sql

Then dump the database with data, except the excluded tables, and do not dump the structure:

mysqldump -u user -p --no-create-info \
    --ignore-table=database.table1 \
    --ignore-table=database.table2 database > database_data.sql

Then, to load it into a new database:

mysql -u user -p newdatabase < database_structure.sql
mysql -u user -p newdatabase < database_data.sql
Haines answered 8/1, 2009 at 17:30 Comment(1)
This is a nice way to do it. Just a quick tip: you could dump the structure to database.sql like you do using > database.sql and then for second dump statement simply append to that file using >> database.sql instead. That way you have only a single file to import.Memphis
T
4

You can use the mysqlpump command with the

--exclude-tables=name

command. It specifies a comma-separated list of tables to exclude.

Syntax of mysqlpump is very similar to mysqldump, buts its way more performant. More information of how to use the exclude option you can read here: https://dev.mysql.com/doc/refman/5.7/en/mysqlpump.html#mysqlpump-filtering

Tatter answered 8/1, 2009 at 17:30 Comment(0)
A
1

In general, you need to use this feature when you don't want or don't have time to deal with a huge table. If this is your case, it's better to use --where option from mysqldump limiting resultset. For example, mysqldump -uuser -ppass database --where="1 = 1 LIMIT 500000" > resultset.sql.

Aswarm answered 8/1, 2009 at 17:30 Comment(0)
G
1

For sake of completeness, here is a script which actually could be a one-liner to get a backup from a database, excluding (ignoring) all the views. The db name is assumed to be employees:

ignore=$(mysql --login-path=root1 INFORMATION_SCHEMA \
    --skip-column-names --batch \
    -e "select 
          group_concat(
            concat('--ignore-table=', table_schema, '.', table_name) SEPARATOR ' '
          ) 
        from tables 
        where table_type = 'VIEW' and table_schema = 'employees'")

mysqldump --login-path=root1 --column-statistics=0 --no-data employees $ignore > "./backups/som_file.sql"   

You can update the logic of the query. In general using group_concat and concat you can generate almost any desired string or shell command.

Gullet answered 8/1, 2009 at 17:30 Comment(0)
E
1

I like Rubo77's solution, I hadn't seen it before I modified Paul's. This one will backup a single database, excluding any tables you don't want. It will then gzip it, and delete any files over 8 days old. I will probably use 2 versions of this that do a full (minus logs table) once a day, and another that just backs up the most important tables that change the most every hour using a couple cron jobs.

#!/bin/sh
PASSWORD=XXXX
HOST=127.0.0.1
USER=root
DATABASE=MyFavoriteDB

now="$(date +'%d_%m_%Y_%H_%M')"
filename="${DATABASE}_db_backup_$now"
backupfolder="/opt/backups/mysql"
DB_FILE="$backupfolder/$filename"
logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt

EXCLUDED_TABLES=(
logs
)
IGNORED_TABLES_STRING=''
for TABLE in "${EXCLUDED_TABLES[@]}"
do :
   IGNORED_TABLES_STRING+=" --ignore-table=${DATABASE}.${TABLE}"
done

echo "Dump structure started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} --single-transaction --no-data --routines ${DATABASE}  > ${DB_FILE} 
echo "Dump structure finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "Dump content"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} ${DATABASE} --no-create-info --skip-triggers ${IGNORED_TABLES_STRING} >> ${DB_FILE}
gzip ${DB_FILE}

find "$backupfolder" -name ${DATABASE}_db_backup_* -mtime +8 -exec rm {} \;
echo "old files deleted" >> "$logfile"
echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "*****************" >> "$logfile"
exit 0
Ellingson answered 8/1, 2009 at 17:30 Comment(0)
T
0

For example, using --ignore-table, you can export the schema and data of the tables except person and animal tables of apple database to backup.sql as shown below. *You must use multiple --ignore-table to specify multiple tables and you must specify both a database and a table together <database>.<table> for --ignore-table otherwise there is error and my answer explains how to export the schema and data of the tables of a database:

mysqldump -u john -p apple --ignore-table=apple.person --ignore-table=apple.animal > backup.sql

And, using -B(--database) and --ignore-table, you can export the schema and data of apple and orange databases except apple database's person table and orange database's animal table to backup.sql as shown below. *My answer explains how to export the schema and data of multiple databases:

mysqldump -u john -p -B apple orange --ignore-table=apple.person --ignore-table=orange.animal > backup.sql

And, using -A(--all-databases) and --ignore-table, you can export the schema and data of all databases except apple database's person table and orange database's animal table to backup.sql as shown below. *My answer explains how to export the schema and data of all databases:

mysqldump -u john -p -A --ignore-table=apple.person --ignore-table=orange.animal > backup.sql
Trunnion answered 8/1, 2009 at 17:30 Comment(0)
D
0

Skip certain tables with mysqldump

Suppose there are some test tables in some databases and you want to exclude them from the backup; you can specify using the -- exclude-tables option, which will exclude tables with the name test across all databases:

shell> mysqldump --exclude-tables=test --resultfile=backup_excluding_test.sql

Danie answered 8/1, 2009 at 17:30 Comment(1)
Doesn't look like this is available in later versions. But --ignore-table=database.table is.Alburnum
G
0

Dump all databases with all tables but skip certain tables

on github: https://github.com/rubo77/mysql-backup.sh/blob/master/mysql-backup.sh

#!/bin/bash
# mysql-backup.sh

if [ -z "$1" ] ; then
  echo
  echo "ERROR: root password Parameter missing."
  exit
fi
DB_host=localhost
MYSQL_USER=root
MYSQL_PASS=$1
MYSQL_CONN="-u${MYSQL_USER} -p${MYSQL_PASS}"
#MYSQL_CONN=""

BACKUP_DIR=/backup/mysql/

mkdir $BACKUP_DIR -p

MYSQLPATH=/var/lib/mysql/

IGNORE="database1.table1, database1.table2, database2.table1,"

# strpos $1 $2 [$3]
# strpos haystack needle [optional offset of an input string]
strpos()
{
    local str=${1}
    local offset=${3}
    if [ -n "${offset}" ]; then
        str=`substr "${str}" ${offset}`
    else
        offset=0
    fi
    str=${str/${2}*/}
    if [ "${#str}" -eq "${#1}" ]; then
        return 0
    fi
    echo $((${#str}+${offset}))
}

cd $MYSQLPATH
for i in */; do
    if [ $i != 'performance_schema/' ] ; then 
    DB=`basename "$i"` 
    #echo "backup $DB->$BACKUP_DIR$DB.sql.lzo"
    mysqlcheck "$DB" $MYSQL_CONN --silent --auto-repair >/tmp/tmp_grep_mysql-backup
    grep -E -B1 "note|warning|support|auto_increment|required|locks" /tmp/tmp_grep_mysql-backup>/tmp/tmp_grep_mysql-backup_not
    grep -v "$(cat /tmp/tmp_grep_mysql-backup_not)" /tmp/tmp_grep_mysql-backup
        
    tbl_count=0
    for t in $(mysql -NBA -h $DB_host $MYSQL_CONN -D $DB -e 'show tables') 
    do
      found=$(strpos "$IGNORE" "$DB"."$t,")
      if [ "$found" == "" ] ; then 
        echo "DUMPING TABLE: $DB.$t"
        mysqldump -h $DB_host $MYSQL_CONN $DB $t --events --skip-lock-tables | lzop -3 -f -o $BACKUP_DIR/$DB.$t.sql.lzo
        tbl_count=$(( tbl_count + 1 ))
      fi
    done
    echo "$tbl_count tables dumped from database '$DB' into dir=$BACKUP_DIR"
    fi
done

With a little help of https://mcmap.net/q/41502/-dump-all-mysql-tables-into-separate-files-automatically

It uses lzop which is much faster, see:http://pokecraft.first-world.info/wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO

Gerick answered 8/1, 2009 at 17:30 Comment(1)
Document you share says gzip is faster than lzop.Touched

© 2022 - 2024 — McMap. All rights reserved.