I have a large SQL file with one database and about 150 tables. I would like to use mysqlimport
to import that database, however, I would like the import process to ignore or skip over a couple of tables. What is the proper syntax to import all tables, but ignore some of them? Thank you.
mysqlimport
is not the right tool for importing SQL statements. This tool is meant to import formatted text files such as CSV. What you want to do is feed your sql dump directly to the mysql
client with a command like this one:
bash > mysql -D your_database < your_sql_dump.sql
Neither mysql
nor mysqlimport
provide the feature you need. Your best chance would be importing the whole dump, then dropping the tables you do not want.
If you have access to the server where the dump comes from, then you could create a new dump with mysqldump --ignore-table=database.table_you_dont_want1 --ignore-table=database.table_you_dont_want2 ...
.
Check out this answer for a workaround to skip importing some table
The accepted answer by RandomSeed could take a long time! Importing the table (just to drop it later) could be very wasteful depending on size.
For a file created using
mysqldump -u user -ppasswd --opt --routines DBname > DBdump.sql
I currently get a file about 7GB, 6GB of which is data for a log table that I don't 'need' to be there; reloading this file takes a couple of hours. If I need to reload (for development purposes, or if ever required for a live recovery) I skim the file thus:
sed '/INSERT INTO `TABLE_TO_SKIP`/d' DBdump.sql > reduced.sql
And reload with:
mysql -u user -ppasswd DBname < reduced.sql
This gives me a complete database, with the "unwanted" table created but empty. If you really don't want the tables at all, simply drop the empty tables after the load finishes.
For multiple tables you could do something like this:
sed '/INSERT INTO `TABLE1_TO_SKIP`/d' DBdump.sql | \
sed '/INSERT INTO `TABLE2_TO_SKIP`/d' | \
sed '/INSERT INTO `TABLE3_TO_SKIP`/d' > reduced.sql
There IS a 'gotcha' - watch out for procedures in your dump that might contain "INSERT INTO TABLE_TO_SKIP".
CREATE TABLE
statements too. –
Byssus sed -r
option and a RegExp in case you want to eliminate more tables in one go, like: sed -r '/INSERT INTO `(TABLE1_TO_SKIP|TABLE2_TO_SKIP)`/d' DBdump.sql > reduced.sql
–
Precessional TABLE_TO_SKIP
is case-sensitive, it's usually table_to_skip
:) –
Prosaic sed
is not an option or you need case-insensitive ignoring: grep -viE 'INSERT INTO `(TABLE1_TO_SKIP|TABLE2_TO_SKIP)` DBdump.sql > reduced.sql
–
Adz sed
command using the d
flag is used for deleting lines which match the pattern between /../
–
Uis -- Table structure for table <TABLE_TO_SKIP> DROP TABLE IF EXISTS <TABLE_TO_SKIP>; CREATE TABLE <TABLE_TO_SKIP> ( -- Dumping data for table <TABLE_TO_SKIP> LOCK TABLES <TABLE_TO_SKIP> WRITE;
are not removed with this sed command if they exist. Some were not aware, but im not gonna be naming myself –
Minh TABLE_TO_SKIP
/d' DBdump.sql > reduced.sql" after this i import and the TABLE_TO_SKIP has been deleted from db already exists after import!!! ----------- it shouldn't becouse this line deleted only INSERT not CREATE Table with DELETE IF EXISTS which is standard included.. in mysqldump. So this scrip makes bad bad bad way!! <---------- read tthis important note –
Gryphon INSERT
s may cause errors while restoring some triggers and procedures, I propose using -f
flag in MySQL restore command to skip errors. Afterwhile you may check for corrupted triggers and procedures. –
Spud For anyone working with .sql.gz files; I found the following solution to be very useful. Our database was 25GB+ and I had to remove the log tables.
gzip -cd "./mydb.sql.gz" | sed -r '/INSERT INTO `(log_table_1|log_table_2|log_table_3|log_table_4)`/d' | gzip > "./mydb2.sql.gz"
Thanks to the answer of Don and comment of Xosofox and this related post: Use zcat and sed or awk to edit compressed .gz text file
log_table_*
or such? –
Deduce -r
in sed -r '/INSERT INTO.../d'
is short for --regexp-extended
(linux.die.net/man/1/sed). So you should be able to use regular expressions. –
Bot mysqlimport
is not the right tool for importing SQL statements. This tool is meant to import formatted text files such as CSV. What you want to do is feed your sql dump directly to the mysql
client with a command like this one:
bash > mysql -D your_database < your_sql_dump.sql
Neither mysql
nor mysqlimport
provide the feature you need. Your best chance would be importing the whole dump, then dropping the tables you do not want.
If you have access to the server where the dump comes from, then you could create a new dump with mysqldump --ignore-table=database.table_you_dont_want1 --ignore-table=database.table_you_dont_want2 ...
.
Check out this answer for a workaround to skip importing some table
Little old, but figure it might still come in handy...
I liked @Don's answer (https://mcmap.net/q/325275/-how-to-ignore-certain-mysql-tables-when-importing-a-database) but found it very annoying that you'd have to write to another file first...
In my particular case this would take too much time and disc space
So I wrote a little bash script:
#!/bin/bash
tables=(table1_to_skip table2_to_skip ... tableN_to_skip)
tableString=$(printf "|%s" "${tables[@]}")
trimmed=${tableString:1}
grepExp="INSERT INTO \`($trimmed)\`"
zcat $1 | grep -vE "$grepExp" | mysql -uroot -p
this does not generate a new sql script but pipes it directly to the database
also, it does create the tables, just doesn't import the data (which was the problem I had with huge log tables)
Unless you have ignored the tables during the dump with mysqldump --ignore-table=database.unwanted_table
, you have to use some script or tool to filter out the data you don't want to import from the dump file before passing it to mysql
client.
Here is a bash/sh function that would exclude the unwanted tables from a SQL dump on the fly (through pipe):
# Accepts one argument, the list of tables to exclude (case-insensitive).
# Eg. filt_exclude '%session% action_log %_cache'
filt_exclude() {
local excl_tns;
if [ -n "$1" ]; then
# trim & replace /[,;\s]+/ with '|' & replace '%' with '[^`]*'
excl_tns=$(echo "$1" | sed -r 's/^[[:space:]]*//g; s/[[:space:]]*$//g; s/[[:space:]]+/|/g; s/[,;]+/|/g; s/%/[^\`]\*/g');
grep -viE "(^INSERT INTO \`($excl_tns)\`)|(^DROP TABLE (IF EXISTS )?\`($excl_tns)\`)|^LOCK TABLES \`($excl_tns)\` WRITE" | \
sed 's/^CREATE TABLE `/CREATE TABLE IF NOT EXISTS `/g'
else
cat
fi
}
Suppose you have a dump created like so:
MYSQL_PWD="my-pass" mysqldump -u user --hex-blob db_name | \
pigz -9 > dump.sql.gz
And want to exclude some unwanted tables before importing:
pigz -dckq dump.sql.gz | \
filt_exclude '%session% action_log %_cache' | \
MYSQL_PWD="my-pass" mysql -u user db_name
Or you could pipe into a file or any other tool before importing to DB.
If desired, you can do this one table at a time:
mysqldump -p sourceDatabase tableName > tableName.sql
mysql -p -D targetDatabase < tableName.sql
Here is my script to exclude some tables from mysql dump I use it to restore DB when need to keep orders and payments data
exclude_tables_from_dump.sh
#!/bin/bash
if [ ! -f "$1" ];
then
echo "Usage: $0 mysql_dump.sql"
exit
fi
declare -a TABLES=(
user
order
order_product
order_status
payments
)
CMD="cat $1"
for TBL in "${TABLES[@]}";do
CMD+="|sed 's/DROP TABLE IF EXISTS \`${TBL}\`/# DROP TABLE IF EXIST \`${TBL}\`/g'"
CMD+="|sed 's/CREATE TABLE \`${TBL}\`/CREATE TABLE IF NOT EXISTS \`${TBL}\`/g'"
CMD+="|sed -r '/INSERT INTO \`${TBL}\`/d'"
CMD+="|sed '/DELIMITER\ \;\;/,/DELIMITER\ \;/d'"
done
eval $CMD
It avoid DROP and reCREATE of tables and inserting data to this tables. Also it strip all FUNCTIONS and PROCEDURES that stored between DELIMITER ;; and DELIMITER ;
Beside the great answers in this topic, It could be that your .sql file does not have the INSERTS with this for every line:
INSERT INTO `log`
But they are actually inserted like this:
LOCK TABLES `log` WRITE;
INSERT INTO `log` (`id`, `data`)
VALUES
(1,'The log content'),
(2,'The log content'),
(3,'The log content'),
(4,'The log content')
UNLOCK TABLES;
You'll want to exclude everything between LOCK TABLES `log` WRITE;
and the next UNLOCK TABLES;
gzip -dc "$1" | sed \
-e '/LOCK TABLES `log` WRITE;/,/^UNLOCK TABLES/d;' \
-e '/LOCK TABLES `some_other_table` WRITE;/,/^UNLOCK TABLES/d;' \
| gzip > reduced_$(date '+%F_%T').sql.gz
Put that in a reduce_db.sh somewhere on your computer and every time you need it you execute it like this
bash /folder/to/reduce_db.sh /path/to/big_db_file.sql.gz
Good to know, only exclude data of which no other data in the database is dependent on. Otherwise your reduced database is inconsistent in the end.
I would not use it on production but if I would have to import some backup quickly that contains many smaller table and one big monster table that might take hours to import I would most probably "grep -v unwanted_table_name original.sql > reduced.sql
and then mysql -f < reduced.sql
another_table_SALES_fk
. Your answer as it stands could lead to a world of pain. ALSO if you are going to go down this route, surely fgrep would be much faster than grep? –
Rhodium Based on the other answers, I've created a BASH script:
https://gist.github.com/martin-rueegg/e247d99cc7402a4f1456c059c5d050ac
Examples
Show what will be filtered:
$ mysql_filter_tables --show "log" $ mysql_filter_tables --show "%session%" "action_log" "%_cache" $ mysql_filter_tables --show -- "-h" # in case your table name is "-h"
Usage with plain SQL file:
$ mysql_filter_tables "log" <big.sql >clean.sql $ cat big.sql | mysql_filter_tables "log" >clean.sql
Import plain SQL dump into mysql:
$ MYSQL_PWD=secret mysql_filter_tables -i big.sql "%_cache" | mysql -u user -D database
If source is compressed:
$ mysql_filter_tables -i big.sql.zip --unzip -o clean.sql "%session%" "%_cache" $ mysql_filter_tables -i big.sql.gz --gunzip -o clean.sql "%session%" "%_cache"
More Options
Command Options:
--show
show generated sed command, rather than executing it
-i | --input file
read from file, rather than standard input
-o | --output file
write result to file, rather than standard output
-f | --force
overwrite output file, if exists
-u | -ui | --unzip
unzip input stream. Use with -i to read from file
-z | -zo | --zip
zip ouput stream. Use with -o to write to file
-g | -gi | --gunzip
gunzip input stream. Use with -i to read from file
-G | -go | --gzip
gzip ouput stream. Use with -o to write to file
Info Options:
-h|--help show help
-V|--version show version
-1|--one-line show a one-liner that can be easily used in other scripts.
(Does not support any info options, except -V)
--copyright show copyright, license, and credits
--credits show copyright, license, and credits
--license show copyright, license, and credits
Installation
Download file from github.com
$ wget https://gist.github.com/martin-rueegg/e247d99cc7402a4f1456c059c5d050ac/raw/mysql_filter_tables.sh
Inspect the file for your own safety
Source the file, and then use the function in the shell, or ...
$ source mysql_filter_tables.sh $ mysql_filter_tables --help
... mark the file executable und use it as script
$ chmod a+x mysql_filter_tables.sh $ ./mysql_filter_tables.sh --help
Dependencies
bash
sed
,cat
zip
,unzip
gzip
,zcat
Credits
© 2022 - 2024 — McMap. All rights reserved.