How to ignore certain MySQL tables when importing a database?
Asked Answered
G

10

60

I have a large SQL file with one database and about 150 tables. I would like to use mysqlimport to import that database, however, I would like the import process to ignore or skip over a couple of tables. What is the proper syntax to import all tables, but ignore some of them? Thank you.

Gleeful answered 24/5, 2013 at 12:48 Comment(4)
i don't know if that is possible. can you create a backup of the .sql, seek them out in the text and clobber them or is this something you are going to be doing again and again and again. you could a CTRL-H for find and replace with a REM stmt, or at the end run a little "delete from table1;" "delete from table2" sorta scriptNgocnguyen
Are there a lot that need to be omitted? And how are they identified? Maybe consider importing the whole thing, then run a delete table on the ones you don't want to keep.Biquarterly
I think that there may be a problem with one of the tables in the SQL file, so my import process does not complete. Do you know whether using the import process from the command line is different from using import utility from MySQL Workbench?Gleeful
Related: #425658Formally
B
12

mysqlimport is not the right tool for importing SQL statements. This tool is meant to import formatted text files such as CSV. What you want to do is feed your sql dump directly to the mysql client with a command like this one:

bash > mysql -D your_database < your_sql_dump.sql

Neither mysql nor mysqlimport provide the feature you need. Your best chance would be importing the whole dump, then dropping the tables you do not want.

If you have access to the server where the dump comes from, then you could create a new dump with mysqldump --ignore-table=database.table_you_dont_want1 --ignore-table=database.table_you_dont_want2 ....


Check out this answer for a workaround to skip importing some table

Byssus answered 24/5, 2013 at 12:53 Comment(2)
Thanks for the comment. My problem seems to be that when I use the import command from within MySQL Workbench, this process works but only imports about a third of my tables and then hangs on a specific table. I can't get all the tables imported because of the problem table.Gleeful
Try to import your dump directly from the command line. Are you getting the same issue? At least you might be able to see some error message.Byssus
R
160

The accepted answer by RandomSeed could take a long time! Importing the table (just to drop it later) could be very wasteful depending on size.

For a file created using

mysqldump -u user -ppasswd --opt --routines DBname > DBdump.sql

I currently get a file about 7GB, 6GB of which is data for a log table that I don't 'need' to be there; reloading this file takes a couple of hours. If I need to reload (for development purposes, or if ever required for a live recovery) I skim the file thus:

sed '/INSERT INTO `TABLE_TO_SKIP`/d' DBdump.sql > reduced.sql

And reload with:

mysql -u user -ppasswd DBname < reduced.sql

This gives me a complete database, with the "unwanted" table created but empty. If you really don't want the tables at all, simply drop the empty tables after the load finishes.

For multiple tables you could do something like this:

sed '/INSERT INTO `TABLE1_TO_SKIP`/d' DBdump.sql | \
sed '/INSERT INTO `TABLE2_TO_SKIP`/d' | \
sed '/INSERT INTO `TABLE3_TO_SKIP`/d' > reduced.sql

There IS a 'gotcha' - watch out for procedures in your dump that might contain "INSERT INTO TABLE_TO_SKIP".

Rhodium answered 15/10, 2014 at 9:53 Comment(12)
wow!!! awesome, i've just reduced a backup from 1GB to 72MB, save me a lot of time, thanksMeagan
This makes sense, absolutely. You might want to trim the CREATE TABLE statements too.Byssus
Consider the sed -r option and a RegExp in case you want to eliminate more tables in one go, like: sed -r '/INSERT INTO `(TABLE1_TO_SKIP|TABLE2_TO_SKIP)`/d' DBdump.sql > reduced.sqlPrecessional
just FLY: TABLE_TO_SKIP is case-sensitive, it's usually table_to_skip :)Prosaic
for windows find for "sed for windows" - now it can be found here: gnuwin32.sourceforge.net/packages/sed.htm - and use double quot instead of single quote for the commandDowdell
If sed is not an option or you need case-insensitive ignoring: grep -viE 'INSERT INTO `(TABLE1_TO_SKIP|TABLE2_TO_SKIP)` DBdump.sql > reduced.sqlAdz
I am still running the import. Don't know if that worked yet. If it works, I could not express how thankful I would be.Drift
FYI: the sed command using the d flag is used for deleting lines which match the pattern between /../Uis
just beware that -- Table structure for table <TABLE_TO_SKIP> DROP TABLE IF EXISTS <TABLE_TO_SKIP>; CREATE TABLE <TABLE_TO_SKIP> ( -- Dumping data for table <TABLE_TO_SKIP> LOCK TABLES <TABLE_TO_SKIP> WRITE; are not removed with this sed command if they exist. Some were not aware, but im not gonna be naming myselfMinh
"sed '/INSERT INTO TABLE_TO_SKIP/d' DBdump.sql > reduced.sql" after this i import and the TABLE_TO_SKIP has been deleted from db already exists after import!!! ----------- it shouldn't becouse this line deleted only INSERT not CREATE Table with DELETE IF EXISTS which is standard included.. in mysqldump. So this scrip makes bad bad bad way!! <---------- read tthis important noteGryphon
@CT I can concur that some were not aware, including myself. Cheers for that.Franko
Since deleting INSERTs may cause errors while restoring some triggers and procedures, I propose using -f flag in MySQL restore command to skip errors. Afterwhile you may check for corrupted triggers and procedures.Spud
B
16

For anyone working with .sql.gz files; I found the following solution to be very useful. Our database was 25GB+ and I had to remove the log tables.

gzip -cd "./mydb.sql.gz" | sed -r '/INSERT INTO `(log_table_1|log_table_2|log_table_3|log_table_4)`/d' | gzip > "./mydb2.sql.gz"

Thanks to the answer of Don and comment of Xosofox and this related post: Use zcat and sed or awk to edit compressed .gz text file

Bot answered 23/8, 2018 at 8:6 Comment(3)
Thank you. This is awesome!Heck
Hm can you use wildcard in the sed line too? log_table_* or such?Deduce
The -r in sed -r '/INSERT INTO.../d' is short for --regexp-extended (linux.die.net/man/1/sed). So you should be able to use regular expressions.Bot
B
12

mysqlimport is not the right tool for importing SQL statements. This tool is meant to import formatted text files such as CSV. What you want to do is feed your sql dump directly to the mysql client with a command like this one:

bash > mysql -D your_database < your_sql_dump.sql

Neither mysql nor mysqlimport provide the feature you need. Your best chance would be importing the whole dump, then dropping the tables you do not want.

If you have access to the server where the dump comes from, then you could create a new dump with mysqldump --ignore-table=database.table_you_dont_want1 --ignore-table=database.table_you_dont_want2 ....


Check out this answer for a workaround to skip importing some table

Byssus answered 24/5, 2013 at 12:53 Comment(2)
Thanks for the comment. My problem seems to be that when I use the import command from within MySQL Workbench, this process works but only imports about a third of my tables and then hangs on a specific table. I can't get all the tables imported because of the problem table.Gleeful
Try to import your dump directly from the command line. Are you getting the same issue? At least you might be able to see some error message.Byssus
S
5

Little old, but figure it might still come in handy...

I liked @Don's answer (https://mcmap.net/q/325275/-how-to-ignore-certain-mysql-tables-when-importing-a-database) but found it very annoying that you'd have to write to another file first...
In my particular case this would take too much time and disc space

So I wrote a little bash script:

#!/bin/bash

tables=(table1_to_skip table2_to_skip ... tableN_to_skip)


tableString=$(printf "|%s" "${tables[@]}")
trimmed=${tableString:1}
grepExp="INSERT INTO \`($trimmed)\`"

zcat $1 | grep -vE "$grepExp" | mysql -uroot -p

this does not generate a new sql script but pipes it directly to the database
also, it does create the tables, just doesn't import the data (which was the problem I had with huge log tables)

Sacrilegious answered 26/10, 2018 at 12:23 Comment(0)
A
2

Unless you have ignored the tables during the dump with mysqldump --ignore-table=database.unwanted_table, you have to use some script or tool to filter out the data you don't want to import from the dump file before passing it to mysql client.

Here is a bash/sh function that would exclude the unwanted tables from a SQL dump on the fly (through pipe):

# Accepts one argument, the list of tables to exclude (case-insensitive).
# Eg. filt_exclude '%session% action_log %_cache'
filt_exclude() {
    local excl_tns;
    if [ -n "$1" ]; then
        # trim & replace /[,;\s]+/ with '|' & replace '%' with '[^`]*'
        excl_tns=$(echo "$1" | sed -r 's/^[[:space:]]*//g; s/[[:space:]]*$//g; s/[[:space:]]+/|/g; s/[,;]+/|/g; s/%/[^\`]\*/g');

        grep -viE "(^INSERT INTO \`($excl_tns)\`)|(^DROP TABLE (IF EXISTS )?\`($excl_tns)\`)|^LOCK TABLES \`($excl_tns)\` WRITE" | \
        sed 's/^CREATE TABLE `/CREATE TABLE IF NOT EXISTS `/g'        
    else
        cat
    fi
}

Suppose you have a dump created like so:

MYSQL_PWD="my-pass" mysqldump -u user --hex-blob db_name | \
pigz -9 > dump.sql.gz

And want to exclude some unwanted tables before importing:

pigz -dckq dump.sql.gz | \
filt_exclude '%session% action_log %_cache' | \
MYSQL_PWD="my-pass" mysql -u user db_name

Or you could pipe into a file or any other tool before importing to DB.

Adz answered 11/7, 2019 at 11:34 Comment(0)
L
1

If desired, you can do this one table at a time:

mysqldump -p sourceDatabase tableName > tableName.sql
mysql -p -D targetDatabase < tableName.sql
Lattonia answered 26/7, 2013 at 2:13 Comment(0)
O
1

Here is my script to exclude some tables from mysql dump I use it to restore DB when need to keep orders and payments data

exclude_tables_from_dump.sh

#!/bin/bash

if [ ! -f "$1" ];
then
    echo "Usage: $0 mysql_dump.sql"
    exit
fi

declare -a TABLES=(
user
order
order_product
order_status
payments
)

CMD="cat $1"
for TBL in "${TABLES[@]}";do
    CMD+="|sed 's/DROP TABLE IF EXISTS \`${TBL}\`/# DROP TABLE IF EXIST \`${TBL}\`/g'"
    CMD+="|sed 's/CREATE TABLE \`${TBL}\`/CREATE TABLE IF NOT EXISTS \`${TBL}\`/g'"
    CMD+="|sed -r '/INSERT INTO \`${TBL}\`/d'"
    CMD+="|sed '/DELIMITER\ \;\;/,/DELIMITER\ \;/d'"
done

eval $CMD

It avoid DROP and reCREATE of tables and inserting data to this tables. Also it strip all FUNCTIONS and PROCEDURES that stored between DELIMITER ;; and DELIMITER ;

Overmaster answered 30/5, 2018 at 14:11 Comment(0)
C
1

Beside the great answers in this topic, It could be that your .sql file does not have the INSERTS with this for every line:

INSERT INTO `log`

But they are actually inserted like this:

LOCK TABLES `log` WRITE;

INSERT INTO `log` (`id`, `data`)
VALUES
  (1,'The log content'),
  (2,'The log content'),
  (3,'The log content'),
  (4,'The log content')
    
UNLOCK TABLES;

You'll want to exclude everything between LOCK TABLES `log` WRITE; and the next UNLOCK TABLES;

gzip -dc "$1" | sed \
-e '/LOCK TABLES `log` WRITE;/,/^UNLOCK TABLES/d;' \
-e '/LOCK TABLES `some_other_table` WRITE;/,/^UNLOCK TABLES/d;' \
| gzip > reduced_$(date '+%F_%T').sql.gz

Put that in a reduce_db.sh somewhere on your computer and every time you need it you execute it like this

bash /folder/to/reduce_db.sh /path/to/big_db_file.sql.gz

Good to know, only exclude data of which no other data in the database is dependent on. Otherwise your reduced database is inconsistent in the end.

Circassia answered 9/3, 2023 at 18:47 Comment(0)
S
0

I would not use it on production but if I would have to import some backup quickly that contains many smaller table and one big monster table that might take hours to import I would most probably "grep -v unwanted_table_name original.sql > reduced.sql

and then mysql -f < reduced.sql

Sociolinguistics answered 8/9, 2018 at 3:48 Comment(1)
You might want to watch out for a situation where your unwanted_table_name is 'SALES' but you have another table called 'BIGSALES' or maybe a foreign key field called, maybe another_table_SALES_fk. Your answer as it stands could lead to a world of pain. ALSO if you are going to go down this route, surely fgrep would be much faster than grep?Rhodium
C
0

Based on the other answers, I've created a BASH script:
https://gist.github.com/martin-rueegg/e247d99cc7402a4f1456c059c5d050ac

Examples
  • Show what will be filtered:

    $ mysql_filter_tables --show "log"
    $ mysql_filter_tables --show "%session%" "action_log" "%_cache"
    $ mysql_filter_tables --show -- "-h"  # in case your table name is "-h"
    
  • Usage with plain SQL file:

    $ mysql_filter_tables "log" <big.sql >clean.sql
    $ cat big.sql | mysql_filter_tables "log" >clean.sql
    
  • Import plain SQL dump into mysql:

    $ MYSQL_PWD=secret mysql_filter_tables -i big.sql "%_cache" | mysql -u user -D database
    
  • If source is compressed:

    $ mysql_filter_tables -i big.sql.zip --unzip -o clean.sql "%session%" "%_cache"
    $ mysql_filter_tables -i big.sql.gz --gunzip -o clean.sql "%session%" "%_cache"
    
More Options
Command Options:
  --show
    show generated sed command, rather than executing it

  -i | --input file
    read from file, rather than standard input

  -o | --output file
    write result to file, rather than standard output

  -f | --force
    overwrite output file, if exists

  -u | -ui | --unzip
    unzip input stream. Use with -i to read from file

  -z | -zo | --zip
    zip ouput stream. Use with -o to write to file

  -g | -gi | --gunzip
    gunzip input stream. Use with -i to read from file

  -G | -go | --gzip
    gzip ouput stream. Use with -o to write to file

Info Options:
  -h|--help       show help
  -V|--version    show version
  -1|--one-line   show a one-liner that can be easily used in other scripts.
                  (Does not support any info options, except -V)
  --copyright     show copyright, license, and credits
  --credits       show copyright, license, and credits
  --license       show copyright, license, and credits
Installation
  • Download file from github.com

    $ wget https://gist.github.com/martin-rueegg/e247d99cc7402a4f1456c059c5d050ac/raw/mysql_filter_tables.sh
    
  • Inspect the file for your own safety

  • Source the file, and then use the function in the shell, or ...

    $ source mysql_filter_tables.sh
    $ mysql_filter_tables --help
    
  • ... mark the file executable und use it as script

    $ chmod a+x mysql_filter_tables.sh
    $ ./mysql_filter_tables.sh --help
    
Dependencies
  • bash
  • sed, cat
  • zip, unzip
  • gzip, zcat

Credits

Catherin answered 21/10, 2023 at 22:11 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.