Export a large MySQL table as multiple smaller files
Asked Answered
A

7

13

I have a very large MySQL table on my local dev server: over 8 million rows of data. I loaded the table successfully using LOAD DATA INFILE.

I now wish to export this data and import it onto a remote host.

I tried LOAD DATA LOCAL INFILE to the remote host. However, after around 15 minutes the connection to the remote host fails. I think that the only solution is for me to export the data into a number of smaller files.

The tools at my disposal are PhpMyAdmin, HeidiSQL and MySQL Workbench.

I know how to export as a single file, but not multiple files. How can I do this?

Allimportant answered 28/10, 2012 at 20:27 Comment(4)
You could try mysqldump'ing just the required table and reconstructing it via import.Polenta
Can you connect to the mysql machine from your local machine?Raggletaggle
@Raggletaggle - yes, of course. I have been using HeidiSQL for database n¿maintenance.Allimportant
then just dump locally - split the file then upload increments.Raggletaggle
M
21

I just did an import/export of a (partitioned) table with 50 millions record, it needed just 2 minutes to export it from a reasonably fast machine and 15 minutes to import it on my slower desktop. There was no need to split the file.

mysqldump is your friend, and knowing that you have a lot of data it's better to compress it

 @host1:~ $ mysqldump -u <username> -p <database> <table> | gzip > output.sql.gz
 @host1:~ $ scp output.sql.gz host2:~/
 @host1:~ $ rm output.sql.gz
 @host1:~ $ ssh host2
 @host2:~ $ gunzip < output.sql.gz | mysql -u <username> -p <database>
 @host2:~ $ rm output.sql.gz
Motheaten answered 30/5, 2013 at 9:33 Comment(0)
S
6

Take a look at mysqldump

Your lines should be (from terminal):

export to backupfile.sql from db_name in your mysql:

mysqldump -u user -p db_name > backupfile.sql

import from backupfile to db_name in your mysql:

mysql -u user -p db_name < backupfile.sql

You have two options in order to split the information:

  1. Split the output text file into smaller files (as many as you need, many tools to do this, e.g. split).
  2. Export one table each time using the option to add a table name after the db_name, like so:

    mysqldump -u user -p db_name table_name > backupfile_table_name.sql

Compressing the file(s) (a text file) is very efficient and can minimize it to about 20%-30% of it's original size.

Copying the files to remote servers should be done with scp (secure copy) and interaction should take place with ssh (usually).

Good luck.

Sera answered 28/10, 2012 at 20:34 Comment(2)
"split the output text file into smaller files (as many as you need, many tools to do this)." - what tools? This was the question: how do I do it?Allimportant
Take a look at the edit. Added Unix's "split" as an example, should be available on most systems.Sera
A
3

I found that the advanced options in phpMyAdmin allow me to select how many rows to export, plus the start point. This allows me to create as many dump files as required to get the table onto the remote host.

I had to adjust my php.ini settings, plus the phpMyAdmin config 'ExecTimeLimit' setting as generating the dump files takes some time (500,000 rows in each).

I use HeidiSQL to do the imports.

Allimportant answered 29/10, 2012 at 0:33 Comment(0)
P
2

As an example of the mysqldump approach for a single table

mysqldump -u root -ppassword yourdb yourtable > table_name.sql

Importing is then as simple as

mysql -u username -ppassword yourotherdb < table_name.sql
Polenta answered 28/10, 2012 at 20:36 Comment(3)
I've found this particularly fast for large exports/imports. Good suggestion for the OP.Coot
Yeah, thats nice, however I do not have SSH access to the server. It does not answer the actual question.Allimportant
@Allimportant You can still run a phpmyadmin import on the dump file, if you have phpmyadmin access. You have the option to do a partial import, then continue by specifying the number of queries to skip as the number that occurred successfully.Polenta
D
1

Use mysqldump to dump the table into a file. Then use tar with -z option to zip the file. Transfer it to your remote server (with ftp, sftp or other file transfer utility). Then untar the file on remote server Use mysql to import the file.

There is no reason to split the original file or to export in multiple files.

Demerit answered 28/10, 2012 at 20:43 Comment(3)
@Vyktor Please, explain to me your question a little bit more. I may be missing some information here. My first quick answer is that I do not expect any query timeouts. "mysql" command will process the "xxx.sql" file (the result of export) on the local machine where the import is going to take place. "xxx.sql" file contains "insert" commands for the table that is being imported (expect about 8 million such commands/statements, i.e. equal to the number of rows of the table at hand). I do not see any reason these "insert" commands to fail due to any timeout.Demerit
This is a shared host - I don't have access to MySQL on the command line.Allimportant
The question was: HOW to split the data.... whether you think I should or not. Clearly for me, this is the simplest solution as I can easily import smaller files thru HeidiSQLAllimportant
H
1

If you are not comfortable with using the mysqldump command line tool, here are two GUI tools that can help you with that problem, although you have to be able to upload them to the server via FTP!

Adminer is a slim and very efficient DB Manager tool that is at least as powerful as PHPMyAdmin and has only ONE SINGLE FILE that has to be uploaded to the server which makes it extremely easy to install. It works way better with large tables / DB than PMA does.

MySQLDumper is a tool developed especially to export / import large tables / DBs so it will have no problem with the situation you describe. The only dowside is that it is a bit more tedious to install as there are more files and folders (~350 files in ~1.5MB), but it shouldn't be a problem to upload it via FTP either, and it will definately get the job done :)

So my advice would be to first try Adminer and if that one also fails go the MySQLDumper route.

Hasan answered 12/3, 2014 at 11:48 Comment(0)
D
0

How do I split a large MySQL backup file into multiple files?

You can use mysql_export_explode https://github.com/barinascode/mysql-export-explode

<?php 
#Including the class

include 'mysql_export_explode.php';
$export = new mysql_export_explode;

$export->db = 'dataBaseName'; # -- Set your database name
$export->connect('host','user','password'); # -- Connecting to database
$export->rows = array('Id','firstName','Telephone','Address'); # -- Set which fields you want to export
$export->exportTable('myTableName',15); # -- Table name and in few fractions you want to split the table
?>

At the end of the SQL files are created in the directory where the script is executed in the following format
---------------------------------------
myTableName_0.sql
myTableName_1.sql
myTableName_2.sql
...
Decencies answered 21/2, 2016 at 2:53 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.