How to make a copy of large database from phpmyadmin?
Asked Answered
B

9

14

I want to create a dev environment of my website on the same server. But I have a 7Gb of database which contains 479 tables and I want to make a copy of that database to the new DB.

I have tried this with the help of PHPmyadmin >> Operations >> copy database to functionality. But every time it will fail and return the error

Error in processing request Error code: 500 Error text: Internal Error. 

Please let me know there is any other method/ solution to copy this database to a new database from cpanel please advise

Broch answered 28/9, 2017 at 7:15 Comment(19)
Create a backup from phpmyadmin.Sliding
@Sliding already created but how to set backup to the new dbBroch
Well, you've created it by clicking on "export" ... so what about loggin into the new database, click there on "import" and import your sql file? ;)Sliding
You can use the terminal for big databases.Reclamation
@Sliding its failed in middle of process gives timeout errorBroch
mysql -u username -p database_name < /path/to/file/on/server/textfilewithsqlstatments.sqlReclamation
You can increase the timeout time by chaning your serversettings.Sliding
@Sliding he said 7gb database...he can't import such big database using phpmyadmin importReclamation
@Reclamation Cpanel not provide that much previledgeBroch
@Sliding even we can't export the all Db in the same timeout issueBroch
@Reclamation And he can't do that because of...? And as I said, increase the running time for scripts, so you won't get the timeout. Its all in your php.ini settings. Nothing more.Sliding
do you have the access of the command line? @BrochCzarism
@DanyalSandeelo No don't have access of command line. its shared hostingBroch
Are you working on Mac or Windows?Limnetic
@Limnetic My pc having window 7 & server is linux shared hostingBroch
Did you ask your hosting provider? Maybe they would do it for you.Strafford
@PaulSpiegel Yes, yesterday I had contact with them, they have solved my issue. But there is any other process to do this from phpmyadmin or cpanelBroch
SHOW VARIABLES LIKE "%timeout%"; Which of those approximates the time before you timed out?Gyral
@Narayan, could you at least try my solution? I've run into this before, on environments where I had access to the CLI, but others didn't, who needed to be able to solve this very problem. It's not the cleanest solution, but the cleanest solutions will require CLI.Spoliation
V
2

Create an export of your database. This should be easily done thru PhpMyAdmin interface. Once you downloaded the DB export, you need to create a new DB where you will put your exported data. This, too, should be easily done thru PhpMyAdmin user interface.

To upload it, we cannot use Import -> Browse your computer because it has a limit of 2MB. One solution is to use Import -> Select from the web server upload directory /var/lib/phpMyAdmin/upload/. Upload your exported data in this directory. After that, your uploaded data should be listed in the dropdown next to it.

enter image description here

If this fails too, you can use the command line import.

mysql -u user -p db_name < /path/to/file.sql
Vesicate answered 28/9, 2017 at 7:22 Comment(2)
there is only 1 option available i.e. browse from computer check i.imgur.com/UNqxqYN.pngBroch
you must be using an older phpmyadmin. do no fear, I updated my answer so you can upload thru command line.Vesicate
G
2

NOTE: I have just read your comment, and as I can understand you don't have access to command line. Please check Solution Two, this will definitely work.

The only solution that will work for you (which work for me at 12GB database) is directly from the command line:

Solution One

mysql -u root -p

set global net_buffer_length=1000000; --Set network buffer length to a large byte number

set global max_allowed_packet=1000000000; --Set maximum allowed packet size to a large byte number

SET foreign_key_checks = 0; --Disable foreign key checking to avoid delays, errors and unwanted behavior

source file.sql --Import your sql dump file

SET foreign_key_checks = 1; --Remember to enable foreign key checks when the procedure is complete!

If you have root access you can create bash script:

#!/bin/sh 

# store start date to a variable
imeron=`date`

echo "Import started: OK"
dumpfile="/home/bob/bobiras.sql"

ddl="set names utf8; "
ddl="$ddl set global net_buffer_length=1000000;"
ddl="$ddl set global max_allowed_packet=1000000000; "
ddl="$ddl SET foreign_key_checks = 0; "
ddl="$ddl SET UNIQUE_CHECKS = 0; "
ddl="$ddl SET AUTOCOMMIT = 0; "
# if your dump file does not create a database, select one
ddl="$ddl USE jetdb; "
ddl="$ddl source $dumpfile; "
ddl="$ddl SET foreign_key_checks = 1; "
ddl="$ddl SET UNIQUE_CHECKS = 1; "
ddl="$ddl SET AUTOCOMMIT = 1; "
ddl="$ddl COMMIT ; "

echo "Import started: OK"

time mysql -h 127.0.0.1 -u root -proot -e "$ddl"

# store end date to a variable
imeron2=`date`

echo "Start import:$imeron"
echo "End import:$imeron2"

Source

Solution Two

Also, there is another option which is very good for those who are on shared hosting and don't have command line access. This solution worked for me on 4-5GB files:

  1. MySQL Dumper: Download (You will able to backup/restore SQL file directly from "MySQL Dumper" you don't need phpmyadmin anymore).
  2. Big Dump: Download (Just restore from Compress file and SQL file, need BIGDUMP PHP file editing for big import $linespersession = 3000; Change to $linespersession = 30000;)

Solution Three:

This solution definitely works, it is slow but works.

Download Trial version of (32 or 64 bit): Navicat MySQL Version 12

Install -> and RUN as Trial.

After that Add your Computer IP (Internet IP, not local IP), to the MySQL Remote in cPanel (new database/hosting). You can use wildcard IP in cPanel to access MySQL from any IP.

Goto Navicat MySQL: click on Connection put a connection name.

In next "Hostname/IP" add your "Hosting IP Address" (don't use localhost). Leave port as it is (if your hosting defined a different port put that one here).

add your Database Username and Password

Click Test Connection, If it's successful, click on "OK"

Now on the Main Screen you will see all the database connected with the username on the left side column.

Double click on your database where you want to import SQL file:

Icon color of the database will change and you will see "Tables/views/function etc..".

Now right click on database and select "Execute SQL file" (http://prntscr.com/gs6ef1). choose the file, choose "continue on error" if you want to and finally run it. Its take some time depending on your network connection speed and computer performance.

Gondar answered 2/10, 2017 at 5:47 Comment(4)
I have tried this 2nd solution also but it will failed in between the download or return only half dataBroch
Which one (MySQL Dumper or BigDump)?Gondar
@Broch try "MySQL Dump" with <code>$linespersession = 30000;</code>, i have just added the solution# 3 try this as well.Gondar
MySQLDumper: This project is no longer maintained and it is inactive.Foldaway
S
2

Limited to phpMyAdmin? Don't do it all at once

Large data-sets shouldn't be dumped (unless it's for a backup), instead, export the database without data, then copy one table at a time (DB to DB directly).

Export/Import Schema

First, export only the database schema via phpMyAdmin (uncheck data in the export options). Then import that onto a new database name.

Alternatively, you could use something like below to generate statements like below, once you've created the DB. The catch with this method is that you're likely to lose constraints, sprocs, and the like.

CREATE TABLE [devDB].[table] LIKE [prodDB].[table]

Copy data, one table at a time.

Use a good editor to create the 470 insert statements you need. Start with a list of table names, and use the good old find-and-replace.

INSERT INTO [devDB].[table] SELECT * FROM [prodDB].[table];

This may choke, depending on your environment. If it does, drop and recreate the dev database (or empty all tables via phpMyAdmin). Then, run the INSERT commands a few tables at a time.

Database Administration requires CLI

The real problem you're facing here is that you're trying to do database administration without access to the Command Line Interface. There are significant complicated details to migrating large sets of data efficiently, most of which can only be solved using tools like mysqldump.

Spoliation answered 6/10, 2017 at 15:14 Comment(0)
G
1

The easiest way is to try exporting the data from phpmyadmin. It will create the backup of your data.

But Sometimes, transferring large amount of data via import/export does result into errors.

You can try mysqldump to backup the data as well.

I found a few links for you here and here.

This is the mysqldump database backup documentation.

Hope it helps. :D

Gang answered 28/9, 2017 at 7:31 Comment(3)
after taking the backup how to restore that ?Broch
Depending on the OS you use, Open cmd and navigate to the file where the dump resides. Check this SO link.. #106276Gang
I am using cpanel shared hosting and it will not provide that much privilege form command line.Broch
N
1

I suspect that PHPMyAdmin will handle databases of that size (PHP upload/download limits, memory constraints, script execution time). If you have acccess to the console, i would recommend doing export/import via the mysql command line:

Export:

    $ mysqldump -u <user> -p<pass> <liveDatabase> | gzip > export.sql.gz

And Import:

    $ gunzip < export.sql.gz | mysql -u <user> -p<pass> <devDatabase>

after you have created the new dev database in e.g. PHPMyAdmin or via command line.

Otherwise, if you only have access to an Apache/PHP environment, I would look for an export utility that splits export in smaller chunks. MySQLDumper comes to mind, but it's a few years old and AFAIK it is no longer actively maintained and is not compatible with PHP 7+. But I think there is at least a pull request out there that makes it work with PHP7 (untested).

Edit based on your comment:

If the export already exists and the error occurs on import, you could try to increase the limits on your PHP environment, either via entries in .htaccess, changing php.ini or ini_set, whatever is available in your environment. The relevant settings are e.g. for setting via .htaccess (keep in mind, this will work only for apache environments with mod_php and also can be controlled by your hoster):

      php_value max_execution_time 3600
      php_value post_max_size 8000M
      php_value upload_max_filesize 8000M
      php_value max_input_time 3600

This may or may not work, depending on x32/x64 issues and/or your hosters restrictions. Additionally, you need to adjust the PHPmyadmin settings for ExecTimeLimit - usually found in the config.default.php for your PHPMyAdmin installation: Replace

      $cfg['ExecTimeLimit'] = 300;

with

      $cfg['ExecTimeLimit'] = 0;

And finally, you probably need to adjust your MySQL config to allow larger packets and get rid of the 'lost connection' error: [mysqld] section in my.ini :

      max_allowed_packet=256M
Niche answered 28/9, 2017 at 7:48 Comment(2)
there is a already a backup of that database on server and its easily available to download but timeout issue the full file unable to download.Broch
it will gives the error like mysqldump: Error 2013: Lost connection to MySQL server during query when dumping tableBroch
K
1

You can use mysqldump as follow

mysqldump —user= —password= --default-character-set=utf8

You can also make use of my shell script, which actually wrote long back for creating back-up of MySQL database on regular basis using cron job.

#!/bin/sh
now="$(date +'%d_%m_%Y_%H_%M_%S')"
filename="db_backup_$now".gz
backupfolder=“"
fullpathbackupfile="$backupfolder/$filename"
logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt
echo "mysqldump started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
mysqldump —user= —password= --default-character-set=utf8  | gzip > "$fullpathbackupfile"
echo "mysqldump finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
chown  "$fullpathbackupfile"
chown  "$logfile"
echo "file permission changed" >> "$logfile"
find "$backupfolder" -name db_backup_* -mtime +2 -exec rm {} \;
echo "old files deleted" >> "$logfile"
echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "*****************" >> "$logfile"
exit 0

I have already written an article on Schedule MySQL Database backup on CPanel or Linux.

Karaganda answered 2/10, 2017 at 11:11 Comment(0)
L
1

Here's how I handled that problem when I faced it... Unfortunately this only works for Mac OS.

  • Download Sequel Pro - Completely free, and it has worked really well for me for over a year now.
  • Remotely connect to your server's database. You will probably need to add your ip address to the "Remote MYSQL" section in CPANEL. If you don't have the credentials, you can probably get them from your website's config file.
  • Once you're in the server, you can select all of your tables, secondary click, and select Export > As SQL Dump. You probably won't need to edit any of the settings. Click "Export".
  • Login to your local servers database, and select "Query" from the top menu.
  • Drag and drop the file that was downloaded from the export and it will automatically setup the database from the sql dump.

I hope this helps. It's a little work around, but it's worked really well for me, especially when PMA has failed.

Limnetic answered 2/10, 2017 at 21:33 Comment(0)
D
1

Since the requirements include PHPMyAdmin, my suggestion is to:

  1. select the database you need
  2. go to the "Export" tab
  3. click the "Custom - display all possible options" radio button
  4. in the "Save output to a file" radio button options, select "gzipped" for "Compression:"
  5. Remove the "Display comments" tick (to save some space)
  6. Finish the export

Then try to import the generated file in the new Database you have (if you have sufficient resources - this should be possible).

Note: My previous experience shows that using compression allows larger DB exports/import operations but have not tested what is the upper limit in shared hosting environments (assuming this by your comment for cPanel).

Edit: When your export file is created, select the new database (assuming it is already created), go to the "Import" tab, select the file created from the export and start the import process.

Dermatoglyphics answered 3/10, 2017 at 9:30 Comment(0)
F
1

If you have you database in your local server, you can export it and use BigDump to inserting to new database on the global server BigDump

Flavour answered 8/10, 2017 at 5:1 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.