MySQL workbench table data export extremely slow
Asked Answered
F

3

10

I just downloaded the newest version of MySQL Workbench (6.3.6) and attempted to export a remote table (on Google CloudSQL) to csv using the new "table data export" wizard. The table had about 600,000 rows and the final downloaded size was about 75MB. It took 7.5 hours.

I realize I can use Google Developer Console to perform this export (which I did, and took about 15 seconds), but it seems that something is wrong with MySQL Workbench. Could there be a configuration issue which is causing this to go so slowly?

Foreandafter answered 7/1, 2016 at 19:5 Comment(3)
Curious, did you export the file to the local or remote machine? Does changing this have any effect?Dim
Can you run a traceroute (mtr is awesome) between yourself and the instance IP to diagnose the network latency?Smoking
#33297069 Reffer this linkConsequent
C
7

I know this question is quite old but I'm answering as I recently had this issue. I was trying to export 2 million + rows and it had taken 2 days to only complete half. This was after trying several different ways of export. Then found this:

SELECT * 
FROM my_table
INTO OUTFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/my file.csv' 
FIELDS ENCLOSED BY '"' 
TERMINATED BY ';' 
ESCAPED BY '"' 
LINES TERMINATED BY '\r\n';

And it completed in 80 seconds!

Please note: if you hit secure_file_priv issue then set the file path to be equal to the result of:

SHOW VARIABLES LIKE "secure_file_priv"
Corri answered 9/5, 2020 at 18:10 Comment(0)
C
4

Description: Workbench is very slow exporting large datasets through the CSV export wizard. Disproportionately slow comapred to a smaller set. However, this is something I've come across before with .NET.

How to repeat: Get a table with 15k or so records or more, and export through the wizard. Note how long it takes and then export a subset of that data and see how the time taken does not correlate linearly with the amount of rows.

Suggested fix: Something I've noticed when building CSV export applications is that the MS .NET framework can't deal with huge strings very well, and tends to perform badly as a result.

I found a solution though. When building up the huge string to the dump into the file when you've done generating it, instead of building 1 huge string and writing it to file all at once when the export is done, I get much better performance by only doing a few hundred rows of CSV generated at a time, write it to the file and flush the buffer you have been writing the generated data to.

I'd recommend writing to a temp file, then rename/move it to the user's specified one when done. The Write to temp and then move/rename is the way Photoshop and some other applications save their data. And the writing x rows and flushing I've found when developing myself is much faster than trying to get .NET to manage a 20MB string.

Consequent answered 6/1, 2018 at 7:12 Comment(0)
P
0

Try using ETL tools Pental ETL

or

https://www.mycli.net/

Psaltery answered 14/5, 2020 at 4:29 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.