Importing a csv into mysql via command line
Asked Answered
L

8

71

I'm trying to import a very large .csv file (~4gb) into mysql. I was considering using phpmyadmin, but then you have a max upload size of 2mb. Someone told me that I have to use the command line.

I was going to use these directions to import it: http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html#c5680

What would be the command to set the first row in the .csv table as the column names in the mysql table? This option is available through phpmyadmin, so their must be a mysql command line version too, right?. Please help me. Thank you.

-Raj

Linzer answered 7/7, 2011 at 4:18 Comment(0)
S
155

Try this command

 load data local infile 'file.csv' into table table
 fields terminated by ','
 enclosed by '"'
 lines terminated by '\n'
 (column1, column2, column3,...)

The fields here are the actual table fields that the data needs to sit in. The enclosed by and lines terminated by are optional and can help if you have columns enclosed with double-quotes such as Excel exports, etc.

For further details check the manual.

For setting the first row as the table column names, just ignore the row from being read and add the values in the command.

Salesclerk answered 7/7, 2011 at 4:23 Comment(18)
Few questions, firstly, do column1, column2 etc, need to have quotes around it? And my issue is that their are about 50+ column names I need to import. The first line contains all that data, so If there was some way to mysql read the first line, and set them as the column names, that would be the best. It would be way too tedious to write each name one-by-one. Thank you.Linzer
You dont need to write it, see that the column names are comma separated, so just cut the first line from your csv file and paste it in the command, And as far as I know, in phpMyAdmin, the csv is being read first to generate a query like this and then the import is being done.Salesclerk
Hmm, did not think of that. Let me give that a go. Thank you. Do you, by any chance, know how to copy from a text editor and paste into ubuntu terminal?Linzer
Either right click, you'll get a option paste, or use Ctrl+Shift+VSalesclerk
Okay, well I need to upload the 4gb file on the server and try it. Thanks for help!Linzer
I got an error when I tried it. This is what I typed in: load data local infile 'filepath' into table tester fields terminated by ',' enclosed by '"' lines terminated by '\n' "colum1", "column2", "column3". I tried putting parenthesis around the last part, but it also gave me an error. What am I doing wrong?Linzer
Think there needs to be a parentheses around the column names, And What is the error you are getting???Salesclerk
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '"NPI","Entity Type Code","Replacement NPI","Employer Identification Number (EIN)' at line 5. Basically this is the line with the parenthesisLinzer
Check this, You might have a similar problem. And always use back-ticks for column names, instead of double quotesSalesclerk
I wrote a python script to change all the double quotes to single quotes, then I recopied the data back, and I got the same error. Any other ideas?Linzer
Its not single quotes, you need to use back-ticks.Salesclerk
AHHH, those are different. Hmm, well i think I'm getting somewhere. Now its saying that the table doesn't exist. Shouldn't it create the table automatically? what do i do?Linzer
NO, you need to create the table. This command just transfers the content at a very high speedSalesclerk
Hmm, okay, I think I have an idea. I'm first going to make a table in phpmyadmin using just the table header file, then try the import afterwards. Another error that I'm running into is the fact that some column names are over 64 chars. Damn. Thank you for your help. :)Linzer
mysql> load data local infile '/home/alhelal/HandicraftStore/Data/provider.data' into table provider fields terminated by '|' lines terminated by '\n' ignore 1 lines (name, code,address,mobile,email); ERROR 1148 (42000): The used command is not allowed with this MySQL versionCocainize
Worked for me just fine, thnx!Pyrophoric
ERROR 1290 (HY000): The MySQL server is running with the --secure-file-priv option so it cannot execute this statement - Not workedAylmer
How to display the warnings caused by load data? This command shows Warnings: 1Puritanical
M
23

try this:

mysql -uusername -ppassword --local-infile scrapping -e "LOAD DATA LOCAL INFILE 'CSVname.csv'  INTO TABLE table_name  FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"
Manual answered 24/10, 2016 at 10:20 Comment(1)
This worked for me, however I was getting ERROR 3948 (42000) at line 1: Loading local data is disabled; this must be enabled on both the client and server sides error and the resolution was here at https://mcmap.net/q/47426/-error-loading-local-data-is-disabled-this-must-be-enabled-on-both-the-client-and-server-sides. (Added here so that other folks can directly go to that link)Swainson
T
16

You could do a

mysqlimport --columns='head -n 1 $yourfile' --ignore-lines=1 dbname $yourfile`

That is, if your file is comma separated and is not semi-colon separated. Else you might need to sed through it too.

Thermodynamic answered 27/8, 2013 at 15:12 Comment(4)
does the table have to be created already with the headers? And what format is $yourfile, would *.csv work?Indefinable
Your import file needs the headers, the 'head -n 1 $yourfile' returns the first comma-separated row of your CSV file. --ignore-lines=1 then ignores that one row since it would else try to insert these into your table. The default delimiter is tab (\t), so also add an --fields-terminated-by=',' clause to use a comma-delimiter instead. Please read linux.die.net/man/1/mysqlimport and dev.mysql.com/doc/refman/5.7/en/load-data.htmlThermodynamic
Shouldn't this be: `mysqlimport --columns=$(head -n 1 FILE) --ignore-lines=1 dbname FILE where file is the database file? Having --columns='head -n 1 $yourfile' produces a syntax error. Also you may have to add the option --local since many mysql servers are by default configured by the --secure-file-priv option.Deathblow
--columns='head -n 1 $yourfile' raises syntax error. --columns=$(head -n 1 FILE) workedLaundrywoman
G
10

You can simply import by

mysqlimport --ignore-lines=1 --lines-terminated-by='\n' --fields-terminated-by=',' --fields-enclosed-by='"' --verbose --local -uroot -proot db_name csv_import.csv

Note: Csv File name and Table name should be same

Ger answered 14/9, 2018 at 7:56 Comment(0)
E
6

For importing csv with a header row using mysqlimport, just add

--ignore-lines=N

(ignores the first N lines of the data file)

This option is described in the page you've linked.

Eucaine answered 7/7, 2011 at 4:24 Comment(2)
I'm not trying to ignore the first line. I want to use the first line as column headers.Linzer
You can't do this with mysqlimport, but you can add the option --columns=column_list to give the command the order of your csv fields for your table.Eucaine
D
1

Another option is to use the csvsql command from the csvkit library.

Example usage directly on command line:

csvsql --db mysql:///test --tables yourtable --insert yourfile.csv

This can be executed directly on the command line, or built into a python or shell script for automation if you need to do this for a number of files.

csvsql allows you to create database tables on the fly based on the structure of your csv, so it is a lite-code way of getting the first row of your csv to automagically be cast as the MySQL table header.

Full documentation and further examples here: https://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html

Dichlorodiphenyltrichloroethane answered 20/10, 2018 at 0:8 Comment(0)
T
0

I know this says command line, but just a tidbit of something quick to try that might work, if you've got MySQL workbench and the csv isn't too large, you can simply

  • SELECT * FROM table
  • Copy entire CSV
  • Paste csv into the query results section of Workbench
  • Hope for the best

I say hope for the best because this is MySQL Workbench. You never know when it's going to explode


If you want to do this on a remote server, you would do

mysql -h<server|ip> -u<username> -p --local-infile bark -e "LOAD DATA LOCAL INFILE '<filename.csv>'  INTO TABLE <table>  FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'"

Note, I didn't put a password after -p as putting one on the command line is considered bad practice

Tuchman answered 22/5, 2020 at 12:31 Comment(0)
A
0

Most answers missing an important point like if you have created csv file exported from Microsoft Excel on windows and importing the same in linux environment, you will get unexpected result.

the correct syntax would be

load data local infile 'file.csv' into table table fields terminated by ',' enclosed by '"' lines terminated by '\r\n'

here the difference is '\r\n' as against simply '\n

Aboral answered 23/11, 2020 at 22:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.