Import CSV file into Sqlite3 Database in command-line or via Batch File
Asked Answered
B

7

17

I would like to inquire on whether is there anyway to import a csv file that contains output of my select statements in SQLite3 into a new database? Following are the codes i have done thus far:

sqlite3.exe -csv logsql.sqlite "SELECT local_port AS port, COUNT(local_port) AS hitcount FROM connections  WHERE connection_type = 'accept' GROUP BY local_port ORDER BY hitcount DESC;" > output.csv
sqlite3.exe -csv test.sqlite "CREATE TABLE test (name varchar(255) not null, blah varchar(255) not null);" .import ./output.csv test

as you can see my first code was to dump out the queries made.

the second line of code i'm attempting to make a new database and attemptign to import the csv file into the table "test"

thanks for any help made in advance! :D

Banish answered 8/8, 2011 at 3:36 Comment(2)
The big question I need to ask here is: why do you need to do this? Is it a one-time operation, is this part of a program?Moneybags
yes this is part of a program. i intend to run the batch script to generate my own table to reduce the processing time for graphical presentation in my php programBanish
M
16

I'd recommend doing your importation from a flat file, which will create your schema followed with the actual importation:

Like so:

sqlite3.exe test.sqlite < import.sql

Where the content of import.sql is:

CREATE TABLE test (name varchar(255) not null, blah varchar(255) not null);
.separator ,
.import output.csv test

One other approach which you might not have considered is the ATTACH command. You can attach a new database, create the table in it, and import to its table, so you don't have the extra step of exporting to CSV then reparsing. It can be from a CREATE TABLE ... AS SELECT ... query or just an INSERT.

So basically, you'd run (from your PHP Page):

"ATTACH 'c:\directory\to\database\test.db' as TESTDB;"
"CREATE TABLE TESTDB.test AS SELECT local_port AS port, COUNT(local_port) AS hitcount FROM connections  WHERE connection_type = 'accept' GROUP BY local_port ORDER BY hitcount DESC;"

Or:

"ATTACH 'c:\directory\to\database\test.db' as TESTDB;"
"CREATE TABLE TESTDB.test (name varchar(255) not null, blah varchar(255) not null);"
"IMPORT INTO TESTDB.test SELECT local_port AS port, COUNT(local_port) AS hitcount FROM connections  WHERE connection_type = 'accept' GROUP BY local_port ORDER BY hitcount DESC;"
Moneybags answered 8/8, 2011 at 3:41 Comment(8)
i'm sorry i don't quite understand your first paragraph. as i'm using batch script. i'm unable to use the DOT commands?Banish
thank you very much for the prompt response. They came back with an error saying Error: expected 2 columns of data but found 1. Is it because my 1st code generates only one column? I have opened the csv file and noticed 2 columns filled.Banish
Sorry, I forgot that .import and -csv create different separators. Please see my edits.Moneybags
I can't seem to run a script from a file against a sqlite3 database in this way - it complains Error: incomplete SQL: .separator , .import output.csv itemsEspagnole
@MichaelForrest items is not an SQL command. If you need further help, you'd better start a new question.Moneybags
items is the equivalent of 'test' in the command discussed (as in .import output.csv test - it's an argument for the .import. I think this is valid commentary for this question.Espagnole
Perhaps I'm just missing a trailing line break.Espagnole
No it was because everything in the file was indented when I generated the script from Ruby! Ok never mind.Espagnole
B
9

A single-file command to import a file via bash that worked for me:

sqlite3 inventory.sqlite.db << EOF
delete from audit;
.separator "\t"
.import audit-sorted-uniq.tsv audit
EOF

Hope that helps.

Bendix answered 18/1, 2018 at 8:35 Comment(0)
K
4

The most simple way I find is to make sure that in the csv file the first row are the name of the columns and the values starts from row 2.

sqlite3 database.db ".import --csv /user/name/work/somedata.csv tab1"
  • Where database is the database where the table will be created.
  • schema will be create based on the values of the first row as the name of the columns.
  • /user/name/... is the path for the csv file ( Linux file system )
  • tab1 is the name of the table being created.
Kmeson answered 9/4, 2023 at 20:53 Comment(0)
A
1

For large CSV files it may be more efficient to use the sqlite3 shell's .import command, rather than parse the file in Python and insert rows with sqlite3 module. It can be done via os.system (on Linux, Unix or Mac OS X, or Cygwin on Windows):

cmd = '(echo .separator ,; echo .import ' + csv_file + ' ' + table + ')'
cmd += '| sqlite3 ' + db_name
os.system(cmd)
Ashcan answered 18/4, 2014 at 14:43 Comment(0)
B
0

You can do a lot with the SQLite command shell and command-line switches... ... but I'd strongly urge you to find a SQLite-aware scripting language that'll work on Windows and that you feel comfortable with.

Perl and Python are two excellent choices. Both support SqlLite, both are freely available for Windows.

And both can handle this - and many other kinds of tasks.

Burleson answered 8/8, 2011 at 3:46 Comment(0)
T
0

First you need correct real csv file with comma separator and header line. I used the dump tool from phpstorm for this.

put all files into a folder.

open cmd.exe

create import statements where %~na is the filename without extension as tablename

for /R %a in (*.csv) do @echo .import %~na.csv %~na >> import.txt

start sqlite with a dummy db

sqlite test.sqlite

set import mode to csv

.mode csv

run commands from file

.read import.txt

list all tables

.tables

quit sqlite

.q
Timisoara answered 18/1, 2020 at 11:52 Comment(0)
P
-1

I nedded to import many csv files, so I wrote the following python script that does the job of creating and loading sqlite tables from csv files, using the first line of the csv as the field names for the table:

#!/usr/bin/env python
import sqlite3
from csv import DictReader

class SQLiteDB():
    def __init__(self, dbname=':memory:'):
        self.db=sqlite3.connect(dbname)

    def importFromCSV(self, csvfilename, tablename, separator=","):
        with open(csvfilename, 'r') as fh:
            dr = DictReader(fh, delimiter=separator)
            fieldlist=",".join(dr.fieldnames)
            ph=("?,"*len(dr.fieldnames))[:-1]
            self.db.execute("DROP TABLE IF EXISTS %s"%tablename)
            self.db.execute("CREATE TABLE %s(%s)"%(tablename, fieldlist))
            ins="insert into %s (%s) values (%s)"%(tablename, fieldlist, ph)
            for line in dr:
                v=[]
                for k in dr.fieldnames: v.append(line[k])
                self.db.execute(ins, v)
        self.db.commit()

if __name__ == '__main__':
    db=SQLiteDB("mydatabase.sqlite")
    db.importFromCSV("mydata.csv", "mytable")

For importing a large amount of data, you should implement transactions.

hth

Pater answered 23/8, 2013 at 19:39 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.