(Please note: There is a question called "SQLite3 and Multiprocessing" but that question is actually about multithreading and so is the accepted answer, this isn't a duplicate)
I'm implementing a multiprocess script, each process will need to write some results in an sqlite table. My program keeps crashing with database is locked
(with sqlite only one DB modification is allowed at a time).
Here's an example of what I have:
def scan(n):
n = n + 1 # Some calculation
cur.execute(" \
INSERT INTO hello \
(n) \
VALUES ('"+n+"') \
")
con.commit()
con.close()
return True
if __name__ == '__main__':
pool = Pool(processes=int(sys.argv[1]))
for status in pool.imap_unordered(scan, range(0,9999)):
if status:
print "ok"
pool.close()
I've tried using a lock by declaring a lock in the main and using it as a global in scan()
, but it didn't stop me getting the database is locked
.
What is the proper way of making sure only one INSERT statement will get issued at the same time in a multiprocess Python script?
EDIT:
I'm running on a Debian-based Linux.
print sqlite3.sqlite_version
to get that. – Stereochemistry