Will SQLite performance degrade if the database size is greater than 2 gigabytes?
Asked Answered
S

2

76

Last year when I checked about SQLite on their web site, the recommended SQLite database size was 2 gigabytes. But now, I could not find that recommendation again.

So has anyone tried to work with an SQLite database that is bigger than 2 gigabytes using latest version of it? How well did SQLite perform?

P.S: I would like to make a mobile application that requires big database (for example storing Wikipedia articles) that works locally.

Swanherd answered 22/1, 2013 at 4:32 Comment(1)
Most apps use apis. But i commend you for trying I wonder what you did in end did u go with sql lite or did u use an api. Am wanting my app to work off kine to. U also have to think how will u support a 2gb plus file on someones fone u cant really.Featheredge
F
101

There is no 2 GB limit.

SQLite database files have a maximum size of about 140 TB.

On a phone, the size of the storage (a few GB) will limit your database file size, while the memory size will limit how much data you can retrieve from a query. Furthermore, Android cursors have a limit of 1 MB for the results.


The database size will, by itself, not affect your performance. Your queries will be fast as long as they do not access more data than fits into the DB's page cache (2 MB by default).

Fellows answered 22/1, 2013 at 8:45 Comment(5)
Thanks for your useful answer. But what I want to know is that how sqlite will perform depending on its database size.Swanherd
I’ve read other answers on StackOverflow with people successfully using much larger databases (even up to 100GB), so I’m pretty sure SQLite can handle 2GB just fine.Unprofessional
i am assuming this limit does not hold for python sqlite module as long as you can iterate one row at a time with the cursorIntend
@Intend Yeah it's unclear to me what "Android cursors have a limit of 1 MB for the results" means... does that mean each row can be up to 1MB in size since you can iterate the cursor one row at a time? And if not, how do you deal with larger data sets?Andalusia
As of release 3.33.0 (2020-08-14), the maximum size is 281 TB. sqlite.org/changes.htmlExtinguisher
R
31

Usually the larger the database the more data you have in it. The more data you have, the longer searches may take. They don't have to, it depends on a search.

As for inserts, they may take longer if you have many indexes on a table. Rebuilding an index may take some time, so expect insert speed degradation with the amount of data.

Updates may also be slower - fitting rows must be found first (search), then values have to be changed (may trigger an index rebuild).

I am telling you this from experience: if you expect a lot of data in your database, consider splitting it into multiple databases. This works if your data is gathered daily and you can create a database for each day. May make your search code more complex, but will speed things up for limited searches/

Roughandready answered 23/1, 2013 at 8:31 Comment(5)
Although I agree with your statements, I think that if you're creating an app on a mobile phone, creating many files is probably not the best idea... It uses more disk space to have many separate files.Quito
@AlexisWilke the amount of space lost for each file is negligible on most filesystems for files with reasonable size (say, 1mb+)Roughandready
@Roughandready Hi, If I want to save image tiles which each tile is about 20kb, and totall size will be about 40TB how can I handel adding data into sqlite database?do you have any idea to increase insert performance?total number of tiles is about 1831292252Diwan
@MajidHojati if you have no indexes, FKs, etc. then in this case insert performance should be capped solely by your disk performanceRoughandready
@Roughandready so I have to insert data first and later enable indexes on tables. right?Diwan

© 2022 - 2024 — McMap. All rights reserved.