While what @Marc-Audet said is true, if you take a look at the code, you can see it is a really lousy way to clean up sessions.
The constructor calls the _sess_gc
function every time it is initiated. So, basically each request to your server if you have it autoloaded.
Then, it generates a random number below 100 and sees if that's below a certain value (by default it is 5). If this condition is met, then it will remove any rows on the session table with last_activity
value less than current time minus your session expiration.
While this works for most cases, it is technically possible that (if the world is truly random) the random number generator does not generate a number below 5 for a long time, in which case, your sessions will not be cleaned up.
Also, if you have your session expiry time set to a long time (if you set to 0, CI will set it to 2 years) then those rows are not going to get deleted anyway. And if your site is good enough to get a decent amount of visitors, your DBA will be pointing fingers at the session table some time soon :)
It works for most cases - but I would not call it a proper solution. Their session id regeneration really should have been built to remove the records pertaining to the previous ids and the garbage collection really should not be left to a random number - in theory, it is possible that the required number is not generated as frequently as you wished.
In our case, I have removed the session garbage collection from the session library and I manually take care of it once a day (with a cron job .. and a reasonable session expiration time). This reduces the number of unnecessary hits to the DB and also does not leave a massive table in the DB. It is still a big table, but lot smaller than what it used to be.