This is essentially a more constrained version of this question.
Suppose we have a very large text file, containing a large number of lines.
We need to choose a line at random from the file, with uniform probability, but there are constraints:
- Because this is a soft realtime application, we cannot iterate over the entire file. The choice should take a constant-ish amount of time.
- Because of memory constraints, the file cannot be cached.
- Because the file is permitted to change at runtime, the length of the file cannot be assumed to be a constant.
My first thought is to use an lstat()
call to get the total filesize in bytes. fseek()
can then be used to directly access a random byte offset, getting something like O(1) access into a random part of the file.
The problem is that we can't then do something like read to the next newline and call it a day, because that would produce a distribution biased toward long lines.
My first thought at solving this issue is to read until the first "n" newlines (wrapping back to the file's beginning if required), and then choose a line with uniform probability from this smaller set. It is safe to assume the file's contents are randomly ordered, so this sub-sample should be uniform with respect to length, and, since its starting point was selected uniformly from all possible points, it should represent a uniform choice from the file as a whole. So, in pseudo-C, our algorithm looks something like:
lstat(filepath, &filestat);
fseek(file, (int)(filestat.off_t*drand48()), SEEK_SET);
char sample[n][BUFSIZ];
for(int i=0;i<n;i++)
fgets(sample[i], BUFSIZ, file); //plus some stuff to deal with file wrap around...
return sample[(int)(n*drand48())];
This doesn't seem like an especially elegant solution, and I'm not completely confident it will be uniform, so I'm wondering if there's a better way to do it. Any thoughts?
EDIT: On further consideration, I'm now pretty sure that my method is not uniform, since the starting point is more likely to be inside longer words, and thus is not uniform. Tricky!
mmap
to map the file into memory. This will save you memory and avoid swapping. Non-dirty unused pages will be just removed since they are file-backed. – Runoff