Since I see srand(time(NULL)) so prevalent (even on this site), should it be discouraged?
It depends on how you want to use the output from from your generator (in this case, the output of rand()
).
If you only need a uniform distribution for single runs of your program, then srand(time(NULL))
is fine. This would be acceptable in a simulation where you only need a uniform distribution of numbers quickly.
If you want to submit a batch job so that multiple instances of your program run at the same time (and are effectively started at the same time), then srand(time(NULL))
will probably result in one or more instances producing the same random stream.
If you need a secure output, then you should not use srand(time(NULL))
because its often a Linear Congruential Generator (LCG). Joan Boyar taught us how to break them years ago. See Inferring sequences produced by a linear congruential generator missing low-order bits.
As for the problem with time_t
, just fold it to fit the argument expected by srand
if time_t
is too large. You might even fold in the process PID so that batch simulation jobs work as intended/expected.
<random>
header with new random number generation capabilities improve the situation substantially. – Torquayrandom
header – Clevey