I am creating a file of a specified size - I don't care what data is in it, although random would be nice. Currently I am doing this:
var sizeInMB = 3; // Up to many Gb
using (FileStream stream = new FileStream(fileName, FileMode.Create))
{
using (BinaryWriter writer = new BinaryWriter(stream))
{
while (writer.BaseStream.Length <= sizeInMB * 1000000)
{
writer.Write("a"); //This could be random. Also, larger strings improve performance obviously
}
writer.Close();
}
}
This isn't efficient or even the right way to go about it. Any higher performance solutions?
Thanks for all the answers.
Edit
Ran some tests on the following methods for a 2Gb File (time in ms):
Method 1: Jon Skeet
byte[] data = new byte[sizeInMb * 1024 * 1024];
Random rng = new Random();
rng.NextBytes(data);
File.WriteAllBytes(fileName, data);
N/A - Out of Memory Exception for 2Gb File
Method 2: Jon Skeet
byte[] data = new byte[8192];
Random rng = new Random();
using (FileStream stream = File.OpenWrite(fileName))
{
for (int i = 0; i < sizeInMB * 128; i++)
{
rng.NextBytes(data);
stream.Write(data, 0, data.Length);
}
}
@1K - 45,868, 23,283, 23,346
@128K - 24,877, 20,585, 20,716
@8Kb - 30,426, 22,936, 22,936
Method 3 - Hans Passant (Super Fast but data isn't random)
using (var fs = new FileStream(fileName, FileMode.Create, FileAccess.Write, FileShare.None))
{
fs.SetLength(sizeInMB * 1024 * 1024);
}
257, 287, 3, 3, 2, 3 etc.