Code:
static void MultipleFilesToSingleFile(string dirPath, string filePattern, string destFile)
{
string[] fileAry = Directory.GetFiles(dirPath, filePattern);
Console.WriteLine("Total File Count : " + fileAry.Length);
using (TextWriter tw = new StreamWriter(destFile, true))
{
foreach (string filePath in fileAry)
{
using (TextReader tr = new StreamReader(filePath))
{
tw.WriteLine(tr.ReadToEnd());
tr.Close();
tr.Dispose();
}
Console.WriteLine("File Processed : " + filePath);
}
tw.Close();
tw.Dispose();
}
}
I need to optimize this as its extremely slow: takes 3 minutes for 45 files of average size 40 — 50 Mb XML file.
Please note: 45 files of an average 45 MB is just one example, it can be n
numbers of files of m
size, where n
is in thousands & m
can be of average 128 Kb. In short, it can vary.
Could you please provide any views on optimization?
Dispose
is superfluous as the objects you're disposing are already in a using block (which will take care of Dispose for you). – Parsayen
is the longer it's going to take for just the disk i/o. On top of that, you have the actual overhead of object creation, memory allocation, GC, and so forth. – Tarpeia