Check If File Is In Use By Other Instances of Executable Run
Asked Answered
A

2

0

Before I go into too detail, my program is written in Visual Studio 2010 using C# .Net 4.0.

I wrote a program that will generate separate log files for each run. The log file is named after the time, and accurate up at millisecond (for example, 20130726103042375.log). The program will also generate a master log file for the day if it has not already exist (for example, *20130726_Master.log*)

At the end of each run, I want to append the log file to a master log file. Is there a way to check if I can append successfully? And retry after Sleep for like a second or something?

Basically, I have 1 executable, and multiple users (let's say there are 5 users).

All 5 users will access and run this executable at the same time. Since it's nearly impossible for all user to start at the exact same time (up to millisecond), there will be no problem generate individual log files.

However, the issue comes in when I attempt to merge those log files to the master log file. Though it is unlikely, I think the program will crash if multiple users are appending to the same master log file.

The method I use is

File.AppendAllText(masterLogFile, File.ReadAllText(individualLogFile));

I have check into the lock object, but I think it doesn't work in my case, as there are multiple instances running instead of multiple threads in one instance.

Another way I look into is try/catch, something like this

try
{
    stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch {}

But I don't think this solve the problem, because the status of the masterLogFile can change in that brief millisecond.

So my overall question is: Is there a way to append to masterLogFile if it's not in use, and retry after a short timeout if it is? Or if there is an alternative way to create the masterLogFile?

Thank you in advance, and sorry for the long message. I want to make sure I get my message across and explain what I've tried or look into so we are not wasting anyone's time.

Please let me know if there's anymore information I can provide to help you help me.

Auto answered 26/7, 2013 at 15:48 Comment(2)
this might be helpful: #51244Phidippides
this previous post might help as well #10210764Degeneration
P
0

Your try/catch is the way to do things. If the call to File.Open succeeds, then you can write to to the file. The idea is to keep the file open. I would suggest something like:

bool openSuccessful = false;
while (!openSuccessful)
{
    try
    {
        using (var writer = new StreamWriter(masterlog, true)) // append
        {
            // successfully opened file
            openSuccessful = true;
            try
            {
                foreach (var line in File.ReadLines(individualLogFile))
                {
                    writer.WriteLine(line);
                }
            }
            catch (exceptions that occur while writing)
            {
                // something unexpected happened.
                // handle the error and exit the loop.
                break;
            }
        }
    }
    catch (exceptions that occur when trying to open the file)
    {
        // couldn't open the file.
        // If the exception is because it's opened in another process,
        // then delay and retry.
        // Otherwise exit.
        Sleep(1000);
    }
}
if (!openSuccessful)
{
    // notify of error
}

So if you fail to open the file, you sleep and try again.

See my blog post, File.Exists is only a snapshot, for a little more detail.

Pictish answered 26/7, 2013 at 16:0 Comment(6)
If I put the whole try/catch inside a while loop, break when reach the end of the outer try, and sleep in the outer catch, then I will get the expected result? I put the method in mode code and it's not giving compile error, but like I said, this is rarely going to happen, just afraid it breaks while I'm not aware.Auto
@sora0419: See my updated code example. The key will be getting the exception handling correct.Pictish
This solution is a poor design. At least put in the specific exception he should be looking for when it comes to file locking. The file may not even exist, or access may be denied, etc. BUT this code will keep you on a never ending, blind, inefficient rollercoaster.Dinodinoflagellate
@user1132959, see the comment in the exception handler. I specifically recommend checking the exception type and acting accordingly.Pictish
@JimMischel Please specify those exceptions. The answer is incomplete.Dinodinoflagellate
@user1132959, The documentation describes which exceptions the constructor is expected to throw and under what circumstances. Seems redundant for me to list them here.Pictish
D
0

I would do something along the lines of this as I think in incurs the least overhead. Try/catch is going to generate a stack trace(which could take a whole second) if an exception is thrown. There has to be a better way to do this atomically still. If I find one I'll post it.

Dinodinoflagellate answered 26/7, 2013 at 16:17 Comment(4)
First, "a whole second" is no time at all in this context. He's reading and writing large files. The other process is likely to have that file open for much longer than "a whole second", so worrying about that time is ridiculous. Second, the technique in the link is idiotic because he's just replacing a simple and effective exception handler with convoluted code that does an end-run around the .NET runtime and doesn't provide any particular benefit in the process.Pictish
@JimMischel "A whole second" is a lot of time for something that is supposed to take a few milliseconds. Your try/catch would never scale. If 1,000 processes would try to access it, your performance would be shot. So worrying about that time is not "ridiculous", it's just not being ignorant. Just surrounding it with an exception handler is blind and "idiotic". The whole purpose is to wait for exclusive access, so why not do it efficiently and correctly? Checking for an error before you let it happen is much better than just catching the error and ignoring it(which takes a second).Dinodinoflagellate
If you have 1,000 or even 100 processes competing for an exclusive lock on a file, you have much bigger problems. No solution that depends on exclusive access will scale. And even the linked solution encounters the error. It just handles the error differently.Pictish
Either way, the link has a better, more maintainable design and works more efficiently. In this sense, it will scale "better" but of course not perfectly.Dinodinoflagellate

© 2022 - 2024 — McMap. All rights reserved.