TransWikia.com

C# and the FileSystemWatcher

Stack Overflow Asked by Thorsten Schroeer on October 18, 2020

i have written a service in C# which should move backup files (*.bak and *.trn) from a database server to a special backup server. This works quite well so far. The problem is that it tries to move single files twice. This fails of course. I have configured FileSystemWatcher as follows:

try
{
    m_objWatcher = new FileSystemWatcher();
    m_objWatcher.Filter = m_strFilter;
    m_objWatcher.Path = m_strSourcepath.Substring(0, m_strSourcepath.Length - 1);
    m_objWatcher.IncludeSubdirectories = m_bolIncludeSubdirectories;
    m_objWatcher.NotifyFilter = NotifyFilters.LastWrite | NotifyFilters.LastAccess; // | NotifyFilters.CreationTime;
    m_objWatcher.Changed += new FileSystemEventHandler(objWatcher_OnCreated);
}
catch (Exception ex)
{
    m_objLogger.d(TAG, m_strWatchername + "InitFileWatcher(): " + ex.ToString());
}

Is it possible that the Watcher produces an event twice for the same file? If I set the filter to CreationTime only, it does not react at all.

How do I have to set the Watcher to fire an event only once per file?

Thanks in advance for your help

One Answer

The documentation states that common file system operations might raise more than one event. Check under the Events and Buffer Sizes heading.

Common file system operations might raise more than one event. For example, when a file is moved from one directory to another, several OnChanged and some OnCreated and OnDeleted events might be raised. Moving a file is a complex operation that consists of multiple simple operations, therefore raising multiple events. Likewise, some applications (for example, antivirus software) might cause additional file system events that are detected by FileSystemWatcher.

It also offers a few guidelines, including:

Keep your event handling code as short as possible.

To that end, you could use your FileSystemWatcher.Changed event to queue files for processing, then process them later. This is a quick example of what that might look like using an instance of System.Threading.Timer to process the queue.

using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;

public class ServiceClass
{
    public ServiceClass()
    {
        _processing = false;
        _fileQueue = new ConcurrentQueue<string>();
        _timer = new System.Threading.Timer(ProcessQueue);
        // Schedule the time to run in 5 seconds, then again every 5 seconds.
        _timer.Change(5000, 5000);
    }

    private void objWatcher_OnChanged(object sender, FileSystemEventArgs e)
    {
        // Just queue the file to be processed later. If the same file is added multiple
        // times, we'll skip the duplicates when processing the files.
        _fileQueue.Enqueue(e.FilePath);
    }

    private void ProcessQueue(object state)
    {
        if (_processing)
        {
            return;
        }
        _processing = true;
        var failures = new HashSet<string>();
        try
        {
            while (_fileQueue.TryDequeue(out string fileToProcess))
            {
                if (!File.Exists(fileToProcess))
                {
                    // Probably a file that was added multiple times and it was
                    // already processed.
                    continue; 
                }
                var file = new FileInfo(fileToProcess);
                if (FileIsLocked(file))
                {
                    // File is locked. Maybe you got the Changed event, but the file
                    // wasn't done being written.
                    failures.Add(fileToProcess);
                    continue;
                }
                try
                {
                    fileInfo.MoveTo(/*Your destination*/);
                }
                catch (Exception)
                {
                    // File failed to move. Add it to the failures so it can be tried
                    // again.
                    failutes.Add(fileToProcess);
                }
            }
        }
        finally
        {
            // Add any failures back to the queue to try again.
            foreach (var failedFile in failures)
            {
                _fileQueue.Enqueue(failedFile);
            }
            _processing = false;
        }
    }

    private bool IsFileLocked(FileInfo file)
    {
        try
        {
            using (FileStream stream = file.Open(FileMode.Open, FileAccess.Read,
               FileShare.None))
            {
                stream.Close();
            }
        }
        catch (IOException)
        {
            return true;
        }
        return false;
    }

    private System.Threading.Timer _timer;
    private bool _processing;
    private ConcurrentQueue<string> _fileQueue;
}

Credit where it's due, I took FileIsLocked from this answer.

Some other things you might need to consider:

What happens if your FileSystemWatcher misses an event? [The documentation] does state that it is possible.

Note that a FileSystemWatcher may miss an event when the buffer size is exceeded. To avoid missing events, follow these guidelines:

Increase the buffer size by setting the InternalBufferSize property.

Avoid watching files with long file names, because a long file name contributes to filling up the buffer. Consider renaming these files using shorter names.

Keep your event handling code as short as possible.

What happens if your service crashes, but the process writing backup files continues to write them? When you restart your service, will it pick those files up and move them?

Answered by Joshua Robinson on October 18, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP