[ACCEPTED]-Why doesn't a foreach loop work in certain cases?-for-loop

Accepted answer
Score: 13

Your problem is that the constructor of 13 List<T> that creates a new list from IEnumerable (which is 12 what you call) isn't thread-safe with respect 11 to its argument. What happens is that while 10 this:

 new List<string>(Foo.Requests)

is executing, another thread changes 9 Foo.Requests. You'll have to lock it for the duration 8 of that call.

[EDIT]

As pointed out by Eric, another 7 problem List<T> isn't guaranteed safe for readers 6 to read while another thread is changing 5 it, either. I.e. concurrent readers are 4 okay, but concurrent reader and writer are 3 not. And while you lock your writes against 2 each other, you don't lock your reads against 1 your writes.

Score: 5

Your locking scheme is broken. You need 10 to lock Foo.Requests() for the entire duration of the 9 loop, not just when removing an item. Otherwise 8 the item might become invalid in the middle 7 of your "process the data" operation and 6 enumeration might change in between moving 5 from item to item. And that assumes you 4 don't need to insert the collection during 3 this interval as well. If that's the case, you 2 really need to re-factor to use a proper 1 producer/consumer queue.

Score: 5

After seeing your exception; it looks to 11 me that Foo.Requests is being changed while 10 the shallow copy is being constructed. Change 9 it to something like this:

List<string> requests;

lock (Foo.Requests)
{
    requests = new List<string>(Foo.Requests);
}

foreach (string data in requests)
{
    // Process the data.

    lock (Foo.Requests)
    {
        Foo.Requests.Remove(data);
    }
}

Not the question, but...

That being said, I 8 somewhat doubt the above is what you want 7 either. If new requests are coming in during 6 processing, they will not have been processed 5 when your foreach loop terminates. Since 4 I was bored, here's something along the 3 lines that I think you're trying to achieve:

class RequestProcessingThread
{
    // Used to signal this thread when there is new work to be done
    private AutoResetEvent _processingNeeded = new AutoResetEvent(true);

    // Used for request to terminate processing
    private ManualResetEvent _stopProcessing = new ManualResetEvent(false);

    // Signalled when thread has stopped processing
    private AutoResetEvent _processingStopped = new AutoResetEvent(false);

    /// <summary>
    /// Called to start processing
    /// </summary>
    public void Start()
    {
        _stopProcessing.Reset();

        Thread thread = new Thread(ProcessRequests);
        thread.Start();
    }

    /// <summary>
    /// Called to request a graceful shutdown of the processing thread
    /// </summary>
    public void Stop()
    {
        _stopProcessing.Set();

        // Optionally wait for thread to terminate here
        _processingStopped.WaitOne();
    }

    /// <summary>
    /// This method does the actual work
    /// </summary>
    private void ProcessRequests()
    {
        WaitHandle[] waitHandles = new WaitHandle[] { _processingNeeded, _stopProcessing };

        Foo.RequestAdded += OnRequestAdded;

        while (true)
        {
            while (Foo.Requests.Count > 0)
            {
                string request;
                lock (Foo.Requests)
                {
                    request = Foo.Requests.Peek();
                }

                // Process request
                Debug.WriteLine(request);

                lock (Foo.Requests)
                {
                    Foo.Requests.Dequeue();
                }
            }

            if (WaitHandle.WaitAny(waitHandles) == 1)
            {
                // _stopProcessing was signalled, exit the loop
                break;
            }
        }

        Foo.RequestAdded -= ProcessRequests;

        _processingStopped.Set();
    }

    /// <summary>
    /// This method will be called when a new requests gets added to the queue
    /// </summary>
    private void OnRequestAdded()
    {
        _processingNeeded.Set();
    }
}


static class Foo
{
    public delegate void RequestAddedHandler();
    public static event RequestAddedHandler RequestAdded;

    static Foo()
    {
        Requests = new Queue<string>();
    }

    public static Queue<string> Requests
    {
        get;
        private set;
    }

    public static void AddRequest(string request)
    {
        lock (Requests)
        {
            Requests.Enqueue(request);
        }

        if (RequestAdded != null)
        {
            RequestAdded();
        }
    }
}

There 2 are still a few problems with this, which 1 I will leave to the reader:

  • Checking for _stopProcessing should probably be done after every time a request is processed
  • The Peek() / Dequeue() approach won't work if you have multiple threads doing processing
  • Insufficient encapsulation: Foo.Requests is accessible, but Foo.AddRequest needs to be used to add any requests if you want them processed.
  • In case of multiple processing threads: need to handle the queue being empty inside the loop, since there is no lock around the Count > 0 check.
Score: 2

Three things:
- I wouldn't put them lock 5 within the for(each) statement, but outside 4 of it.
- I wouldn't lock the actual collection, but 3 a local static object
- You can not modify 2 a list/collection that you're enumerating

For 1 more information check:
http://msdn.microsoft.com/en-us/library/c5kehkcz(VS.80).aspx

lock (lockObject) {
   foreach (string data in new List<string>(Foo.Requests))
        Foo.Requests.Remove(data);
}
Score: 2

To be completely honest, I would suggest 4 refactoring that. You are removing items 3 from the object while also iterating over 2 that. Your loop could actually exit before 1 you've processed all items.

Score: 2

The problem is the expression

new List<string>(Foo.Requests)

inside your 4 foreach, because it's not under a lock. I 3 assume that while .NET copies your requests 2 collection into a new list, the list is 1 modified by another thread

Score: 1
foreach (string data in new List<string>(Foo.Requests))
{
    // Process the data.
    lock (Foo.Requests)
    {
        Foo.Requests.Remove(data);
    }
}

Suppose you have two threads executing this 11 code.

at System.Collections.Generic.List1[System.String]..ctor

  • Thread1 starts processing the list.
  • Thread2 calls the List constructor, which takes a count for the array to be created.
  • Thread1 changes the number of items in the list.
  • Thread2 has the wrong number of items.

Your 10 locking scheme is wrong. It's even wrong 9 in the for loop example.

You need to lock 8 every time you access the shared resource 7 - even to read or copy it. This doesn't 6 mean you need to lock for the whole operation. It 5 does mean that everyone sharing this shared 4 resource needs to participate in the locking 3 scheme.

Also consider defensive copying:

List<string> todos = null;
List<string> empty = new List<string>();
lock(Foo.Requests)
{
  todos = Foo.Requests;
  Foo.Requests = empty;
}

//now process local list todos

Even 2 so, all those that share Foo.Requests must 1 participate in the locking scheme.

Score: 0

You are trying to remove objects from list 9 as you are iterating through list. (OK, technically, you 8 are not doing this, but that's the goal 7 you are trying to achieve).

Here's how you 6 do it properly: while iterating, construct 5 another list of entries that you want to 4 remove. Simply construct another (temp) list, put 3 all entries you want to remove from original 2 list into the temp list.

List entries_to_remove = new List(...);

foreach( entry in original_list ) {
   if( entry.someCondition() == true ) { 
      entries_to_remove.add( entry );
   }
}

// Then when done iterating do: 
original_list.removeAll( entries_to_remove );

Using "removeAll" method 1 of List class.

Score: 0

I know it's not what you asked for, but 3 just for the sake of my own sanity, does 2 the following represent the intention of 1 your code:

private object _locker = new object();

// ...

lock (_locker) {
    Foo.Requests.Clear();
}

More Related questions