If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Test case

I have a project that demonstrates the issue. I don't know if you will find it helpful but I'll include it here.

There is a Producer.cs class that puts a message on a queue then sleeps. The Consumer receives the message then sleeps.

Using the normal connection factory or the SingleConnectionFactory, it works fine. Switching to the CachingConnectionFactory leads to the original error.

In our case we are currently using the standard connection factory. Thankfully our message volume is low and we can afford the inefficiency (for now). We tried using the SingleConnectionFactory and it works for some amount of time (hours or more), but then it seems like some of the consumers get stuck. The admin console shows there are pending messages and registered consumers, but no delivery takes place. I have not determined if this is a Spring.NET problem, but I suspect we are doing something on our side to "poison" the connection and make it unusable. If we do get in such a state basically the producers/consumers are blocked. The service continues to run but no message activity takes place and so far I have not found anything amiss in the logs.

I'm hoping that if we can get the CachingConnectionFactory working we can do additional testing and see which of our components is misbehaving.

Hi,
I didn't get a chance to look into the issue, however SingleConnectionFactory is incredibly simple so I suspect the issue maybe with ActiveMQ. As it takes hours to reproduce it seems, I hate to bug you, but since I don't have the time today to set that up and let it bake, if you re-write the example using straight-up NMS API, and cache the connection, I suspect you will have the same issue. How about trying a 5.2 snapshot?
Cheers,
Mark

The test case I posted can be used to reproduce the problem with no delay. The error message occurs within seconds, at most-- probably within the first second.

That issue involves the CachingConnectionFactory, and since the error occurs every time and immediately you should be able to reproduce it easily by running that solution against AMQ.

The second half of the original post was discussing a separate problem-- using the SingleConnectionFactory gave us issues after "hours" of use. However, I'm not ready to point the finger yet at Spring.NET or AMQ because I feel the most likely problem is that our multi-threaded service is deadlocked or otherwise "churning" and that there is not an actual problem with the queues.

The problem is intermittent, which makes writing a test case slightly more difficult. So far we have observed the problem most often when the system is under moderate to heavy load. We are using queueing in our load process, which starts with several dozen GB of text files and end up with millions of rows in the database. The messages themselves are small (just filenames) and there are a few thousand per load that move from queue to queue.

Is there a known practical limit to the number of producers/consumers can share a single connection? Also, I'm using client acknowledge but some of the asynchronous receivers wait in a Thread.Sleep() loop until a condition is true, at which point the message is processed. I have a theory that might be part of the problem.

Hi,
Thanks mate for the ol' college try i don't know of what the practical limits in activemq are for the number of session/consumers/producers per connection but it should be quite high and it doesn't sound like you are using that many. If you are doing very long processing tasks before doing the client ack, I can see how that might cause some issues in as much as that is probably not that good practice to wait very long. The possible retry of an long running data-load (due to error) should be separated from the fact that the message was delivered successfully to the process.
Mark