I'm in the process of translating an odd part of our legacy system that does some IPC via sockets with a process on another machine when certain kinds of transactions occur. In the legacy system, the client application was effecting "listening" to each transaction and queuing up the required tranmiissions for later dispatch via a timer.
However, when moving from 2 tiers to 3 or more, this won't work any more. What does seem to work, however, is to effectively do the same operations on the server side of the portal, just by listening to the same transactions via the DataPortal_Update method of the object being monitored.
I don't want to do anything that interferes with the transaction being processed, so when when one of these occurs I package up the data to be transmitted, add it to a list of pending transmissions, and start a timer (if it hasn't already been started). When the timer fires, the data gets transmitted on another thread (with failover to store the data it's own table should the IPC fail for any reason).
In general, is doing your own asynchronous processing like this via server-side dataportal functions a good idea?Sorry if this seems like it has an obvious answer, but sometimes with C# you can easily do something that you really shouldn't be doing.
The .NET data portal assumes a synchronous connection from client to server (regardless of whether you use the async or sync data portal - the actual client->server call is assumed to be a blocking call).
As long as you preserve that blocking behavior you are good. So on the server side, as long as the primary thread doesn't complete until the task is done, you can spin up all the threads or other processing you'd like to spin up.
Now whether it is a good idea to use a timer to spin up those other threads is debatable, and I think the general answer is no. You should consider using the BackgroundWorker as an abstract and reliable way to implement this sort of behavior - as long as the background task isn't going to take minutes.
If the background task will take minutes, then you should spin your own thread. The reason is that BackgroundWorker will use the thread pool (which is usually ideal), but threads from the thread pool in ASP.NET shouldn't be used for long-running tasks because (as I understand it) this can cause overall scaling issues with ASP.NET itself because it also uses the thread pool. You should verify this though - because I'm thinking of scenarios from a few years ago, and ASP.NET's thread management may be different in .NET 3.0 or 3.5.
OK, thanks for the ideas. I've used BackgroundWorker extensively on the UI side (WinForms), and I don't think the task will take but a few seconds at most. (One of my co-workers always uses System.Threading.QueueUserWorkItem, which I think would be fine here))
I tend to agree that the timer might not be the best way to accomplish what I was really trying to do, which was to defer the the transmission until after the current transaction is committed, and also ensure that I don't spin up multiple threads. (In fact, if the transaction gets rolled back, the transmission should be cancelled. I don't think the legacy code dealt with this properly either).
QueueUserWorkItem is fine - it puts the task on the thread pool too. The advantage of the BackgroundWorker is that it manages the completion callback in a more standardized way - you don't have to invent your own completion scheme.
Off the server, BackgroundWorker is almost always better, because it takes care of getting the call back onto the UI thread - something that's really hard to do yourself. But of course on the server that consideration is completely different.
Copyright (c) Marimer LLC