Transaction timeout problem

Transaction timeout problem

Old forum URL:

IgorB posted on Thursday, August 26, 2010

I created a pretty simple business object (I use CSLA 4.0 with WinForms frontend) with one child collection. There are complex computations on the Database side when child element is inserted – it can take several seconds per row. I increased CommandTimeout to 600, so it should be enough for saving required amount of information (totally about 2-3 minutes). When I try to save main business object with several rows, the transaction stops after 60 sec with a message:


DataPortal.Update failed (Csla.DataPortalException: ChildDataPortal.Update failed on the server ---> Csla.Reflection.CallMethodException: Child_Update method call failed ---> Csla.DataPortalException: ChildDataPortal.Update failed on the server ---> Csla.Reflection.CallMethodException: Child_Insert method call failed ---> System.InvalidOperationException: The transaction associated with the current connection has completed but has not been disposed.  The transaction must be disposed before the connection can be used to execute SQL statements. …)”


I tried to increase transaction timeout in component services on client and server machines, but that didn’t help. Do you have any idea, why I cannot run any transaction for more than 1 min?

Thank you very much in advance.

ajj3085 replied on Thursday, August 26, 2010

Well the error is that the transaction hasn't been disposed... so are you disposing it when you're done?

IgorB replied on Friday, August 27, 2010

As I mentioned, the process interrupted after 60 sec automatically. Transaction didn't finish (it would take about 2 - 3 min). The question is WHY it interrupted.

ajj3085 replied on Friday, August 27, 2010

Well, the only information you posted so far is the error message, which indicates that the transaction did complete but wasn't disposed properly.  My first instinct would be to ensure I was properly disposing the transaction before continuing to trouble shoot anything else.  You should probably also post the code where you're setting up your transaction.

60 seconds is the default transaction timeout.

ayubu replied on Sunday, August 29, 2010

Is your update code enclosed in a using Block? if not that might be a reason

IgorB replied on Monday, August 30, 2010

This is my Insert method:




 protected override void DataPortal_Insert()


using (var ctx = ContextManager<PMDalLinq.TestContext>.GetManager(PMDalLinq.Database.Test))


int? uploadID = null;




using (BypassPropertyChecks)




ref uploadID, this.CompanyID, this.PeriodType, this.Status, this.UserID);




this.ID = (int)uploadID;












You can find a lot of such methods in ProjectTrackercs solution.

tmg4340 replied on Monday, August 30, 2010

Well... the first thing I'm wondering is whether you shouldn't be using the EnterpriseServices DataPortal, since you talk about setting timeouts in Component Services.  Howerver, I know that TransactionScope does "play nice" with Component Services, presuming you have it configured correctly, so perhaps that doesn't matter.

However, configuration setup may be your problem.

The transactional DataPortal wraps your calls inside a TransactionScope created with default options.  That includes things like the timeout value (which I'm pretty sure is defaulted to 60 seconds).  That's likely what's causing your problem.  I can't say whether moving to the EnterpriseServices DataPortal will solve this issue or not - I have never used it.

If it were me, I'd probably not utilize CSLA's built-in transaction support and manage your transactions manually.  That way you can control everything.


- Scott

ajj3085 replied on Monday, August 30, 2010

I agree with Scott.  At the very least, you can't alter the TS' timeout unless you create it yourself.  I don't know if TS will honor the DTC timeouts or not.

IgorB replied on Tuesday, August 31, 2010


Of cause, I re-created my process so I didn't use CSLA built-in transaction support. I did it at the same time when I got that problem ( I didn't have time for research). But I don't like that solution. So I am trying to find a way to increase somehow timeout from 60 sec.


tmg4340 replied on Tuesday, August 31, 2010

Without changing the CSLA codebase, I don't think you're going to get what you want.  As I said, the transactional DataPortal wraps the DP calls inside a TransactionScope object initialized with default values.  Nothing in CSLA gives you access to change any of those values.  As you've seen, changing timeout values on your database connection strings or Component Services setup doesn't matter, because the TransactionScope object doesn't care about whatever timeout values may have been established on any attached resource.


- Scott

IgorB replied on Tuesday, August 31, 2010

So, do you think only Rocky can help to solve this problem?

tmg4340 replied on Tuesday, August 31, 2010

If you don't want to handle your transactions manually (which I still think is the best suggestion), there is nothing stopping you from pulling the CSLA code, modifying it to suit your purposes, and using that in your project.  Several people have done that for their projects.  It certainly complicates upgrades, but it's a viable solution in certain scenarios.

However, if you want an "official" solution, then yes, it's up to Rocky to decide how (and if) he wants to resolve this.

- Scott

ajj3085 replied on Tuesday, August 31, 2010

Well, the thing is that the "built in" support is saving you about two lines of code:

using( var ts  = new TransactionScope()) {



That's it.  no magic to it.  Doing an Enterprise transaction using ServicedComponent is a bit more complex, but still fairly easy I believe.

RockfordLhotka replied on Tuesday, August 31, 2010

I had considered completely reworking the data portal for CSLA 4 - opening it up as a chain of command pattern - basically make it a sequential workflow where people could plug in whatever they wanted to the pipeline, and swap out their component for any of mine.

That turns out to be somewhat challenging - at least to provide that level of flexibility while still keeping the data portal "just working" for the 99% of people who wouldn't care or use that extensibility.

So I didn't do it. It would have been a huge amount of work, to address some edge cases that have decent workarounds today.

Arguably the most common edge case is where the default TransactionScope settings aren't sufficient, but that really doesn't save you much code when you get right down to it, so there's a decent workaround that is less complex than the alternative.

The alternative is either to make you implement some sort of custom Transactional handler, that you'd plug into the data portal pipeline via config. Or to have the existing TransactionScope handler optionally invoke some "TransactionScope factory" component - again that you'd create and plug into the pipeline via config.

Of course the next think somebody would want is to have different settings for different business objects - and then we're back to the bigger chain of command approach - where the call context is passed through the chain so you can interrogate the request and do whatever you'd like based on the criteria/request/object/whatever.

I'm not saying I'll never do this work - I might - but it keeps falling lower than other things in the priority list...

tmg4340 replied on Wednesday, September 01, 2010

I know this is getting beyond the OP topic, but this brought to mind a (obvious to me) question:

Would this DP chain exist both on the client and server side?

I know that makes a potentially complicated piece of tech even more complicated, but I could see some real value in having a pluggable chain on the client side.  I know that some environments would limit what you could effectively do on the client side, but it could open up some real opportunities for others.

But I really like the idea in terms of bringing some AOP-style concepts to the table...

- Scott

RockfordLhotka replied on Wednesday, September 01, 2010

Yes, it would be the same on client and server.

Of course writing components to put into the pipeline would not be trivial, and would require extensive knowledge of data portal internals. Not for the faint of heart.

Which is why I also keep considering the much lower-impact approach of having a TransactionScope factory...

AaronH replied on Tuesday, October 19, 2010

I'm also having this issue.  Any takers?

AaronH replied on Tuesday, October 19, 2010

Nevermind, I figured it out.  I applied the TransactionalTypes.Manual attribute to my DP_Insert and specified my own TransactionScope, specifying a timeout to 5 minutes and it worked.

cardasim replied on Monday, February 28, 2011

I've also encountered the exception "The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements." with and wothout CSLA. So my conclusion is that this error is not caused by CSLA.

I believe it is caused by the TrasactionScope together with some low-resources (RAM and/or CPU). I've only encountered the issue on low resources VMs and I've read on the internet that others had the same behavior in similar configurations. The solution was to change the code to manula transactions.

It seems that TransactionScope (or some component that is calls) and/or the resource coordinator (sql server) have some issue communicating during heavy transaction loads and low resources.

Copyright (c) Marimer LLC