Case for new TransactionalTypes enumeration?

Case for new TransactionalTypes enumeration?

Old forum URL: forums.lhotka.net/forums/t/9026.aspx


RockyRocks posted on Thursday, June 03, 2010

A colleague decided to use the TransactionScope option for his transactional support for insert/update of an object graph.

I changed the app to make use of another object during its data access during an update. The fetch of that object failed, because the transaction mechanism identified that two connections were to be open at the same time and tried to promote the transaction to distributed. That doesn't work in our environment because we don't run DTC.

It made me think that what I should be doing for the vast majority of our DataPortal_Fetch methods (where we know that the read will not need to be part of a transaction) is use TransactionScopeOption.Suppress. However, it'd be nice if I could do this not through adding code, but by targetting the data access method with an attribute as per other transactional options. It would read something like Transactional(TransactionalTypes.TransactionScopeSuppress), which would cause the appropriate DataPortal to create a TransactionScope, but with the Suppress option.

I think in general Fetch operations should not be transactional. In the case that they might be performed during another operation that is transactional, should it not be possible to easily exclude them like this?

Therefore I suggest a new TransactionalTypes enumeration value of TransactionScopeSuppress.

I look forward to your thoughts.

RockfordLhotka replied on Thursday, June 03, 2010

I have considered this sort of thing in the past. There are three problems.

First, the Transactional attribute also supports Manual and EnterpriseServices transactions, and it isn't clear what Suppress would mean there.

Second, this is a slippery slope. If I add a parameter for Suppress, do I also add a parameter for isolation level? What about other concepts unique to either EnterpriseServices or TransactionScope.

Third, is there an expectation that this metadata be somehow made available to your code inside your methods? This seems like it would be important for the Manual transaction option. And does this mean the metadata would need to be available to child objects as well? I think so, because your manual transaction code would need to know this at all levels.

Instead of facing all these issues, my answer for the past several years has been that if you don't like the very basic on/off behavior provided by the Transactional attribute, you should use Manual transactions and manage them yourself.

RockyRocks replied on Tuesday, June 15, 2010

Hi Rocky,

I agree that the framework can't possibly hope to do everything that a developer might dream up, so I understand the slippery slope to which you refer. However, I was thinking along very simple lines to avoid the slope getting too steep.

I'm moderately ashamed of my suggestion, in that the suggested implementation was flawed. However, it does offer the ability to avoid many of the problems you highlight. Were this offered as a separate enumeration value, the new option precludes its use in conjunction with any other, including Manual.

However, having thought some more about it, I think it would be cleaner to add a second (optional) parameter to the Transactional attribute for this purpose instead of adding a technology-specific value. I suspect your brain had prefiltered my suggestion into this result and that is from where the point you raised about invalid combinations was born. Some technology-agnostic values for the enumeration for this new parameter would suffice:

Default (this allows a technology-specific default); Include (equivalent to TransactionScope's Required); Exclude (equivalent to TransactionScope's Suppress); CreateNew (equivalent to TransactionScope's RequiresNew)

I intended these options to be available ONLY to the framework and not to custom code. Use of a second parameter opens the can of worms you were suggesting, in that someone might try to use invalid/unsupported combinations. As far as I'm concerned, throwing exceptions where a combination of values that is not appropriate/supported would be fine. The exception would be raised every time the data access code ran, including during (manual/system) testing. Obviously some careful wording of the exception message would be required to avoid lots of forum questions!

A third (optional) parameter to support control of the isolation level would offer even finer control, but I can sense a man with a bucket of grease standing next to me at the top of that slope ...

I think what is attractive about adding the ability to control exclusion from a transaction is that it increases the number of cases where code can be left out of the BOs, and that is of course what you strive so hard to allow in the framework. The Manual option is always available where the support offered by the framework isn't advanced enough.

As it is, if we were to use TransactionScope, I'd want to add code to almost EVERY DataPortal_Fetch to suppress its involvement in any transaction that might be ongoing, thereby allowing for the situation where I want to use a read-only object during checks/calculations that take place in an insert/update data portal operation. That makes nearly all of my BOs contain more code. That code would be better placed in the framework, if it is simple to do (and I think it can be).

 

tiago replied on Sunday, June 13, 2010

Hi Rocky

RockyRocks

I think in general Fetch operations should not be transactional. In the case that they might be performed during another operation that is transactional, should it not be possible to easily exclude them like this?

Therefore I suggest a new TransactionalTypes enumeration value of TransactionScopeSuppress.

I look forward to your thoughts.

Your question didn't get much feed back.

DataPortal_Fetch is normally used to display stuff on the screen. Or on the printer. The point is the same although the printer might be a bigger source of problems. Suppose I'm doing my report by fetching a big collection of info RO objects and then fetching details about  each item. I know this isn't a good technique as it should be done in a single query so objects don't get change or deleted in between.

Most of the time we don't need transactions on fetches. But we might need.

RockyRocks replied on Tuesday, June 15, 2010

Hi Tiago,

Thanks for your input. I agree that the developer needs to have control over whether any read is included in a transaction. I wasn't suggesting that the decision is taken out of the developer's hands, but that the developer gets an additional option. As you'll see from my (very belated, after much thought) response to Rocky, I'm simply suggesting that the number of cases where the framework offers all the support needed could be increased, without intending to add complexity - keep it simple is always my motto.

In my case I'd want to avoid running into the problem I just had by excluding all fetches from transactions by default. Unless in specific instances I found that I needed it, defaulting it to excluded would be better for me. As we use static/one-off code generation I would generate to include the additional parameter about which I speak, but could remove it or change its value after generation where the case in question falls outside of the norm.

I could of course generate all the code needed to manually support suppression of transactions during fetches using the Manual enumeration of the Transactional attribute. But the fact that I could does not mean I should! That's copy and paste all over again, by a different name. I think that striving to minimise the amount of code in our custom classes is a very good thing; at some point any custom code will need to be maintained/changed as technology changes (and this is most true in the area of data access). Hence my suggestion that a tiny bit more code in the framework - once - would save lots of code in the custom classes lots of times.

Copyright (c) Marimer LLC