Suggested transactions enhancement for CSLA.NET

Suggested transactions enhancement for CSLA.NET

Old forum URL: forums.lhotka.net/forums/t/449.aspx


hurcane posted on Thursday, June 22, 2006

I have a suggestion for an enhancement to how TransactionScope transactions are used. Before I provide the solution, let me explain what drove me to make these changes.

I have been writing integration tests with NUnit. All  my work had been done with a local data portal and database. Our customers are going to have a remote data portal and database, so I decided I should set up that configuration and make sure our tests all run. Our customers are already using VB6-based components through COM+, so I decided to set up the data portal using the EnterpriseServicesProxy and EnterpriseServicesHost.

When I ran my tests, some of them worked, but most of them failed. Sad [:(] It turned out there were three general reasons my tests were failing: bad test code, bad business code, and database transaction issues.

The bad test code was usually due to using old references after a Save. This works with a local data portal, but not a remote one. The bad business code was usually caused by setting shared data in the DataPortal_Xxx methods and then trying to use it on the client. These were relatively easy to fix.

However, a lot of my tests were erroring with the same error: A distributed transaction timed out. This turned out to be because of how I have configured my testing environment. I want my tests to have no residual effect on the database data, so I have a test fixture in NUnit that looks like this:

Public MustInherit Class TransactionFixture
    Private tx As CommittableTransaction

    <SetUp()> _
    Public Sub Setup()
        tx = New CommittableTransaction(New TimeSpan(0, 5, 0))

        ' Make the transaction the ambient transaction.
        ' ADO.NET and COM+ will automatically participate with this ambient transaction.
        Transaction.Current = tx

        ' Log in to gain access to the database.
        BusinessPrincipal.Login( _
            My.Settings.Item("TestUserID").ToString, _
            My.Settings.Item("TestPW").ToString)
    End Sub

    <TearDown()> _
    Public Sub TearDown()
        tx.Rollback()
        Transaction.Current = Nothing
    End Sub
End Class

My unit tests inherit from this transaction fixture. A typical test follows these general steps:

Because the transaction is rolled back at the end of each test, all data manipulation, whether through the test or through the business objects, is rolled back, leaving a consistent database state between tests.

These are the tests that timed out when I changed to the remote data portal. The insert data statement  runs in a transaction (serializable by default) which is active throughout the test. The problem was that the business object would attempt to read the data that had just been inserted, but the business object was not participating in the transaction, so it couldn't read the data.

I needed a way for the server side to participate in the same transaction on the client. In theory, Transaction objects can escalate to distributed transactions, including transactions that happen on different machines. How do I share the transaction?

My solution is to pass the "ambient" transaction through the data portal. Since the framework already passes culture information automatically, couldn't it also pass the transaction? The transaction is serializable, so this was easy to do with a few lines of code in the Server.DataPortalContext and Server.DataPortal.

In Server.DataPortalContext

In Server.DataPortal, I added the following code in SetContext:
      ' Use the transaction from the client, if one is in effect.
      If Not IsNothing(context.AmbientTransaction) Then
        Transactions.Transaction.Current = context.AmbientTransaction
      End If

I don't automatically set the current transaction on the server because there might already be an ambient transaction in effect on the server's thread. I wouldn't want to reset it to nothing if the client didn't have an active transaction. However, if the client has an active transaction, that trumps all other transactions.

Just a few lines of code, and my tests are working through the remote data portal. It was such an easy change, and I'm not sure that it has any negative impact. Perhaps somebody has already tried something like this and can alert me to any potential issues.

Could this be a candiate for inclusion in a future update of the framework?

ajj3085 replied on Thursday, June 22, 2006

Since this is a testing issue, and not actually something with the Csla framework itself, I don't think this would be a good enhancement.

Instead of doing your tests in a transaction so thaty ou don't leave the DB in an intermediate state, just write a clean up stored procedure which clears all the tables and inserts any dat which must always be present.

This will have the other advantage of being faster tahn using distributed transactions.

As a side note, I wouldn't go the route of continuing to use Enterprise services just because their old VB6 framework did; this will also have a negative peformance impact.  I'd stick to either manual transactions or transaction scope if your database is Sql Server 2005.

HTH
Andy

hurcane replied on Thursday, June 22, 2006

ajj3085:
Since this is a testing issue, and not actually something with the Csla framework itself, I don't think this would be a good enhancement.


I think this can be applied in more than testing scenarios. I think integration projects could be helped by this enhancement. Suppose I have CSLA objects that are utilized by other existing systems in an enterprise. One of the requirements might be that that CSLA object has to participate in a distributed transaction.

This can be solved by implementing the passing of the transaction within the business object instead of the framework. However, wouldn't it be nicer if the framework handled it automatically?

Enhancements generally need to be evaluated against two major criteria. Does it break existing code? Does it affect performance? For the change I have made, the answer is no.

ajj3085:
Instead of doing your tests in a transaction so that you don't leave the DB in an intermediate state, just write a clean up stored procedure which clears all the tables and inserts any dat which must always be present.


I considered this. Many of our tests generate a lot of data. Cleanup scripts are not a trivial task. If I could be convinced that the performance improvement is enough to justify the extra time spent writing cleanup code, I would do it that way.

As a side note, I wouldn't go the route of continuing to use Enterprise services just because their old VB6 framework did; this will also have a negative peformance impact.  I'd stick to either manual transactions or transaction scope if your database is Sql Server 2005.


Well, my CSLA objects must integrate with the VB6-based objects. It's part of a long-term migration project. The .NET-based objects have to integrate with the COM-based objects, and vice versa. We also have to support SQL 2000, as 95% of our customers are using that version and we can't force them to upgrade yet. Sad [:(]

I wish that I could be working with a clean slate, instead of with legacy integration requirements. I would be doing a lot of things very differently!

Brian Criswell replied on Thursday, June 22, 2006

We have set up all of our CSLA 2.0 tests to run through a single command object.  The command object's data access method is marked with [Transactional(TransactionScope)] (or whatever it is, don't have access to the code right now).  The command object also takes a method as its criteria.  Inside the data access method, the method is invoked and then a DtcRollbackException is thrown.  A test is made up of two methods:

[Test]
public void TestMethod()
{
    // Run the test through the command object
    TestCommand.RunTest(TestMethodCore);
}

private void TestMethodCore()
{
    // Add, update, fetch and delete objects here
}

I do not know if this would work for your situation, but it has worked well for us. All our objects manipulate their data within a single transaction.

hurcane replied on Thursday, June 22, 2006

Brian Criswell:
We have set up all of our CSLA 2.0 tests to run through a single command object.


This is a very interesting idea. I will be exploring this a little bit. It does mean that I'd have to rewrite all our data-access tests (over 4000 of them), but that project could be assigned to a summer intern. Wink [;)]

rasupit replied on Thursday, June 22, 2006

Brian,

This is very interesting, would you mind share us how you do this.  Does it means your test classes leave in the same project as your library classes or you have nunit project being reference by your library project?

Some code would be very helpfull, Thanks.

Ricky

Brian Criswell replied on Thursday, June 22, 2006

The NUnit tests are in their own library which references the business object library.

I would have to get permission to post code, but it the command object looks something like this (doing this from memory now, so don't blame me if it does not compile ;) ) :

internal TestCommand : CommandBase
{
    private CommandBase()
    {}

    // Define RollbackDtcException

    // Define delegate

    // Define criteria

    internal static RunTest(TestHandler testMethod)
    {
       try
       {
          DataPortal.Execute(new Criteria(testMethod));
       }
       catch (RollbackDtcException ex)
       {}
    }

    [Transactional(TransactionScope)]
    private DataPortal_Execute(Criteria criteria)
    {
       criteria.TestMethod.Invoke();
       throw new RollbackDtcException();
    }
}

rasupit replied on Friday, June 23, 2006

Brian,

Thanks for the insight, I came out with something like the following:

using System;
using System.Collections.Generic;
using System.Text;
using System.Transactions;
using Csla;
using Csla.Data;

namespace CslaTest
{
    internal class TestCommand : Csla.CommandBase 
    {
        private TestCommand(TestHandler testMethod) 
        {
            TestMethod = testMethod;
        }
        public delegate void TestHandler();
        private TestHandler TestMethod;

        public static void RunTest(TestHandler method)
        {
            try
            {
                DataPortal.Execute(new TestCommand(method));
            }
            catch (Csla.DataPortalException ex)
            { 
                if (!(ex.BusinessException is RollbackDtcException)) 
                    throw; 
            }
        }
        [Transactional(TransactionalTypes.TransactionScope)]
        protected override void DataPortal_Execute()
        {
            TestMethod.Invoke();
            throw new RollbackDtcException();
        }

        private class RollbackDtcException : ApplicationException { }
    }
}
I didn't get a chance to test this using remoting, but it works without remoting using the following test:
using System;
using System.Collections.Generic;
using System.Text;
using NUnit.Framework;
using CslaTest.CslaObjects;

namespace CslaTest
{
    [TestFixture]
    public class SingleTestReadOnly
    {
        [Test]
        public void TestReadOnlyList()
        {
            TestCommand.RunTest(new TestCommand.TestHandler(TestReadOnlyListCore));
        }
        private void TestReadOnlyListCore()
        {
            ReadOnlyRootList list = ReadOnlyRootList.GetReadOnlyRootList();
            foreach (ReadOnlyChild item in list)
            {
                Console.WriteLine(string.Format("{0} {1}", item.SupplierID, item.ContactName));
            }
        }

        [Test]
        public void TestReadOnlyRoot()
        {
            TestCommand.RunTest(TestReadOnlyRootCore);
        }
        private void TestReadOnlyRootCore()
        {
            ReadOnlyRoot root = ReadOnlyRoot.GetReadOnlyRoot(1);
            Console.WriteLine("Supplier={0} {1}:{2}", root.SupplierID, root.CompanyName, root.ContactName);
            foreach (ReadOnlyChildItem item in root.ReadOnlyChildList)
            {
                Console.WriteLine("- Product={0} {1}", item.ProductID, item.ProductName);
            }
        }

    }
}

Let me know if this is what you described.

Once again thanks for sharing.

Ricky.

Brian Criswell replied on Monday, June 26, 2006

That is pretty much it.  Did you ever get a chance to try it with remoting?  I have never gotten around to setting up remoting.

rasupit replied on Monday, June 26, 2006

Brian,

Thanks for confirming.  I haven't got a chance to try this with remoting but I think it would not work. If I think about it, we really passing a pointer to a test method which reside locally.

Ricky

Brian Criswell replied on Tuesday, June 27, 2006

Okay, what if we used a MethodInfo object or passed a string and used CallByName instead of the delegate?

RockfordLhotka replied on Monday, June 26, 2006

Yes, the problem with this entire discussion is that it assumes a 2-tier physical model. It isn't realistic to start a transaction outside the data portal in a 3-tier physical scenario.

My suggestion, if you want to have repeatable, testable, transactional data access, is to put the data access into a formal data access layer. You can then write unit tests for the DAL to test it. You could then also write a mock DAL if that's your thing, and use that to test the objects themselves.

hurcane replied on Wednesday, June 28, 2006

RockfordLhotka:

Yes, the problem with this entire discussion is that it assumes a 2-tier physical model. It isn't realistic to start a transaction outside the data portal in a 3-tier physical scenario.



If by a 3-tier physical scenario, you mean having three machines, but still using ADO.NET within the DataPortal_Xxx methods, I have tested this scenario and it works just fine. ADO.NET code automatically enlists in the thread's transaction on the application server, if it has been set. When you use an implicit TransactionScope, it automatically sets the thread's current transaction.

I'm afraid that the unit testing scenario is clouding the original intent of my post. What negatives are there to including the thread's current transaction in the data portal context and setting it on the server thread? The serialized data is not significantly increased. If the client has no transaction on the thread, there is no performance penalty. You won't be creating a distributed transaction unless the client passes the transaction. The client shouldn't be starting a transaction unless there is activity that needs to be synchronized.

Not passing the transaction can also break CSLA-based apps when they are reconfigured from using a local data portal to a remote data portal. Suppose I have an object graph that involves data from two databases. The root object uses a local database, but a command object uses a remote database for logging activity during the update of the root object. The root object has the ForceLocal attribute applied, but the child object does not have this attribute. Both objects have the attribute to use system transactions.

With a local portal, everything appears to be working great. If you generate a database error on the client or the server, neither database is updated.

When you switch to the remote data portal, you start getting reports that a bug in the code is causing an error when saving. The object is not updated in the local database, but the command object appeared to successfully log the activity.

With the local data portal, the command object was running on the same thread as the root object, and the ADO.NET code automtically enlisted with the transaction that was initiated by the root object's data portal. When the logging data portal was configured to be remote, the transaction was not carried through. The command object successfully updated the database because it was unaware of the transaction on the client.

In CSLA.NET 1.x, the common practice was to explicity pass a SQLTransaction to synchronize database activity across objects. With TransactionScope, the project tracker doesn't include any transactions in parameters. It relies on the TransactionScope and ADO.NET magically working toghether. My suggestion extends this "magic" through the data portal (and across threads), if needed.

I could resolve the issue described above by explicitly including the transaction as a parameter in the factory method of the logging command object. Does it not make sense to bake it in, and keep the technical details hidden as much as possible from the business developer?

Copyright (c) Marimer LLC