CSLA 3.7, Polymorphism and DataPortal.Fetch

CSLA 3.7, Polymorphism and DataPortal.Fetch

Old forum URL: forums.lhotka.net/forums/t/8905.aspx


danderson00 posted on Monday, May 10, 2010

Hi,

I was hoping someone may be able to assist with a particular problem I am experiencing with CSLA version 3.7 and inheritance.

As a contrived example, let's say we have an abstract "Contact" class that has two derived classes - TelephoneContact and EmailContact. I want to be able to load a Contact object by its primary key without knowing what the derived type is. It is possible to determine the correct type from data loaded from the database, but I'm not sure how to achieve this with CSLA.

The problem is that the DataPortal_Fetch method that one would usually override to implement the data access is an instance method - ie, an instance of the class must already be created to execute this, and we don't know exactly what type of object to create until we've loaded data from the database. Chicken or egg?

It's possible to do this with collections - override the DataPortal_Fetch method on a custom collection class, load the data, determine the correct type for each instance, instantiate it and populate the data. I'm just not sure how to do this with individual instances. Is there some static method I can implement on a class that CSLA will "magically" figure out how to call and return the appropriate type from? A different solution altogether? Is this sort of thing even possible with CSLA?

It's worth noting that this is also a well known problem with many ORM frameworks, but there are invariably solutions.

 

Thanks in advance!

Dale Anderson 

JonnyBee replied on Monday, May 10, 2010

Hi Dale,

This could easily be accomplished by using the ObjectFactory pattern and move data access into separate ObjectFactories.

The OF is responsible for both creating instance and Data access / move data into BO.

It would be hard to accomplish with the DataPortal_XYZ as these are Instance methods.

 

Curelom replied on Monday, May 10, 2010

I don't know how the BusinessList could handle this.  I don't think it has the capability currently to hold more than one object type.  If you could sub-class BusinessList so that it could, I think you would handle loading the different object types in the BusinessList DataPortal Fetch.  It would read in each object from the database then determine from the loaded data which type of object to instantiate.

xAvailx replied on Monday, May 10, 2010

You could create a factory class that handles this for you.

e.g.

IContact aContact = ContactFactory.Create(id);

ContactFactory.Create delegates creation to the appropriate class. Inside the factory, it can query the database to know what type to create. 

 

RockfordLhotka replied on Monday, May 10, 2010

CSLA isn't an ORM, so it isn't likely it will have ORM-style solutions to the question.

Jonny is right - if you want to have the data layer decide which type of object to return, the object factory model is the right choice.

But if you want a polymorphic result, you need to remember that generics aren't polymorphic, and CSLA is designed so the base types will be generic with exactly one non-generic type as the "leaf node" of the inheritance chain. If you don't follow that pattern many things will start to fall apart.

BusinessListBase is constrained such that its child objects must implement IEditableObject. This means you can define your own interface for your child objects that derives from IEditableObject - thus allowing you to create a polymorphic collection. Normally this is used to create a heterogeneous collection - one that contains different types of child object at the same time. But you could use it to create different homogeneous collections - where each collection is homogenous, but different collections might have different children (though all the children will implement your common interface of course.

danderson00 replied on Monday, May 10, 2010

It would seem that using the ObjectFactory attribute on the classes in question would solve the problem. Thanks for your prompt responses!

Not sure what you mean by "generics aren't polymorphic". I think what you mean is that generics are invariant. There is nothing to stop you from storing anything that derives from a class A in a List<A> and treat them polymorphically, but I can't upcast that to List<DerivedFromA> or even downcast that to List<object> (unless you're using .NET 4.0 of course). All of our business objects eventually derive from BusinessBase<T> and we're using a single generic class that derives from BusinessListBase<T, C> rather than a custom collection class for each different business object.

It worries me somewhat that there is no interface to wrap these ObjectFactory classes, or the DataPortal Fetch pattern in general. The signature of these methods are somewhat arbitrary and it requires that a developer consuming these factories knows precisely the parameters required to retrieve an object. There is no compile time validation that they are passing the correct parameters, and no refactoring support. There is a lot of nasty reflection going on under the covers. Use of criteria objects mitigates this slightly, but I still find the lack of compile-time checking uncomfortable.

Thanks again.

Dale

RockfordLhotka replied on Monday, May 10, 2010

I'd be interested to hear any suggestions (other than IL-munging with a post-compiler, because that's nastier than reflection) on how to have a client call a method, and to have the method invocation be on the app server, potentially two network hops away.

A lot has changed since I started down this road in 1996 with COM, and even since 2000 when I wrote the initial .NET data portal. While reflection is used for discovery, it hasn't been used for invocation since 2005, and now in CSLA 4 the invocation is handled using lambda expression trees.

But ultimately the "method call" needs to be intercepted by CSLA so the call context can be serialized across the network to the physical location where it becomes an actual method call.

It is an interesting problem space, and one that can be solved by writing your own compiler, or altering the IL post-compilation, or by making all method calls cross a context boundary or explicitly use Remoting. All those solutions are at least as bad, if not worse, than using reflection and expression trees though, so I've kept on the same basic track, slowly improving and enhancing the basic concept over time (preserving backward compatibility as much as possible).

CSLA 4 actually removes the "criteria object" restriction, allowing you to pass any single serializable value as criteria - which includes int, string, or more complex serializable types.

danderson00 replied on Tuesday, May 11, 2010

I really need to look at what CSLA 4 provides before stating too many opinions and making a fool of myself, but it sounds like it may solve some of the issues I mentioned.

Many of the solutions that come to mind (like wrapping classes and factories in interfaces to cover various data access requirements) are somewhat "opinionated" and would remove some of the flexibility that CSLA provides, so would be better to be a layer built on top of CSLA for specific cases. I'm also a fan of the repository pattern (either with methods written on each repository to cover specific data access needs, or query objects (similar to criteria objects) that are passed to a generic repository), but I'm not sure how well this will fit with CSLA.

I'll have a look at CSLA 4 before I go any further.

RockfordLhotka replied on Tuesday, May 11, 2010

CSLA 4 does not substantially change the way the data portal works.

CSLA is not an ORM, nor does it pretend to be one. The "data portal" is actually poorly named, but after 14 years I'm not planning to change the name any time soon...

The "data portal" is actually an object portal. Its job is not unlike a REST service - you give it specific input (business object type and criteria) and it gives you back an object of that type. Of course the data portal predates REST by quite a few years too, so it isn't truly RESTful...

The repository pattern is a great pattern for providing indirection for things like accessing a DAL. Create numerous DALs and select the one you want.

The repository pattern is less useful for indirecting the selection of your domain model, because there's less need to do this. If your object model is actually behavioral and single-responsibility, then it is somewhat difficult to see where you'd be swapping out different models within your use case.

The biggest hang-up people face when coming to CSLA is that they view objects as data containers, not as autonomous behavioral constructs. I don't know if that's what's driving your train of thought, but it is pretty common.

Certainly DAL-level objects (entity objects, DTOs and the like) are usually data-centric. And most of the common "OO" design patterns apply directly to that type of object. But hopefully your actual domain objects (the ones you create with CSLA) directly match the behavioral and responsibility requirements of your use case or user story or usage scenario (whatever term you'd like to use). The fact that these objects have data is secondary to the reality that they fulfil a specific and active role in the usage scenario.

This means that some concepts, like mocking the business objects, is almost entirely meaningless. That'd be like creating a "unit test" for a job interview and coming up with a mock applicant. The test might pass, but it would be meaningless, because a real applicant would never be as simplistic as your mock applicant.

The same is true with these business objects. By the time you've mocked the object to the level where it is meaningful, you'll have recreated the actual object.

Please note that I'm not saying you shouldn't test the objects. Nor am I saying mocking is bad, or even out of the question at some levels - but to really mock a business object your mock will almost certainly need to be a CSLA business object, otherwise you won't get the appropriate rule/eventing/notification/persistence behaviors required to do any meaningful test.

So you are better off always using the actual objects, and mocking the DAL. Mocking the DAL is easily done in several ways, and I talk about how to do this in the Core 3.8 video series. With mock data, your business objects become entirely predictable and testable, without the need to incur the huge expense of mocking the business objects themselves.

Copyright (c) Marimer LLC