Is there any general advice on using CSLA with some ORM tool?

Is there any general advice on using CSLA with some ORM tool?

Old forum URL: forums.lhotka.net/forums/t/10456.aspx


buaaytt posted on Tuesday, June 21, 2011

I understand that CSLA is a framework that helps write business objects with logic and behaviors, while ORM tools like EF, NH, abstract the layer of data retrieval and persistence.

That sounds to me like I should have 2 sets of objects defined for the 2 purposes respectively: DTO and BO. Surely I can have all the data properties as well as business logic code inside the same class definition, but would it be better to have 2 classes?

And if the answer is yes, is it convenient to switch from one to the other? For example, I guess I need to bind UI controls to DTO objects. But when users want to Save a new object, I have to call BO's Save method. My question is what the best way to handle these kind of scenarios where the difference between DTO and BO should be transparent?

Thanks for any reply.

Nico

RockfordLhotka replied on Tuesday, June 21, 2011

For my part, I view all ORM tools as purely a data access technology or concept. Therefore, their use should be restricted to the Data Access layer, and none of their artifacts should contaminate the Business layer, much less the Interface Control or Interface layers.

I use EF quite a lot, and it creates that "second set of classes" when I aim the VS designer at my database. I don't think of those as "real" classes though - they are just a code-generated artifact of the EF designer. A low-level tool used to persist data within my DAL implementation.

To me this is important. I assume EF will be like every other Microsoft data access technology over the past 20 years. They'll use it for a while, then the PM who owns it will move on, and the next PM who gets control of the data access stack will invent something completely new. The most you can hope for is maybe 5-7 years (that's what we got out of the DataSet, and the ADO Recordset before that). Most data access technologies only last 2-3 years - with the occassional longer run...

I also assume my business logic will be meaningful for a very long time. Most businesses don't change that fast. I've worked for a lot of different companies over the years, and I've never seen their core business processes or concepts change radically. Lots of changes on the periphery (business rules, etc), but the big stuff is unchanging over a very long period of time.

Heck, the company I worked for at the start of my career still does the same thing now as then - so no business change over a 22 year period. Sure, the technology changed out from under them twice (at least) over that time, but that just meant rewriting the same business process automation over and over...

What I'm getting at, is that CSLA is intended to help you create an encapsulated layer for your business logic, so you can preserve that investment for as long as possible. The less you break the architecture, and the less you contaminate the business layer with DAL or UI artifacts, the longer your investment will last. You must assume that the DAL and UI technologies will change out from under you at least a few times during the lifetime of your app.

I discuss some specific approaches around the DAL implementation, including the use of EF, in the Using CSLA 4: Data Access ebook.

buaaytt replied on Tuesday, June 21, 2011

Hi Rock,

Thanks for your reply and illuminations. But I'm afraid I might not express my question clearly enough.

First, I'm not saying EF is perfect, and I don't particularly mean EF when I said ORM tools.

I also agree with you completely on the importance of not having DAL messed up with either business logic or presentation layer.

Now my question is: if I use CSLA to implement my business logic, is it better to introduce separate DTO classes for data retrieval and persistence purpose only, rather than having data properties mingled with business logic implementation? Also, those DTO classes will be the ones that play with ORM tools, EF, NH, etc.

A related concern is: if I have defined data properties in separate DTO classes, is it possible to completely remove duplicate property definitions in BO? The BO can be injected with DTO classes.

RockfordLhotka replied on Tuesday, June 21, 2011

You do need to decide on the interface you will expose from your DAL. There are numerous options, including a DataSet, IDataReader, and DTOs.

In the Data Access ebook I demonstrate how to create an IDataReader and a DTO interface to the DAL.

You can't avoid defining properties in the business classes. CSLA has a specific structure for property definitions that you must follow to leverage the business rules engine and other CSLA features.

You might consider using your DTOs as "internal storage", where each BO property delegates storage to the DTO. This works as long as the DTO types are serializable and you don't want to support Silverlight or WP. This is because CSLA uses a serializer in SL/WP that adds some constraints on complex types, and it would be a little challenging to get your DTO types to work with the MobileFormatter. Not impossible - but they won't be POCO types at that point, because they must either implement IMobileObject or subclass MobileObject.

My approach is to use DTOs as a "service contract" to the DAL - so I just use them to ferry data between the business and data access layers. The reason for using DTOs is to provide pure decoupling. Using the DTOs within the business layer itself causes coupling, thereby defeating the reason for using a DTO (imo anyway).

If you want a more optimal performance solution, use an IDataReader interface to the DAL. This largely precludes the use of an ORM - but since I'm not a big fan of ORMs I don't necessarily count that as a negative :)

buaaytt replied on Tuesday, June 21, 2011

Thanks Rock. Now I think I got what I wanted to know.

Yes, I reckoned defining properties in BO is a necessity. Features like object state management and validations are implemented using reflection to find property values. It just can't regard values of a member property as the values of the business object and then have CSLA handle things like change tracking based on values of that complex type property.

Thanks for your hints on using DTOs as internal storage. I think now that duplicate property definitions is inevitable, I won't go with this way as it really doesn't give me benefits and it sort of violates the intention of DTOs, which is, as you pointed out, purely ferry data between business and DAL.

Thanks again. Have a good night!

 

RockfordLhotka replied on Tuesday, June 21, 2011

buaaytt

Features like object state management and validations are implemented using reflection to find property values.

CSLA itself doesn't use reflection for this purpose. The PropertyInfo<T> field is used to avoid reflection (and to enable serialization to SL/WP).

But you are right, data binding in all UI technologies does use reflection (at least by default) to interact with the properties of the object.

buaaytt replied on Thursday, June 23, 2011

Thanks for the corrections. Cool

SNiedinger replied on Thursday, October 20, 2011

I agree 100% with your post! (Rocky's post on Wed, Jun 22 2011 2:44 AM)

I just want to add the serializable DTOs also become the Message Objects on a MOM, ESB, etc or just an argument or result of Web Service method consumption.in an SOA 'environment'.

Regarding using an ORM.

I don't see why one should use an ORM when you have a good SQL code (CRUD) (and the Stored Procs) generator (+CSLA with a DAL or course!). It is not difficult to write your own generator for your DAL classes with the SQL (ADO.Net). I find this much leaner and fail to understand why I should use an ORM.  I also find that an ORM (EF) adds a huge level of complexity and learning curve to get it right. (Also feels a bit like loosing control over your SQL)

At the moment we only use EF in the DAL for updates because EF can generate the UPDATE statements to only update the fields that are different from what is on the database, though it does not deal with concurrency properly. (Last user wins).

The way I see concurrency working is:

The DTOs would also have to be smart enough to hold which fields have changed (boolean array? with method to map index to properties) The DTO is initialised by the BO on Save accordingly. A Stored Proc can then generate the SQL dynamically for this.. however SQL Server does not have array types! (I have created stored procedures that work like this in PostgreSQL successfully)

Sorry for going off track a bit..

Siegfried

 

 

 

 

 

 

Curelom replied on Thursday, October 20, 2011

Concurrency does work with EF, though you have to do a little manual modifiying of the EF file.  On your version property(timestamp) You set the property Concurrency Mode to Fixed and StoredGeneratedPattern to Computed.  The StoredGeneradPattern doesn't set the property in the EF file in all the places it needs to be.  That is where you go in and make sure it is set in the StorageModels and the ConceptualModels.  There are a number of articles on the internet on how to do this.  Microsoft considers this a "feature", though it is a feature nobody wants and they don't document it.  Supposedly it will be fixed in the next version of EF, but they said that about the last version of EF as well.Sad

Curelom replied on Thursday, October 20, 2011

Also, there is a code generator for CSLA and EF called CslaExtension 

Copyright (c) Marimer LLC