CSLA vs Datasets

CSLA vs Datasets

Old forum URL: forums.lhotka.net/forums/t/479.aspx


tazzmann posted on Tuesday, June 27, 2006

Not sure if this goes here or a question to Rocky, but we have a programmer here that thinks with the new VS2005 and .NET 2.0 there is no longer a need to use CSLA and that regular business objects with datasets will do the same function using the .NET framework and technology such as Windows Workflow with a lot less coding. Can anyone list out the disadvantages to going off of CSLA? I think Rocky has done a wonderful job and I think that we should continue to use CSLA, but it seems that every argument I bring up he is able to shoot down with existing or soon to come technologies. We have a staff of about 5 programmers that are just learning .NET and three of us that already are very well versed in .NET. One of our .NET programmers think it will be too much to bring them up to speed so that they can compose business objects on the CSLA framework.

 

Thanks!

guyroch replied on Tuesday, June 27, 2006

Take a look at this http://www.lhotka.net/Articles.aspx?id=60e99288-0998-4c4b-a950-4e19bd6c0bbd

The classic case it simple.  If you go dataset then your end result will be a data centric application where one table will ususally mean one or more forms in your UI.  If you go business object (like CSLA) then your use cases drive your business objects as they should be, including all business rules and access points.  A dataset is nothing but a data container with very little room for business validation.

Did your developer read the book?  If not, then how can he make such a strong assessment?

Q Johnson replied on Tuesday, June 27, 2006

"Composing business object on the CSLA framework" should really be an automated task at this point.  Perhaps getting one of those guys up to speed with a code generation tool (like CodeSmith or MyGeneration - my own favorite) would be worthwhile.  I'm pretty sure that with that many programmers, having one doing code gen work will reduce the coding effort of the others two by far more than enough to offset the fact that the one isn't spending full time "coding." 
 
Any one else care to comment on this estimate or have metrics to back it up?
 
[Sidebar: Frankly, I'm a bit jealous.  I code by myself or with a two or three man team at most. I think a staff of your size could actually devote a full time person to implementation and maintanence of  the harness that Kathleen Dollard developed for us all in her book "Code Generation in Microsoft .NET"  I frankly feel it is too much work for a shop of one or two folks building modest applications.  I think the products mentioned above can provide somewhat less functionality but with a LOT less effort and are a better fit for smaller shops.  But with eight folks, you can afford to go for the gold!!  Other opinions are welcome here, of course.]
 
Regarding the choice of CSLA vs. Datasets, read chapter one.  If you can afford to ship much larger software objects over your wire and give up the convenience of SmartDates and SafeDataReader, then maybe Datasets are acceptable (though with code gen I think you really get these benefits for free).  
 
Maybe you should give some thought to the human nature issues at this point instead of the technical ones and come up with a way for this guy to save face and let HIM discover why datasets are a poorer choice.  He sounds like somebody who really hates to be wrong.
 
Perhaps if it can look like his idea, you can all move on with getting work done in the most efficient manner.  With some folks you could do this by making some stupid claims like, "yeah datasets are better because there's a lot less overhead in their object model so there's less to marshall and serialize and put on the wire, right George?"  Then George is in the position of being able to look it up and look smarter than you by saying something like "Actually that isn't true.  I think there may some benefits to datasets, but that isn't one of them.  CSLA is actually a much leaner object".  Then you make another dumb one and he defends CSLA again.  Pretty soon, he's the genius saying that you stick with CSLA.
 
This could backfile, of course, if he isn't willing to pick up the book.  So be sure of your strategy before you pick it.  Maybe something else will work better, like just recommending that he read chapter one (or at least page 23 - of the VB book at least).
 
Or maybe you can "agree with him" by saying "yeah, CSLA is really only for applications that have real complexity; it isn't needed here."  This will irk a lot of people who will feel as if you are trivializing their development projects.  Pretty soon he could be pointing out all the complexities of his app that make CSLA such a great fit.  Again - pick the approach you think will be most likely to be successful, but be careful how you "agree" with him.
 
[My apologies to all our forum members named George <g>] 

Michael Hildner replied on Wednesday, June 28, 2006

>> code generation tool (like CodeSmith or MyGeneration - my own favorite)

Are there any good templates available for MyGeneration and CSLA? I searched, but the ones I found were kinda old.

Thanks,

Mike

Q Johnson replied on Wednesday, June 28, 2006

The MyGeneration templates I've got (for ERO, ECC, ECO, and ROL only) are all for CSLA 1.x.  I only recently received the VB version of the CSLA 2.0 book and haven't begun 2.0 development yet.  I probably won't get to that for another six weeks at least.

If you'd be interested in seeing those, I could pass them along.  I could use some more feedback on them anyway. 

Please keep in mind that they implement my own decisions regarding naming and commenting conventions and the like.  You may not have the same appreciation for the goals of these templates.  (For example, I pass the parent's key field value, not the whoe parent, to child collection's Update method.)  But they generate compilable CSLA business objects of the types described.  They use the older validation technique (calling CheckRules in your Property Set, etc.)  I implemented those in such a manner that you could generate Rules for max string length automatically and can get StringRequired and Max and Min values for numeric types just by adding a meta data value for the source field (e.g. MaxValue = 255).

Interested?

guyroch replied on Wednesday, June 28, 2006

MyGeneration is also my favorite.  If you look closely at MyGeneration, you'll notice that the Zeus API and the MyMeta API are compiled in ASP.NET 1.1.  So you can even do without the MyGeneration UI and reference those API in your own VS solution for code generation.  By what does that all mean to you.  Well, you'll get code complete while writting your template.

Glad to see your still enjoying MyGeneration Q.

Michael Hildner replied on Wednesday, June 28, 2006

Yes, I'm interested. I'll send a PM with my email.

I'd like to take a look at what you're doing different than what the book's templates do. the CodeSmith templates do things a little differently too. I'd like to study and see why the discrepencies.

Thanks,

Mike

DansDreams replied on Wednesday, June 28, 2006

Man, I love your post Q.  I've also had fantasies of being able to develop a Dollardish generation system that would be business layer centric, but it's hard to justify that when there's only a couple of us.

One of the hurdles to get over I find is the assumption that CSLA is just another geek's pet hobby project rather than the result of years of evolution and experience in both Rocky and Magenic's use of it.  And let's not forget all of us.

There are some advantages to the dataset based approach.  You can discover a lot of metadata directly from the database, for example.  It's not difficult to make a very powerful (and potentially dangerous of course) querying system based on that discoverable metadata.

But I would never ever EVER in a million years base an enterprise application on the current flavor of Microsoft's data access technology.  When ado.net gets replaced (remember RDO and then ADO?) with whatever comes next your CSLA-based application keeps humming along after a few isolated changes confined to a few methods.  Meanwhile your colleague's application using the datasets all the way to the UI is 100% broken and has to be fundamentally redesigned, let alone recoded. 

To get around that potential nightmare you need to do what?  Well, wrap that data with an object that will provide more longevity, of course.  Something like...uh... well, how about CSLA-based business objects?  It's kind of a no-brainer if you ask me.

Many of the powerful new features of Visual Studio 2005 / .NET 2.0 apply directly to CSLA-based objects.  If you like out-of-the-box Microsoft stuff, creating windows forms bound to CSLA business objects is a snap (well, outside of the quirks and instability of the VS forms designer).

Q Johnson replied on Wednesday, June 28, 2006

>>Man, I love your post Q. <<

Many thanks.  I thought some of the ideas for getting "George" to let go of his ownership of the dataset view was the good part.  Maybe not eh?  <g>  Seriously, though.  I'm so swamped this month I haven't had time to offer much here.  But I thought I could offer something useful in response to that post and the fact that you found it worthwhile makes me all the happier that I took the time.  Everyone's opinion is important, of course.  But there are a few of you here who have really earned the respect of the rest of us by your comments and other contributions over the long haul to the community.

>> I've also had fantasies of being able to develop a Dollardish generation system that would be business layer centric, but it's hard to justify that when there's only a couple of us. <<

Yeah, for me the killer is the work I'd have to do getting up to speed on the XSLT techniques.  Even if I climbed that learning curve, I'd be starting over if I only returned to it rather infrequently (only at start of new projects and major enhancements).  Her web site used to say that she had a couple of book projects in mind, one of which was a book focused on using her generation harness rather than building it.  That might change my mind.  But I'm afraid it may not come to pass.  She changed her site a great deal last month and it looks to me as if her only book project right now is on generics (not an altogether bad idea, of course).  I'm going to e-mail her and ask her about the other one.

>> (remember RDO and then ADO?) with whatever comes next your CSLA-based application keeps humming along after a few isolated changes confined to a few methods.  Meanwhile your colleague's application using the datasets all the way to the UI is 100% broken and has to be fundamentally redesigned, let alone recoded. 

To get around that potential nightmare you need to do what?  <<

This is the real benefit of using CSLA and code generation.  You can leave the CSLA business logic stuff alone completely and just change your templates for the persistence methods.  Properly maintained template-source file sets could be completely regenerated for LOTS of applications in an hour or two after the day or so it takes to come up with the new template work.  How many weeks or months will those MS-Data-driven folks spend doing the same thing?

>> If you like out-of-the-box Microsoft stuff, creating windows forms bound to CSLA business objects is a snap (well, outside of the quirks and instability of the VS forms designer). <<

This (along with my current workload) is a big incentive for me to hold off on that 2.0 development.  I think I can wait for some of you pioneers to let your arrow-holes heal before I ramp up to it. <g>

Good to hear from you over hear in the New World.

 

RanceDowner1234 replied on Tuesday, June 27, 2006

Sounds like more of a OOP versus SpaghettiCode (although "DataCentric" is a more PC term LOL!) argument.  Sounds like he's just trying to dress up DataCentric programming in an OOP envelope.  For years I used to argue against OOP myself.  I think it's because it can be so abstract.  Rocky's done a great job of simplying the abstract for mere mortals. 

Finally... after switching to vb.net from Visual Basic 6.x I decided it was time to learn OOP and to do it using best practices.  Personally, I think CSLA was a great way for me to get up to speed on OOP best practices.... or atleast introductory practices (practices I could  understand as a newbie).

As for CSLA itself... I considered Strongly-Typed datasets, and even started down that road for a while, but I soon found out that I needed certain functionality like n-Level Undo and Serialization.

As for Serializaton/Cloning ...

For instance I was writing an accounting application that voided certain records.  It would do so by creating an exact duplicate of the record and then chaning the entry Amount to a negative number.  Using my onw "business objects" I ran into the pickle of having to copy each and every field.  By using CSLA, with built in serialization, I simply "cloned" the object and then changed the ForeignKey & entryAmount.  Three lines of code.

As for Undo...

Every once it is nice to actually undo changes made to a record instead of reloading it from the database.  Doing this, manually, or field by field, can be annyoing.

But OOP/CSLA is not for everybody.  Heck .net isn't for everybody.  There are actually developer's out there signing petitions to stay in vb6.  But I know I was sometimes stubborn in the past... so don't come down too hard on the guy... there might be hope for him! :0)  The best thing might just be to try coding a small project using strongly-typed datasets and then one using CSLA and let him prove to himself the value of OOP.


david.wendelken replied on Tuesday, June 27, 2006

Before I found CSLA a few weeks back, I had been trying to extend Datasets to incorporate a business rules/ authorizations rules approach. 

This was with ASP.Net v1 in VS 2003, haven't tried to replicate my experiment with VS2005 and ASP.Net v2.

Basically, I used the strongly typed DataSet generator to build a DataSet to wrap a table.

(I'm a data guy, so of course I started from there. Smile [:)] You OO guys, go ahead and laugh. I'm learning!)

I created a subclass of the strongly typed DataSet.  It's constructor wired up the row changing and column changing events of the strongly typed DataSet to a CheckRules procedure.

The row-changing wire-up worked great in my windows test screen..   Changing data on the screen, then navigating to a new row caused to row-changing event to fire the CheckRules procedure. I forced a rule to fail, which then threw an exception with a user-friendly error message in it.  The field value in the on-screen DataGrid reverted to its pre-change value and the screen automagically popped up a message showing the exception text.  Sweet!

Then I tried testing a column level rule.  The rule was fired, it threw an exception, and the field value displayed in the data grid was reverted to the pre-change value.  But the exception was consumed by the DataSet, which meant that there was no pop-up message box showing the exception.  So, from the user's perspective, they entered a field value which was blanked out and replaced by the original value with no explanation whatsoever. 

That was one killer "feature".

Now, I went to all that work because I **know** the value of a business rules-based analysis of an object.  I've had **very** successful results using that approach in non-object-oriented technologies.  I understand the **full** project and application life-cycle costs of not having the business rules fully integrated into the usage of the object, so clearly defined and so cleanly separated from one another in their coding implementation.  I know that I'll come out ahead on all but the most trivial of projects - not only over the life time of the application, but on the inital build as well.

I would wager that your colleague's objections are rooted in the fact that they do **NOT** understand the full cost of not developing this way.  They see that they can get the first screen iteration up and running faster without all this "extra" coding.   What they don't see is that they still have do do all the coding, they just don't concentrate all that work in one pass while building the business object.  They do a bit to get the screen up and running, then do a bit more to add some validation, then a bit more, then a bit more, then a few fixes or twenty, then a few more, all interspersed between work on other programs.  Psychologically, they don't make the connection between all those "trivial" amounts of work, so they don't count in their tally of relative costs.   As Senator Proxmire once said, "A billion here, a billion there, pretty soon it adds up to real money."

You can try the mind games suggested in a previous post (a truly great book for learning how to deal with problem colleagues is "Dealing with People you Can't Stand: How to Bring Out the Best in People at Their Worst"). 

Or you can try to help you colleague see the full cost of your two approaches.  Good luck, because getting people to see the obvious is not an easy task. Smile [:)] 

You may have to rub their noses in how many times someone has to go back and fiddle with a screen after it's "done".  If you go that route, remember to include all collateral damage in the nose-rubbing.

 

ajj3085 replied on Tuesday, June 27, 2006

VS2005 and .Net 2.0 don't change the fact that Ado.Net is for manipulating data, but the point of OOD is to model behavior. 

The simple fact is that as soon as you change the database tables at all, your entire application is now broken.  Using Csla, you only need modify code in the DataPortal_xxx methods; your UI is safe from the changes to the database.  This is a HUGE advantage.

Also, where do you put your business rules?  You have a rule that you can only have one discount, or only one item of a certain type in a collection.  Where do you put the code to enforce that?  Usually it ends up in the UI.  Want to create a web UI for your application?  Get ready to rewrite much of the code, and retest ALL the business logic again.  When Vista becomes more popular, and you need to move to Windows Presentation Foundating in .Net 3.0, get ready to throw away your UI and business rules yet again.  With Csla, you keep the business rules (which perhaps the most complicated part of the application) and you can easily recreate the UI in WPF. 

Learning Csla isn't too terriblly difficult; at worse, you can say RTFB.  At best, you can train them yourself on how to build Csla based business objects.

The reasons to keep on Csla are outlined in Chapter 1, so there's not really any reason for me to continue to rehash the arguments.  What 'solutions' to the problems with data centric programming does he present?

Andy

malloc1024 replied on Wednesday, June 28, 2006

Datasets aren't all bad.  When Microsoft changes ADO.NET you will have to re-write all projects that use datasets.   Further, you will also spend more time on maintenance using datasets. Job security baby!

Copyright (c) Marimer LLC