Not sure if this goes here or a question to Rocky, but we have a programmer here that thinks with the new VS2005 and .NET 2.0 there is no longer a need to use CSLA and that regular business objects with datasets will do the same function using the .NET framework and technology such as Windows Workflow with a lot less coding. Can anyone list out the disadvantages to going off of CSLA? I think Rocky has done a wonderful job and I think that we should continue to use CSLA, but it seems that every argument I bring up he is able to shoot down with existing or soon to come technologies. We have a staff of about 5 programmers that are just learning .NET and three of us that already are very well versed in .NET. One of our .NET programmers think it will be too much to bring them up to speed so that they can compose business objects on the CSLA framework.
Thanks!
Take a look at this http://www.lhotka.net/Articles.aspx?id=60e99288-0998-4c4b-a950-4e19bd6c0bbd
The classic case it simple. If you go dataset then your end result will be a data centric application where one table will ususally mean one or more forms in your UI. If you go business object (like CSLA) then your use cases drive your business objects as they should be, including all business rules and access points. A dataset is nothing but a data container with very little room for business validation.
Did your developer read the book? If not, then how can he make such a strong assessment?
>> code generation tool (like CodeSmith or MyGeneration - my own favorite)
Are there any good templates available for MyGeneration and CSLA? I searched, but the ones I found were kinda old.
Thanks,
Mike
The MyGeneration templates I've got (for ERO, ECC, ECO, and ROL only) are all for CSLA 1.x. I only recently received the VB version of the CSLA 2.0 book and haven't begun 2.0 development yet. I probably won't get to that for another six weeks at least.
If you'd be interested in seeing those, I could pass them along. I could use some more feedback on them anyway.
Please keep in mind that they implement my own decisions regarding naming and commenting conventions and the like. You may not have the same appreciation for the goals of these templates. (For example, I pass the parent's key field value, not the whoe parent, to child collection's Update method.) But they generate compilable CSLA business objects of the types described. They use the older validation technique (calling CheckRules in your Property Set, etc.) I implemented those in such a manner that you could generate Rules for max string length automatically and can get StringRequired and Max and Min values for numeric types just by adding a meta data value for the source field (e.g. MaxValue = 255).
Interested?
MyGeneration is also my favorite. If you look closely at MyGeneration, you'll notice that the Zeus API and the MyMeta API are compiled in ASP.NET 1.1. So you can even do without the MyGeneration UI and reference those API in your own VS solution for code generation. By what does that all mean to you. Well, you'll get code complete while writting your template.
Glad to see your still enjoying MyGeneration Q.
Yes, I'm interested. I'll send a PM with my email.
I'd like to take a look at what you're doing different than what the book's templates do. the CodeSmith templates do things a little differently too. I'd like to study and see why the discrepencies.
Thanks,
Mike
Man, I love your post Q. I've also had fantasies of being able to develop a Dollardish generation system that would be business layer centric, but it's hard to justify that when there's only a couple of us.
One of the hurdles to get over I find is the assumption that CSLA is just another geek's pet hobby project rather than the result of years of evolution and experience in both Rocky and Magenic's use of it. And let's not forget all of us.
There are some advantages to the dataset based approach. You can discover a lot of metadata directly from the database, for example. It's not difficult to make a very powerful (and potentially dangerous of course) querying system based on that discoverable metadata.
But I would never ever EVER in a million years base an enterprise application on the current flavor of Microsoft's data access technology. When ado.net gets replaced (remember RDO and then ADO?) with whatever comes next your CSLA-based application keeps humming along after a few isolated changes confined to a few methods. Meanwhile your colleague's application using the datasets all the way to the UI is 100% broken and has to be fundamentally redesigned, let alone recoded.
To get around that potential nightmare you need to do what? Well, wrap that data with an object that will provide more longevity, of course. Something like...uh... well, how about CSLA-based business objects? It's kind of a no-brainer if you ask me.
Many of the powerful new features of Visual Studio 2005 / .NET 2.0 apply directly to CSLA-based objects. If you like out-of-the-box Microsoft stuff, creating windows forms bound to CSLA business objects is a snap (well, outside of the quirks and instability of the VS forms designer).
>>Man, I love your post Q. <<
Many thanks. I thought some of the ideas for getting "George" to let go of his ownership of the dataset view was the good part. Maybe not eh? <g> Seriously, though. I'm so swamped this month I haven't had time to offer much here. But I thought I could offer something useful in response to that post and the fact that you found it worthwhile makes me all the happier that I took the time. Everyone's opinion is important, of course. But there are a few of you here who have really earned the respect of the rest of us by your comments and other contributions over the long haul to the community.
>> I've also had fantasies of being able to develop a Dollardish generation system that would be business layer centric, but it's hard to justify that when there's only a couple of us. <<
Yeah, for me the killer is the work I'd have to do getting up to speed on the XSLT techniques. Even if I climbed that learning curve, I'd be starting over if I only returned to it rather infrequently (only at start of new projects and major enhancements). Her web site used to say that she had a couple of book projects in mind, one of which was a book focused on using her generation harness rather than building it. That might change my mind. But I'm afraid it may not come to pass. She changed her site a great deal last month and it looks to me as if her only book project right now is on generics (not an altogether bad idea, of course). I'm going to e-mail her and ask her about the other one.
>> (remember RDO and then ADO?) with whatever comes next your CSLA-based application keeps humming along after a few isolated changes confined to a few methods. Meanwhile your colleague's application using the datasets all the way to the UI is 100% broken and has to be fundamentally redesigned, let alone recoded.
To get around that potential nightmare you need to do what? <<
This is the real benefit of using CSLA and code generation. You can leave the CSLA business logic stuff alone completely and just change your templates for the persistence methods. Properly maintained template-source file sets could be completely regenerated for LOTS of applications in an hour or two after the day or so it takes to come up with the new template work. How many weeks or months will those MS-Data-driven folks spend doing the same thing?
>> If you like out-of-the-box Microsoft stuff, creating windows forms bound to CSLA business objects is a snap (well, outside of the quirks and instability of the VS forms designer). <<
This (along with my current workload) is a big incentive for me to hold off on that 2.0 development. I think I can wait for some of you pioneers to let your arrow-holes heal before I ramp up to it. <g>
Good to hear from you over hear in the New World.
Before I found CSLA a few weeks back, I had been trying to extend Datasets to incorporate a business rules/ authorizations rules approach.
This was with ASP.Net v1 in VS 2003, haven't tried to replicate my experiment with VS2005 and ASP.Net v2.
Basically, I used the strongly typed DataSet generator to build a DataSet to wrap a table.
(I'm a data guy, so of course I started from there. You OO guys, go ahead and laugh. I'm learning!)
I created a subclass of the strongly typed DataSet. It's constructor wired up the row changing and column changing events of the strongly typed DataSet to a CheckRules procedure.
The row-changing wire-up worked great in my windows test screen.. Changing data on the screen, then navigating to a new row caused to row-changing event to fire the CheckRules procedure. I forced a rule to fail, which then threw an exception with a user-friendly error message in it. The field value in the on-screen DataGrid reverted to its pre-change value and the screen automagically popped up a message showing the exception text. Sweet!
Then I tried testing a column level rule. The rule was fired, it threw an exception, and the field value displayed in the data grid was reverted to the pre-change value. But the exception was consumed by the DataSet, which meant that there was no pop-up message box showing the exception. So, from the user's perspective, they entered a field value which was blanked out and replaced by the original value with no explanation whatsoever.
That was one killer "feature".
Now, I went to all that work because I **know** the value of a business rules-based analysis of an object. I've had **very** successful results using that approach in non-object-oriented technologies. I understand the **full** project and application life-cycle costs of not having the business rules fully integrated into the usage of the object, so clearly defined and so cleanly separated from one another in their coding implementation. I know that I'll come out ahead on all but the most trivial of projects - not only over the life time of the application, but on the inital build as well.
I would wager that your colleague's objections are rooted in the fact that they do **NOT** understand the full cost of not developing this way. They see that they can get the first screen iteration up and running faster without all this "extra" coding. What they don't see is that they still have do do all the coding, they just don't concentrate all that work in one pass while building the business object. They do a bit to get the screen up and running, then do a bit more to add some validation, then a bit more, then a bit more, then a few fixes or twenty, then a few more, all interspersed between work on other programs. Psychologically, they don't make the connection between all those "trivial" amounts of work, so they don't count in their tally of relative costs. As Senator Proxmire once said, "A billion here, a billion there, pretty soon it adds up to real money."
You can try the mind games suggested in a previous post (a truly great book for learning how to deal with problem colleagues is "Dealing with People you Can't Stand: How to Bring Out the Best in People at Their Worst").
Or you can try to help you colleague see the full cost of your two approaches. Good luck, because getting people to see the obvious is not an easy task.
You may have to rub their noses in how many times someone has to go back and fiddle with a screen after it's "done". If you go that route, remember to include all collateral damage in the nose-rubbing.
Datasets aren't all bad. When Microsoft changes
ADO.NET you will have to re-write all projects that use datasets. Further, you
will also spend more time on maintenance using datasets.
Copyright (c) Marimer LLC