How does Linq to Sql fit with CSLA?

How does Linq to Sql fit with CSLA?

Old forum URL: forums.lhotka.net/forums/t/3635.aspx


dotnetbrix posted on Tuesday, October 02, 2007

Hi,

We are planning of using LinqTo Sql and I was wondering whether it would fit around CSLA.

My first conclusion is that by using  Linq To Sql we have to program a different kind of business object alltogether ,a much lighter one.

I am also struggling to find out where linq actually fits in an NTier application as I cannot seem to make a clear distinction between a Business Layer and a DataLayer ,so far my conclusion is that Linq is both a DAL and BizLayer.

What are your views?

 

ajj3085 replied on Tuesday, October 02, 2007

I think it'd show up in the DP_xyz methods mainly.  Possibly your BO could keep the resulting objects around and use the update features of the entities, but that's a while guess, I haven't started with Linq yet.

It also might be useful to filter business lists (thinking of searching through ROLB objects).

Just my guess from what I know so far.. Smile [:)]

triplea replied on Tuesday, October 02, 2007

I would probably wait for Rocky to release his next book on the topic :-) But in the meantime I would also probably look at utilising it in the dataportal methods simply as an OR mapper (though I know Linq is so much more), nothing fancier.

ajj3085 replied on Tuesday, October 02, 2007

We could, but Rocky has already given us a clue as to where he sees EF and Linq fitting in.

dotnetbrix replied on Tuesday, October 02, 2007

Thanks for your replies

Reading the post that Rocky put with regards to Linq and EF ,my understanding is that he doenst see a place in the CSLA.

I have been asked to provide a framework around LinqToSql but I am struggling to abstract the business Layer from the Data Layer.

The more I look at it ,it looks like that LinqToSql "is" the Data Layer and whatever Is generated by sqlMetal is the actual business Layer,which means not having a physical DataLayer.

If you know of way to abstract LinqToSql DataLayer from the BizLayer please let know how you do that?

 

JoeFallon1 replied on Tuesday, October 02, 2007

Check out Rick Strahl's blog  - he has been working on this exact issue and has hit many of the pain points already. In fact he made some interesting conclusions yesterday about disconnected data contexts. We won't know the final result until the next build is released.

http://west-wind.com/weblog/

 

Joe

ajj3085 replied on Wednesday, October 03, 2007

dotnetbrix:
Reading the post that Rocky put with regards to Linq and EF ,my understanding is that he doenst see a place in the CSLA.


Not quite what I would say; he sees a use for it for data access, similar to how NHibernate fits in today.

dotnetbrix:
I have been asked to provide a framework around LinqToSql but I am struggling to abstract the business Layer from the Data Layer.

You should be able to build a data layer that uses Linq.  The business layer would use your data layer, although I'm not sure how helpful that would be.

dotnetbrix:
The more I look at it ,it looks like that LinqToSql "is" the Data Layer and whatever Is generated by sqlMetal is the actual business Layer,which means not having a physical DataLayer.

No, sqlmetal would generate DTOs which your business layer would use to move data into or out of the database.  Those objects would also be part of the data layer.

dotnetbrix:
If you know of way to abstract LinqToSql DataLayer from the BizLayer please let know how you do that?

Well you'd have to write a layer that would take some data and use Linq behind the scenes.  Again though, I'm not sure what use that would be.  I have a data layer now that I maintain, to hide embedding sql into my application, and also to give me a strongly typed object to code against.. but Linq will fill that need for me, so over time I will replace my data layer with Linq.

dotnetbrix replied on Wednesday, October 03, 2007

Thanks for taking the time to reply to my queries.

ajj3085 No, sqlmetal would generate DTOs which your business layer would use to move data into or out of the database.  Those objects would also be part of the data layer.

   I would not call them DTOs as they have logic in them and your can add partial methods and so on.

ajj3085 You should be able to build a data layer that uses Linq.  The business layer would use your data layer, although I'm not sure how helpful that would be.

    I can build the DAL  using sqlmetal via commandLine and put in one assembly(myCompany.DAL) The entity generated contains logic and interfaces in much lighter version than a csla would have.
What do I put in my Business Layer if my generated entities that resides on the DAL already have that logic?

ajj3085 Well you'd have to write a layer that would take some data and use Linq behind the scenes.  Again though, I'm not sure what use that would be

I agree in the sense that Linq is both your Dal and Biz.
However if you think different and it's simple for you can you post an example on how you would put Generated Entities in (myCompany.BLL) and all the sprocs,dataContext stuff in the Mycompany.DAL (separate assemblies) and you make them talk to each other.

My problem is that I was trying to separate the two but I dont think you can.

Also regarding how it fits into CSLA you right ,I misread the last bit of his note where he says he sees it working on a DataPortal a bit like nHibernate does.

Has anybody tried that?

 

thanks again for your time

 

ajj3085 replied on Wednesday, October 03, 2007

dotnetbrix:
I would not call them DTOs as they have logic in them and your can add partial methods and so on.


What kind of logic?  I certainly wouldn't put any business logic in there.  From my limited exposure, sqlmetal generates classes based off of database tables.. which I would classify as a DTO.

dotnetbrix:
I can build the DAL  using sqlmetal via commandLine and put in one assembly(myCompany.DAL) The entity generated contains logic and interfaces in much lighter version than a csla would have.
What do I put in my Business Layer if my generated entities that resides on the DAL already have that logic?

If you're using Sqlmetal to put business logic into the generated classes, that's not really an approach that would be recommended here.  Your business objects should be modeled to fit your use cases.  This means they almost never match your database design.  So again, what kind of logic would be in these classes?

dotnetbrix:
I agree in the sense that Linq is both your Dal and Biz.
However if you think different and it's simple for you can you post an example on how you would put Generated Entities in (myCompany.BLL) and all the sprocs,dataContext stuff in the Mycompany.DAL (separate assemblies) and you make them talk to each other.

I wasn't trying to say that Linq would BE the business layer.  It would be USED by the data layer, just like NHibernate would. 

dotnetbrix:
My problem is that I was trying to separate the two but I dont think you can.

Well the business layer needs to talk to the data layer somehow, so they are coupled.  It would be hard to totally decouple them, but I think Linq + entity classes are a good enough seperation.

dotnetbrix:
Also regarding how it fits into CSLA you right ,I misread the last bit of his note where he says he sees it working on a DataPortal a bit like nHibernate does.

I think you misunderstood; NHiberate would be used in the DataPortal_xyz methods contained within the business object.. the DataPortal itself doesn't care how you access data, whether you use linq, ado.net, or nhibernate.

But using NHibernate in your business objects data access code certainly has been done.  I think there's a sample floating around if you search the forums.

Hope that clears things up.

dotnetbrix replied on Wednesday, October 03, 2007

Thanks for your reply.

I actually disagree on many points with regards to Linq but I really appreciate your help and time in replying.

EG
LinqToSql entities (generated by sqlmetal) Implements interfaces + their properties raises events and methods that you can override in your own partial class using partial method.So it's not a DTO in the pure sense with just getters and setters in the simple form.

Linq is your DAL no doubt.But much of the stuff that gets generated is stuff that I would have coded in the bizObject.

Thanks again

 

 

RockfordLhotka replied on Wednesday, October 03, 2007

dotnetbrix:

Thanks for your reply.

I actually disagree on many points with regards to Linq but I really appreciate your help and time in replying.

EG
LinqToSql entities (generated by sqlmetal) Implements interfaces + their properties raises events and methods that you can override in your own partial class using partial method.So it's not a DTO in the pure sense with just getters and setters in the simple form.

Linq is your DAL no doubt.But much of the stuff that gets generated is stuff that I would have coded in the bizObject.

Thanks again

There's no doubt that Microsoft's code-gen for LINQ is putting some code beyond a simple DTO into their generated objects. Don't make the mistake of thinking this is anything other than code-gen though. In an earlier post someone discounted code-gen, but then accepted Microsoft's code-gen. That kind of thinking is dangerous and can lead to odd results like thinking a Recordset isn't an object because it came from Microsoft.

Not that I'm saying you are doing that. But I spent a lot of time explaining how and why custom objects (al la CSLA) are faster and lighter weight than a Recordset - to people who figured that since the Recordset came from Microsoft it must be "blessed".

The same with the DataSet.

LINQ's tools are merely code generators. Not open or flexible like most code-gen gools, and so technically not nearly as good. But what they DO have going for them is that they code-gen in a way that is expected by VS, and so there's an element of integration. That often offsets the lack of flexiblity, but not always.

In any case, there's a core philisophical question you need to answer for yourself. Are your objects defined by data, and then behavior is grafted onto them? Or are your objects defined by behavior (responsibility), and then they use the data necessary to perform that responsibility?

If you are comfortable with the former answer, then L4S, nHibernate, ADO.NET EF and most other ORM tools can be made to work for you. Generate the entity objects, graft on the business logic and away you go.

If you are comfortable with the second answer, then ORM tools fall short. I have yet to see one that works for this. CSLA is intended to help with this second approach. To help you create single responsibiltiy-driven objects that match your use case requirements.

I strongly prefer the second answer. 12 years ago I was solidly in the data-centric object camp. I lived and breathed that for years, increasingly frustrated the the "OO promise" never came true. I never reaped the benefits of the promised land.

About 7 years ago I started learning about this behavioral/responsibility based world. Slowly I moved more and more that direction, and I am far happier for it. I really do think I get major benefits from the OO promise now - in ways I never did before.

I think the data-centric approach is a dead end. Easier, more seductive, but not more powerful.

Which is not to say that ORM tools are bad. Far from it! Even though they don't solve the impedance mismatch problem (see David Taylor's books for this term), they do make it far easier to get data into/out of databases in many cases.

And it turns out to be much easier to solve object-to-object mapping than object-to-relational mapping when it comes to getting data into/out of responsibility-driven business objects. The ORM tools do the hard part of mapping from ugly relational tables into objects, and mapping those ugly (and inconsistent) database types into nice CLR types.

So by the time the ORM has done its job, our data is in nice CLR objects, ready for easy consumption. And that's a happy thing.

Unfortunately most ORMs (including L4S) kind of ignore the n-tier distributed computing world. Key bits of state don't move across tier boundaries (isnew, isdirty, isdeleted, etc). This complicates things and is very sad.

But don't look to Microsoft for an answer in the near term. They are so SOA focused that it doesn't matter. None of that stuff will ever flow across a service boundary, and there's a bias against n-tier within Microsoft these days. Sad but true.

CSLA does track this sort of state, and brings it with the object - thus solving the n-tier issue nicely, and still working with the SOA world too.

The hardest part of making CSLA play nice with L4S turns out to be this exact issue. How to best convince L4S to work efficiently when it is simply unable to manage this extra state data. There are answers - but the question is which answer is both maintainable and performant. Where's the tradeoff, and can CSLA help automate the work involve in any solution (thus increasing maintianability).

webjedi replied on Wednesday, October 03, 2007

RockfordLhotka:

I think the data-centric approach is a dead end. Easier, more seductive, but not more powerful.

So it's the like the Dark Side of the Force? :-)

RockfordLhotka replied on Wednesday, October 03, 2007

That was the intended reference J

 

(though that is just a paraphrase – but I have this theory that all of life can be described as a series of Star Wars quotes, and that has yet to be proven wrong)

 

Rocky

 

From: webjedi [mailto:cslanet@lhotka.net]
Sent: Wednesday, October 03, 2007 4:35 PM
To: rocky@lhotka.net
Subject: Re: [CSLA .NET] How does Linq to Sql fit with CSLA?

 

RockfordLhotka:

I think the data-centric approach is a dead end. Easier, more seductive, but not more powerful.

So it's the like the Dark Side of the Force? :-)



RockfordLhotka replied on Thursday, October 04, 2007

I am not a big fan of reflection either. It is just a tool that is sometimes useful.

CSLA doesn't use as much as people seem to think. There are a limited number of calls in the data portal to achieve dynamic behaviors. And n-level undo uses it to avoid you having to write tons of repetative, hard-to-debug code. All other uses are strictly optional.

L4S doesn't use much (if any) reflection. They avoid using reflection by using code-gen.

And CSLA could do this too - except in the data portal (which is a non-issue anyway). For example, n-level undo without reflection requires a lot of code. But code-gen could create that code, thus eliminating the use of reflection in that common scenario.

So the question is whether you want to rely on extensive code-gen, or the use of some reflection.

I've consistently opted for the limited use of reflection, because the alternative would require the use of code-gen to use CSLA. While I encourage the use of code-gen, I'm not quite ready to require it.

Microsoft has a long history of going the other way - all the way back to the VB4 wizard days. Fortunately they've gotten quite good at it, and most people don't even realize that the Windows Forms, Web Forms, WPF and other designers (including the LINQ designers) are all just code-gen tools built into VS.

However, if I do build a CSLA Light (as discussed in another thread), I would almost certainly end up going down that mandatory-code-gen path. This is because the kind of reflection required for n-level undo isn't in Silverlight, and so the huge, ugly, hard-to-debug code becomes unavoidable. And the way to make that a non-issue is to go with code-gen.

dotnetbrix replied on Friday, October 05, 2007

hi,

thanks for your reply.If you use StoredProcedure with LinqToSql  than reflection is used.

You might say it defeats the object of using LinqToSql.Yes it does.But it's inevitable with legacy system and real world app where SPS are way to complex or many more network trip would be required using LinqToSql or any other orm tool for that matter.

[Function(Name="dbo.Customer_READALL")]

public ISingleResult<Customer> Customer_READALL()

{

IExecuteResult result = this.ExecuteMethodCall(this, ((MethodInfo)(MethodInfo.GetCurrentMethod())));

return ((ISingleResult<Customer>)(result.ReturnValue));

}

Thanks again

RockfordLhotka replied on Friday, October 05, 2007

Well yes. But that kind of method-level reflection is a non-issue. Just like the data portal using it.

 

It is important to keep these things in perspective. Calling a database is incredibly expensive. A small number of reflection calls to make it happen are totally immaterial – they have no meaningful impact on performance.

 

Now if you tell me that L4S, in some scenarios, uses reflection to copy all the fields of data from the columns into some other location. Then THAT can be an issue.

 

Do some tests. You’ll find that in a typical real app (like a web app), whether you make a small number (1-5) method calls via reflection when loading data from the database doesn’t even show up on load tests.

 

But if you try to load all your fields using reflection you’ll probably see a 15% to 20% slowdown when compared to setting the fields directly like I do in the book.

 

I did a lot of research in this area early on, because I did consider making the data portal do the data copies on your behalf. And for a lot of apps, that 15% perf hit is totally acceptable, so I did seriously consider including my mapping code. But in the end I did not include it, because that much use of reflection would have discredited the framework in the eyes of too many people – even if it was optional…

 

Times have changed though. ASP.NET, Windows Forms and WPF use reflection all over the place. Continually. And (almost) no one worries. The amount of reflection used in a normal app is staggering, and people just happily go about their business.

 

So CSLA 3.5 may, in fact, include some heavier (though still optional) use of reflection to help streamline the amount of code you need to write when working with DTO/Entity objects (because L4S more properly creates a form of Entity object) behind your business objects.

 

Rocky

dotnetbrix replied on Monday, October 08, 2007

Rocky,

what you are saying makes perfect sense.

However I have put together a quick test as this guy suggested

http://alexpinsker.blogspot.com/2007/07/benchmarking-linq-vs.html

Results
So, comparing to the raw ADO.NET - DAAB is 8% slower and LINQ is 28% slower.
Comparing to DAAB - LINQ is 18% slower.
CPU usage intensity is about 2% average for raw ADO.NET vs. about 8% for DAAB vs. about 20% for LINQ - order of magnitude worse than raw ADO.NET (CPU graphs captured with perfmon below).

Thanks again for thorough reply.

 

ajj3085 replied on Monday, October 08, 2007

Well, its good that you have some benchmarking tests, but Linq is still in beta.  Its likely to be faster on RTM, when they don't have debug code and such.  So I'd try again once its actually released.

RockfordLhotka replied on Monday, October 08, 2007

As has been pointed out, you are running beta software, so perf tests are pretty unreliable. Also they are probably in violation of the EULA so be careful J

 

There’s no doubt that _anything_ will be slower than raw ADO.NET (with a  datareader). This is because _everything else_ uses a datareader to get data, and thus adds overhead. That’s inescapable unless you find a data access technology that doesn’t build on ADO.NET…

 

L4S ultimately gets its data from a datareader. And if you follow the pattern in my books your CSLA objects will too. But you can bet L4S has to do more work than your DataPortal_XYZ() methods, because it is a more general technology, where your methods are very specific to your needs.

 

All that said, there are people on this forum who use a DataSet to load their objects with data. That is _terribly_ slow compared to a datareader. But if it is fast enough for their needs then who cares?

 

That’s the tradeoff. It is like CanReadProperty(). You can use the fast-but-less-maintainable overload or the slower-but-more-maintainable overload. Which you choose depends on whether, for your specific app, maintainability or performance matters most. I can’t (and shouldn’t) make that decision for you, I should just allow you to make that decision with as little penalty (either way) as possible.

 

So as I integrate LINQ into CSLA, that’s my guiding principle. You _must_ still be able to use raw datareaders if you need maximum performance. But if you can live with (whatever) overhead LINQ imposes then you will most likely have to write a lot less code. Same kind of tradeoff.

 

Rocky

 

From: dotnetbrix [mailto:cslanet@lhotka.net]
Sent: Monday, October 08, 2007 7:18 AM
To: rocky@lhotka.net
Subject: Re: [CSLA .NET] RE: RE: How does Linq to Sql fit with CSLA?

 

Rocky,

what you are saying makes perfect sense.

However I have put together a quick test as this guy suggested

http://alexpinsker.blogspot.com/2007/07/benchmarking-linq-vs.html

Results
So, comparing to the raw ADO.NET - DAAB is 8% slower and LINQ is 28% slower.
Comparing to DAAB - LINQ is 18% slower.
CPU usage intensity is about 2% average for raw ADO.NET vs. about 8% for DAAB vs. about 20% for LINQ - order of magnitude worse than raw ADO.NET (CPU graphs captured with perfmon below).

Thanks again for thorough reply.

 



dotnetbrix replied on Tuesday, October 09, 2007

Thanks a lot for your reply.It all makes sense.I only wish my boss listened to me

 

 

 

robert_m replied on Tuesday, October 09, 2007

Rocky wrote:

"So as I integrate LINQ into CSLA, that’s my guiding principle....."

What kind of integration do you have in mind, if it's not too early to ask ??

 

RockfordLhotka replied on Tuesday, October 09, 2007

You can envision a number of possibilities (and if you have other thoughts I’d love to hear them).

 

L4S returns DTOs (basically) that can be used to load the fields of your object. DataMapper can (optionally) be used to copy this data to/from the DTO if it is enhanced along that line.

 

LINQ itself can be used in two modes – with projections and without projections. Depends on your select clause.

 

If you do a projection you are on your own – you get a new list of new objects that have no link to the original objects at all.

 

But if you do a “select item” then you get a new list that contains references to the real objects. Even this, by default, is bad, because this new list isn’t linked to the old list, so adding/removing items from it has no effect on the real list – and that would be counter-intuitive, or at least not nearly as powerful as today’s SortedBindingList and FilteredBindingList.

 

It may be possible to make LINQ return a list that IS linked to the original list. Basically allowing you to do a query (without a projection) that returns a live view of the original list (much like SBL and FLB). That’d be cool.

 

I work with Aaron Erickson, the author of the open source indexing library for LINQ. We may implement a version of that indexing code directly in CSLA. Because CSLA objects are guaranteed to be richer than standard objects, we can make LINQ do more with CSLA objects than it can do with standard objects – and that’d be cool too.

 

Finally, and this is the hard part, to really work nicely with L4S in a distributed scenario (remote data portal) it must be possible to do insert/update/delete operations efficiently. Since L4S has a context idea that isn’t n-tier, CSLA needs to help L4S overcome that limitation without making the business developer write a lot of extra code. Challenging issue…

 

Rocky

 

robert_m replied on Tuesday, October 09, 2007

Thnx for quick reply.

We are starting a new project using CSLA and are considering using LINQ primarily to load data from db (and save it back, of course). The idea for now is just to replace SqlCommand and SqlDataReader functionality with L4S (in DataPortal_XXX methods)....

RockfordLhotka replied on Tuesday, October 09, 2007

Yes, and that works great – no problems there that I’m aware of – other than dealing with context for insert/update/delete operations.

 

Rocky

 

 

From: robert_m [mailto:cslanet@lhotka.net]
Sent: Tuesday, October 09, 2007 8:07 AM
To: rocky@lhotka.net
Subject: Re: [CSLA .NET] RE: RE: RE: RE: How does Linq to Sql fit with CSLA?

 

Thnx for quick reply.

We are starting a new project using CSLA and are considering using LINQ primarily to load data from db (and save it back, of course). The idea for now is just to replace SqlCommand and SqlDataReader functionality with L4S (in DataPortal_XXX methods)....



ajj3085 replied on Tuesday, October 09, 2007

RockfordLhotka:
Finally, and this is the hard part, to really work nicely with L4S in a distributed scenario (remote data portal) it must be possible to do insert/update/delete operations efficiently. Since L4S has a context idea that isn’t n-tier, CSLA needs to help L4S overcome that limitation without making the business developer write a lot of extra code. Challenging issue…


I've read an intro to Linq book, so I'm not as up on this as I'd like.  Would it be possible to have the business objects hold on to Linq objects.  I'm talking about classes marked with Table, Column, etc. attributes.  Since I already have such objects (my DAL uses them so I can have strongly typed data objects), my hope was I could just flag them as Serializable add the additional Linq attributes, and possibly be able to have the BO use those to store the data.

Any reason that wouldn't work?  I think its difficult, if you don't have a timestamp column on your table, but it does seem to be possible to unhook the context and re-attach one later.

RockfordLhotka replied on Tuesday, October 09, 2007

The problem is that the context can’t be serialized, so it is lost in a remote data portal scenario, or stateless web scenario or a service-oriented scenario. You can recreate the context – but that means re-querying the database and that is inefficient.

 

So instead, you need to call some other methods to “trick” an empty context into realizing that your DTOs are “new”, “old” or “for deletion”.

 

Rocky

 

 

From: ajj3085 [mailto:cslanet@lhotka.net]
Sent: Tuesday, October 09, 2007 8:40 AM
To: rocky@lhotka.net
Subject: Re: [CSLA .NET] RE: RE: RE: RE: How does Linq to Sql fit with CSLA?

 

RockfordLhotka:

Finally, and this is the hard part, to really work nicely with L4S in a distributed scenario (remote data portal) it must be possible to do insert/update/delete operations efficiently. Since L4S has a context idea that isn’t n-tier, CSLA needs to help L4S overcome that limitation without making the business developer write a lot of extra code. Challenging issue…



I've read an intro to Linq book, so I'm not as up on this as I'd like.  Would it be possible to have the business objects hold on to Linq objects.  I'm talking about classes marked with Table, Column, etc. attributes.  Since I already have such objects (my DAL uses them so I can have strongly typed data objects), my hope was I could just flag them as Serializable add the additional Linq attributes, and possibly be able to have the BO use those to store the data.

Any reason that wouldn't work?  I think its difficult, if you don't have a timestamp column on your table, but it does seem to be possible to unhook the context and re-attach one later.


komminane replied on Friday, October 05, 2007

will the LinQ Anonymous classes replaces the CSLA DTO'S concept ?

dotnetbrix replied on Thursday, October 04, 2007

Hi there,

thank you very much for such a lenghty and good reply.Apologies if didnt reply earlier on.

I am a big fan of CSLA and I have used it in personal projects.I am trying to persuade my boss not to go for the time being via LinqToSql route till it becames a more mature product but embrace the CSLA,as it would be crazy to write a framework from scratch when there is one already available with a fairly big community too.

I am not a particular fan of ORMs tools either  for all the reasons you mentioned above.

I beleive it too that Data Centric is not the way to go,however I am not the decision maker.

Because we going for linqToSql i am been asked to put a framework together around it,and I am still struggling on some bits .My conclusion is that it's not ready for enterprise usage and multitier scenarios,but hey I have to make it work.

I beleive MS did a fantastic job with SqlMetal but not a finished job though as there are so many limitation to this tool,not time on my part to put a gui to it.

All this prompted me to ask this community their view on integrating with CSLA.As already pointed out it come be done ,but at what cost.

I am not a big fan of Reflection and even though is also used in CSLA ,I tend to avoid when I can to keep the performance up.There is nothing faster than RAW ADO.NET,however this ORM tools make very large use of it.

I will stop here and I will be keeping an eye just in case somebody in community posts an example  of integration with LinqToSql

Thanks again

 

 

 

Tolomaüs replied on Wednesday, October 03, 2007

> The more I look at it ,it looks like that LinqToSql "is" the Data Layer and whatever Is generated by sqlMetal is the actual business Layer,which means not having a physical DataLayer.Indeed,

Indeed, I have come to the same conclusion with NHibernate as ORM solution and I think this conclusion is valid for any ORM. In fact, the ORM itself contains a "generic" datalayer which it hides from the business layer. Instead, it provides access to the business objects themselves. 

Even with this in mind, it still seems interesting to me to abstract away the behaviour of your specific ORM behind an interface so your business doesn't depend on it. You could call this additional layer a "Repository" as it groups all methods to access your business objects.

I found some interesting reads about this:

http://krisvandermotten.wordpress.com/2006/11/19/creating-a-data-access-layer-with-linq-to-sql-part-1/

and

http://debasishg.blogspot.com/2007/02/domain-driven-design-inject.html

robert_m replied on Wednesday, October 03, 2007

I am quite new to LINQ but I would agree with ajj3085 's post. LINQ can be used in DataPortal_XYZ methods to load and save data. In standard scenarios there should be no need for an extra data access tier since one of the main features of CSLA architecture is that CSLA objects are mobile. ie they exist on both client and server side as neeeded. So, instead of working with SqlCommands and SqlDataReaders, you can use strongly typed LINQ queries to achieve the same funcionality with less code (and much more maintainable code)

Copyright (c) Marimer LLC