I would like to bring back a list of editable root objects. I don't want the objects to be children, as I don't want the entire list to be sent back to the server to save changes to one object's graph.
I also do not want the behavior of the DynamicListBase, as I don't want the objects to be automatically persisted as the user exits edited rows. I want this to be a conscious effort on their part when to save data.
Given these requirements, it seems that creating a list that inherits BusinessListBase is the best route, but what is the best way to load my editable roots? Just an internal GetObject(data) method that bypasses the DataPortal_Fetch/Child_Fetch methods?
Any thoughts / suggestions / existing discussion links would be greatly appreciated!
The data portal works against any serializable type.
Therefore, if you don't need the features of BLB or DLB, then you could just create a serializable List<T> or ObservableCollection<T> or BindingList<T> as your collection type and use that.
If you actually keep the objects in the collection on the client side though, you'll need to reimplement much of DLB. Although DLB does the automatic save operations, the most important thing it does is manage the concept of an object in the list being saved without disrupting the list itself.
But if you just need a list to bring back a bunch of objects to the client, and you aren't going to keep the objects in the list, or use the list for anything on the client, then List<T> is a good option.
If you want to support Silverlight - use the Csla.Core.MobileXYZ types (like MobileList<T>).
Is it recommended practice to actually invoke DataPortal_Fetch for each editable root in this case, or is it acceptable to expose an internal static Fetch() method in which I pass the data to directly?
In other words, is DataPortal_Fetch doing anything for me behind the scenes (invoking business rules, etc.) that I wouldn't otherwise get if I bypass it to load an object's data?
I see that authentication is being checked and each object is being set as clean and old, but it seems like DataPortal_Fetch is performing alot of work that could be easily bypassed in the scenario that I'm after.
Any thoughts on this? Am I missing something that DataPortal_Fetch is doing for me that I would otherwise have to handle myself if I bypass it and call my own internally defined static Fetch() method?
The data portal, by default, manages the object's metastate (isnew, isdirty, ischild, etc). It also clones the object graph before an update, ensuring that if if the update fails, you still have a deterministic object graph in memory (otherwise you must discard and reload from scratch - if that's even possible).
The Using CSLA 4: Data Access and Using CSLA 4: Data Portal Configuration ebooks are pretty thorough in covering the data portal and data access behaviors provided by CSLA .NET.
You can entirely replace the data portal of course. In that case, you need to devise your own way to manage metastate as CSLA expects, and you need to devise some way to deal with failure during the update process. I suggest that you also need some way to abstract the persistence of objects to retain the maintainability and consistency provided by the data portal.
You can do all those things - you just need to understand what the data portal is doing so you can replicate the minimum behaviors.
Copyright (c) Marimer LLC