How to pass external values and return values from Data Portal

How to pass external values and return values from Data Portal

Old forum URL: forums.lhotka.net/forums/t/7230.aspx


Vinodonly posted on Monday, July 06, 2009

What is the best way of passing external values and returning values from Data Portal.

When I say External values, I mean the values outside of BO.

There are some settings which are set globally in my App, so I want to access this from DataPortal Calls..

Secondly I want to return values from DataPortal, for Eg. in certain cases I don't want to throw error but instead display a message, so how I can return that message back to UI from DataPortal.

triplea replied on Monday, July 06, 2009

If I understand correctly this applies only for DataPortal_Fetch(). If that is the case then I would use the Criteria object passed to do such things.

"There are some settings which are set globally in my App, so I want to access this from DataPortal Calls."
Are you using a remote dataportal? Otherwise your App settings should be available directly in the DataPortal. If not, you cuold create a Criteria base class where these are defined. Then you extend this base criteria class in each of your BO.

"Secondly I want to return values from DataPortal"
Again in your base Criteria class you could have a public string property Result {get; set;} which you set in your DP. Then it is up to you how you want to propagate the value to the UI.

HTH

Vinodonly replied on Monday, July 06, 2009

Kindly note

1. Values are being used in all dataportal calls for some BO's.

2. First I thought of using criteria but Global Values are more then 15, so I dropped the idea.

3. I'm not using remoting dataportal for the time being but I want to keep my app flexible, bcos if I add it at some point of time then I have to change my app. That is actually the goal of CSLA also.

After again going through the book, I think the solution is GlobalContext which is passed from client to server and any changes done on server are passed back.

Secondly, I'm checking up the Roles object which uses events like OnDeserialized, OnDataPortalInvoke etc. so I think the key is there..

But if still somebody who has already used/tried this can help then it will be very helpful

RockfordLhotka replied on Monday, July 06, 2009

Yes, if you have values that need to be available (and be modified) on client and server during each data portal call then ApplicationContext.GlobalContext is the answer.

You really must be aware however, that this is not free. All the objects in GlobalContext are serialized and deserialized from client to server, and again from server to client, on every data portal call. That serialization takes time, and of course the size of the byte stream on the wire matters.

So while the data portal does provide the plumbing to allow you to pass values back and forth like this, it is absolutely up to you to be responsible in using that plumbing!

Remember that it is almost always cheaper to recreate/reload/recalculate values than it is to transfer them over the wire. For example, this is why CSLA reloads business/validation/authz rules rather than transferring them over the wire. Also LINQ to CSLA indexes are maintained on both client and server, because that's cheaper than than transferring the data all the time.

In other words, think long and hard, and only transfer data that you absolutely can't get/create/calculate/generate in other ways.

Vinodonly replied on Monday, July 06, 2009

I thought it was a easy way out and I started my implementation also but I never thought about this point.

That is why you are the framework Developer ! I also take this opportunity to thank you from bottom of my heart. I have learned so much with your books and advise provided on this forum. I have no words to express that.

I have a small question here, that when the local data portal is used, Is the serialization still done or it is simply skipped.

RockfordLhotka replied on Monday, July 06, 2009

In version 3.5 and higher, by default the object graph is serialized when
you invoke the data portal on a root object.

Even in local mode the object graph is serialized once, to handle the case
where the database throws an error in the middle of updating a set of
objects. In that case, your object graph would be broken because some
objects would have new (but now invalid) primary key values and timestamp
values.

That scenario has never been a problem with a remote data portal, because
the original object graph was still on the client, and the broken graph on
the server is essentially discarded.

But for a long time (everything up to 3.5 basically) the local data portal
would leave you with a broken object graph. In some of my older books I
recommended writing UI code to clone the object graph and to have a
try..catch block to solve this issue - but that's really not good, because
it isn't a UI concern, and because most people didn't do this extra work.

So in 3.0 I added the option for the data portal to make the problem go
away, but you had to turn it on (with the AutoCloneOnUpdate setting). And in
3.5 I changed the AutoCloneOnUpdate default to true, so the data portal does
the right thing (cloning the object graph) by default, and you need to turn
it off if you want the older (arguably broken) behavior.

Rocky

Vinodonly replied on Tuesday, July 07, 2009

Thanks for the detailed reply.. This is very much clear now..

tdrake replied on Wednesday, September 02, 2009

Correct me if I'm wrong.

I'm saving my object, and the save action get messages back from the Portal_Update which are not avaliable on a Portal_Fetch - I need to return these to the user. so I'm using the conect to add the object.

This works really well, and saving me from using a hack(saving on a fetch, and returning into BP and then manually 'saving' - as if markasclean etc....

is this a OK pattern for this problem.

RockfordLhotka replied on Wednesday, September 02, 2009

Using GlobalContext to transfer “real data” is a hack. That is absolutely outside the design intent for the context values.

 

If you have a save operation that needs to retrieve more data than just the updated object graph then I suggest using a command object to wrap the more complex operation. This is often called a unit of work object, or a process object or an orchestration object, and it allows you to build your object model to clearly manage more complex operations.

tdrake replied on Wednesday, September 02, 2009

Thanks for the quick reply!

I'ved used commands alot for this too, but in this case I'm using your crud model against a ERP system (CSLA over Microsoft Dynamics AX anyone) - The AX Class (which is interfaces via a simple XML gateway) sometime passes back rich error status on save (less then 1%) I'm using a simple token object to pass back via this ApplicationContext. As all the 'SAVES' are successful from the everyones point of view view (there is no such thing as 'invalid' in this business context) as such, a fetch is called and the core BO will rebuild. The ERP treats the client as stateless, and doesn't know about the save return message on the original save.

By using the basic Create/Update/New data portals I can tie it to WPF Databinding using the built in WPF commanding (Save) on this application (a public weighbridge WPF UI with the ERP on the backend as it's 'business logic and database'!!)

Just found this ApplicationContext tonight and the overhead is well woth it, it's very simple to use and provide me with the exception handlinging I need (most data is not returned into a CSLA object, but presented in standard readonly WPF UI.

I've found the commanding very useful for heavy and lazy data (I've used CSLA to build a simple Pub/Sub message bus which passes heavy XML around) for other solutions as you point out...

I do struggle crossing the ERP and customer .net app area, there are a lot of people in each camp, but not many working accross these domains, most solutions to this type of problem would build a stand alone app for the weighbridge (or POS, Warehouse management) which passes data back and foward via a seperate process to the ERP, which never quite get it right (dup data and bus logic).

Thanks again for a fab framework - I'm currently introducing 3 experienced .net devs (3 to5ars exp) to it on this project, and they are starting to see it's power.

Copyright (c) Marimer LLC