Hi All,
I have a strange problem at the moment ,
We are writing a toolkit to allow us to easily use nhibernate as the data layer for CSLA. and we have come across an issue when consuming that collection within Silverlight 5
Where the size of a collection of objects grows massively if we try and save,
for example the collection I am using contains 4000 items and approx 10 fields for each item when loading this object into silverlight fiddler reports the size of the data as 750,000 bytes, however when I then edit one property of one object from 5 to 6 and save the data size passed back to the server is now of 3.5 Meg,
I have tried a number of collections and they report the same thing , Now is this normal in silverlight and csla I have to admit never noticing it in a number of previous projects we have written.
Also what is the best way to capture the data before it gets binary encrypted and zipped so I can try and see what the difference between the 2 collections are,
If I override the OnDeserialized call the objects look the same, It seems to be addeding extra fluff that is then stripped out again
Any help would be appreciated
Cheers
Daniel
.
That doesn't sound normal and would probably require looking at your BO to determine the cause.
Would the DynamicListBase<T> be a better options for handling edits for your large collection? Then your updates would be for just the items that changed.
You should also consider:
Loading that many objets fr the purpose of a single or few edits in the list may not be an optimal solution as CSLA will send the entire list back to the server even when only one/a few items has been modified.
Copyright (c) Marimer LLC