Large Data with Silverlight

Large Data with Silverlight

Old forum URL:

hanspret posted on Thursday, February 03, 2011


I have a Fetch Method that fetches 5000 rows. The Query took 2 seconds to execute (using SQL Server profiler) but the Silverlight application took about 1 minute 30 seconds to display the results in a datagrid(the datagrid only display 20 items at a time). I used the performance analyzer to see where the problem was. The method that took all the time was

Function NameInclusive SamplesExclusive SamplesInclusive Samples %Exclusive Samples %Module Name
+ Csla.Serialization.Mobile.MobileFormatter.Serialize(object)
8 368 0 69.73 0.00 Csla.DLL

Is this normal or am I doing something wrong and if this is normal, what strategies can I use to improve the performance?

I am using compression as explained in the Silverlight videos

RockfordLhotka replied on Thursday, February 03, 2011

Serializing that much data can take some time, no doubt about it. While this isn't ideal, it is probably "normal".

There are a couple strategies you might consider.

  1. Background loading of the data
  2. Using a direct service call from the local data portal

No matter what you do, the data will be serialized from server to client. The first option simply breaks this into chunks and does the work in the background so the user sees at least the initial data almost instantly. Look at the Samples\Silverlight\cs\PagedList sample to see how this is done.

The second option might help, in that the data would be serialized by a less complex serializer. You'd expose the data from the server via a standard asmx or WCF service, and then run the business object's data portal locally on the client. Your client-side data portal method would call the service to get the data. This still might not be real fast, because no matter what you do the serialization must occur. But directly using the DataContractSerializer via WCF might be enough faster than the MobileFormatter to help.

Jack replied on Friday, February 11, 2011

Just throwing out an idea here without looking into the code but would it be possible to enhance the serialize so that it was a bit smarter and chunked up the serialized data and almost did lazy unserializing on the client end.  I know you serialize the whole object graph but would something be possible so that on the client end you can unserialize the bare minimum, return the data to the client, and do the rest in the background?

// enhanced attribute to prioritize the first 20 records and then lazyload 20 records at a time

//[Serializable(PriorityCountSize, LazyLoadChunkSize)]

The server serialize process creates 'chunks'


The ClientSide deserialize()



MyDataList.BackGroundDeserializeAddAppend(SerializedChunk[MyDataRecords21.x], pageSize)

Where pageSize is the number of items to background serialized and add to the collection before raising collectionChanged. 

It would be up to the developer to ensure there was a proper sort on the collection before serializing it.

Just a thought.. I know the whole process is complicated but it is definately a bottleneck especially when client's have slow PC's.



RockfordLhotka replied on Friday, February 11, 2011

This isn't really something the data portal or MobileFormatter can do, because there's no way to predict which parts of the object graph the application might use at any given point in time.

But look at the PagedList sample in Samples\Silverlight\cs to see how to do background loading of a large collection using async page loading. This works great when the developer has a reasonable expectation that the user won't move off the first 1-2 pages of data until the async load has had time to complete.

Copyright (c) Marimer LLC