Massive Memory Growth with saving using DataPortal (CSLA

Massive Memory Growth with saving using DataPortal (CSLA

Old forum URL:

xenoputtss posted on Thursday, March 06, 2014

We having been using CSLA for the better part of 2 years now, no real issues.  We have discovered a issue with some new code when we go to save a lot of rows.

Our memory usage for our app typically remains below < 400mb in size.  With this new code (albeit it there is more data) our memory usuage will jumb from ~350MB to just over 2.5GB.  I traced this down to, what I think is happening, and it appears that the dataportal is doing binary xml serialization of the object (creating a 2.5GB xml), sending this over the network to the DataPortal host. 


I'm not sure what the resolution is.  I'm wondering if I should change from using an IIS hosted DataPortal to .net Remoting.  Should I upgrade CSLA to a newer version.  Should I do something else?


We are using .net 4.0,  CSLA, IIS Hosted DataPortal, IIS7.

Let me know any more information i can provide.  I do not have a working example that I can post (if it is needed, i can try to make one) 

RockfordLhotka replied on Saturday, March 08, 2014

There are a couple things to consider.

First, if you upgrade to CSLA 4 or higher you'll be able to use a new formatter we wrote to shrink the size of the serialized data. If that's an option for you that might be the best solution.

Second, if you can't upgrade you can look at using compression. This won't help memory consumption, but can help radically reduce the amount of data transferred over the network. There's a perf cost to running compression, but it is usually more than offset by the bandwidth savings. Unfortunately it isn't until in CSLA 4 that there's a real easy way to do compression in the .NET data portal, but you can (or could) find some WCF bindings that do compression.

Third, and this is perhaps the most extreme, you can look at the DiffGram sample (I think it is in the Silverlight samples folder) that shows how to send only changed data over the network. If you do this you give up the mobile object concept and many of the benefits of the data portal, but if you have a specific object graph that is very large this can be a worthwhile tradeoff to get better performance (just apply it to that one object graph).


RockfordLhotka replied on Saturday, March 08, 2014

This FAQ page may be helpful:

xenoputtss replied on Monday, March 10, 2014

I think upgrading seems to be our most logical step since compression won't reduce the memory usage at the client end.


I'm not incredibly knowledgeable in this area, but I thought I read somewhere about using .net Remoteing, would this offer any benefit to this issue?  

RockfordLhotka replied on Wednesday, March 12, 2014

Remoting won't help here because it uses serialization too, and it is serialization that's consuming your memory.

Copyright (c) Marimer LLC