CSLA Remoting fails with too much data

CSLA Remoting fails with too much data

Old forum URL: forums.lhotka.net/forums/t/2438.aspx


jgianni posted on Tuesday, February 27, 2007

I have a visual basic project that is validating a large amounts of data from a spreadsheet. I am using CSLA 1.5 and Visual Studio 2003. After validation I am writing to about 6 tables in an oracle database. Total number of records written from one spreadsheet is about 2000 records. I am using IIS on an application server using Binary formatting. If I run this locally, not using the application server, it works great. When I run it using the application server, I get a binary formatting error, which if I use a packet sniffer I see that it is an Http exception, Maximum request length exceeded. I increased the 'maxRequestLength' parameter in the httpRuntime section of the web.config file for the application server, but it still fails after allowing more data. This only allows more columns in the spreadsheet to be processed. (I physically edit the spreadsheet to remove columns)

So basically If I change the amount of data it will finally work, so it is a data volume problem in remoting.

Also like I said if I run locally then I have no problem at all since it is not sending the serialized data to the application server, which is where the problem lies.

Has anyone had similar problems?

Thanks

xal replied on Tuesday, February 27, 2007

I've had this issue before. It's not only about the max request length, it's also about the time it takes to process. Increase the value for the timeout in httpRuntime and see if that works (I don't remember the exact name of the attribute right now).

It may still be that you have a bigger stream than the value you defined, try serializing the object to disk and see how big it is....


Andrés

jgianni replied on Tuesday, February 27, 2007

I will try 'executionTimeout' 

How would I go about serializing to disk?.. That seems like a great idea for troubleshooting..

Thanks..

xal replied on Tuesday, February 27, 2007

Take a look at how csla implements cloning. (If I remember correctly Csla.Core.ObjectCloner)

It would be exactly the same but with a file stream instead of a memory stream. Of course, you wouldn't need the deserializing step :)

It's about 5 lines of code.

Andrés

jgianni replied on Wednesday, February 28, 2007

OK,

I was able to get the the object serializing to a file..

It looks like the limit is 66mb.

If my data is over 66mb I have the failure.

It doesnt matter what themaxRequestlength or executionTimeout is mine are:

maxRequestLength = 25600 (kb)
executionTimeout = 900 (seconds)

So I wonder if there is a problem with .NET 1.1

xal replied on Wednesday, February 28, 2007

Have you considered processing the spreadsheet inside the portal instead of the client? That way, you save the overhead of serializing / deserializing all that data...
Or maybe you can send the data in chunks of n objects.

Andrés

pelinville replied on Tuesday, February 27, 2007

I also remember a bug in .NET version 1.1 that had to do with serialization and size.  2.0 fixed it.

Copyright (c) Marimer LLC