Transfer big data in CSLA?

Transfer big data in CSLA?

Old forum URL: forums.lhotka.net/forums/t/5761.aspx


Cuong posted on Friday, November 07, 2008

I am developing a WindowForm app, clients comunicate with server by IIS. My app needs to transfer very big data (several GBs) between clients and server. I tried to use BOs to trabsfer data but not successful because the limits of IIS. Does anyone have experience in this scenario? Are there any solutions or tricks to pass the limits of IIS?

sergeyb replied on Friday, November 07, 2008

You can bump up the limit in IIS.  The size is in KB (100 MB in the example below)

<system.web>

            <httpRuntime maxRequestLength="102400"/>

However, you may still run into other issues with this size of transfer, such as timeout issues.

 

Sergey Barskiy

Principal Consultant

office: 678.405.0687 | mobile: 404.388.1899

cid:_2_0648EA840648E85C001BBCB886257279
Microsoft Worldwide Partner of the Year | Custom Development Solutions, Technical Innovation

 

From: Cuong [mailto:cslanet@lhotka.net]
Sent: Friday, November 07, 2008 8:10 AM
To: Sergey Barskiy
Subject: [CSLA .NET] Transfer big data in CSLA?

 

I am developing a WindowForm app, clients comunicate with server by IIS. My app need to transfer very big data (several GBs) between clients and server. I tried to use BOs to trabsfer data but not successful because the limits of IIS. Does anyone have experience in this scenario? Are there any solutions or tricks to pass the limits of IIS?


SonOfPirate replied on Friday, November 07, 2008

The problem you are going to run into (that I've run into) transferring large amounts of data using BOs is more related to the type of serialization used as opposed to the use of BOs per se.  Although, depending on the design, BOs do tend to get heavy as the number of non-static/non-shared methods and properties increase.  I've found that loading a large amount of data into business objects contained in a business collection became unwieldy very fast and made the application inefficient.

For transfer scenarios, I always use lightweight DTO's to hold the data - a class with nothing more than the bare minimum of properties, no attributes, etc. to consume memory space.  This helps some.

The other issue has to do with how you are serializing the objects and passing them across the wire.  XML serialization can be expensive because it is text and you are adding all of the markup code, etc.  Binary serialization is the most efficient way of transferring your data.  Depending on the protocol you are using, such as SOAP, you may have to wrap the binary image in an XML wrapper but you can still accomplish the same goal.

Hope that helps...

 

Cuong replied on Friday, November 07, 2008

@sergeyb: Increasing the maxRequestLength value is a good trick if data size is only about 100MB. If data size exceeds GBs the error still occurs.

@vdhant: Your given links are to resolve other problems, maybe for big list of BOs of some tens MBs. But my transfered data is very very big (about several GBs)  and it is raw data.

@SonOfPirate: Thank for your suggestions, they are very heplful for me. Maybe I will write data into a temporary file and use CommandBase objects to transfer data piece by piece.

richardb replied on Saturday, November 08, 2008

Bulk loads of data can mean that you need to bypass the business objects and use some alternative solution.  We had same problem and creating some optimised objects specifically designed to validate and load helped performance, but then one day they wanted to import 20 million records (sales orders) and that didn't really work - SQL server bulk import had to be used.

vdhant replied on Friday, November 07, 2008

This issue tends to come up every now and then so have a quick search through the posts. But I think the general concusses is to find another way of solving the issue by breaking the data into smaller data sets or if you are displaying data, have a list that looks really big (as in the scroll bar suggests that lots of records are available) but then only load the next records as the user scrolls down (note that pattern has a name but i can't remember the name).

Here are some of the posts i fond just quickly looking:
http://forums.lhotka.net/forums/thread/4044.aspx - Performance of creating large collection
http://forums.lhotka.net/forums/thread/126.aspx - Handling large lists
http://forums.lhotka.net/forums/thread/23075.aspx - Caching large lists of data

Cheers
Anthony

RockfordLhotka replied on Saturday, November 08, 2008

I don't think I'd try to use the data portal or business objects to transfer GBs of data. With that much data you probably want to exploit resume and other features that are part of native http or advanced ftp protocols, and so technologies like web services or WCF just aren't the right fit.

Copyright (c) Marimer LLC