Unable to write data to the transport connection

Unable to write data to the transport connection

Old forum URL: forums.lhotka.net/forums/t/4171.aspx


mr_lasseter posted on Monday, January 14, 2008

I am getting the following error when trying to save an object using remoting. Anyone have any ideas? Using Csla 3.0.2

System.IO.IOException: Unable to write data to the transport connection: An established connection was aborted by the software in your host machine. ---> System.Net.Sockets.SocketException: An established connection was aborted by the software in your host machine
   at System.Net.Sockets.Socket.Send(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
   at System.Net.Sockets.NetworkStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   --- End of inner exception stack trace ---

Thanks,
Mike

RockfordLhotka replied on Monday, January 14, 2008

Do other objects work?

mr_lasseter replied on Monday, January 14, 2008

Yes.  I am thinking it must be to do with the size of the object.  I can save small objects of this type but when I save large (a collection that has 75K) children it fails with this error.  This was just a test case as I shouldn't have collections that large in production, but it would be nice to know what is causing the error.

Thanks,

Mike

RockfordLhotka replied on Monday, January 14, 2008

I’m surprised you didn’t get an exception about size. But I imagine this is the issue. If you are using the http channel, you just need to expand the allowed size for an http upload – that’s been discussed on the forum in the past and is easy to do.

 

If you are using the TCP channel then I am not sure how to adjust that setting. Maybe Ingo Rammer’s book has a clue? Or someone else here?

 

Rocky

 

 

From: mr_lasseter [mailto:cslanet@lhotka.net]
Sent: Monday, January 14, 2008 3:58 PM
To: rocky@lhotka.net
Subject: Re: [CSLA .NET] Unable to write data to the transport connection

 

Yes.  I am thinking it must be to do with the size of the object.  I can save small objects of this type but when I save large (a collection that has 75K) children it fails with this error.  This was just a test case as I shouldn't have collections that large in production, but it would be nice to know what is causing the error.

Thanks,

Mike

TheSquirrelKing replied on Tuesday, May 13, 2008

Hey, sorry to bump an old thread, but could anyone tell me how to expand the allowed http upload size as Rocky suggests? I have not been able to find the posts he refers to.

Thanks!

GlutenBoy replied on Wednesday, May 14, 2008

I've been runnning into something similar lately. Has anyone found a solution yet?

Thx

sergeyb replied on Wednesday, May 14, 2008

How long does it take from the time that the call is initiated until the time you get the error?

 

 

Sergey Barskiy

Senior Consultant

office: 678.405.0687 | mobile: 404.388.1899

Magenic ®

Microsoft Worldwide Partner of the Year | Custom Development Solutions, Technical Innovation

 

From: GlutenBoy [mailto:cslanet@lhotka.net]
Sent: Wednesday, May 14, 2008 8:58 AM
To: Sergey Barskiy
Subject: Re: [CSLA .NET] RE: Unable to write data to the transport connection

 

I've been runnning into something similar lately. Has anyone found a solution yet?

Thx


GlutenBoy replied on Wednesday, May 14, 2008

The times varies from less than an hour to 5 hours. This is using the same data source every time.

sergeyb replied on Wednesday, May 14, 2008

That is one long running process!

I had to deal with remoting timeouts on long running processes before (not in CSLA though).  Here is what I would try:

In web.config for remoting web site: add timeout as in so:

 

<channel ref="http" timeout="30">

 

You also unfortunately have to modify CSLA framework as well.  In RemotingProxy.VB (or CS) add another property to properties bag:

Below properties("name") = "HttpBinary"

Add properties("timeout ") = "30"

 

Where 30 is timeout in seconds for a single call.  So, if your call can last up to 5 hours, set timeout to 18000.  You can also set timeout to -1 for infinite, but I am not sure if you want to do this, although you certainly can for testing.

 

Please let me know if this works at all for you.

Thanks.

 

Sergey Barskiy

Senior Consultant

office: 678.405.0687 | mobile: 404.388.1899

cid:_2_0648EA840648E85C001BBCB886257279
Microsoft Worldwide Partner of the Year | Custom Development Solutions, Technical Innovation

 

From: GlutenBoy [mailto:cslanet@lhotka.net
Sent: Wednesday, May 14, 2008 9:39 AM
To: Sergey Barskiy
Subject: Re: [CSLA .NET] RE: RE: Unable to write data to the transport connection

 

The times varies from less than an hour to 5 hours. This is using the same data source every time.


GlutenBoy replied on Friday, May 16, 2008

Unfortunately this did not fix the problem. The process we are running create a large number of child objects by importing information from an equally large number of files. To break up the load, we save the parent object (which contains the child object collection) every x number of files. When using remoting, the error occurs after roughly 14000 files depending of value of x. When connecting directly to the database, this number varies greatly from about 50 000 to almost 100 000 files, before we get a transaction aborted exception. We are desperate and confused.

sergeyb replied on Friday, May 16, 2008

If you tried timeout of -1, I am not really sure how to solve this problem traditionally.  I think, your best bet at this point is to convert the import process to a Windows Service that exposes a command to start the import (or you can have it poll DB for a start flag).  Make sure you install the service on the same machine as data portal.  Essentially, you are converting your process to be asynchronous, and use DB to poll for status of it.  I believe this will work.  This is all based on assumption that the import is not running on the client, but on the server(data portal).

Hopefully, this is an option for you.

 

Sergey Barskiy

Senior Consultant

office: 678.405.0687 | mobile: 404.388.1899

cid:_2_0648EA840648E85C001BBCB886257279
Microsoft Worldwide Partner of the Year | Custom Development Solutions, Technical Innovation

 

From: GlutenBoy [mailto:cslanet@lhotka.net]
Sent: Friday, May 16, 2008 8:49 AM
To: Sergey Barskiy
Subject: Re: [CSLA .NET] RE: RE: RE: Unable to write data to the transport connection

 

Unfortunately this did not fix the problem. The process we are running create a large number of child objects by importing information from an equally large number of files. To break up the load, we save the parent object (which contains the child object collection) every x number of files. When using remoting, the error occurs after roughly 14000 files depending of value of x. When connecting directly to the database, this number varies greatly from about 50 000 to almost 100 000 files, before we get a transaction aborted exception. We are desperate and confused.


ajj3085 replied on Friday, May 16, 2008

I did something similar, and we also had to go the service route.  Our process would create about a million rows at a shot.  Also had to "batch" the sql commands; there was a point where the sql text was too big to send to the server.

TheSquirrelKing replied on Tuesday, May 27, 2008

I just thought I would give an update in case anyone else is interested or having the same problem:

In our case, the import does run on the client due to the distributed nature of our architecture. We ended up being able to partially solve the problem by modifying the <system.web> section of the Web.config as follows:

<httpRuntime maxRequestLength="2097151" executionTimeout="43210000" />

This fixed things for us in the short-term, but, ultimately, the 2097151 KB maximum imposed by the .NET framework ended up being too small for some Save operations (usually once the file count got up into the 100 000 to 1 000 000 range).

This has led us to our current hackish solution of just calling our stored procs for insert operations directly and thus avoiding the overhead of the CSLA Insert/Save process. I'm not too happy with having to "cheat" and go around the framework like this, but the performance requirements have made this a necessary evil.

RockfordLhotka replied on Tuesday, May 27, 2008

If you are doing an import of a large (huge?) file, you might consider streaming the file to the server and then doing the import. WCF supports the concept of streaming, or multi-block transfers, of data. The data portal doesn’t – but you could easily add another svc to your host for the purpose of “uploading” the large data to a temporary location on the server, and then triggering the import process from there – on the server.

 

Architecturally that seems like a much better overall solution.

 

Rocky

TheSquirrelKing replied on Wednesday, May 28, 2008

Well, it's not so much a single huge file as it is ~ 1 010 000 files in this particular case, and we are looking to keep the processing of these files to the distributed clients. And as for WCF, as much as I've been looking forward to giving that a spin, we're still stuck using version 2.1.2 of the framework (long story, not my choice...).

In any case, directly calling the object's Insert() methods and skipping the Save() overhead seems to be working well enough for now on the getting the data/objects into the DB side of things. The problem we're having now is Fetching() them back out again. Reconstituting a collection of a million+ objects is having the effect of turning my client machines into quivering, unresponsive wrecks before they can get through the whole collection. Thusfar, I haven't been able to actually complete a Fetch() operation on collections of this size - though I guess I could leave it going overnight tonight to see if it'll eventually slog through to the end.

On that note, I was wondering if anyone else has had any experience with pulling down such large collections and how to get past the performance-crippling effects of doing so.

Thanks!

RockfordLhotka replied on Wednesday, May 28, 2008

Oh, I doubt very much that you can create that many objects – that’s just not the right design.

 

You should consider using a flyweight and/or iterator design pattern if you need to process all that data. Or better yet, when processing huge amounts of data I always start first by trying to do the work in stored procedures. (not that I’m a fan of T-SQL programming, but for processing huge amounts of data it is the right tool for the job)

 

Rocky

 

Copyright (c) Marimer LLC