Worker Processes and Large Objects

Worker Processes and Large Objects

Old forum URL: forums.lhotka.net/forums/t/4806.aspx


Brian O posted on Tuesday, May 06, 2008

Two questions that are related here :

  What is a good number of worker processes to assign to the App Pool used by the Remoting Host on IIS 6.0?

Second part is :

  If you have BusinessBase object that is containing a large byte array in a variable, is there a usage recommendation for hosting such an object? For example, a business object which retrieves the contents of a file and stores it in the byte array during DataPortal_Fetch. We are seeing IIS run out of memory periodically, and under Task Manager the W3P.EXE is gobbling up memory.

Thanks in advance,

Brian

 

 

 

 

 

tmg4340 replied on Tuesday, May 06, 2008

Brian O:

  What is a good number of worker processes to assign to the App Pool used by the Remoting Host on IIS 6.0?

I can't speak to this one - sorry.

Brian O:

Second part is :

  If you have BusinessBase object that is containing a large byte array in a variable, is there a usage recommendation for hosting such an object? For example, a business object which retrieves the contents of a file and stores it in the byte array during DataPortal_Fetch. We are seeing IIS run out of memory periodically, and under Task Manager the W3P.EXE is gobbling up memory.

It sort of depends on what you intend to use the byte array for.  I am presuming that you're getting the file to stream it back to the browser/client/etc.

In any event, the standard technique is to read the file in chunks and stream those chunks as you get them.  If you look at the "GetBytes" method on a DataReader, this allows you to "chunk" your field into pieces, thus allowing you to use a smaller in-memory buffer.  However, depending on how you are getting that back to the client, you may have some issues.

Ultimately, you have to get the whole thing back in your BusinessBase object, right?  What you might look into is creating a "placeholder" property for your file contents.  When the property is accessed, you execute a Command to the DataPortal that retrieves the file.  That way, you have a separate execution path to get all that data, potentially making the "chunking" process back to your business object easier.  Yes, it introduces a potential performance hit on your object, and depending on how you use it, it might be a little cumbersome.  But it's the best way I can think of to manage such large files.  And if the user never needs the file, it's not loaded unnecessarily.

There is a school of thought that says you shouldn't store large objects in a database - you should store pointers to the object instead.  That's a whole other discussion, though.

HTH

- Scott

Copyright (c) Marimer LLC