SqlCommand cm;
//...
SqlDataReader dr = cm.ExecuteReader();
while (dr.Read())
yield return GetExportPiece(dr);
This reduces my peak memory usage from over 2 GB to a quite comfortable 150 MB.
However, this breaks the "use DataPortal and DataPortal_* to perform all database communication" rule.
Does CSLA contain anything that looks like this, or that otherwise supports processing quantities of data too large to fit in memory?First, do you really need to bring all those rows back to the client? In a 3- or 4-tier deployment that'd be amazingly slow! And even in a 2-tier environment it seems very iffy.
Second, does the user actually see or interact with this? Or is this a non-interactive batch job? I can't imagine a user paging through a million rows - they'd be there for days, or weeks or maybe months!
If this is non-interactive, it is probably best encapsulated in a Command object's DataPortal_Execute() method, and just use ADO.NET - skip the objects (or use your IEnumerable approach).
If this data really does need to come to the client, ouch! In that case I'd suggest you use paging with a ReadOnlyBase command object, and a variation of your IEnumerable implementation.
What you have now restricts you to a 2-tier deployment. There's no way that code can ever work in a 3- or 4-tier deployment scenario.
Here's the basic idea:
So your actual list object will never go through the data portal - but this page loader (ReadOnlyBase) object will use the data portal to bring back a page of data at a time - giving you the same basic result you have now, but in a way that works with the data portal and 3- or 4-tier deployments.
Yes, I do need to bring the rows back; the user may want to see the individual rows. Granted, this is much more likely on smaller result sets -- e.g. the items created on a certain day or in a certain week instead of all-time -- but I don't want to deal with multiple code paths.
I think I understand what's supposed to happen here now. Thanks.
Copyright (c) Marimer LLC