I'm hoping that someone can point me in the right direction for my SL4 / CSLA4 app.
A SQL recordset containing 2,500 rows (about 16 columns) is returned from the local DP. The proc executes in under 1 second. However, the time it takes to loop through each record in the ROLB DP method and pass it to the ROB via the DP.FetchChild (e.g. var item = DataPortal.FetchChild<ItemStatusInfo>(data);) takes approx 8 seconds.
To make matters worse, when I point the app to our remote DP on an application server, the WCF service generally times out after 30 seconds when trying to run the exact same query. I don't know if this second issue is due to trying to send that much info across the wire, or something else. Regardless, neither scenario is acceptable. (Note - I am using compression following Rocky's example in his e-book.)
I've followed the examples that I've seen for ROLB / ROB objects and am wondering if there is any way to significantly reduce the time it takes to load all this data in order to show in a data grid. I remember reading about a lazy load pattern, but am hoping to get some sense of direction before going down the wrong path. Thanks.
Lazy load won't help here - it would just push the problem to later.
I guess for me the first question is "what does your Child_Fetch method look like?" After that would be questions about the DataPortalInvoke methods...
(I guess I'm saying that some code might help us diagnose the issue. )
FWIW, I also think this is a rather large amount of data to be pumping back and forth. A 2,500-row grid with 16 columns just doesn't strike me as being very useful for a user. I would investigate a paging/searching scheme to reduce that row count. That won't fix your 8-seconds-per-record problem, but it will likely help with the timeout issues you're seeing in your remote tests.
HTH
- Scott
Could you post the code for the fetch?
Small test on my virtual Win7 dev computer loads a readonly list with 5 properties as:
20000 rows : 0,343s
8000 rows : 0,109s
2000 rows : 0,031s
Note - this test uses no data acess - just in-memory handling.
So the most likely causes would be running new SQL Queries on the database for each item (lookups?).
Test:
using System; using Microsoft.VisualStudio.TestTools.UnitTesting; namespace Csla.Test { [TestClass] public class ReadOnlyListTest { [TestMethod] public void Load10Rows() { int count = 20; var list = ItemList.GetList(count); Assert.IsTrue(list.Count == count); } [TestMethod] public void Load2000Rows() { int count = 2000; var list = ItemList.GetList(count); Assert.IsTrue(list.Count == count); } [TestMethod] public void Load8000Rows() { int count = 8000; var list = ItemList.GetList(count); Assert.IsTrue(list.Count == count); } [TestMethod] public void Load20000Rows() { int count = 20000; var list = ItemList.GetList(count); Assert.IsTrue(list.Count == count); } } [Serializable] public class Item : ReadOnlyBase<Item> { public static readonly PropertyInfo<int> Name1Property = RegisterProperty<int>(c => c.Name1); public int Name1 { get { return GetProperty(Name1Property); } internal set { LoadProperty(Name1Property, value); } } public static readonly PropertyInfo<int> Name2Property = RegisterProperty<int>(c => c.Name2); public int Name2 { get { return GetProperty(Name2Property); } internal set { LoadProperty(Name2Property, value); } } public static readonly PropertyInfo<int> Name3Property = RegisterProperty<int>(c => c.Name3); public int Name3 { get { return GetProperty(Name3Property); } internal set { LoadProperty(Name3Property, value); } } public static readonly PropertyInfo<int> Name4Property = RegisterProperty<int>(c => c.Name4); public int Name4 { get { return GetProperty(Name4Property); } internal set { LoadProperty(Name4Property, value); } } public static readonly PropertyInfo<int> Name5Property = RegisterProperty<int>(c => c.Name5); public int Name5 { get { return GetProperty(Name5Property); } internal set { LoadProperty(Name5Property, value); } } private void Child_Fetch(int i) { Name1 = i; Name2 = i; Name3 = i; Name4 = i; Name5 = i; } } [Serializable] public class ItemList : ReadOnlyListBase<ItemList, Item> { public static ItemList GetList(int count) { return DataPortal.Fetch<ItemList>(count); } protected override void DataPortal_Fetch(object criteria) { var cnt = (int) criteria; this.IsReadOnly = false; this.RaiseListChangedEvents = false; for (int i = 0; i < cnt; i++) { var item = DataPortal.FetchChild<Item>(i); this.Add(item); } this.RaiseListChangedEvents = true; this.IsReadOnly = false; } } }
Thank you for your repsonses. As I was adding examples of my code, I came across the issue (which was entirely my fault). Inside the Child_Fetch method was the following line at the start:
WCFUserEventProvider ev = new WCFUserEventProvider();
So each of the 2,500 ROB objects that were being loaded instantiated a new WCFUserEventProvider class unnecessarily. When I moved it to the catch block the time was reduced from a minute (on the remote DP) to about three seconds.
Additionally, as Scott pointed out, a user is unlikely to constructively use 2,500 rows of data, so I will ask the project lead to revisit this. It was late when I posted and was mildly panicked because this is the first team app done in SL/CSLA after all my pushing over the last year to use this technology.
Copyright (c) Marimer LLC