Root Object & Root List Guidance

Root Object & Root List Guidance

Old forum URL: forums.lhotka.net/forums/t/7694.aspx


SonOfPirate posted on Monday, September 28, 2009

I have an application under development that is designed to support my primary business object as a root object (saving, updating & deleting itself).  However, our requirements have changed and now we are considering the idea of "caching" the objects, iow maintaining a list of objects, so that we can access the items quickly from the list rather than re-loading them each time they are needed.

The application runs behind a service interface, so data-binding or other interaction with the list is not required.  We simply need to iterate through the list for our new operation and be able to retrieve a single item by key value for others.

My initial thought was to go with a root list and have the list maintain the objects.  For instance, creating a new BO would be done by calling the Add method on the list.  However, I don't want to incur the overhead of calling Save on the list each time I perform a singular action (Add/Delete/Modify).  Plus, this is a multi-threaded app so blocking access is a concern.

I'm open to any suggestions at this point.  I have the 4 basic CRUD operations plus another that requires us to iterate through the list of BOs.  How would you approach this keeping performance and multi-threading in mind?

 

RockfordLhotka replied on Monday, September 28, 2009

Remember that CSLA .NET (like 99% of .NET) is not threadsafe. Using a single object (list or otherwise) from multiple threads at the same time will cause you problems.

In other words, caching read-only data is usually fine - as long as you don't have authz rules because they are thread-sensitive because the .NET principal is on the thread.

But caching read-write data in a multi-threaded or multi-user server setting will be problematic.

SonOfPirate replied on Monday, September 28, 2009

99+% of our operations will be working against the list as read-only objects. Only during system setup and the very rare configuration change will the CRUD operations be used. This is why we are looking to cache the objects to achieve better performance.

Maybe a better approach would be to maintain a read-only list of items in a root list as well as our separate root business operation which supports the CRUD operations. When we execute one of the CRUD methods, we could simply invalidate the "cache" (list) forcing it to reload the next time it is used. My only concern is the performance hit releading the entire list rather than simply performing an add/remove/update on the individual item within the list.

As for thread-safe, we would put locks around the Add/Remove operations on the list to prevent problems but this in and of itself creates performance issues as threads will block due to the lock.

If we did go with a read-only list and invalidate it when we perform a CRUD operation, who should be responsible for invalidating the cache? The BO or the caller?

RockfordLhotka replied on Monday, September 28, 2009

You can look at the ReaderWriterLock structure, which might allow for better performance around locking your lists to change them.

The only problem with such a locking strategy is that it only takes one mistake in some service code to cause a problem. It is totally voluntary for a consumer of the list to do the locking, so your only enforcement will be code reviews.

In .NET 4.0 there are some higher performance locking objects coming, including (I think) a ReaderWriterLock that uses a spinwait instead of a kernel lock - which will be nice.

If you do go with invalidating the cache, my first instinct is to have the object do it. The reason is to centralize the code - otherwise you'll have both locking and cache sync code scattered everywhere you touch your lists.

Copyright (c) Marimer LLC