Optional Validation for large data loading application.

Optional Validation for large data loading application.

Old forum URL: forums.lhotka.net/forums/t/7874.aspx


jamie.clayton posted on Monday, October 26, 2009

G'day,

I've got a data loading application that I've recently updated to use a CSLA Business object to save the data. In my original application I was getting the users to run a validate code that looped through all the records to load (65k), then fix the data issues then save the data.

Now in this new hybrid approach the CSLA business rules provide an excellent "check" of the previous applications code, by simply calling the .IsValid method for each record.

Now I use the CSLA business obect to save the records and the business rules run a 2nd time and that's slowing down the process (significantly). Can any one suggest a way I can suppress the ValidationRules code from firing in a BO for this senario? Note: I'm reloading the BO property values between my validate and save code?

RockfordLhotka replied on Monday, October 26, 2009

There are a couple options.

BypassPropertyChecks can be used to change the GetProperty/SetProperty methods so they work like ReadProperty/LoadProperty (and so don't run rules).

There's also a SuppressRuleChecking property on ValidationRules that you can set to turn off just business/validation rule processing.

jamie.clayton replied on Wednesday, November 18, 2009

Rocky,

Sorry for the delayed reply. I had to upgrade a lot of projects from CSLA 3.0.3 to 3.7.0.0 and .net 2.0 to 3.5, so I had a little refactoring. Thanks for the suggestion. In my business object I added

Public Sub BeginBatchLoad()
ValidationRules.SuppressRuleChecking = True
End Sub

Public Sub EndBatchLoad()
ValidationRules.SuppressRuleChecking = False
ValidationRules.CheckRules()
End Sub

In the Dataportal_Create() method I also added

Sub DataPortal_Create(ByVal Criteria As Criteria)
Initialize()
msalesID = System.Threading.Interlocked.Decrement(mNextID)

' Set normal default properties

' Provide a way for Bulk records to skip initial validation checks, so they are done at the end of the property population.
If Not Criteria.SuppressValidation Then
ValidationRules.CheckRules()
End If
End sub

The code changes have resulted is a 73% improvement in bulk data loading using this BO. Much of the improvement is based on the changes to the DataPortal_Create to skip validation (30+ validation rules).

I must admit I could not find a way to use the BypassPropertyChecks from an external project in VB because of the way it's been coded.

Jamie

rsbaker0 replied on Wednesday, November 18, 2009

Something else to watch out for when loading data is not use the property setters directly (it causes PropertyChanged events and your business rules to fire) and instead use LoadProperty() or access the backing fields directly. I couldn't tell from your post which way you were loading the data, but this is something to keep in mind also.

ajj3085 replied on Wednesday, November 18, 2009

You can use settings if you wrap the code in a block like this:

using(  BypassPropertyChecks ) {

// set values here

}

That property exists to disable validation / security checks while loading your objects.

rsbaker0 replied on Wednesday, November 18, 2009

ajj3085:

You can use settings if you wrap the code in a block like this:


using(  BypassPropertyChecks ) {


// set values here


}


That property exists to disable validation / security checks while loading your objects.



You're right -- I just looked in the CSLA 3.6 code (I really have to upgrade from 3.5 before too long). I didn't realize how this feature was implemented.

BypassPropertyChecks not only bypasses the authorization rules, but it also prevents the PropertyChanging/Changed events from firing at all when the setter is used, which of course takes care of not firing the validation rules.

jamie.clayton replied on Wednesday, November 18, 2009

ajj3085,

Thanks for the clarification on the Using BypassPropertyChecks feature. The application is a financial one and I'm calling our BO "Sale" object, via properties, from another B2B application that allows other businesses a way to load data into our database. I did need security checks to run, because I'm fairly paranoid about open slather data loading in the finance industry.

My BO's are a single BO/DAL integrated into one project (class).
My call in the B2B application, which references the BO Assembly, with code something like.
Dim mySale as new Sale.NewRecord(x,y,z)
With mySale
.BeginBatchLoad()
.product = “Investment A”
.Customer = “Jamie Clayton”
.Invested = 1000 ' Does lots of calculations in the property change events.
‘ ... lots more data
.EndBatchLoad()
End with


Hope that explains things and the approach I've adopted. I'm very happy with the 73% performance improvements I've gained with these few code changes. Hopefully all the users of this software will too (500+).

Jamie.

jamie.clayton replied on Wednesday, November 18, 2009

Rsbaker0,

Yes the LoadProperty() method looked promising but the application(s) are financial, so I need the propertyChange events to fire calculations in the business object. The BO is a "sales" one.

I used Reg-Gate Ants 5 to profile the application, to confirm the behaviour and performance changes as this was the main reason for wanting to change the validation behaviour of the BO, as data loading doesn't include visual components (progress bar only), so the users don't need the constant validation feedback normally found in CSLA BO's.

Jamie.

rsbaker0 replied on Wednesday, November 18, 2009

Jamie Clayton:
Rsbaker0,

Yes the LoadProperty() method looked promising but the application(s) are financial, so I need the propertyChange events to fire calculations in the business object. The BO is a "sales" one....


Yes, I can see how one could go either way with this design choice.

In our case, we opted not to recheck the business rules in objects as they are fetched from the database in most cases (but not all). Presumably, they could not have been written if they were invalid at that time they were saved, so that is how we justified this approach. It was mainly done for performance, as with large data sets even going through a trivial property setter versus loading a backing field directly makes a noticeable difference in fetch speed.

jamie.clayton replied on Thursday, November 19, 2009

rsBaker,

I've also found significant performance gains from using contants vs variables in my applications.

In the VB world the upcomming .net 4.0 release (22 March 2010) will also change the way properties can be defined (single code line, rather than set/get) which I might need to be performance tested as well. But I'm not sure I'll use them in CSLA BO's.

I remember back in the VB.6 days, property performance was painful, so you ended up writing lots of methods that passed all the values in and then set the private variables, skipping properties completely. Well that was just boring code to write.

Copyright (c) Marimer LLC