OT: logging and Database question

OT: logging and Database question

Old forum URL: forums.lhotka.net/forums/t/165.aspx


esteban404 posted on Monday, May 22, 2006

I'm biulding a logging routine in CSLA 1.52 using a clone of the BO (users want *every field* included in the change history). I have History:BusinessBase and will only do a simple listing then display in a grid grouped by field name so they'll see

   FieldName   WAS      NEW      WHO      WHEN
+ Material
                        Value      Value      Value      Value

--Grouped by FieldName, order by When

This BO is invoked by the user, not by default and based on the DateTimeGuid of the BO.

To get the actual human readable value from the table, I'll need to use a lookup mechanism like the Resources cache (yes, from waaay back in the CSLA 1.x days). That sounds like a lot of overhead.

This first one has 24 fields and 2 child collections. Each collection will have this History building class also. I'm trying to make sure I'm not wasting too many resources doing this. The objects are all disposing properly, so the footprint has not been a huge problem (yet). I see trouble though.

I've read all the threads I could get to today, but I'm still wondering. Under thinking can be just as bad as over thinking sometimes.

Thanks in advance,

_E

ajj3085 replied on Monday, May 22, 2006

Are you using stored procedures?

For auditing, its probably just best that you do this in the stored procedure.  On every insert and update, insert the new data in an audit table.

For a delete, copy the current row's values into the audit table and then delete the row.

For example:

T_Person ( PersonId, FName, LName, SSN)
T_Person_Audit( AuditId, PersonId, FName, SSN )

apPersonInsert
as
delcare @PersonId int

insert into T_Person(FName, LName, SSN)
values ( @FName, @LName, @SSN )

set @personId = scope_identity()

insert into T_Person_Audit( PersonId, FName, LName, SSN)
values ( @PersonId, @FName, @LName, @SSN )


This is psuedo code, but you should get the gist.

HTH
Andy

esteban404 replied on Tuesday, May 23, 2006

ajj3085:
Are you using stored procedures?

Yes, I am. I forget about the DB side alot. Instead of creating a lookup field in the object, I could put some logic into the sproc to grab the field value I want using a function in the database. I don't think it belongs there though.

Because it's building the same string for each entry, I thought there'd be a lot of chatter with the builder code so I considered doing it differently. This event will fire only in the Dispose() method where the record is unlocked also. So I'll be able to build it based on the final version of the session's changes. Maybe make the History object a child collection  and update with the same datareader. Hmm. I think I like that. Resolves the excessive chatter, keeps the CPU burden on the local machine instead of the server and I just need to consider the children in the collection. Roughly the same mechanism should work for them though.

I think that's done thinking out loud. Sorry for the ramble, but do give me feedback if I'm baked and need a day off. :-]

_E

DavidDilworth replied on Tuesday, May 23, 2006

esteban404:

I'm trying to make sure I'm not wasting too many resources doing this. The objects are all disposing properly, so the footprint has not been a huge problem (yet). I see trouble though.

When you say you want to make sure you're not "wasting" too many resource, what resources are you talking about?  Could you clarify?

Also, have you looked at the log4net project to see if it could help with your applications logging needs?

esteban404 replied on Tuesday, May 23, 2006

DavidDilworth:
When you say you want to make sure you're not "wasting" too many resource, what resources are you talking about?  Could you clarify?

Also, have you looked at the log4net project to see if it could help with your applications logging needs?

"Wasting resources" meaning building lists and initializing objects where they are not needed or using the data server to do work that it shouldn't need to do. I think my idea in the other reply may work.

Regarding log4net, I've had mono for a while now. :-] I'm originally a Mac programmer who was tricked by the dark side and haven't been able to save myself. Saw some log4net mention come through on one of the Apache lists, but didn't check it out until now. I seems a bit more than we need for this wee bit of logging if my idea for the child collection works.

With OS X's capabilities, I'm porting applications over to it, but I haven't tried CSLA stuff there. I'm sure I will when the framework is ported over (may already be, but I work on a 27 hour clock as it is) and I have time.

Thanks for the reply. Gets the brain juices flowing.

_E

ajj3085 replied on Tuesday, May 23, 2006

Esteban,

Just one more thought; I wouldn't use log4net to track field level changes.  If you really need to audit data changes, your best bet is to roll your own and store the history in a mirror like table as I described earlier.  More than likely, you'll need to get at this audit data, to do diffs, to show a history of changes, whatever. 

Use log4net to log other information about your application, espeically debugging information and other warnings and information about your application.  For a Forms application, how much you need to log will vary; most messages I would think you'd want to display to the user, you wouldn't just log a failure silently.  For a web application, you probably will want to log more information, since you may or may not have more errors that you don't want to display.

Andy

esteban404 replied on Tuesday, May 23, 2006

Indeed I *am* doing field level changes, so I've created a History base class from BusinessBase and for each of the objects, I just identify the fields that need to be tracked and build them from their field names. log4net seems interesting, so I'll read up.

Now, I'm investigating tagging each property with a custom attribute something like:
[LogField(true, "MaterialNumber")] // where true=log this sucker
public long MaterialID{get; set;}

Then use reflection to feed the IsLogEntry=true field names. I'm looking at how to get the human-readable value as a string from reference fields so it can build the entries. I don't have properties in the object nor static fields like the Roles or Resource fields to use to get values.

I do use a DevExpress grid and bind the object to a look up that resolves the issue by showing the value in the control. I may use that control's display text instead of crawling around the object to get it. I'm sure it's not going to be this easy since I need the readable value from the old setting, too, and it's not loaded into a control. So I'm back to using the object's datasource to do it. It's already initialized, so it should work.

Apparently I do a lot of thinking when I type. Thanks for reading and the input on the log4net option. It's very useful for the arsenal.

_E

DansDreams replied on Tuesday, May 23, 2006

I've also entertained this idea from time to time, but I've thought of a different approach.  My design is just to add two fields to the normal table something like EffectiveStartDate and EffectiveEndDate.

The "current" record is the one with a null EffectiveEndDate.

For any other specified point in time it's rather trivial to find the record effective as of that time.  And also fairly easy to do diff comparisions rolling through the records to create a change report.

Yes, no, maybe so?

http://www.amazon.com/gp/product/1558604367/sr=8-1/qid=1148405378/ref=pd_bbs_1/002-5007753-3371233?%5Fencoding=UTF8

ajj3085 replied on Tuesday, May 23, 2006

Dan,

Your approach is acceptable, but does have some drawbacks.  For one, there's quite a bit of 'dead' data that isn't relevent.  This will make the table larger, slowing down data operatins against it. 

It also doesn't sound like you can tie together changes into a single logical operation.  For example, the user may have worked on an invoice and several records change as a result.  You don't have a good way to tie all those changes together.

What I've done that worked very well is have a master Audit table, which recorded an AuditId, userId, date / time, and possibly a session id. 

All tables which needed auditing had a mirror that had AuditId as the primary key, which also is a foriegn key to the Audit table.

when a data operation was done, the Root BO created an audit record, and then all updates passed the AuditId to the stored procedures they were calling.  The procs were responsible for creating a copy of the record in the audit table (as I discussed before).

I really recommend against trying to do the auditing in the business layer; it will add quite a bit of overhead, and will be exteremly complex to implement and maintain.  All you're really doing is inserting into a mirror table whenever data is changed; let the database handle this.

Andy

esteban404 replied on Tuesday, May 23, 2006

I've done that, too, Dan. Then on an interval defined by the luser base a job runs to remove some of the records (directly targeted or based on activity level) into another table to speed object. Unfortunately, a trigger to create a dup of the record won't work for what the users need to see and the object's fields and properties create a big graph if I'm not paying attention.

For financials and other such data, I do audit tables as suggested. This just doesn't fit that criteria, IMOHO.

I've just not really addressed doing it in the BO before now. It seems a good fit. Now if this application were an ASP.NET project, I might rethink my approach, but I think this will work. The other developer here thinks it will work.

I just can't seem to get a good attribute way to do it all. But I can at least identify the properties to put into the history table.

Thanks for the link. I'm dl the PDF now. Looks like an excellent read.

_E

Copyright (c) Marimer LLC