3 monkeys (Utilities.CallByName vs Methodcaller.CallPropertyGetter)

3 monkeys (Utilities.CallByName vs Methodcaller.CallPropertyGetter)

Old forum URL: forums.lhotka.net/forums/t/6741.aspx


rfcdejong posted on Saturday, April 04, 2009

As addition to my question a story of 3 monkeys in the following:
http://www.codeproject.com/KB/database/DynamicMethod_ILGenerator.aspx?fid=435990&df=90&mpp=25&noise=3&sort=Position&view=Quick&fr=76&select=2130593

It seems that CSLA is using both monkey 2 (slow reflection) and monkey 3 (fast dynamicmethod) code.

However, most of the code is using the Utilities class which uses reflection alot. The Methodcaller generates dynamic methods in memory.

Am i looking at this wrong?

Is Methodcaller really faster like the monkey 3 in the story.

RockfordLhotka replied on Saturday, April 04, 2009

CSLA predates the dynamic method concept, and so had to use some reflection early on.

The parts of CSLA where performance matters most (n-level undo and the data portal) were converted to use dynamic methods in 3.5 thanks to a lot of help from Ricky Supit.

Eventually the other parts may get converted too, but honestly it is a cost/benefit issue. There's been more benefit in supporting WPF, Silverlight and so forth than in optimizing the use of reflection in areas where it has little real impact.

Premature optimization is an anti-pattern. We had to optimize the data portal to support some of the 3.5 concepts or they wouldn't have been practical. And it was necessary to optimize n-level undo to handle larger object graphs in complex UI scenarios. It may be necessary to optimize other areas at some point, and I'll do so if necessary.

rfcdejong replied on Sunday, April 05, 2009

Ok,in time perhaps.
I'm not trying to 'flame' CSLA ofcourse.

Just wanted to know if i should use MethodCaller.CallPropertyGetter or Utilities.CallByName.

Perhaps i'll create a small application to test both methods, if that doesn't put me in the anti-pattern hehe.

Thanks for the reply.

RockfordLhotka replied on Sunday, April 05, 2009

What I was hoping to get at, is that you need to understand the situation
where you are using the call.

If you are going to make relatively few calls, and/or the calls occur
relatively infrequently, then reflection is probably fine, and requires few
resources.

If you are going to make a lot of calls, and/or the calls occur frequently,
then it is probably worth the overhead of establishing and caching the
dynamic method information to make the calls faster.

The data portal, prior to 3.5, fit into the category of relatively few
calls, and so reflection was (imo) fine. The cost of talking to the database
was many, many, many times higher than the cost of reflection, so the use of
reflection (or not) made no practical (timable) difference in any real
application.

The 3.5 data portal, with its support for child objects, changed the
equation. With the change, the data portal does a lot more invocation of
your objects. Due to this, the use of reflection did make a measurable
difference in performance for many common scenarios. This made it worth the
overhead of creating and caching dynamic method delegates.

Rocky

rfcdejong replied on Sunday, April 05, 2009

I'm using the datamapper in my factory base classes where a custom mapping definition is being analysed and parsed. It maps an DTO by using the datamapper.map with a dictionary as destionation.
Next the dictionary is being iterated and it does a LoadProperty with the correct propertyinfo. The propertyinfo is marked as internal and the factory methods are in a friend assembly.

Iinserting it's slightly a differend story. By the DTO generated identifiers have to be read and set in the BO and ofcourse my internal "updatechildren" lookalike must give any identification to the BO's children factory's.

Works great, but im a bit afraid that all the extra mapping might turn the application into a cpu & memory eating monster. Anti pattern in place?

We'll see..

Greetings,
Raymond

RockfordLhotka replied on Sunday, April 05, 2009

The DataMapper support for an IDictionary originates with ASP.NET Web Forms,
because that data binding model provides postback form data in the form of
an IDictionary. The use of reflection in that scenario is incidental,
because data binding is already so reflection-intensive that DataMapper
makes no difference (and the amount of data in a postback is necessarily
limited).

I can't say I anticipated the idea of using an IDictionary as a DTO for a
DAL. You'd be forced to use a loosely typed dictionary, and so would incur a
lot of boxing/unboxing costs, and probably a lot of casting costs. It
doesn't seem like a particularly great model overall, DataMapper aside.

But if performance is adequate and you don't run into maintenance issues
with such a loosy-goosy DTO model then all is well :)

Rocky

rfcdejong replied on Sunday, April 05, 2009

We have our DTO objects generated, they aren't real DTO objects. But the ORM fills them and in case of a fetch operation it can even be anonymous objects.

No loosy goosy DTO, just generated from the datamodel :)

What i do for fetch:

1) Get DTO (can be with Linq to ORM)
Repository.GetPersistent(..)

2) DTO to a new dictionary();

3) Dictionary to BO
With LoadProperty() doing type coersion.

Update & Insert:

1) BO to dictionary
2) Dictionary to DTO
3) Persist DTO
4) Foreach identity in DTO do a loadproperty() against the BO
5) call child factory..

I thought that the DataMapper will help speed things up, since the GetValue and SetValue are being delegated into an dynamicmethod.

RockfordLhotka replied on Sunday, April 05, 2009

Why the dictionary in the middle?

You go from a strong-typed model to a weak-typed model back to a
strong-typed model. If this is all code-genned on both sides, I agree that
there's no maintenance issue. But there's still a perf issue due to all this
data copying and type coercion and boxing and casting.

Rocky

rfcdejong replied on Monday, April 06, 2009

Cause of the DataMapper wanting a dictionary with field names, but double thinking about it i could just call MethodCaller directly going from DTO to BO. I didn't see that much of a problem with using Dictionary.

I didn't think that going from strong to weak and from weak to strong again isn't that much of a perf hit.
I see that DataMapper.Map(fromObject, toDictionary) doesn't do type coercion.
For sure type coercion in LoadProperty, but i don't see any other way then doing that.

The dictionary between has one major improvement for us, the propertynames can be case-insensitive :)
The developer doesn't have to think about casing, in the mapping definition.
Because our developers will only have to define a mapping,
but most will be the already cased correctly because using lambda expressions for the propertynames. Only property alias names are handly coded.

rfcdejong replied on Monday, April 06, 2009

hmm.. the property value is already weak when called?
I might be wrong, but it seems that there isn't any boxing and unboxing?

object value = MethodCaller.CallPropertyGetter(dto, propertyName);

RockfordLhotka replied on Monday, April 06, 2009

You are aware that DataMapper will map from object to object right?

 

Again, the dictionary<->object support is there for ASP.NET. But there’s object<->object support for working with DTOs.

 

Rocky

Copyright (c) Marimer LLC