I haven't posted much lately, none on the new forums. But I have an app that is running for about 30 users on the original (1.5) version of the framework. Everything works good except I think my dataportal is becoming too congested. It seems that the dataportal is my one bottleneck and sometimes it is even skipping transactions.
I am working on porting this app to version 2.0 and am seriously considering all of my other options in the dataportal area. I am using remoting/iis currently.
What is everyone else doing dataportal wise? And what are the pros and cons of each. I read the little blurb at the back of the book and it seems that Enterprise Services may be my best option. Any comments on this? I am looking for the option that will run the fastest for users.
Thanks,
Ryan
Before going too far, you should identify the core of your performance issue. It could be coming from several possible areas, and the right solution will depend on the actual problem. Ideas include
Using the EnterpriseServices data portal channel would help with #2, but won't do much for 1,3,4. Numbers 3 and 4 can be solved by well-known tuning, refactoring and other techniques. Number 1 is the hardest, because if serialization overhead is the issue then you need to look at your object graphs and determine if you are moving too many objects. Solving #1 typically requires reworking your object model to minimize object movement, and that can be tricky.
Thank you for your timely response. I now have another question for you. What is the best way to look at my object graphs? The reason I ask this is because out of all of your possible problems I am most likely having problems in number 1 because some of my objects are very large (parent and nested children included) so this creates quite a bit of data that has to be serialized/deserialized to go to the dataportal. When I run the app without a dataportal (local, connecting straight to database) everything is 2 to 3 times quicker. I am also going to investigate some of these other issues but I really need to look at my object graphs.
Thanks,
Ryan
I've also had something very similar where users were editing a report, which could contain upto a 1000 objects. The users needed all the data in one go as well as they work and edit the entire report.
One thing that we notice was that the data transfer was always the slowest link, we have roughly about 20 users using this application daily. First thing I did was monitor the HTTP chanels to see how much data was being transfered. I then implemented the RemotingCompressionSink described in a .Net Remoting book (Sorry I can;t remember the name, i'll dig it out and let you know if u need it).
This was with CSLA 1.5. The report loading time more than halved with the compression sink turned on (It can be turned on or off via a config file change) Some of our reports were taking upto 12 seconds to load without compression, when we turned compression on the reports would load between 4-6 seconds. A massive improvement, you can see this if you monitor the HTTP channel the change in the amount of data transfered was huge. (Havent got exact figures on me, but can fish them out if needed)
Although the compression idea is good and will sort out problems with Large objects you have to be carfull of how often you use it. We haven't had any issue with 20 or so users, but if that was multiplied by 10 say and you had 200 users the compression may cause performance to degrade on the server.
My current implementation only allows the compression to be turned on or off for the application, what would be a good improvement would be if we could turn compression on or off per object.
I've not played with .net 2.0 much yet, and have only just got the book, but I'm sure I have seen compression mentioned somewhere so maybe its built into CSLA 2.0 (maybe someone could chime in if they know)
Hope this helps
Craig
Thank you for your responses. I am going to look into the remotingcompressionsink. If you could find the title of the book that would be great. Also how do you monitor the http channels?
Thanks,
Ryan
Hi Ryan,
The tool I used for the HTTP monitoring was tcpTrace from PocketSoap.com, http://www.pocketsoap.com/tcptrace/
The .net remoting book was http://www.amazon.com/gp/product/1590590627/102-8693462-1339323?v=glance&n=283155 Advanced .NET Remoting in VB.NET by Ingo Rammer.
Regards
Craig
You can't really look at the serialized objects - the result is a binary stream or byte array. But you can easily find the length by manually running the BinaryFormatter and then examining the length of the resulting byte array.
Chapter 3 shows how to implement the Clone() method, which is a good guide for using the BinaryFormatter manually.
Copyright (c) Marimer LLC