I'm looking for a way to send CommandObjects to different servers. My command objects perform tasks on things like the file system so they need to run on servers other than the application server. All my regular objects need to be loaded/saved through the application server using default CSLA functionality but I need to have a way to send command objects to a machine that is not the application server. I'm trying to find a way to do this without changing anymore of the CSLA code than necessary so I can easily migrate to new versions of the framework as they are released. It's easy enough at runtime to pass the target server name to the command object it's just a matter of getting that command object routed to the proper destination. Any ideas? Have I made my requirement clear?
Thanks for any input.
Thanks bayu.
Prior to adopting CSLA I implemented this functionality with custom TCPListener and TCPClient which worked but since platform independence wasn't an issue I eventually changed it to use remoting. There is just so much plumbing already done in CSLA that I was hoping to leverage it now that I am using the framework but it sounds like implementing my own remoting code outside of CSLA may be the best way to make this happen. I just wanted to make sure there wasn't some existing hook in the framework for something like this before I went off on my own.
The jobs that I am running will run asynchronously. Here is a simple example - every evening I want to compress files on a file server based on some criteria that the user has defined through the front-end. The code on the application server handles starting this compression job when it is due but the job needs to run on the file server for performance reasons. So, at the designated time the app server sends a command object to the file server, the file server runs the compression job, then calls back to the app server with the results. The app server sticks the results in the database so that the next time the user opens the front-end he can see that the job ran successfully, how long it took, etc. Sounds like what I'll do is have the app server invoke the job through remoting on the file server then have the file server update the status of the job (a csla business object) so the results are saved to the database. So, the job is a csla object and the file server can update it using existing csla functionality but I'll have to launch the job outside of csla.
None that I now of .... so I think you will have to do it yourself. However, you are in fact leveraging a framework .... it's just not csla. ;-) Having experience with TCP Listeners (which you have too apparently) this is already quite a timesaver.davidk:I just wanted to make sure there wasn't some existing hook in the framework for something like this before I went off on my own.
davidk:The jobs that I am running will run asynchronously. Here is a simple example - every evening I want to compress files on a file server based on some criteria that the user has defined through the front-end. The code on the application server handles starting this compression job when it is due but the job needs to run on the file server for performance reasons. So, at the designated time the app server sends a command object to the file server, the file server runs the compression job, then calls back to the app server with the results. The app server sticks the results in the database so that the next time the user opens the front-end he can see that the job ran successfully, how long it took, etc. Sounds like what I'll do is have the app server invoke the job through remoting on the file server then have the file server update the status of the job (a csla business object) so the results are saved to the database. So, the job is a csla object and the file server can update it using existing csla functionality but I'll have to launch the job outside of csla.
Copyright (c) Marimer LLC