Code Generators and UML Modeling tool

Code Generators and UML Modeling tool

Old forum URL: forums.lhotka.net/forums/t/215.aspx


Moh_Abed posted on Sunday, May 28, 2006

Hi,

Using code generator to generate CSLA classes is a productivity boost, but it violates the concept of Object - Oriented design, i mean it generates classes from tables then forcing data oriented design, so if we need to design an application to be fully object - oriented no matter the database schema is then we have to write classes by hand.

what i am rainsing is that it would be better to find a code generator for CSLA that would generate code from class diagrams, then we design our application fully object oriented through class diagrams then it would generate classes and source files leaving developers to concentrate on writing business logic - and gaining some benifits from MDA.

most modeling / CASE tools generates XMI files defining the schema of class diagram - and some of them generate code to a specific language like C# but they dont have customized code generator, the level of customization may exceeds code generation (like defining a check box to deremine if the class is root or child ...). 

does anybody knows any tool that would generate code from class diagrams (or XMI files) that could be customized to generate CSLA code ?   

DavidDilworth replied on Tuesday, May 30, 2006

I'm not aware of any tools that do that right now.  But that doesn't mean there aren't any Smile [:)]

Your post raises important questions about how we should use code generation to create our problem domain BO classes.

I'm currently reading a book entitled "Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools (by Jack Greenfield et al)" that discusses the concept of completely automating the software development process.  As part of the discussion in the book it talks about the issues of how code should be generated from models as part of the whole development process.

However, the current tools we have (VS 2005 for example) don't raise the abstraction level of the problem domain sufficiently to make these types of tools useful just yet.  So we're stuck with using ORM tools or code generation templates that work from relational databases.

It's an interesting topic, even though I don't fully agree with all the points made by the authors.

 

Moh_Abed replied on Monday, June 05, 2006

Currently i am investigating modeling tools (MDA tools), Enterprise Architect looks very good candidate as it has a flexible MDA engine with code generation template and transformation template, but the template language seems to be ver complicated.

it would be great if you reach the model of designing object model using the UML class diagram, and add some steriotypes like Root, child collection, ... then it generates the code customized with all CSLA code, leaving the developer concentrating on writing business logic only.

DennisWelu replied on Monday, June 05, 2006

I think you have raised a question I've asked myself many times on this forum. The code generators I've seen are generally built from the database schema. Depending on the project, that may be convenient. It may be a more practical approach given that there is probably more community knowledge and standards for working against an RDBMS data source. But I thought it would be nicer to codegen from a UML model also, and that led me also to Enterprise Architect.

I like EA a lot because 1) it's cheap, 2) it's relatively easy, 3) it doesn't try to do everything but 4) it does a lot of things across the development life cycle, and finally 5) it's extensible. The template language is indeed proprietary and complicated, but I like the automation model it exposes a lot. It was relatively easy to build an add-in that would bring up my codegen dialog based on the current diagram. From there I could generate/regenerate my C# business classes with my own generator logic. The built-in class properties and generic tags gave me enough flexibility for metadata to generate the classes into my visual studio solution the way I wanted them.

Now, I've successfully used this on only a small non-production prototype so far. But I'm about to find out how well the process works for a larger from-scratch development project. We'll see how it goes!

Dennis

 

SolPub replied on Sunday, November 26, 2006

Hi Dennis,

I'm just about to get started with EA. Do yu know of any pre-build CSLA 2.1 templates for this tool. Thanks for your help.

 

Byron

DansDreams replied on Monday, November 27, 2006

"I like EA a lot because ...5) it's extensible"

Are we talking about Visio for Enterprise Architects?

By extensibility do you mean that you could get it to the point where you add a property in the uml and it writes the proper CSLA property accessors like including the CanReadProperty() check?

I continue to struggle with the legitimate ROI of code generation once you spend hundreds of hours tweaking things.  Just one example.... I have a root business object that has several properties that are collection objects.  I want two of them to lazy load but not the other two.  The collections all come from different stored procedure calls between two different back end databases.

Things like that are fairly trivial to code by hand, but it seems there are hundreds and hundreds of permutations of how BOs are related to each other and to the database.  Obviously, the value of generation is only realized when there's a fairly obvious and significant imbalance between time to tweak the generator and to generate the code.

There is then the case of needing to make a breaking change once you have a complex application.  In that case the imbalance is fairly obvious since we're comparing one change to the generator vs. numerous changes directly to the BO code.  But then the question becomes when we make a big breaking change like implementing LINQ are we essentially going to have to rewrite the whole generator paradigm again anyway?

Ok, I'm rambling.  This is one of those conversations I'd really like to have over a steak with somebody that's beat the daylights out of code generation and still finds it valuable.

Q Johnson replied on Monday, November 27, 2006

The fact that code generators run from RDMS table meta-data has often been held a liability here.  I am a bit confused by this.  Sure, we want to define our objects according to their needed behavior and responsibility.  But they will still boil down to properties and methods.

And it's so easy to make tables with our database tools, that it doesn't seem a real stretch to me to just build tables for the express purpose of code generation, whether or not the BO generated from them will use those particular tables for persistence or not.  (And it helps to begin the names of all these with a 'cg' or some such so that we don't confuse them with tables that will actually store data for the application.)

I find myself doing this frequently, for example, for BOs whose properties are found in more than one table.  I just join the tables in a view, change the view to a make table query, and generate the table.  Then I run the code generator.  It's still FAR less work than manually coding the BO by hand.

And regarding persistence, whatever TSql actually goes in the sProc whose name was generated for my BO will take care of that quite independently of how the BO was generated.  So we can store the BO in the most useful way for the database without regard to the BO structure.

What's not to like?

It sounds like many folks want to start with a modeling tool (e.g., EA) that generates some sort of useful file output.  Then that output could be used by a code generator to create both BO code and DDL code for creating the DB schema.  But it seems to me that this pesky ORM issue is probably going to make consistently achieving this a real challenge.  How would your definition of the object in the modeling environment know whether that object was to be persisted to a single table or might require more than one?

Until I find an easy way to answer that problem, I'm just going to be content saving lots of time builing my database to store what my application requires and to provide meta-data for BO generation (even if special tables need to be created for that express purpose).

Am I missing something important here?

Thanks,

 

 

DansDreams replied on Monday, November 27, 2006

Q, that's an interesting approach.  You're right, it solves the big question of what tool to use to model the business layer, given it could be quite different than the data layer schema.

But that still seems so inadequate.  You can easy enough model that there's a property called FullName in the Customer BO... but what of the fact that it doesn't come from a database column?  I can add a property lickity split with a snippet, so I don't see that as being the big value of generating.  It's all the appropriate dr.GetString("propertyName") mundane plumbing stuff that goes with it that really brings the value.  And to be able to know when to do that in a business object that may load from several tables across multiple databases, using a stored procedure in one case and an ad hoc query in another, it seems to me you need a whole lot more metadata in the model than just the fact that a property exists.

Q Johnson replied on Monday, November 27, 2006

DansDreams:

it seems to me you need a whole lot more metadata in the model than just the fact that a property exists.

That's one of the two major reasons I chose MyGeneration over CodeSmith.  The MyMeta utility available in MyGeneration can be very powerful.  You have to support it with code in your templates that reacts to the presence of those Metadata values, but it is often a very worthwhile trade-off.  At the time that I was comparing the two, that metadata seemed to do all you can do with CodeSmith's properties and quite a bit more.  And it can use Views and sProcs as well as tables for metadata sources.  Since you can add your own metadata to tables as well as fields, you can get properties added without having to add those fields, if that's important to you.  And it's pretty easy to protect hand-crafted code so that subsequent re-generation doesn't trash it (very handy for .NET 1 folks who don't have partial classes available!).

The other major reason (I'm guessing someone might be curious) is that you can write the templates in VB (or in C#).  I have managed to program many years without ever having been asked to do a web app.  So the ASP-like language used for CodeSmith isn't particularly appealing to me.  Developing my templates right in Visual Studio (with intellisense and all the other code support) on the other hand, is a real benefit.

DansDreams replied on Tuesday, November 28, 2006

That all sounds very interesting Q.  Are there MyGeneration templates for CSLA 2 available - is there as much community support in that sense as there seems to be for CodeSmith?  Where's the starting point for MyGeneration?

I currently have a contract developer working on a design for a metadata generator.  It's not a from-scratch project because he'd be leveraging some work he's already done, but it would be a significant cost to me. 

The reason I'm doing that is because my perception is that everybody is thrilled to get a 5% productivity boost and to me it hardly seems worth the trouble.  (Exaggeration intended.)  I looked at CslaGen, and while it is interesting being a purpose-built generator, the metadata editing UI is a long way from what I think is necessary to really take this generation idea to its fruition.

I say all that to say that some of what you describe with MyGeneration sounds similar to what we've been discussing here.

Could you elaborate a little more on "...the metadata seemed to do all you can do with CodeSmith's properties and quite a bit more."?

simon_may replied on Tuesday, November 28, 2006

Like most who have contributed to this thread I find code generation a facinating subject because if it could be implemented efficiently it would provide huge improvements in development productivity. I have in the past encountered MDA approaches and products from a management perpective rather than a user but concluded that they require an awful lot of work to get them to work succesfully in a particular application domain.. The idea of sticking a domain specification in one end of a machine and getting a fully working system out of the other end is tantalising.

However, when I chucked in the management job and decided to get my hands dirty again, working for myself, I took a look at automating what I could. My first peice of luck was discovering CSLA a solid framework around which business logic could be implemented but it was obvious that code generation was required.

I spent some time with CodeSmith but found that the templatte syntax a pain (being based on asp, who dreamed up the idea of a tag opener to require you to type a < -requires pressing a shift plus a key at the bottom to the keyboard followed by % again shift and now a key in the top row of the keyboard? - user friendly I dont think so Microsoft). I also found it challenging generate coditional code segments (where the code to be generated depended on a specific attribute in the model).

Like others here I also found that genrating from the schema also a pain given that I was mostly writing new systems. Anyway long story short I decided to crate my own metadata model allowing me to describe my domain in a set of XML files and from those genrate the business objects, DDL and stored procedures.

Version 1 was a good pre-prototype as I was able to prove the concept to myself and with the advent of CSLA version 2 I  moved to .Net 2 and rewrote. This time I attempted to bring in support for OO concetps such as inheritance and composition. Concepts necessary if I was ever to map from UML or XMI (which is outside current scope). I have abandoned Codesmith in favour of StringTemplate for my template engine.

Sorry about the ramble and life history. What prompted this contribution were questions and statements made by others about adhering to the purity of object oriented design in the solution space. Having been there and attempted to model it and generate CSLA based code I have come to the conclusion that as soon as you bring a realational database into the equation you are going to have to compromise on that purity.

The big one is how to support polymorphism particularly polymorphic collections of business objects with virtual methods where the actoul override is in and that needs to access data only available to a subclass.

Also there is a little bit of work required to get CSLA to support business object inheritance.

If anyone is doing the same sort of stuff and working on these and other problems then I would be pleased to talk to them.

Regards

Simon

 

 

Q Johnson replied on Tuesday, November 28, 2006

DansDreams:

That all sounds very interesting Q.  Are there MyGeneration templates for CSLA 2 available - is there as much community support in that sense as there seems to be for CodeSmith?  Where's the starting point for MyGeneration?

www.mygeneration.com

There isn't nearly as much community support.  AFAIK, it's just me (using VB) and Guy Rochon (using C#).  He evangelized me and I'm sort of continuing that idea here.  I only have templates for 1.53.  He only has templates for 2.x (again, AFAIK).  He might jump in here and offer more or you could ask him directly. 

....The reason I'm doing that is because my perception is that everybody is thrilled to get a 5% productivity boost and to me it hardly seems worth the trouble.  (Exaggeration intended.)  I looked at CslaGen, and while it is interesting being a purpose-built generator, the metadata editing UI is a long way from what I think is necessary to really take this generation idea to its fruition.

I think the boost is a little closer to 25-30% to tell you the truth.  But it's pretty darned hard to quantify for me.  The project I just finished used JET instead of SQL Server so I had to "tweak" my templates considerably for the different back end.  MyGen supports it just fine, but the OleDb instead of Sql object calls and the syntax for the sProcs I generate all needed to be output differently for my BOs.  Still, after spending the better part of a day modifying four BO templates, I generated about 2800 lines of code for the the major BOs of the project.  And I think it would have taken me at least several days using another quasi-automated tool.  By that I mean either using a BO template and adding all the properties and DP_xxx methods by hand or even adding that same code to the output produced by the .NET macros that have been offered here in the past.

After the initial generation, there was still a heck of a lot of UI stuff and report work to do for which I don't have any templates developed (or even planned yet).  But regeneration worked great when it was needed.

I say all that to say that some of what you describe with MyGeneration sounds similar to what we've been discussing here.

Could you elaborate a little more on "...the metadata seemed to do all you can do with CodeSmith's properties and quite a bit more."?

I think the easiest way for me to communicate the metadata capabilities would be to demo them for you over a remote connection.  I'd be happy to offer 30 minutes or so of that if you have the time for it.  Just email me at QJohnson at TeamNFP dot nospam com to set up a time convenient for us both.  I'm on Central Time and am in the office from about 8 to 8 every day this week.

I did this for Chris Denslow a while back, by the way.  I haven't touched base with him on the subject in some time, but my guess is that he has decided to stick with CodeSmith for which there is obviously so much more community support.  You just have to decide whether it's worth "working alone on it" because you like the tool enough for that.  I was reluctant myself at first.  But when I finally realized how much I was going to want to customize any tool's templates for my own use, I knew I wanted to be doing it with MyGen.  YMMV of course.

Q Johnson replied on Tuesday, November 28, 2006

DAN!

YIKES!  SORRY!  I gave you the wrong URL for MyGeneration in response to your question about the starting point for it.  It should have been:

www.mygenerationsoftware.com

 

MvdHoning replied on Tuesday, November 28, 2006

As the title of this thread impies usage of uml to generate business objects for csla. An idea that apeals to me! Generating business objects from database just feels wrong to me (also it misses dynamic properties that are not stored in the db).

The Enterprise Architect that is talked about is that this one?: http://www.sparxsystems.com/

Maybe then we could generate both the database structure(if needed) and business objects from the uml making the uml the central point in development.


DansDreams replied on Tuesday, November 28, 2006

I will definitely take you up on that Q but probably not until the beginning of next year.

DennisWelu replied on Tuesday, November 28, 2006

There have been some good questions and points raised here!

 

The familiarity of developers with relational technology, the rich set of tools built around it, and the type of project/organization may make starting from a db schema a fine choice. Other things being equal I would prefer not to because it seems that a conceptual/domain model is a more natural canvas for expressing these business objects. Outside of codegen, for us the model serves as a centering point for requirements discussions as we kick off each iteration.

 

I struggled with the various “languages” and templates for codegen in the tools out there. I really like Enterprise Architect (the product from Sparx Systems) but found their built-in codegen framework similar to CodeSmith: another language to learn and hard to do more than simple tweaks. It would sure be nice to work with csharp/dotnet and have full control, I thought.

 

Another observation is that many of the “off the shelf” codegen templates/tools try to do more than what I need on any given specific project. There are a few basic patterns of business objects on a project that cover the majority of ground. Rather than trying to get something that does everything I thought we could solve for the more common scenarios and get the process useful – then evolve it from there.

 

That is how I came to building a csharp addin for EA. We use custom tags in the UML model to give guidance to our codegen addin on what to generate. We also are transforming the domain model to E/R which we can export as a DDL script. Does it generate 100% of the output that could be generated? Definitely not. What doesn’t get generated gets done in other ways. But it handles the mainstream in a comfortable and controllable manner.

 

I was interested to see in a post somewhere when someone asked Rocky what he used for codegen and he said they had a custom solution. There certainly isn’t “one” tool that works for everyone.

 

mr_lasseter replied on Tuesday, November 28, 2006

Q,

I would be interested in seeing your myGeneration Templates if you don't mind posting them somewhere.

Thanks,
Mike

Q Johnson replied on Tuesday, November 28, 2006

Here are the four I use for BOs (ERO, ECO, ECC and ROC).  I also have some for sProcs and my BindControls routine.  I'm just going to include those here, too.  But keep in mind, this is only CSLA 1.x.  There's also an excel file in which I tried to start documenting my use of the metadata there so that Parent-type ERO BOs would know which Child and ChildCollection objects they had (and Children types would know their parent).

If you start another thread to get Guy's attention, he might let you see his C# templates for 2.x.

I have resisted the urge to post these in MyGen's template library because not everyone wants to generate the way I do.  But I guess that's always going to be true so I may as well get them out there and see if anyone can do anything worthwhile with them.  Post back here if you need any help.

 

guyroch replied on Saturday, December 02, 2006

Q Johnson:

There isn't nearly as much community support.  AFAIK, it's just me (using VB) and Guy Rochon (using C#).  He evangelized me and I'm sort of continuing that idea here.  I only have templates for 1.53.  He only has templates for 2.x (again, AFAIK).  He might jump in here and offer more or you could ask him directly. 

Nice to see you're still using MyGeneration Q. :)

However, due to legality issues I’m not at liberty to disclose my templates - sorry :(.  But Q nailed it pretty darn good - ah, if only I was a good poet like Q :).

I’ll get straight to the point.  MyGeneration was developed in .Net 1.1 and comes with it’s own UI.  You can do a decent job with the built-in UI but I’ve learn quickly that intellisense is one thing I can not do without.  Instead, I’m referencing the MyGeneration assemblies from my own .net project and code my MyGeneration templates using visual studio - It’s awesome.

I’m not intimate with all the other generation tool but I can guarantee you that if you can code in .Net (VB or C#) then you can code templates in MyGeneration - it’s the same thing.

Like any other tool though, it does need some TLC to get use to it.  I remember last year when Q approached me about MyGeneration.  It took several emails between him and I and a WebEx session of about 30 minutes and he was hooked - I’m glad to see he’s still going at it 1 year later.

The issue of code generation has been rehashed time and time again - and will be for years to come.  The thing is I’m sure we will be having this same conversation for years to come as well.  But this is fine, we can’t change the world all at once, but if we try we can change the world one by one.

At the end of the day though, you must choose the tool that best suits your need, and if MyGeneration is it, then I’m glad, if not, then it's fine too.  The fact remains that _most_ code generation tool reads database schemas for input and most of Csla’ers wants some sort of tool that read from xml or uml.  So until someone is willing to undertake this project and actually create such a tool, we will have to make due with what we have.  For the time being, I’m using MyGeneration and I does the job me, it really does.

Oh, one more thing... Mike Griffin, the creator of MyGeneration is much like Rocky in that his tool is free and he is as much active in his forum (http://www.mygenerationsoftware.com/phpbb2/index.php) as Rocky is in the Csla forum. 

Enjoy... 

xal replied on Monday, December 04, 2006

Anybody wanting to try something new could go to http://groups.google.com/group/CslaGenerator/ and try that one out.

Yes, you must have a db schema in order to use it, but you can easily design BOs to be just the way you want them to. Add the properties you feel like from each table. For many of these cases, the tool will figure out how to generate the correct stored procs, for the rest, you can always readjust the them.

I won't say much about the tool because I'm biased, so dl it and get your own conclusions. What I will say that I use it for my every day tasks and it's an incredible aid.

Andrés

xAvailx replied on Monday, November 27, 2006

>>

Using code generator to generate CSLA classes is a productivity boost, but it violates the concept of Object - Oriented design, i mean it generates classes from tables then forcing data oriented design, so if we need to design an application to be fully object - oriented no matter the database schema is then we have to write classes by hand.

<<

I guess this depends on what tool you are using for code generation. I use CodeSmith. By default CodeSmith has support for database (SchemaExplorer) and XML support to use as metadata. You can even create a custom meta data source. So with CodeSmith, you are not constrained to generating classes from tables.

http://www.codesmithtools.com/usersguide/

>>

does anybody knows any tool that would generate code from class diagrams (or XMI files) that could be customized to generate CSLA code ?  

<<

I have looked into this in the past. Visio has an XMI export add-on, but I quickly gave up after much frustration with the XMI schema. Instead we created a custom meta data provider in CodeSmith.

http://msdn2.microsoft.com/en-us/library/aa140339(office.10).aspx


HTH

RBrown replied on Tuesday, November 28, 2006

Since Rockford's book states that the Csla .NET framework lends itself well to codegen tactics, I'd love to get his input on this very interestig thread. 

Also, since CodeSmith has published codegen templates specifically for Csla, and has a substantial user community, one would think that it is a better choice for those of us just in the beginning stages of finding a codegen solution.  Sounds like others have done fine rolling their own with other codegen products as well.  For me, I'm gonna be more interested in finding something that doesn't require starting the template creation process from scratch, afterall, the reason for CodeGen in the first place is to increase a project ROI.  And because of the differences between table structures and object design, I think you'll always need to "tweak" the end result regardless of what product you use.  Just my 2 pennies...

david.wendelken replied on Saturday, December 02, 2006

The key to use code generation productively is to use it for the right tasks.

That said, how does one know what the right tasks are?

Here's how I do the math:

  1. How long would it take me to do one of these by hand?
  2. How many of them are there to do?
  3. How boring are they to do?  (I.e., will I slow down out of boredom if I'm doing a lot of them at one time?)
  4. How many mistakes per item will I make on average?
  5. How long will it take to correct each actual mistakes?
  6. How long will it take to correct the collateral damage from the mistakes?  (I.e., the downstream damage caused by allowing bad data, destroying test data, etc.)

So, the basic formula goes something like this:

(((Time to do one item by hand) +  (Extra boredom time factor))
* (Number of items to do) )

+ ( (Avg Number of Mistakes per item) * (Number of Items to do) * (Time to Correct Each Mistake))

+ (Correcting Collateral Damage Time)

So, let's say that I have 100 business objects to do.  I can do one in ten minutes, except that as I slog thru dozens of them I start to think about more interesting things like my wife's smile, so we'll allow for twelve minutes apiece.  That's (10 + 2) * 100 or 1200 minutes for the base work.

I'll estimate one mistake per business object (leaving out a column, mispelling one, wrong datatype, etc.)  Each mistake will take 1 minute to fix and 1 minute to check in and out of source control, for a total of 2 minutes per mistake.  That's 1 mistake * 100 items * 2 minutes per mistake, or 200 minutes.  Thankfully, I don't work in a paper-work intensive shop, or it might be an additional 5 minutes of paperwork per mistake...

Out of those 100 errors, at least 2% are bound to cause some downstream damage.  One will cause another programmer to waste half a day trying to figure out what's wrong with their code before they find out it's my fault.  That's 4 hours, or 240 minutes.  The other mistake may only cause about an hour's worth of lost time while I research it and reload some trashed test data, for another 60 minutes.

So, my planned cost to do it by hand is 1200 * 200 * 240 * 60, or 1700 minutes, almost 50% more than the actual base work.  Of course, if just one of those mistakes takes a day or two to sort out, we could easily double or triple our true time to do this task by hand. 

1700 minutes is 28 and a third hours.  That's my time budget to build a code generator to do that work.

Unless, of course, I think it's likely that we'll make systemic changes to the objects as time goes by - during the first release of the product only.  Then I get to add in the "upgrade time" to the initial time budget.

I follow the 80-20 rule.  80% of the benefit of code generation can be gained for 20% of the work.  The next 20% takes way more effort to get additional results. 

So, if I've analyzed my work and realized that 80% of my mistakes are due to one or two items, and I can build a code generator to do just that portion of the code, then the payback on generator prep time vs hand-coded time will be higher! 

If I've been careful to clearly separate generated stuff from non-generated stuff, I can continually re-run the generators as the specifications change without fear of overwriting manually coded changes.  This makes the payback higher still.  It also means that I can relax my change control processes because, for many types of change, I simply do not need to care it took place.  The code will automagically get updated the next time the generator gets run.  Otherwise, I have to track each change to the specifications and make sure that it gets rippled thru by hand.  (And the expected cost to track it should be added to the formula to determine my time budget to generate code.  Not taking this into account is where many code generators lose value.

The reality is that many code generators are extremely simple to build provided they focus on solving 80% of the problem and not the entire problem.  Going back to my example, let's assume that I don't try to generate the entire business object, just key portions of it.

Let's say I can save 5 minutes per object plus cut my error rate by 80% by building a generator to do the boring but simple parts. 

So, that's (5 minutes per object + 2 minutes boredom factor) * 100 objects for 700 minutes for the base work.  Add in the 80 errors I avoid (80% * 100 errors)  and I save 80 * 2 minutes for 160 minutes.  We'll assume the collateral damage remains the same.  That gives me a budget of 860 minutes, or 14.3 hours to build those generators. 

As an example, I can build a stored procedure generator in 2 to 4 hours if I know the database meta-data tables well.  (Then again, I've had a lot of practice building code generators!)

Hope this helps!

PS - This technique, of making the full cost of doing a task visible can be extremely effective in keeping over-optimistic colleagues from getting management's approval to lead you down a bad development path.

 

 

Copyright (c) Marimer LLC