I have an application that requires us to populate an object model with external data upon start-up then, when the user elects to, save the entire object model in one swoop to the database. Because the populated object model could have anywhere from 1 to 10,000 objects spread out through its 4 levels of depth and several siblings of width, it is taking forever to perform an insert!
Once the first save is performed, everything is okay because we can rapidly navigate the object model and commit only those objects that have changed to the db. But, it is this initial save (INSERT) that is killing us.
So, I began thinking about the possibility of having some kind of bulk/batch insert controlled by the containing collection when the collection is flagged as "new". But, for one, I'm not sure how to even approach the SQL part of it and, two, how to pass this down through the object model like we do now?
I've also thought about multi-threading the save process but I'm worried about data concurrency and collisions if we allow the user access to the data again through the UI while we are trying to commit changes. So, I considered creating a Clone of the root object (and all of its children, grandchildren, etc) and saving that, but how then do we mark the original objects as clean after the save?
Does anyone have any thoughts on how to address this or had any experience with this issue?
Thx in advance.
#2 is exactly the direction I was thinking, but how then do we clear the IsDirty flag from all of the objects?
Copyright (c) Marimer LLC