We're in the process of trying to determine how to QC and replicate data between our editable databases and our enterprise database. We essentially have three databases:
We are not using versioning, networks, topologies, or m/any of the more advanced geodb features. We are essentially using non-versioned feature classes in various projections. All in are SQL Server geodbs, some in 9.3 and some in 10.0. Our goal is to upgrade all to v10.1.
Currently there is a fair amount of processing built into the enterprise database in the form of triggers, stored procedures, etc. Our experience has been that with offline mobile editing, when the edited features are loaded back into the db then all triggered processes work as expected. Feature classes are published from the enterprise db to the read-only web mapping db on a regular basis using python scripts and Windows scheduling.
The next nut we have to crack is to enable editing via mobile and Web applications, while allowing for QC of the data before loading it into the enterprise db. This is the reason for the 3rd read/write db.
We're trying to nail down a workflow or workflows that:
A logical worklfow would be to QC the read/write Web mapping db after edits are made there, replicate to the enterprise db after QCed, and then propagate the QC'ed data to the read-only instances (Web mapping or otherwise). Our concern with automating this is integration with the processing in the enterprise geodb (triggers, etc.), and maintaining three (at least) copies of the data. We would not necessarily be replicating all feature classes in the editing db, and in fact would more typically copy only a subset of feature classes at any given time (real-time where QC is not necessary).
We have not worked with database replication, but if it lends itself to a workable, reliable solution then we are definitely open to it.
Have others had similiar issues to overcome and set up their geodb replication accordingly? Any best practices to recommend? Anything that made it easy to accomplish these requirements?
Any info would be greatly appreciated. Thanks in advance.
- an enterprise database (mostly state plane) that is the organization's master db for all enterprise users
- a read-only db (web mercator) largely for use by web mapping applications
- a read/write db (web mercator) for use by editing web mapping applications (this one isn't set up yet -- it's why we're looking for best practices)
We are not using versioning, networks, topologies, or m/any of the more advanced geodb features. We are essentially using non-versioned feature classes in various projections. All in are SQL Server geodbs, some in 9.3 and some in 10.0. Our goal is to upgrade all to v10.1.
Currently there is a fair amount of processing built into the enterprise database in the form of triggers, stored procedures, etc. Our experience has been that with offline mobile editing, when the edited features are loaded back into the db then all triggered processes work as expected. Feature classes are published from the enterprise db to the read-only web mapping db on a regular basis using python scripts and Windows scheduling.
The next nut we have to crack is to enable editing via mobile and Web applications, while allowing for QC of the data before loading it into the enterprise db. This is the reason for the 3rd read/write db.
We're trying to nail down a workflow or workflows that:
- minimize potential for errors
- are entirely automated
- allow for QC of edited data when necessary
- allow for timely -- in some cases immediate -- viewing of edited data in the read-only and enterprise databases
- avoids/minimizes editing conflicts
A logical worklfow would be to QC the read/write Web mapping db after edits are made there, replicate to the enterprise db after QCed, and then propagate the QC'ed data to the read-only instances (Web mapping or otherwise). Our concern with automating this is integration with the processing in the enterprise geodb (triggers, etc.), and maintaining three (at least) copies of the data. We would not necessarily be replicating all feature classes in the editing db, and in fact would more typically copy only a subset of feature classes at any given time (real-time where QC is not necessary).
We have not worked with database replication, but if it lends itself to a workable, reliable solution then we are definitely open to it.
Have others had similiar issues to overcome and set up their geodb replication accordingly? Any best practices to recommend? Anything that made it easy to accomplish these requirements?
Any info would be greatly appreciated. Thanks in advance.