I have talked to a number of users who are asking for capability to manage the Process Preparation needs of multiple locations around the world. I would like to hear from you on what requirements you consider important to this discussion. For example we have some customers running a single database across multiple locations allowing them to share data across all their facilities. One downside of this approach is that each site is dependent on an external network connection back to the host server to continue to develop new programs. Another possible downside is that data needs to be common across each site. Data that may need to be changed from site to site may become harder to manage.
An alternative is to have each site running their own database enabling them to be more self-sufficient if and when external network connections go down. The downsides raised earlier for a single server are not issues for this approach however there are other challenges that multiple databases can have. This scenario brings up the need to be able to share different types of data across the sites, namely design specific data, library specific data, etc. Should this data be shared in a more synchronous fashion or should it be more of a push/pull from site to site?
I would appreciate anyone posting comments to this so that we can appreciate the different advantages and disadvantages of the approaches.