June 2, 2008 at 8:00 am
We are redesigning an application for use by multiple business units. The business units can share the same code (stored procedures, etc.) but will have different data (reference tables and transactional data). One goal of the project is to centralize the database programming into one database -- we would like to manage our stored procedures and functions in one database. One option we have considered is creating the stored procedures to use dynamic sql and pass parameters for the 'data' database (specifically the 'data' database name and schema). Are there any other methods of having a single 'code' database utilized by multiple 'data' databases?
June 2, 2008 at 8:25 am
There are plenty of ways to do that. You could pass the business name as a parameter in each case, and have IF...ELSE statements. You could have different connections for different businesses and use the various USERNAME functions. You could use Union statements with those in the Where clauses (select c1, c2 from business1.dbo.table1 where user_name = 'business1' union all select c1, c2 from business2.dbo.table1 where user_name = 'business2').
However, you might want to take into account that having your procs in one database and your tables in anothe will result in a 5-6% reduction in performance for all actions in your database. (Cross-database queries, even on the same server, usually have about that reduction in speed.)
You will also end up with poor execution plans in some cases, because you'll be using the same execution plan for different databases with tables with different statistics. That might also impact your performance (possibly quite badly). Maybe using recompile on everything would help with that, maybe not.
There are other ways to manage your procs and code in a multi-database environment like this. One would be database comparison tools (RedGate and Apex both have those). One would be to use a loop on your alter scripts to have it go through all databases on a server. (I'd recommend that last option.)
That way, you're not having to build every query so it can use multiple databases (simplifies your code), your procs can store appropriate execution plans, and you're not doing cross-database queries.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
June 2, 2008 at 8:39 am
Thanks for the response. One inherent advantage of this application is the number of transactions will be minimal and some delay in response time is ok, so the additional overhead may be mitigated. Your insight into possible negative effects on execution plan is a concern and I will check into that should this option be pursued.
June 2, 2008 at 8:44 am
If you're going with ad hoc queries, save yourself the headaches and just write the code directly against each database. The performance won't be any worse and the development and maintenance will be considerably better.
I'd simply get the code into source control and automate your builds & deployments such that you can build X number of databases that are all identical with the click of a button. My personal choice for something like this is Visual Studio Team Edition ofr Database Professionals, but it's damned expensive. Red Gate's SQL Compare will work off of individual files in source control or you might look at DBGhost. Either of these will do what you need.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 2, 2008 at 8:49 am
Thanks for your response. I'm not exactly sure what you mean by 'ad hoc' queries, but the stored procedures will be exactly the same for each business unit. We have the sql utilities from Red Gate - thanks for the insight into using those tools.
June 2, 2008 at 9:03 am
Sorry, I said ad hoc when I meant dynamic. I didn't mean to be unclear.
Seriously though, you'd be better off with dynamic queries running on each database individually rather than managed through a central database. I'd even go so far as to say that you should use prepared statements in the application code to run the queries rather than stored procedures if you're going that far.
Otherwise, you really should be able to automate builds such that you can easily maintain multiple databases from identical code sets.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 2, 2008 at 9:28 am
You should consider the use of SYNONYMS of your 'code' DB in your 'data' DBs.
* Noel
June 2, 2008 at 10:17 am
If the code is the same, why not deploy it to each database and call the code directly without worrying about dynamic SQL?
Having 40 copies of stored procedures in 40 databases doesn't really hurt you. There are any number of ways, including Red Gate tools (or others), to deploy to multiple locations.
June 2, 2008 at 10:24 am
Thanks to all for the responses. I am planning on using the database we are developing in as the source master and replicating to the other databases using the Red Gate tools.
June 2, 2008 at 11:33 am
If this is a total re-design, why wouldn't you just create a single database that included composite keys in the tables - one of the key fields indicating the business unit?
If you want to ensure you maintain the same code base and there is not a pressing business need for individual databases, I would lean toward having one database. This would also allow you to design in the flexibility of sharing some of the data across the business units where appropriate. If the company can see the benefit to centralization of some of the database, they may be convinced to centralize everything.
June 3, 2008 at 10:36 am
Michael Earl (6/2/2008)
If this is a total re-design, why wouldn't you just create a single database ...
This solution gets my vote. If you have the same code then you must have the same schema for each database. It is a lot easier to execute the same code against the same database, just selecting the data according to which application or user is executing the code. We have done that where I work, merging similar databases into one database with similar but distinct applications "seeing" only the data they are supposed to see. It has worked so well we are actively looking for other candidates for merging.
Tomm Carr
--
Version Normal Form -- http://groups.google.com/group/vrdbms
June 3, 2008 at 11:09 am
Tomm Carr (6/3/2008)
Michael Earl (6/2/2008)
If this is a total re-design, why wouldn't you just create a single database ...This solution gets my vote. If you have the same code then you must have the same schema for each database. It is a lot easier to execute the same code against the same database, just selecting the data according to which application or user is executing the code. We have done that where I work, merging similar databases into one database with similar but distinct applications "seeing" only the data they are supposed to see. It has worked so well we are actively looking for other candidates for merging.
The problem with merging is that some maintainability/availability requirements could be different forcing you to addapt to the "higest"... I do understand though that if those are not compelling reasons then merging is probably the wisest thing to do.
* Noel
June 3, 2008 at 11:43 am
Michael Earl (6/2/2008)
If this is a total re-design, why wouldn't you just create a single database that included composite keys in the tables - one of the key fields indicating the business unit?If you want to ensure you maintain the same code base and there is not a pressing business need for individual databases, I would lean toward having one database. This would also allow you to design in the flexibility of sharing some of the data across the business units where appropriate. If the company can see the benefit to centralization of some of the database, they may be convinced to centralize everything.
There are some distinct disadvantages to the one-database-to-rule-them-all method.
First, if the database grows enough, or comes under heavy enough load, to require adding servers to it, that's much more difficult to do with one database with comingled data than with multiple databases.
Second, you can run into a situation whereby one company wants you to restore a backup from the prior day (or whatever) because some idiot did something horrible to their data. Not really possible with multiple companies in the same database.
Third, you don't have to worry about a proc slipping through, or an SQL injection attack, that accidentally exposes one company's data to another company. At least, with the data separated, it's not a simple question of a missing line from a Where clause.
Other, similar situations can be thought of quite easily.
The only real advantage to puting it all in one database is that it's easier to maintain the code. Quite usually, that's not enough to make up for all the potential advantages of splitting them up.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
June 3, 2008 at 11:53 am
I agree with much of that, but this is a case of consolidating business units, not data from several companies (according to the original post). So, the security concerns (at least to me) go way down. This also can take a lot of the "restore part of" pressure off.
It also seems likely to me that there will be at least some shared data, so consolidation makes a lot of sense. I do not think we should have any fear of large databases anymore. SQL Server has gotten to the point in which I would almost always prefer one big table over three identical small tables.
If it was sevaral companies, I would tend to split things into individual databases because even the procedure code may need to be a little different. It is hard to get two companies to agree to a date in which a patch that changes some code can be applied.
June 3, 2008 at 11:56 am
GSquared (6/3/2008)
Michael Earl (6/2/2008)
If this is a total re-design, why wouldn't you just create a single database that included composite keys in the tables - one of the key fields indicating the business unit?If you want to ensure you maintain the same code base and there is not a pressing business need for individual databases, I would lean toward having one database. This would also allow you to design in the flexibility of sharing some of the data across the business units where appropriate. If the company can see the benefit to centralization of some of the database, they may be convinced to centralize everything.
There are some distinct disadvantages to the one-database-to-rule-them-all method.
First, if the database grows enough, or comes under heavy enough load, to require adding servers to it, that's much more difficult to do with one database with comingled data than with multiple databases.
Second, you can run into a situation whereby one company wants you to restore a backup from the prior day (or whatever) because some idiot did something horrible to their data. Not really possible with multiple companies in the same database.
Third, you don't have to worry about a proc slipping through, or an SQL injection attack, that accidentally exposes one company's data to another company. At least, with the data separated, it's not a simple question of a missing line from a Where clause.
Other, similar situations can be thought of quite easily.
The only real advantage to puting it all in one database is that it's easier to maintain the code. Quite usually, that's not enough to make up for all the potential advantages of splitting them up.
Totally agree!!
* Noel
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply