October 20, 2006 at 9:13 am
hi
i need to take data out of one central database and distribute it out to one of a number of databases that hold a subset of the data in the central one. i developed a trigger that does this - but it is not automatically scalable.
i used an insert into databasename.dbo.tablename select values_list from inserted to do this.
i tried writing a dynamic sql statement so i could set the databasename based on some criteria - but it failed miserably.
any ideas? is a trigger the best way to do this transfer of data?
what i have done works - but is there a better way
October 20, 2006 at 9:36 am
IMO A trigger is the worst place.
It holds your transaction open until the copy to the other server is complete. So the question would be what happens of one of the other servers is offline.
Insert would fail, and data would be lost.
I would recommend Either Replication, Replication using DTS or some other system where you maybe use a trigger to insert the keys into a table in the same database, then use some other mechanism to push, or pull the data to the correct server.
October 20, 2006 at 10:02 am
Jono,
Trigger is not the kind of solution that you should be looking. DTS or SQL Replication is what u want otherwise ur data might be lost (due to non availability of second database).
October 20, 2006 at 11:13 pm
i should have been more specific:
the databases are all on the same server. so our sql server has multiple databases - only one of which receives data from remote data via UDP. what we need is to take all the data into the one db and then split it into the correct db. does this change anything?
October 21, 2006 at 9:26 pm
I can't help but wonder why you would have multiple databases on the same server that need the same information. Why aren't these databases all looking at the same data? Not the answer you were looking for, but the question remains
--Jeff Moden
Change is inevitable... Change for the better is not.
October 22, 2006 at 1:23 am
well it is a business decision - we do not own the data - we just store it and render it in html format
October 22, 2006 at 1:39 am
How bloody familiar!
They believe that the best way to isolate data to be displayed to different customers is to have those data in separate databases.
You know what's common between databases and politics? Everybody knows how to do it.
For some reason those "big bosses" don't dare to tell BMW or Mercedes how to make their cars. They trust professionals.
I wonder who or what makes them believe they are the best professionals in database design and sequrity?
_____________
Code for TallyGenerator
October 22, 2006 at 8:51 am
I love it... Spot on, Serqiy.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 22, 2006 at 9:33 am
I've just got to start selling the "T"-Shirts that say...
SELECT *
FROM Users
WHERE Clue > 0
(0 row(s) affected)
Jono,
You should probably be a bit disgusted with "the business decision" but I appreciate what you guys are going through. You mention that it would be a "subset" that get's transferred to the other databases and that it should be "automatically scalable"...
I guess those are the two most important points... what criteria defines a "subset" and what do you mean by "automatically scalable"? Is there a company ID of some form in the data in the cetral database to identify a "subset"? Does "automatically scalable" mean that when you add a company database, that the SQL is smart enough (bit of an oxymoron in this case) to automatically figure out what data it has to transfer to the new database?
--Jeff Moden
Change is inevitable... Change for the better is not.
October 22, 2006 at 3:47 pm
Yes Jeff, it means.
The way we use (no, I wish we use, some parts are implemented in some freaky ways):
1) Create VIEW in all tarteted databases with set of fields matching set of inserted data;
2) Create INSTEAD OF INSERT trigger on this view which will take all data from table INSERTED and distribute it over tables according to the data design;
3) Create table table in "intergation" database containing list of tageted databases and set of criteria used to select database;
4) Include into the script checks for tarteted database availability;
5) Create dynamic statement :
SELECT @sql = 'INSERT INTO ['+ TargetedDatabase + '].dbo.UploadView SEECT .... FROM <table-mediator> WHERE <Your criteria, may be dynamic as well>'
WHERE <specify which data go to this database, e.g. CustomerName = 'Cosa Nostra'>
6) Include this call into WHILE loop or cursor which will loop through TargetedDatabase values and execute the script above.
Then, to create a new customer you need just add a new record in the "distributing" table.
_____________
Code for TallyGenerator
October 22, 2006 at 5:57 pm
so here is the set-up at the moment
i have no control over the design of the data storage db - it is set in stone
i have another database where i store all my custom information, including a list of customers and which db they ref too
in the trigger i use an sp that takes a unique piece of data in the insert that allows me to determine the destination db.
i had intended using a dynamically constructed sql string - but it failed everytime. i was using something like
select @sql = 'insert '+@dbname+'.dbo.tablename select * from inserted' (need []?)
exec(@sql)
it failed - did not like the inserted part of the string - did not recognise the object
thanks everyone for their comments too!
is there any reason i should NOT use triggers?
October 22, 2006 at 6:09 pm
My approach does not touch design of the data storage db.
I add just a view with trigger - it's just SP with table parameter "inserted".
Table inserted exiats only in trigger.
To reference it from SPs you are calling from trigger use this:
SELECT *
INTO #inserted
FROM inserted
You may do all RTRIM(LTRIM(...)) for all varchar columns in this statement and set up indexes on #inserted if you expect big sets of data.
_____________
Code for TallyGenerator
October 22, 2006 at 7:43 pm
i had intended using a dynamically constructed sql string - but it failed everytime. i was using something like select @sql = 'insert '+@dbname+'.dbo.tablename select * from inserted' (need []?) |
How does it fail? What are some examples of the dbnames and table names you are using? Please give full information when you make a statement like this.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 22, 2006 at 7:58 pm
Table inserted does not exist in the scope of dynamic SQL.
It must be mentioned only inside of trigger itself.
_____________
Code for TallyGenerator
October 22, 2006 at 8:12 pm
Doesn't seem to be "automatically scalable" unless you create the views and the trigger dynamically. Maybe that doesn't matter much... non-dynamic solutions do require some human intervention but they usually run better after they are setup. I do like the idea of a config table and have used them with great success, but, again, don't need it if all these databases are the same as it sounded in the original post... dynamic SQL should be able to do this all. And no one said whether the central db is updated realtime or by batch... if by batch, then no need for triggers at all... add the transfers to the end of the batch.
If it were me, I'd do kinda like you said but a lot simpler... create an updateable passthrough view in each target database and point it back to the central database... the GUI guys will never figure out the difference between an actual table and a view that carries the name of the table they want. Will it create a hot-spot? Not if they write good SQL... oops... that's already not happening with this whole thing.
Could make it so it's automatically scalable by creating the view if it doesn't exist using the loop you were talking about.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 23 total)
You must be logged in to reply to this topic. Login to reply